"Operator panel" section#
The “Operator panel” section displays face recognition events (Figure 7). The section displays the last 30 events within the settings of handling policy for processing incoming images of the video stream. Receiving and displaying events is performed with minimal delays in near real time.
![“Operator panel” section, face detection events](../../Images/User/image9.png)
Click the “Get events” switch (1) to show events received via the WebSocket protocol. Click on the "Filters" button (2) to filter events by parameters (for more details, see the "Filtering events" section).
The page shows the last event data (3). An event can be either a face detection or video analytics "People Count", "Fight", "Weapon" and "Fire" (Figure 8).
![Section](../../Images/User/image181.png)
- Photo image from a video stream;
- "Source"—the name of the event source that recorded the event;
- “Place”—location of the event—house number, street, city, district, area;
- For face detection events:
- Top match is the reference photo image of the face. If no matches are found for a photo from an event, then the column with the top match for this event will remain empty.
- Event type—"Detection" when a face is detected on the frame, and "Match" when a match is found for the detected face in the database.
- The date and time the event was recorded;
- Information from the database, linked to a person from the control list;
- “Person ID”—the identifier of a person card from the control list. The value is displayed if such an ID is present;
- “Lists”—the name of the lists (list) to which the person is attached;
- Value of similarity of the identified face with the reference (in percentage terms and with the color coding of similarity thresholds*);
- For video analytics events:
- "Event type"—takes values of "People Count", "Fire", "Weapon", "Fight" depending on the type of video analytics;;
- The date and time of the start and end of the event;
- For a people counting event, user sees information about the number of people detected in the processed video segment;
- For a fire detection event, user sees the probability of fire in the frame is displayed in percentage, as well as information about the fire class, such as "Black Smoke", "White Smoke", "Fire";
- For a weapon detection event, user sees the probability of weapon in the frame in percentage;
- For a fight detection event, user sees the probability of fight in the frame in percentage;
* The color coding of similarity thresholds is configured in the config.json configuration file (see Luna POINT. Administration Manual).:
- similarity values below “low” will be marked in red;
- similarity values between “low” and “medium” will be marked in yellow;
- similarity values above “medium” will be marked in green.
When you click on a block with an event on the right, the event details open.
Event details in the operator panel#
Event details are divided into four blocks and contain information about the date and location of the event, a photo of a full frame of the event, a map and video stream of the event, a summary of the match, as well as information about last events with the person from the card (for face detection event).
In the figure, blocks of event details are highlighted with numbers 1-4 (Figure 9). Description of the blocks are presented in tables (Tables 2-5)
![Event details](../../Images/User/image10.png)
Table 2. Elements of the first (1) block of event details: detailed information about the event depending on the type of event
Title |
Event Type |
Description |
---|---|---|
"Date created" |
Face detection |
Date and time the event was recorded |
"Date created" |
All video analytics |
Date and time of event start and end |
"Source" |
All events |
Name or ID of the source that recorded the event |
"Location" |
All video analytics |
House number, street, city, district, area |
"Sample" tab |
Face detection |
When clicked, a sample is opened—centered cropped photo of face from event. Click the "Download" button to download the sample |
"Full frame" tub |
All events |
When clicked, a full frame of the video stream, on which the event was recorded, is opened. Click the "Download" button to download the full frame |
"Add to card" button |
Face detection |
When clicked, the "Add to card" window opens. Here the user can add a sample to the person card by filling out the following fields:
After completing the fields, click the "Save" button and the sample will be attached to the specified card and lists |
"Search" button |
Face detection |
When clicked, the user goes to the "Search" page, where the search results by current event ID are shown |
"Maximum number of people" |
"People count" video analytics |
Maximum number of people detected in the processed video segment |
"Fight score" |
"Fight" video analytics |
The probability of a fight in the frame in percent |
"Weapon score" |
"Weapon" video analytics |
The probability of a weapon in the frame in percent |
"Fire score" |
"Fire" video analytics |
The probability of a fire in the frame in percent |
"Fire class" |
"Fire" video analytics |
Information about the type of fire, such as "Black smoke" or "Fire" |
"Alarm" button |
All video analytics |
When pressed, a notification is sent to the security system service to activate the response team. To configure the alarm button in the interface, see "LUNA POINT. Administrator guide", "Setting up an alarm button in the "Operator panel" section |
Table 3. Elements of the second (2) block of event details: video stream from the event site and a map with the geotag of the event
Name |
Description |
---|---|
"Map" tab |
When clicked, a map indicating the geotag of the event is shown |
"Realtime Video" tab (active only for face detection events) |
Displays the video stream from the source that recorded the event |
"Video stream" tab (active only for face detection events) |
When clicked, a video stream from the source that recorded the event is shown. Allows you to track the movement of a person on the camera from a video stream |
Table 4. Elements of the third (3) block of event details—for face detection events: information about the person whose face matched the detected face
Name |
Description |
---|---|
"Card" tab |
Shows the person card where you can view all faces similar to the template (for more details, see the "Card Index" section). Active if the person from the template is linked to the card |
"Match" tab |
Displays information about the match, if there is a match:
|
Table 5. Elements of the fourth (4) block of event details—for face detection events: list of lasts events with a person from the card
Name |
Description |
---|---|
Photos of the event |
Photo of a face from a video stream |
Similarity percentage |
The value of the degree of similarity of the identified face with the face from the card in percentage |
Date |
Date and time the event was recorded |
Source |
Name of the source that recorded the event |
Location |
Event location—house number, street, city, district, area |
Filtering events#
User can quickly find an event among the last 30, as well as set a limit for displaying new events on the screen using filters. When a user clicks on the "Filter" button in the “Operator panel” page, a menu with settings and filters opens (Figure 10).
![Selecting filters in the “Operator panel” section](../../Images/User/image170.png)
The number next to the icon shows the number of applied filters. A short description of the elements and parameters of the filter block on the “Operator panel” page is presented in the table (Table 6).
Table 6. Filters available to the user to search for last events
Name |
Description |
---|---|
General |
Filters applied to face detection events |
Source |
Select one or more sources from the list of available ones |
Handling policies |
Handling policy names, according to which the face in the image was processed. One or several handling policies can be selected for searching |
Gender |
Gender of a person to be detected:
|
Age category |
Lower and/or upper limits of age of a person to be detected:
|
Similarity,% |
Lower and/or upper limits of similarity for displaying faces identified by the lists |
Video analytics |
Filters applied to events of video analytics |
People count |
Filters applied to events of "People count" video analytics |
Source |
Select one or more sources from the list of available ones |
Count of people |
Maximum number of people detected in the processed video segment |
Fight |
Filters applied to events of "Fight" video analytics |
Source |
Select one or more sources from the list of available ones |
Fight score, % |
The probability of a fight in the frame in percent |
Weapon |
Filters applied to events of "Weapon" video analytics |
Source |
Select one or more sources from the list of available ones |
Weapon score |
The probability of a weapon in the frame in percent |
Fire |
Filters applied to events of "Fire" video analytics |
Source |
Select one or more sources from the list of available ones |
Fire score |
The probability of a fire in the frame in percent |
The user selects one filter or a combination of filters and clicks on the “Filter” button for the applied filters to be applied. The applied filters will affect the appearance of new events on the screen.
To reset the applied filters, click on the “Reset” button.