“Last events” section#
The “Last events” section displays detection and object (faces, bodies) recognition events, and records identification events using lists.
The general view of the “Last events” section is shown below (Figure 8).
The section displays the last 30 events within the settings of handling policy for processing incoming images of the video stream, terminals, REST requests, etc. Receiving and displaying events is performed with minimal delays in near real time.
At the bottom of the screen, there is a “View events archive” button which leads to “Events archive” section (for more details about see the “Events archive” section)
If departments are created, and users and event sources are attached to the department, the user sees only those events that were recorded by the event sources of his department. In order for the user to see the events, the administrator creates a department in the "Departments" section, then adds the user and the selected event source to it.

The filter icon (1), which is located on the right, hides the block with filtering settings. The page shows the following event data (2):
- "Event image"
- a photo image of the face from the video stream;
- a photo image of the body from the video stream;
- "Top Match"—the column is shown if the “Display top match” checkbox is active (3). If no matches are found for a photo from an event, then the column with the top match for this event will remain empty. The "Top match" includes:
- reference photo images of the face and/or body;
- value of similarity of the identified face with the reference (in percentage terms and with the color coding of similarity thresholds);
- "Match type"—the type of object (face or event), according to which the similarity of the identified face/body with the reference was found;
- "External ID"—external identifier of the face, the field is shown if such an ID is available (for "Face" in "Match type");
- "User data"—information from the database, linked to a person from the control (for "Face" in "Match type");
- "List"—the name of the list to which the person is attached (for "Face" in "Match type");
- "Date created"—date and time of fixing the event (for "Event" in "Match type");
- "Source"—the name of the event source that recorded the event (for "Event" in "Match type");
- "Handling policy"—the name of the handler, according to which the reference photo image of the body was processed (for "Event" in "Match type").
- "Event details" shows the available event data:
- "Date of created"—date and time of event registration;
- "Source"—the name of the event source that recorded the event;
- "Handling policy"—the name of the handler, according to which the reference photo images of the face/body were processed
- "Metadata"*—button for uploading arbitrary user data in JSON format, the filed is shown if such data was added to the event (for "Event" in "Match type").
- Face attribute, if found:
- "Gender"—gender based on face image;
- “Age category”—the age of the detected person;
- Body attributes, if found:
- "Upper body colors"—an indication of the color of the clothes of the human body upper part;
- "Lower body colors"—indicating the color of the human body upper part;
- "Headwear"—the presence or absence of a headdress, if it is defined.
- "Backpack"—the presence or absence of a backpack, if it is defined.
* All detailed capabilities and limitations of the "Metadata" field are specified in the "Administrator Manual" of LUNA PLATFORM 5 in the paragraph 6.9.4 "Events meta-information".
The color coding of similarity thresholds (3):
- similarity values below “low” will be marked in red;
- similarity values between “low” and “medium” will be marked in yellow;
- similarity values above “medium” will be marked in green.
Color coding of similarity thresholds is configured in the config.json configuration file (see Luna Clementine 2.0. Administration Manual).
Last events filtering#
The Service allows you to filter last events (2 in Figure 8) to find and display necessary events (Figure 9).
User can quickly find an event among the last 30, as well as set a limit for displaying new events on the screen.

When a user clicks on the icon (1 in Figure 8) on the “Last events” page, a menu with settings and filters opens. The number next to the icon shows the number of applied filters. A short description of the elements and parameters of the filter block on the “Last events” page is presented in the table (Table 2).
Table 2. Filters available to the user to search for last events
Name |
Description |
---|---|
Event details |
|
"Sound notification" toggle and "Similarity threshold" parameter |
Allows user to configure sound alerts about detection of an object that is not below the specified value of the "Similarity threshold" field |
General |
|
Source |
Select one or more sources from the list of available ones; |
Handling policies |
Handling policy names, according to which the face or body in the image was processed. One or several handling policies can be selected for searching. |
Tags |
Selection of one or more tags. For example, the "Temperature"* tag, is intended for displaying information about the temperature of the human body, filtering events by temperature. "Temperature":
|
Labels |
|
Label |
Label name (labels are rules by which the comparison is made); |
Similarity,% |
Lower and/or upper limits of similarity for displaying faces identified by the lists; |
Face attributes and properties |
|
Gender |
Gender of a person to be detected, determined by the image of a face:
|
Age category |
Age range of a person to be detected, determined by the image of a face:
|
Emotion |
Emotion of a person to be detected:
A combination of several values is possible; |
Mask |
Indication of the presence of a mask:
A combination of several values is possible; |
Liveness |
Liveness status selection:
A combination of several values is possible; |
Deepfake** |
Liveness status selection:
A combination of several values is possible; |
Body attributes and properties |
|
Upper body colors |
Top clothing color specification:
A combination of several values is possible; |
Lower body type |
Bottom clothing type specification:
A combination of several values is possible; |
Lower body colors |
Bottom clothing color specification:
A combination of several values is possible; |
Shoes color |
Shoe color specification:
A combination of several values is possible; |
Headwear |
Headdress specification:
A combination of several values is possible; |
Headwear colors |
Headdress color specification:
A combination of several values is possible; |
Backpack |
Backpack presence specification:
A combination of several values is possible; |
Sleeve |
Sleeve length specification:
A combination of several values is possible; |
Gender by body |
Gender of a person to be detected, determined by the image of a body:
A combination of several values is possible; |
Age category by body |
Age range of a person to be detected, determined by the image of a body:
|
Location |
|
City Area District Street House number Longitude(-180…180); Accuracy (0…90); Latitude(-90…90); Accuracy (0…90); |
Event location |
Other |
|
Comma-separated track IDs |
Specifying track IDs |
Add filter by meta*** |
Allows you to fill in a set of blocks to create a filter by the "meta" field. The number of meta filters is unlimited. The following blocks are required to be filled in when creating a filter by meta:
|
* Color coding of temperature values:
- increased temperature values will be marked in yellow;
- abnormal temperature values will be marked in red;
- normal temperature values will be marked in green.
See the LUNA Access documentation for more information on setting of temperature ranges.
** Deepfake license required
*** For advanced users
The user selects one filter or a combination of filters and clicks on the “Filter” button for the applied filters to be applied.
To reset the applied filters, click on the “Reset” button.
The applied filters will affect the appearance of new events on the screen.
To collapse “Filters”, click on the filter icon on the right side of the screen.
Event card#
A card with detailed event data (Figure 10) opens when clicking on an arrow button on the face or body from the event image (Figure 8).

When an event contains data on the detection of both a face and a body, the event card gives the ability to switch between these data. If an event contains detection data for only one object, such as a face, then there will be no detection data for another object.
The event card consists of four blocks. The description of the elements of the event card is presented in Table 3.
Table 3. Elements and parameters of the event card
Name |
Description |
---|---|
Event details |
|
Event information |
|
Date created |
Date and time of event recording |
Additional information |
“Event ID” — when clicked on |
Track |
“Track ID” — when clicked on |
Handling policy |
Name of the policy by which the image processing in the video stream is performed. Clicking on the name of the policy opens the form for editing its parameters |
Tags |
Name of tags by which the event is filtered |
Liveness |
Result of the Liveness check for person identification purposes (KYC) |
Deepfake |
Result of the Deepfake check to determine face replacing |
Face detection image and body detection image when two objects are detected |
Clicking face detection or body detection will switch the data to face detection or body detection details |
Location |
Location data (“City”, “Area”, “District”, “Street”, “House number”, “Coordinates (latitude)”, “Coordinates (longitude)”) |
Metadata |
Uploading arbitrary user data in JSON format, if available |
Face Detection or Body Detection (if equipped) |
|
Find similar: events) |
Clicking on |
Find similar: faces Photo image of a face from a video stream |
Clicking on when clicked on when clicked on when clicked on |
Face attributes |
“Attributes”: “Gender”—gender of a person (male/female); “Age category”—specification of age category; |
Body attributes |
“Basic Attributes”: “Gender by body”—gender of a person (male/female); “Age category by body”—specification of age; "Upper body"—specification of the type and color of clothing of the upper body:
"Lower body"—specification of the type and color of clothing of the lower body:
"Accessories"—specification of the presence or absence of a "backpack" |
Additional properties |
Head tilt angle (roll); Head tilt angle (pitch); Head rotation angle (yaw); Eye direction (pitch); Eye direction (yaw); Mouth state; Eye state ; Image quality determination: Light; Dark; Blur; Specularity; Illumination Definition of attributes and properties of the face/body is set in the handler settings |
Best match (type - "Event" or "Person") |
Similarity value of identified face/body with the face/body from control list/event (in percentage); |
Find similar: events |
Clicking on |
Find similar: faces |
Clicking on |
Photo image of a face/body |
Reference photo image of the face or body (sample) or no photo: when clicked on when clicked on when clicked on when clicked on |
Face attributes |
“Attributes”: “Gender”—gender of a person (male/female); “Age category”—specification of age category; |
Body attributes |
“Basic Attributes”: “Gender by body”—gender of a person (male/female); “Age category by body”—specification of age; "Upper body"—specification of the type and color of clothing of the upper body:
"Lower body"—specification of the type and color of clothing of the lower body:
"Accessories"—specification of the presence or absence of a "backpack" |
Additional properties |
Head tilt angle (roll); Head tilt angle (pitch); Head rotation angle (yaw); Eye direction (pitch); Eye direction (yaw); Mouth state; Eye state ; Image quality determination: Light; Dark; Blur; Specularity; Illumination Definition of attributes and properties of the face/body is set in the handler settings |
"Matches" |
|
Event photo |
“Face” type — an avatar, sample, or no photo image. Similarity value of identified face with the face from control list (in percentage) “Event” type — detection (photo image of a face from a video stream). Similarity value of identified face or body with the face or body from the event (in percentage) |
Type |
|
Date created |
Date and time of creation in the Service of the card of the face on which the identification event occurred |
Label |
Label name. Labels are rules by which the comparison is made |
Additional information |
“Face” type — “User data”, “Lists”, “External ID”. “Event” type — “Handling policy”, “Source”. When clicked on |
The external ID is used to integrate LUNA CLEMENTINE 2.0 with external systems, to transfer data to other systems in order to analyze and quickly respond to an event.
Face card#
To open the face card (Figure 11) click on the arrow button on the fec or body image from the "Top match" column. Users can also open the face card from the event card by clicking on in the "Best match" block or near the "Additional information" in the "Matches" block.

The face card consists of two blocks. The description of the elements of the face card is presented in Table 4.
Table 4. Elements and parameters of the face card
Name |
Description |
---|---|
Face details |
|
Photo image of a face |
Avatar is a biometric sample that is created when uploading a photo image to the list (to the LUNA PLATFORM 5 system):
when clicked on |
Update photo |
Opening the form to upload a new face photo image |
Edit face user data |
Opening a form for editing face data (“User data”, “External ID”, “Lists”) |
Delete face |
Removal of face biometric sample, face photo image, and face card (available only for the administrator role) |
Date created |
Date and time of creation of the biometric sample |
Information |
User data from the database, linked to a face (upon availability) |
External ID |
External identifier of the face |
Lists (N) |
The list and number of lists to which the person is attached. Clicking on a name opens a list |
Additional information |
“Face ID” — when clicked on |
Face attributes |
“Attributes”: “Gender” — gender of a person (male/female); “Age category” — indication of age category |
Additional properties |
Face properties: Head tilt angle (roll); Head tilt angle (pitch); Head rotation angle (yaw); Eye direction (pitch); Eye direction (yaw); Mouth state; Eye state Image quality determination: Light; Dark; Blur; Specularity; Illumination |
Additional block |
|
Faces with the same external ID. Could be empty (“No external ID”) |
|
Photo image of a face |
Avatar, sample or no photo |
Date created |
Date and time of creation of the biometric sample |
User data |
Information from the database, linked to a face (upon availability) |
Lists (N) |
The list and number of lists to which the person is attached.
Clicking on a name opens a list
When clicked on |
Last events with this face (there may be no events with this face) |
|
Photo image of a face from a video stream |
Normalized image: when clicked on when clicked on Similarity value of identified face with the face from control list (in percentage) |
Date created |
Date and time of recording the event with a face |
Event source |
The name of the event source that recorded the event with a face: when clicked on when clicked on the name of the event source, a form for editing its parameters opens (available only for the administrator role) |
Handling policy |
Name of the policy that processed the image in the video stream. Clicking on the name of the policy opens the form for editing its parameters |
View all faces with the same external ID |
When clicked, the search for faces by external ID is performed, and a list of all faces, whose external ID matches the one of the reference photo, opens. |
Face card editing#
Updating the photo image in the face card is performed by clicking the “Update photo” button (Figure 11). The general view of the photo image update form in the face card is shown below (Figure 12).
Image file requirements:
- *.png , *.jpeg, or *.bmp format;
- image size no more than 15 MB and no more than 3840х2160 pixels;
- image may contain one or more people;
- image must have a person's face.

Editing the face user data is performed by clicking the “Edit face user data” button (Figure 11). The general view of the face user data editing form is shown below (Figure 13).
Face user data editing form contains:
- “User data” — information from the database, linked to a face (upon availability);
- “External ID” — external identifier of the face;
- “Lists” — lists to which the face is attached;
- “Save” button — button for saving changes.
If you need to go back to the page with face details during editing, press the Esc key on your keyboard.
