"Video analytics" section#
The "Video analytics" section is designed to work with video analytics streams. The section contains the following elements (Figure 93):
- The "Streams" tab—for showing video analytics stream data (2):
- Stream name;
- Stream status (e.g. "In progress", "Failure", etc.);
- Type of connected analytics ("Fight", "Fire", "Weapon", "People count");
- Location—map with the camera geotag indication;
- Stream ID—copy to search for streams by ID in filters;
- Group—name of the group to which the video stream is attached;
- Description—an additional information about the video stream;
- The "Add" button, which opens a window for selecting the type of video analytics to create a stream;
- Filters—for searching streams. Click the button (1) and filter streams by status, video stream name, or group.
Manage video analytics streams using the buttons:
—start stream processing;
—stop stream processing;
—delete stream.
Creating a video analytics stream#
To add a video analytics stream:
1․ Click the "Add" button;
2․ In the opened window, select one or more video analytics types that the event source should process (Figure 94);
3․ Set up the stream to evaluate events with one video analytics (e.g. a lying person) within one source, or with several (e.g. people counting and a weapon detector) within the same source, including identical ones (e.g. two fight detection processes with different thresholds) (Figure 95);
4․ Save the created stream.
Stream creation parameters are divided into basic and personal ones for the selected video analytics type(s) (Table 31).
Table 31. Basic parameters for creating a video analytics stream
|
Parameter |
Description |
|---|---|
|
Basic parameters |
|
|
Name |
The stream name displayed in the Service. Used to identify the source of sent frames. |
|
Description |
User information about the video stream |
|
Group |
The name of the group to which the stream is attached. Grouping is designed to logically combine streams, for example by territorial principle. |
|
Stream data |
|
|
Type |
Video stream transmission type:
|
|
Full path to the stream |
Path to video stream source:
To use video files, transfer them to a Docker container |
|
Timing format |
Select what the time dimension will be based on:
|
|
Start time |
Starting point for timing. To activate, select |
|
Camera rotation angle |
Specify the angle to rotate the camera image: 90/180/270 degrees |
|
Restart options |
|
|
Automatically restart |
Enable if you want the stream to restart automatically |
|
Attempts count |
Number of attempts to automatically restart the stream |
|
Restart delay (in seconds) |
Automatic restart delay of the stream |
|
Video analytics parameters |
A set of parameters for the selected type of video analytics |
|
Count type |
For "People count" video analytics Defines the condition for triggering the rule:
|
|
Number of people to create a track |
For "People count" video analytics Specify the number of people that meet the count type and must be in the frame to generate an event |
|
Minimum event duration (in seconds) |
For "People count" video analytics Specify how many seconds the condition for the number of people must be continuously met for the system to record an event |
|
Threshold for fight/weapon/fire/ balaclava/niqab/lying person/hands up event generation, as well as the threshold of confidence in the presence of an item in the frame of a small model for "Abandoned object" analytics |
The minimum value required to create a track. Example: 0.7 would mean 70% chance that a fight is present in the frame. |
|
Minimum object area (in pixels) |
Specify the value obtained by multiplying the width and height of the bbox of the object. For 1080p camera resolution, the standard parameter value is 450 px. |
|
Minimum amount of time an item must be left for an event to be created (in seconds) |
For "Abandoned object" video analytics Specify the time during which an object must remain motionless or without an owner to register an "Abondoned object" event |
|
Threshold value accounting type |
Value calculation type required to generate an event:
The specified value must be greater than the threshold value to generate an event |
|
The number of consecutive frames that satisfy the models and item area thresholds required to create a track |
Allows you to specify the number of frames required to create a track. The track begins when a detection with the specified video analytics appears on the last frame of the specified number of consecutive frames |
|
Frame analysis frequency measurement type |
Frame rate unit: seconds or frames |
|
Frame analysis frequency |
Depending on the selected frame analysis frequency measurement type:
|
|
Event trigger point |
When to generate events:
Example: for people counting analytics: if there are 100 people in the frame for an hour, the service can generate events every 5 minutes to track changes or confirm a constant number of people |
|
Event creation interval (in seconds) |
The period at which events are created while the track is running |
|
ROI coordinates |
A limited frame area where detection occurs. Specify the ROI value in one of the two formats– |
|
DROI coordinates |
A limited area inside the ROI. Detection is performed in the ROI, but the best frame is selected only in
the DROI. Detection must be completely inside the DROI to be considered the best frame.
Specify the coordinates in WKT format. Example: |
|
Track fight/weapon/fire/hands up assessment |
Determines the frame with the highest probability of fight/weapon/fire/hands up detection |
|
Track frame with fight/weapon/fire/ hands up |
Adds to the event information an image of the frame where the maximum fight/weapon/fire/hands up probability value was reached |
|
Track people's coordinates |
Calculates the coordinates of people in the frame |
|
Track the frame with the highest number |
Adds to the event information an image of the frame in which the maximum number of people was reached |
|
Event image format |
Determines the file format in which the image from the frame is recorded: .png or .jpeg |
|
Image quality |
Specifies the image quality from 0 to 1 that should be applied when processing the frame: 1 means good quality, 0 means bad quality. |
|
Image size (in pixels) |
Determines the file format in which the image from the frame is recorded |
|
Save event to «Event archive» |
Saving an event to the archive for later viewing of event details |
|
Geoposition |
Add information about the geographic location of the video stream by clicking the "Set geoposition" button:
|