Skip to content

"Video analytics" section#

The "Video analytics" section is designed to work with video analytics streams. The section contains the following elements (Figure 93):

  • The "Streams" tab—for showing video analytics stream data (2):
    • Stream name;
    • Stream status (e.g. "In progress", "Failure", etc.);
    • Type of connected analytics ("Fight", "Fire", "Weapon", "People count");
    • Location—map with the camera geotag indication;
    • Stream ID—copy to search for streams by ID in filters;
    • Group—name of the group to which the video stream is attached;
    • Description—an additional information about the video stream;
  • The "Add" button, which opens a window for selecting the type of video analytics to create a stream;
  • Filters—for searching streams. Click the button (1) and filter streams by status, video stream name, or group.
Figure 93. "Video analytics" section

Manage video analytics streams using the buttons:

  • —start stream processing;
  • —stop stream processing;
  • —delete stream.

Creating a video analytics stream#

To add a video analytics stream:

1․ Click the "Add" button;

2․ In the opened window, select one or more video analytics types that the event source should process (Figure 94);

3․ Set up the stream to evaluate events with one video analytics (e.g. a lying person) within one source, or with several (e.g. people counting and a weapon detector) within the same source, including identical ones (e.g. two fight detection processes with different thresholds) (Figure 95);

4․ Save the created stream.

Selecting a video analytics type to create a stream
Figure 94. Selecting a video analytics type to create a stream
Creation parameter of video analytics stream
Figure 95. Creation parameter of video analytics stream

Stream creation parameters are divided into basic and personal ones for the selected video analytics type(s) (Table 31).

Table 31. Basic parameters for creating a video analytics stream

Parameter

Description

Basic parameters

Name

The stream name displayed in the Service. Used to identify the source of sent frames.

Description

User information about the video stream

Group

The name of the group to which the stream is attached. Grouping is designed to logically combine streams, for example by territorial principle.

Stream data

Type

Video stream transmission type:

  • Videofile;
  • Stream

Full path to the stream

Path to video stream source:

  • For the "Stream" type rtsp://some_stream_address;
  • For the "Videofile" type https://127.0.0.1:0000/super_server/

To use video files, transfer them to a Docker container

Timing format

Select what the time dimension will be based on:

  • Auto—uses the pts value if a timestamp is present in the video or stream, otherwise uses the server time for streams and the frame rate for video files;
  • Pts—based on the timestamp inside the stream or file;
  • Server—based on the server time, available only for streams;
  • Frames—based on the frame rate, available only for video files.

Start time

Starting point for timing. To activate, select Pts as the timing format.

Camera rotation angle

Specify the angle to rotate the camera image: 90/180/270 degrees

Restart options

Automatically restart

Enable if you want the stream to restart automatically

Attempts count

Number of attempts to automatically restart the stream

Restart delay (in seconds)

Automatic restart delay of the stream

Video analytics parameters

A set of parameters for the selected type of video analytics

Count type

For "People count" video analytics

Defines the condition for triggering the rule:

  • More than ≥: the event is created if the number of people in the frame is equal to or exceeds the specified value;
  • Less than ≤: the event is created if the number of people in the frame is equal to or less than the specified value.

Number of people to create a track

For "People count" video analytics

Specify the number of people that meet the count type and must be in the frame to generate an event

Minimum event duration (in seconds)

For "People count" video analytics

Specify how many seconds the condition for the number of people must be continuously met for the system to record an event

Threshold for fight/weapon/fire/ balaclava/niqab/lying person/hands up event generation, as well as the threshold of confidence in the presence of an item in the frame of a small model for "Abandoned object" analytics

The minimum value required to create a track. Example: 0.7 would mean 70% chance that a fight is present in the frame.

Minimum object area (in pixels)

Specify the value obtained by multiplying the width and height of the bbox of the object. For 1080p camera resolution, the standard parameter value is 450 px.

Minimum amount of time an item must be left for an event to be created (in seconds)

For "Abandoned object" video analytics

Specify the time during which an object must remain motionless or without an owner to register an "Abondoned object" event

Threshold value accounting type

Value calculation type required to generate an event:

  • "Median"—the average value of the results of the last frames is calculated;
  • "Minimum"—the minimum value of the last processed frames is calculated.

The specified value must be greater than the threshold value to generate an event

The number of consecutive frames that satisfy the models and item area thresholds required to create a track

Allows you to specify the number of frames required to create a track. The track begins when a detection with the specified video analytics appears on the last frame of the specified number of consecutive frames

Frame analysis frequency measurement type

Frame rate unit: seconds or frames

Frame analysis frequency

Depending on the selected frame analysis frequency measurement type:

  • The interval in seconds after which a frame is selected for analysis. By default, frames are processed every 10 seconds;
  • The interval between frames that will be selected for analysis: the first and every N+1 frame will be taken, where N is the specified value. By default, every 10 frames are processed The selected frames are then analyzed by video analytics and, if the threshold value is met, are selected to create a track

Event trigger point

When to generate events:

  • Start—at the beginning of track creation;
  • End—at the end of the track;
  • Period—during the track operation, at intervals

Example: for people counting analytics: if there are 100 people in the frame for an hour, the service can generate events every 5 minutes to track changes or confirm a constant number of people

Event creation interval (in seconds)

The period at which events are created while the track is running

ROI coordinates

A limited frame area where detection occurs. Specify the ROI value in one of the two formats–px or %. The first two values specify the coordinates of the top left point of the frame. The second two values indicate the width and height of the area of interest, if the values are specified in px; and the width and height of area relative to current frame size, if the values are specified in %. For example: 0,0,1920,1080 px or 0,0,100,100%

DROI coordinates

A limited area inside the ROI. Detection is performed in the ROI, but the best frame is selected only in the DROI. Detection must be completely inside the DROI to be considered the best frame. Specify the coordinates in WKT format. Example: POLYGON ((30 10, 40 40, 20 40, 10 20, 30 10)) will create one quadrilateral zone.

Track fight/weapon/fire/hands up assessment

Determines the frame with the highest probability of fight/weapon/fire/hands up detection

Track frame with fight/weapon/fire/ hands up

Adds to the event information an image of the frame where the maximum fight/weapon/fire/hands up probability value was reached

Track people's coordinates

Calculates the coordinates of people in the frame

Track the frame with the highest number

Adds to the event information an image of the frame in which the maximum number of people was reached

Event image format

Determines the file format in which the image from the frame is recorded: .png or .jpeg

Image quality

Specifies the image quality from 0 to 1 that should be applied when processing the frame: 1 means good quality, 0 means bad quality.

Image size (in pixels)

Determines the file format in which the image from the frame is recorded

Save event to «Event archive»

Saving an event to the archive for later viewing of event details

Geoposition

Add information about the geographic location of the video stream by clicking the "Set geoposition" button:

  • Longitude (in degrees);
  • Latitude (in degrees);
  • City;
  • Region;
  • District;
  • Street;
  • House