Skip to content

"Event sources" section#

The “Event sources” section is intended to display all event sources and video streams, event source status, preview, group, and parameters settings for each event source (Figure 76).

The administrator has access to all event sources and all source settings. Sources can be webcams, USB, and IP cameras (via RTSP protocol), video files, and images.

“Event sources” section (for administrator)
“Event sources” section (for administrator)

“Event sources” section contains the following elements:

  • buttons to control video stream processing:
  • — “Play” button to start stream processing (sends a request for stream processing, the stream is distributed to a specific FaceStream 5 and it starts processing it);
  • — “Pause” button to pause stream processing, for example, to save resources (pauses the stream processing, but the stream remains assigned to the same FaceStream 5);
  • — “Stop” button to stop stream processing (stops the stream processing, the stream is no longer assigned to the same FaceStream 5);
  • the list of streams from LUNA Streams and their parameters:
  • “Last frame” — event source video stream preview;
  • “Status” — current status (state) of video stream activity;
  • “Stream ID” — video stream identifier in LUNA Streams, generated when stream is created;
  • “Name” — video stream name;
  • “Description” — additional user-defined information about the video stream;
  • “Group” — the name of the group to which the video stream is attached;
  • buttons for working with streams:
  • — button to edit video stream parameters;
  • — button to view the event source video stream;
  • — button to delete video stream;
  • “Add” button — button for adding a video stream (available only to users with administrator rights);
  • the number of video streams displayed on the page is set by the switch in the lower right corner of the page. There can be 10, 25, 50 or 100 video streams in total on one page.

For searching and filtering video streams use the following fields: “Stream ID”, “Name”, “Description” and “Group”.

For a detailed description of how FaceStream 5 interacts with LUNA Streams, see the FaceStream 5 documentation Administrator manual).

LUNA Streams UI is a service that allows user to interact with FaceStream 5 in terms of working with video streams (event sources).

The LUNA Streams interface allows the user to configure video stream parameters such as stream type, stream source address, filtering parameters, etc FaceStream 5 takes settings from LUNA Streams for further processing.

The following video stream statuses are possible:

  • “pending” — the stream has been taken into operation, but no FaceStream 5 handler has been found yet;
  • “in_progress” — the stream is being processed by the FaceStream 5 handler;
  • “done” — the stream is fully processed (for video files and images);
  • “pause” — stream processing has been paused by the user (not applicable to video files and images);
  • “restart” — stream processing has been restarted by the server. The status is temporary and can only appear when using the “autorestart” settings group (for more details, see the FaceStream 5 User manual);
  • “cancel” — stream processing has been canceled by the user. The "cancel" status may occur after the “pending”, “restart” and “pause” statuses;
  • “failure” — Stream processing by FaceStream 5 handler failed (e.g., an error occurred);
  • “handler_lost” — FaceStream 5 handler is lost, needs to be transferred to another handler (not applicable for video files and images);
  • “not_found” — the stream was deleted by the user while processing;
  • “deleted” — the stream was deleted by the user intentionally.

The “restart” and “handler_lost” statuses are temporary. With these statuses, it is not possible to get a stream, however, the transition through these statuses is logged as usual.

The “not_found” status is internal and will be sent back for feedback if the stream has been removed during processing. With this status it is not possible to receive a stream.

The “deleted” status is virtual. A stream with such a status cannot exist, but this status can be seen in the stream's logs.

See the FaceStream 5 Administrator manual for details on transition from status to status.

Adding a video stream#

Only a user with the administrator role can add a video stream.

To add a video stream, click on the “Add” button (Figure 55). “Create stream” form will open for specifying the settings (Figure 77).

Start specifying the values of the video stream parameters according to the parameters shown in the table (Table 29). If you need to go back to the page with source list while creating a video stream, press the Esc key on your keyboard.

“Create stream” form
“Create stream” form

Table 29. Event source settings

Parameter

Description

Default value

General stream parameters

Account ID

The parameter is used to bind the received data to a specific user. Required field

Account ID in LP5

Stream name

The stream name displayed in the Service. Used to identify the source of sent frames.

Description

User information about the video stream

Group

The name of the group to which the stream is attached

Status

The current state of the stream. Possible statuses:

Pause

Pending

Pending;

Stream data

Type

Video stream transmission type:

UDP;

TCP;

Videofile;

Images

The TCP protocol implements an error control mechanism that minimizes information loss and missing key frames at the cost of increasing network latency.

The UDP protocol does not implement an error control mechanism, so the stream is not protected from damage.

The use of this protocol is recommended only if a high quality network infrastructure is available.

For a large number of streams (10 or more), it is recommended to use the

UDP protocol. When using the TCP protocol, there may be problems with reading streams

UDP

Full path to the source

The path to the source of the video stream. Required field

For example, for TCP/UDP type: rtsp://some_stream_address

USB device number for TCP/UDP type: /dev/video0

To use a USB device, you must specify the --device flag with the address of the USB device when starting the FaceStream 5 Docker container (see “Launching keys” section in the FaceStream 5 Installation manual).

The full path to video file for Videofile type: https://127.0.0.1:0000/super_server/

The full path to the directory with images for Images type: /example1/path/to/images/

To use video files and images, you must first transfer them to a Docker container

ROI coordinates

A limited frame area where face or body is detected and tracked (for example, in a dense flow of people). Specify the ROI value in one of the two formats–px or %.

The first two values specify the coordinates of the top left point of the frame.

The second two values indicate the width and height of the area of interest, if the values are specified in px; and the width and height of area relative to current frame size, if the values are specified in %. For example: 0,0,1920,1080 px or 0,0,100,100%

Parameter setting can be done visually on the preview image when editing the event source. To do this, click on the gear button. In the opened window, grab the border of the detection area and move it. The width, height and coordinates of the detection area will take on new values. If you need to detect over the entire frame, click the "Full frame" button. Save your changes.

DROI coordinates

A limited area within the ROI zone. Face detection is performed in the ROI region, but the best frame is selected only in the DROI region. The face detection must be completely within the DROI so that the frame is considered as the best one. Specify the DROI value in one of two formats–px or %.

The first two values specify the coordinates of the top left point of the frame.

The second two values indicate the width and height of the area of interest, if the values are specified in px; and the width and height of the area relative to the current frame size, if the values are specified in %. For example:

0,0,1920,1080 px or 0,0,100,100%

DROI is recommended for use when working with access control systems.

This parameter is used only for working with faces.

Parameter setting can be done visually on the preview image when editing the event source. To do this, click on the gear button. In the opened window, grab the border of the best frame selection area and move it. The width, height and coordinates of the area will take on new values. If you need to select the best frame over the entire frame, click the "Full frame" button. Save your changes.

Rotation angle of the image from the source

Used when the incoming video stream is rotated (for example, if the event source is installed on the ceiling)

0

Frame width

The parameter is used only for the TCP and UDP types and is designed to work with protocols that imply the existence of several channels with different bit rates and resolutions (e.g., HLS).

If the stream has several such channels, then this parameter will allow you to select from all channels of the entire stream the channel which frame width is closer to the value specified in this parameter

800

Endless

This parameter allows you to control how the stream is restarted when a network error is received.

The parameter is available only for the TCP and UDP types.

If the parameter takes the Enabled value, then in case of receiving an error and successful reconnection, the stream processing will continue. If all attempts to reconnect failed, then the stream will take the status “failure”.

If the parameter takes the Disabled value, then the stream processing will not continue, and the stream will take the “done” status.

When broadcasting a video file, the Disabled value is assumed. This will avoid re-processing an already processed video file fragment when an error is received.

If the parameter value is Enabled when broadcasting a video file, then after the processing is completed, the video file will be processed from the beginning

Enabled

Stream handler parameters

This group of parameters defines the parameters of the policy (handler) created in LP5, which will be used to process streams. Different handlers should be used for faces and bodies. The handler must be created in LP5 beforehand

Handler URL

The full network path to the deployed LP5 API service, including the LUNA Handlers and LUNA Events services required to generate an event by handler:

http://:/

Where is the LUNA API service address, is the port used by the API service. By default: 5000

Required field

API version

API version for event generation in LP5. Required field

API version 6 is currently supported

Handler ID for best shots (static)

The parameter allows using an external static handler_id of the LP5 policy for processing biometric samples of faces or bodies according to the specified rules.

When using this policy, LP5 generates an event that contains all the information received from FaceStream 5 and processes it according to the processing rules.

For example: aaba1111-2111-4111-a7a7-5caf86621b5a

Required field

URL to save original frames

This parameter specifies the URL for saving original frames of faces or bodies in LP5.

The URL can be either the address to the LUNA Image Store service container, or the address to the /images resource of the LUNA API service. When specifying an address to /images resource, original frame will be saved under the image_id identifier.

To send a frame, the send_source_frame parameter must be enabled.

An example of the address to the LUNA Image Store service container:

http://127.0.0.1:5020/1/buckets/<frames>/images

Where 127.0.0.1 is IP address where the LUNA Image Store service is deployed; 5020 is the default port of the LUNA Image Store service; 1 is API version of the LUNA Image Store service; is the name of the LUNA Image Store service container where face or body image should be stored. Container must be created by the user beforehand.

An example of the address to the /images resource of the LUNA API service:

http://127.0.0.1:5000/6/images

Where 127.0.0.1 is the IP address where the LUNA API service is deployed; 6 is the API version of the LUNA API service; 5000 is the default API service port

Authorization (Token)

This parameter specifies either a token or an LP5 account ID for making requests to the LUNA API service.

If the authorization field is not filled, then the LP5 account ID will be used, which is set when creating a stream

Geoposition

This group of parameters includes information about location of video stream source

City

Area

District

Street

House number

Longitude

Latitude

Event source geographical location

Autorestart

This group of parameters allows to configure the automatic restart of the stream

Autorestart

Whether or not to use automatic stream restart

Enabled

Attempt count

Number of attempts to automatically restart the stream

10

Autorestart delay (in seconds)

Stream auto restart delay

60

Sending parameters

This group of parameters defines the period during which frames will be analyzed to select the best shot, as well as all parameters associated with compiling a collection of the best shots

Frame analysis period after which the best shot will be sent

The period starts from the moment a person appears in the frame — the first detection.

Decreasing this parameter allows to quickly determine person, but with a greater error.

Possible values:

number of frames;

number of seconds;

1 — frames are analyzed for all frames until the end of the track. At the end of the track (when the object leaves the frame) the best shot will be sent to LP5.

-1 0

Wait duration between track analysis periods

Specifies the timeout between two consecutive tracks.

Possible values:

number of frames;

number of seconds;

0 — there is no timeout;

1 — timeout will last indefinitely

0

Track analysis and waiting period duration measure

Specifies the measurement type of the frame analysis period and the timeout period:

Seconds;

Frames

The choice depends on the business task

Seconds

Number of frames that the user sets to receive from the track or certain periods of this track

Assumes the creation of a collection of the best shots of the track or the time interval of the track, specified in the "Frame analysis period after which the best shot will be sent" parameter.

This collection will be sent to LP5.

Increasing the value increases the probability of correct recognition of the object, but affects the network load.

Possible values are from 1 and more.

1

Send only full set

Allows to send data (best shots and detections) only if user have the required number of best shots (“Number of frames that the user sets to receive from the track or certain periods of this track”) and the track length (“Minimum detection size for Primary Track mode”)

Enabled

Delete bestshot and detection data

Allows to delete the best shots and detections after sending data. If disabled, data remains in memory

Disabled

Use Primary Track

This group of parameters is designed to work with access control systems (ACS, turnstiles at the entrances) to simplify control and of face recognition technology at the entrance to a protected area.

This group of parameters is only used for working with faces.

This group of parameters is not used for the Image type

Use Primary Track

If the value of this parameter is Enabled, then the implementation mode of the Primary Track is turned on.

Of all the detections on the frame, the detection with the maximum size is selected and its track becomes the main one. Further analysis s performed based on this track.

The best shot from that track is sent to LP5.

When using the parameter at the checkpoint, the best shots of only the person closest to the turnstile will be sent (the condition for the largest detection is met)

Disabled

Minimum detection size for Primary Track mode

Sets the minimum detection size (vertically in pixels) at which the analysis of stream frames begins and the determination of the best frame

70

Size of detection for the main track

Sets the detection size in pixels for the Primary Track.

When the detection size in pixels reaches the specified value, the track immediately sends the best shot to the server

140

Healthcheck parameters

This parameter group is used only when working with streams (TCP, UDP) and video files.

In this group, user can set the parameters for reconnecting to the stream in case of stream playback errors

Maximum number of stream errors to reconnect to the stream

The maximum number of errors during stream playback.

The parameter works in conjunction with the parameters "Error count period duration (in seconds)" and " Time between reconnection attempts (in seconds)".

After the first error is received, the timeout specified in the "Time between reconnection attempts (in seconds)" parameter is performed, and then the connection to the stream is retried.

If during the time specified in the "Error count period duration (in seconds)" parameter, the number of errors is greater than or equal to the number specified in the parameter, then the processing of the stream will be terminated, and its status will change to "failure".

Errors can be caused by a problem with the network or video availability

10

Error count period duration (in seconds)

Parameter-criterion of the time to reconnect to the video stream. If the maximum number of errors occurs within the specified time, an attempt is made to reconnect to the video stream

3600

Time between reconnection attempts (in seconds)

After receiving the first error, the timeout specified in the parameter is performed, then the connection to the stream is retried

5

Filtering parameters

The parameter group describes objects for image filtering and sending the resulting best shots

Threshold value to filter detections

Also called Approximate Garbage Score (AGS) for faces and Detector score for bodies — threshold for filtering face or body detections sent to the server.

All detections with a score above the value of the parameter can be sent to the server as an HTTP request, otherwise the detections are not considered acceptable for further work with them.

The recommended threshold value was identified through research and analysis of detections on various images of faces and bodies

0,5187

Head rotation angle threshold (to the left or right, yaw)

The maximum value of the angle of rotation of the head to the left and right to the source of the stream (in degrees).

If the head rotation angle on the frame is greater than the specified value, the frame is considered unacceptable for further processing.

This parameter is used only for working with faces

40

Head tilt angle threshold (up or down, pitch)

The maximum value of the head tilt angle up and down relative to the source of the stream.

If the head tilt angle on the frame is greater than the specified value, then the frame is considered unacceptable for further processing.

This parameter is used only for working with faces

40

Head tilt angle threshold (to the left or right, roll)

The maximum head tilt angle to the left and right relative to the source of the stream.

If the head tilt angle on the frame is greater than the specified value, then the frame is considered unacceptable for further processing.

This parameter is used only for working with faces

30

Number of frames used to filter photo images by the angle of rotation of the head

Filtering cuts off images with faces strongly turned away from the stream source.

Specifies the number of frames for analyzing head rotation angles on each of these frames. If the angle is drastically different from the group average, the frame will not be considered the best shot.

This parameter is used only for working with faces

With a value of 1, the parameter is disabled. Recommended value: 7

1

Number of frames the system must collect to analyze head yaw angle

The parameter indicates to the system that it is necessary to collect the number of frames specified in the “Number of frames used to filter photo images by the angle of rotation of the head“ parameter to analyze the head rotation angle.

If the parameter is disabled, the Service will sequentially analyze incoming frames, i.e., first, two frames are analyzed, then three, and so on. The maximum number of frames in this sequence is set in "Number of frames used to filter photo images by the angle of rotation of the head".

This parameter is used only for working with faces

Disabled

Mouth overlap threshold (minimum mouth visibility)

If the received value exceeds the specified threshold, the image is considered unacceptable for further processing. For example, with the parameter value equals to 0.5, 50% of the mouth area is allowed to be covered. This parameter is used only for working with faces

0

Minimum body detection size

The parameter specifies the body detection size, less than which it will not be sent for processing. If the value is 0, then body detection will not be filtered by size

0

Liveness parameters

Liveness is used to check if there is a live person in the frame and prevents a printed photo or a photo from a phone from being used to pass the check. This group of parameters is only used for working with faces. This group of parameters is not used for the Image type

Check RGB ACS Liveness

Enables the mode of checking the presence of a person in the frame, based on working with the background.

The check execution speed depends on the frame size of the video stream. If the processing speed drops with the enabled parameter, you need to reduce the video resolution in the event source settings

Disabled

Check FlyingFaces Liveness

Enables the mode of checking the presence of a person in the frame, based on working with the environment of the face

Disabled

Track frames to run liveness check on

The parameter specifies for which frames of track the Liveness check will be performed.

Frame selection options:

First N shots;

Last N shots before best shot sending;

All shots of track.

The value “N” is specified in the parameter “Number of frames in the track for Liveness check when liveness-mode is enabled”

First N frames

Number of frames in the track for Liveness check when liveness-mode ” is enabled

The number of frames in a track for checking Liveness when using the parameter “Track frames to run liveness check on

0

Threshold value at which the system will consider that there is a real person in the frame

The threshold value at which the Service considers that there is a living person in the frame.

The Service verdict on the presence of a real person in the frame will follow only if Liveness returns a value higher than the specified threshold value

0

Livenesses weights (RGB ACS, FlyingFaces)

The coefficient of influence of each type of Liveness checking on the final estimate of the presence of a living person in the frame.

Three values are indicated, referring to different types of Liveness.

Values are indicated in fractions of a unit.

The ratio is scaled based on the given numbers, regardless of whether they constitute a unit and which Liveness methods are enabled

0

0

0

Number of background frames that are used for the corresponding checks

Allows to set the number of background frames in the track for the Liveness check.

Recommended value: 300. It is not recommended to change this parameter

0

Additional parameters

Frame processing

The parameter is used only for TCP, UDP and Videofile types.

Possible values:

Auto;

Full frame;

Scale frame. The parameter is set for a specific instance of FaceStream 5.

With Full frame value, frame is immediately converted to an RGB image of required size after decoding. This results in a better image quality, reduces the frame rate.

When set to Scale frame, the image is scaled based on the TrackeEgine settings.

The default value is Auto. In this case, one of the two modes is selected automatically

Auto

Number of threads for video decoding

Sets the number of streams for video decoding with FFMPEG. With an increase in the number of streams, the number of processor cores involved in decoding increases. Increasing the number of streams is recommended when processing high- definition video (4K and above)

0

Maximum FPS for video processing

The parameter is used only for Videofile type. The video is processed at the specified. Video cannot be processed with FPS higherFPS. than specified in this parameter

If the video has a high FPS value and FaceStream 5 cannot operate at the specified FPS, then frames will be skipped.

Thus, the video file imitates a stream from a real video camera. This can be useful for performance tuning. The video will be played at the selected speed, which is convenient for load testing and further analysis.

The parameter is not used if the value is 0

0

After saving the settings for the newly created video stream, the message “Source has been successfully created” will appear on the screen (Figure 78).

Confirmation of the successful event source creation
Confirmation of the successful event source creation

Video stream editing#

Clicking the (4 in the Figure 55) button to start editing a video stream, or click the “Edit” button on the page for viewing the video stream of the source (Figure 79).

Go to stream editing from the video stream viewing page
Go to stream editing from the video stream viewing page

General view of the form for editing a video stream is similar to the view of the form for creating a video stream (Figure 53).

Start editing stream parameters. If you need to go back to the page with the list of sources during editing, press the Esc key on your keyboard.

After saving the settings for the stream parameters, the message “Source was successfully updated” will appear (Figure 80).

Confirmation of a successful stream update
Confirmation of a successful stream update

Video stream deleting#

To delete a video stream, click on the button in the line with the video stream you want to delete (3 in the Figure 52). In the pop-up window (Figure 81), confirm the action—click on the “Delete” button or cancel the action by clicking on the “Cancel” button.

Confirmation of event source deletion
Confirmation of event source deletion

After clicking the “Delete” button, a message about successful stream deletion appears (Figure 82).

Message about stream deletion
Message about stream deletion