«Cameras» Section#
The Cameras section is designed to display all cameras, camera status, camera previews, and configure video stream parameters for each camera. Both an RTSP stream and a video file can function as a camera source.
CARS Analytics UI supports simultaneous work with multiple sources of video streams.
Only the Administrator can add, update, edit and delete cameras.
The view of this section of the Camera (Figure 70). The description of the columns of the camera table is presented in Table 31.
Table 31. Columns description
Column name | Description |
---|---|
Preview | Camera image preview |
Camera name | Camera name |
Type | Type of vide source: video file or RTSP stream |
Status | Camera status (see Table 32) |
Update date | Date and time of the last update of the camera settings |
Camera control | Camera control buttons (see Table 33) |
This section contains cameras registered in the CARS Analytics and tools for working with them. The color indicator to the left of the camera name shows the status of the camera Table 32.
Table 32. Handler options
Status | Description |
---|---|
The camera is operating correctly | |
Camera is not responding to CARS Analytics UI requests | |
The camera has been disabled by the administrator, or the camera setup process has not been completed | |
The camera is connecting |
All interactions with cameras are performed using the buttons located on the camera preview (Table 33).
Table 33. Camera options
Button | Description |
---|---|
Camera restart. Reconnection to the camera according to the specified parameters (see section 7.4) | |
Camera editing (see section 7.5). | |
View the location of the camera (see section 7.7). | |
Viewing the video stream (see section [7.8](#viewing-video-stream)). |
Camera creation#
Adding a camera takes place on the "Camera creation" page after clicking the "Add" button in the "Cameras" section see Figure 70. The general view of the "Camera creation" page (Figure 71).
The camera creation page contains 3 sections:
Block | Description |
---|---|
Connection | Parameters for connecting to the source |
Settings | Settings for camera parameters and detection and recognition zones |
Geoposition | Section for connecting the geolocation of the camera |
Connecting camera#
The camera connection options depend on the type of source selected.
An example of camera connection settings ir the "Connection" block for RTSP Stream and Video File source types (Figure 72). The description of each field is presented in Table 34.
Each field of the "Connection" block form is required.
Table 42. Parameters description
Name | Description | Values |
---|---|---|
Camera Name | Display name in the camera list. The maximum title length is 255 characters. | Can consist of 1-255 characters and can contain letters, numbers and symbols. |
Source type | RTSP stream or video file. It is recommended to use the RTSP stream for the product circuit, video files can be used for debugging or testing. | RTSP stream |
Video stream | ||
Cars Stream server address | Field to select IP address with installed CARS Stream. If the source is in the same network as the CARS Stream server, then it will be displayed in the drop-down list. | CARS Stream in drop down list |
Cars Stream server address (manual input) | Input field IP address of CARS Stream to connect to the server outside the network. | IP address of CARS Stream |
Source Path | RTSP stream address or video file location. The path to the video file is set in absolute form, or relative to the location of CARS Stream on the server. | RTSP: rtsp://ip-address/… |
Video file: /var/lib/luna/cars/video.mp4 | ||
Data Transfer Protocol | Video streaming protocol. CARS Analytics. | TCP |
UDP | ||
Camera operation switch | Switching the camera operation (gray - the camera is disabled, blue - the camera is enabled) | On/Off |
To add a camera, click the Save button. To cancel adding a camera, press [Esc] on the keyboard or left-click on the zone around the form.
An example of camera connection settings in the «Connection» block for ANPR stream source type (Figure 73). The description of each field is presented in Table 43.
Each field of the «Connection» block form is required.
Table 43. Parameters description
Name | Description | Values |
---|---|---|
Camera Name | Display name in the camera list. The maximum title length is 255 characters. | Can consist of 1-255 characters and can contain letters, numbers and symbols. |
Source type | ANPR stream | ANPR stream |
User | ANPR stream username | Can consist of 1-255 characters and can contain letters, numbers and symbols. |
Password | ANPR stream user password | Can consist of 1-255 characters and can contain letters, numbers and symbols. |
ANPR Stream server address | Field to select IP address with installed ANPR Stream. If the source is in the same network as the ANPR Stream server, then it will be displayed in the drop-down list. | ANPR Stream in drop down list |
ANPR Stream server address (manual input) | Input field IP address of ANPR Stream to connect to the server outside the network. | IP address of ANPR Stream |
Source Path | ANPR stream address or video file location. | |
The path to the video file is set in absolute form, or relative to the location of ANPR Stream on the server. | http://ip-address/… | |
Camera operation switch | Switching the camera operation (gray - the camera is disabled, blue - the camera is enabled) | On/Off |
To add a camera, click the Save button. To cancel adding a camera, press [Esc] on the keyboard or left-click on the zone around the form.
Camera Settings#
The camera settings section includes two subsections: managing recognition zones and camera parameters.
0…1000|# Managing recognition zones
The camera settings section is used to manage detection and recognition zones (Figure 74). Description of settings is in Table 35.
Table 35. Camera settings description
Name | Description | Values |
---|---|---|
Angle of rotation | The angle of rotation of the image from the source (step 90°). It is used when the incoming video stream is rotated, for example, if the camera is installed upside down. | 0, 90, 180, 270 |
Automatic restart | Automatically attempts to connect to the camera when communication with the camera is lost. The number of attempts and time interval are set during installation | On/Off |
Camera Preview | The preview (the image from the camera) is necessary for drawing recognition and detection zones. CARS Analytics cannot work correctly without a preview. The algorithm for adding a preview from a file: | An image in a format that meets the requirements specified in Table 6 |
1. Take a snapshot of the video stream in the original resolution. | ||
2. Click Download preview from file. | ||
3. Select an image in the Explorer window. | ||
To add a preview from CARS Stream, click on the appropriate button. After clicking, the task "Getting a preview" will be created (working with tasks is described in paragraph 9), after its execution, the preview will be automatically added to the settings | ||
Editing the detection zone | Detection zone sets the zone of interest processed by CARS Stream for the purposes of detection and tracking of objects (see section 7.2 | See section 7.2 |
Adding a recognition zone | Recognition zone defines the zone of recognition of vehicle and LP attributes inside the detection zone (see section 7.3). | See section 7.3 |
Camera parameters#
This subsection is intended for controlling camera settings and object detection and recognition parameters.
Changing the camera parameters can lead to errors in the operation of the LUNA CARS System. Settings should be configured only by experienced users or under the guidance of VisionLabs engineers.
Interface of camera parameters in the «Settings» block for RTSP stream and Video file source types (Figure 75). Description of settings is in Table 45.
Table 45. Camera parameters description
Name | Description | Possible values | Default value |
---|---|---|---|
General parameters | The section where the general camera settings are set | - | - |
Save license plate detections | Enabling the function of saving information about LP detections: coordinates of the position of the detected LP in the general history of movement. The data is available when uploading a JSON file generated by the CARS API | On/Off | Off |
Stream preview mode | Selecting the real-time video stream broadcast mode (see section 7.8) | - Disabled – preview mode is disabled | - |
- Simple – preview mode, which displays only BBox of detected objects | |||
- Interactive – preview mode, which displays the BBox of detected objects and recognized attributes | |||
Classifiers parameters | Section where vehicle and LP attributes are chosen for recognition | - | - |
Vehicle classifiers | Drop-down list of available vehicle classifiers for recognizing vehicle attributes. When vehicle is detected, the system recognizes only chosen attributes. All the excluded ones won’t be recognized. Changing the list of attributes impacts on system performance: the less attributes chosen, the higher performance is. | - Brands and models | All available classifiers |
- Vehicle type | |||
- Vehicle emergency type | |||
- Vehicle color | |||
- Vehicle descriptor | |||
- Public transport type | |||
- Special transport type | |||
- Vehicle wheel axles count | |||
License plate classifiers | Drop-down list of available LP classifiers for recognizing LP attributes. When LP is detected, the system recognizes only chosen attributes. All the excluded ones won’t be recognized. Changing the list of attributes impacts on system performance: the less attributes chosen, the higher performance is. | License plate symbols and country | License plate symbols and country |
Save full frames | Section to manage image saving settings | On/Off | On |
Save all best shots | By default, only one best frame is saved, the frame of the beginning and end of the track. This flag activates the saving of all the best frames received by the system | On/Off | Off |
Detector parameters | The section where the detector parameters are set. Changing the parameters depends on the location where the camera is installed, the purpose of use | - | - |
Detector size | The size of the image in pixels on the larger side on which the object is detected. Increasing the size will allow you to better detect an object in the background (or small GRZ on large vehicles), but this will increase the processing time | 320…1280 | 640 |
Detector threshold | The threshold value of the detector for object detection. If the object crosses the detection zone less than the specified threshold, the detection will not be registered | 0…1 | 0.3 |
Redetect threshold | The threshold value of the detector for object detection. If the object crosses the detection zone less than the specified threshold, the detection will not be registered | 0…1 | 0.25 |
FGS is used | Enabling FGS | On/Off | Off |
Redetect parameters | The section in which the parameters of the redetector are regulated | - | - |
Finder of lost redetections | Selection of the algorithm for searching for lost detections after the redetection procedure | - Tracker – an algorithm in which the search for lost detections occurs by predicting the trajectory of an object; | Tracker |
- Detector – an algorithm in which the search for lost detections occurs by restarting the detection algorithm on the frame | |||
Max number of skipped redetects | The maximum number of frames on which the launch of the search for lost redetections can be skipped using the tracker | 0…1000 | 0 |
Max number of skipped redetects by FGS | The maximum number of frames on which the launch of the search for lost redetections using FGS can be skipped if regions with movements are not found | 0…1000 | 0 |
Intersection threshold with FGS regions | The threshold value of the intersection of detection with the FGS movement regions. When crossing the above, it is impossible to skip the launch of the redetector | 0…1 | 0.5 |
FGS parameters | The section where the FGS technology settings are set | - | - |
Model frequency | Refresh rate of the FGS model in frames | 0…1000 | 20 |
Model scale | The number by which the frame will be enlarged to build the FGS model | 0…1 | 0.25 |
FGS open kernel size | The size of the noise areas (in pixels) that need to be removed when getting the areas of the frame where there was movement | 12 | |
FGS close kernel size | The size of the areas (assumed to be a static background) that need to be removed from foreground objects in pixels when getting regions where motion is detected | - | 72 |
FGS minimum detection | Minimum size of the foreground area returned for the FGS model | - | 300 |
Attributes correction parameters | The section where parameters for filtering vehicle attributes are set in order to exclude conflicting values. For example, for cases when the type of emergency service and the type of public transport are recognized for the vehicle at the same time. Recognition accuracy thresholds are set for the specified attributes. If the attribute recognition accuracy is less than the specified one, the indicator is not taken into account, the accuracy is estimated as 0. For more information, see Attribute correction parameters" in the document "CARS API. Administrator's Guide" | On/Off | Off |
Model threshold | The threshold value of recognition accuracy for the attributes "Vehicle brand" and "Vehicle Model". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Vehicle type threshold | The threshold value of recognition accuracy for the attribute "Vehicle type". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Vehicle emergency type threshold | The threshold value of recognition accuracy for the "Emergency Services" attribute. Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Special transport type threshold | The threshold value of recognition accuracy for the attribute "Special transport". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Public transport type threshold | The threshold value of recognition accuracy for the attribute "Public transport". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Analysis of license plate motion vector | The section of the detection and recognition settings of the LP in accordance with the direction of movement of the vehicle. It is recommended to use only for cameras installed in courtyards | On/Off | Off |
License plate tracking length | The maximum number of frames on which one GRZ is tracked with track preservation | 0…1000 | 24 |
Max frame without license plate | The maximum number of frames on which the LP can be skipped to save the track | 0…1000 | 3 |
License plate intersection threshold | Threshold value of the intersection of the LP of the recognition zone | 0…1 | 0.15 |
AGS settings for license plate | Enabling AGS technology. Allows you to evaluate the quality of LP crop | On/Off | Off |
Upper threshold | The AGS value of the license plate number below which the image will be considered good | 0…1 | 0.85 |
Lower threshold | The AGS value of the license plate number below which the image will be considered bad | 0…1 | 0.5 |
Camera parameter interface in the «Settings» block for the ANPR camera source type (Figure 76). Description of settings is in Table 46.
Table 45. Camera parameters description
Name | Description | Possible values | Default value |
---|---|---|---|
Classifiers parameters | Section where vehicle and LP attributes are chosen for recognition | - | - |
Vehicle classifiers | Drop-down list of available vehicle classifiers for recognizing vehicle attributes. When vehicle is detected, the system recognizes only chosen attributes. All the excluded ones won’t be recognized. Changing the list of attributes impacts on system performance: the less attributes chosen, the higher performance is. | - Brands and models | All available classifiers |
- Vehicle type | |||
- Vehicle emergency type | |||
- Vehicle color | |||
- Vehicle descriptor | |||
- Public transport type | |||
- Special transport type | |||
- Vehicle wheel axles count | |||
License plate classifiers | Drop-down list of available LP classifiers for recognizing LP attributes. When LP is detected, the system recognizes only chosen attributes. All the excluded ones won’t be recognized. Changing the list of attributes impacts on system performance: the less attributes chosen, the higher performance is. | License plate symbols and country | License plate symbols and country |
Save full frames | Section to manage image saving settings | On/Off | On |
Save all best shots | By default, only one best frame is saved, the frame of the beginning and end of the track. This flag activates the saving of all the best frames received by the system | On/Off | Off |
Attributes correction parameters | The section where parameters for filtering vehicle attributes are set in order to exclude conflicting values. For example, for cases when the type of emergency service and the type of public transport are recognized for the vehicle at the same time. Recognition accuracy thresholds are set for the specified attributes. If the attribute recognition accuracy is less than the specified one, the indicator is not taken into account, the accuracy is estimated as 0. For more information, see Attribute correction parameters" in the document "CARS API. Administrator's Guide" | On/Off | Off |
Model threshold | The threshold value of recognition accuracy for the attributes "Vehicle brand" and "Vehicle Model". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Vehicle type threshold | The threshold value of recognition accuracy for the attribute "Vehicle type". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Vehicle emergency type threshold | The threshold value of recognition accuracy for the "Emergency Services" attribute. Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Special transport type threshold | The threshold value of recognition accuracy for the attribute "Special transport". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Public transport type threshold | The threshold value of recognition accuracy for the attribute "Public transport". Recognitions with an accuracy below the specified one are not taken into account when displaying recognition results | 0.00…1000 | 0…1000 |
Camera geoposition#
This block contains fields for filling in information about the camera location (Figure 77).
Filling in these fields will allow you to show the exact location of the camera on the map when you click on the corresponding icon in the list of cameras. By default, the coordinates shown in Figure 70 are filled in for each camera.
Detection Zone#
The detection zone sets the zone of interest processed by CARS Stream for the purposes of detecting and tracking objects. Can be configured as a rectangular inside the frame.
To set up and edit the detection zone, click on the «Edit detection zone» button see Figure 73.
CARS Stream only processes the zone inside the selected rectangle.
To create a detection zone, you need to add an image to the camera preview (see «CARS Analytics. Administration Manual»).
Proper use of the detection zone significantly increases the performance of the LUNA CARS. With the help of the detection zone, it is possible to cut off frame zones for which there is no need to detect objects and determine their attributes.
The administrator can adjust the detection zone at any time.
An example of the location of the detection zone (Figure 78).
The detection zones are displayed on the preview as a yellow rectangular zone (only the selected zone will be processed by CARS Stream).
Editing Detection Zone#
To edit the detection zone, click «Edit detection zone». The general view of the detection zone editing form see Figure 73.
To change the shape of the detection zone, it is necessary to «drag» it by one of the control points located along the perimeter of zone.
To move the detection zone, click on the yellow zone and move it inside the preview.
To reset the detection zone setting, press the «Full frame» button.
Below the preview are the geometrical parameters of the detection zone:
- Frame size – preview size.
- Detection zone coordinates – X and Y coordinates with the starting point in the upper left corner of the preview.
- Width and height of the detection zone.
After finishing setting up the detection zone, click on the Save button in the lower left corner of the window. To cancel adding a zone, press [Esc] on the keyboard or left-click on the zone around the form.
Recognition Zone#
Recognition zone sets the zone for registration of recognition events within the detection zone. It is an arbitrary polygon adjusted to the camera preview. CARS Analytics UI supports the creation of an unlimited number of recognition zones.
The scenario of interaction between the vehicle and the LP with the recognition zone is determined by the processors (see section 5.4).
Object detection and tracking is performed throughout the entire detection zone, but the best shot is selected only for objects that crossed the recognition zone with a given threshold. Using the recognition zone allows you to limit the zone for determining the best frame without losing information about the vehicle track outside the boundaries of the recognition zone.
Changing the position and size of the detection zone does not affect the position and size of the recognition zone.
The principle of interaction between objects and the recognition zone (Figure 79).
In this figure, Si is the area of the recognition zone (red), which was blocked by the object's BBox for the i-th frame. The Si value will be compared with the threshold value. The script is triggered when the Si value exceeds the threshold.
When constructing zones, it is necessary to take into account the movement of the vehicle in the vicinity of the zone, since the area of intersection of the BBox and the recognition zone will be different on different frames and for different trajectories of movement.
An example of the «tracks» of vehicle trajectories (Figure 80).
In Figure 80, the final area of the detection zone is marked in red, which will be overlapped by BBox 1, yellow – by BBox 2. The intersection area of BBox 1 is larger than the intersection area of BBox 2 and the best frame will be selected only for BBox 1.
The list of created recognition zones is in the form of adding a camera (Figure 81). Detection zone name – name of the recognition zone. Each zone has a color, the corresponding name has the same color.
Recognition Zone Creation#
To create and edit a recognition zone, click on the «Edit recognition zone» button. The recognition zone management window (Figure 82). The recognition zone creation algorithm is shown in Table 37.
It is recommended to build recognition zones in the camera plane without reference to the environment, since the check of the object's entry into the zone occurs with rectangular BBox, and not an arbitrary shape of the object. Recommendations for creating zones with examples of using handlers are given in section 5.4.
Table 37. Algorithm for creating a recognition zone
№ | Description |
---|---|
1 | Create a recognition zone. To create a recognition zone, you must specify the vertices of the polygon, which will be the recognition zone. To create a vertex, left-click on the camera preview. |
2 | It is not recommended to create more than 10 vertices. The polygon faces must not intersect. The recognition zone must be larger than the possible objects to be detected. |
3 | Confirm the creation of the zone - click on any vertex of the polygon. |
4 | Save the recognition zone - click the «Save» button located under the preview. To reset the zone, click the «Start over» button. |
5 | Enter the name of the recognition zone in the «Recognition zone name» field. Can consist of 1-255 characters and can contain letters, numbers and symbols. |
6 | Choose a zone color or leave the default color. The color can be selected from a list or by entering the color's HEX code. |
Editing Recognition Zone#
To edit the recognition zone, click on . Editing a recognition zone is similar to the process of adding a zone (see section 7.3.1).
Deleting Recognition Zone#
The zone can be deleted by pressing the button in the line.
In the pop-up window (Figure 83), you must confirm the action - click the «Delete» button or cancel the action using the «Cancel» button.
Camera Restart#
In cases where the camera has been disconnected due to an error (the status indicator is red), the administrator has the option to manually restart the camera. The restart button is located on the camera preview.
When the camera is restarted, it reconnects to the source with the parameters specified in the camera properties (see section 7.1).
Edition Camera#
To edit a camera, click on the camera preview.
After clicking, the camera settings menu will open, in which you can change any parameters. For a description of the camera parameters, see section 7.1.
Deleting Camera#
You can delete the camera only when editing the camera. The delete button is located at the top of the camera page see Figure 71).
You need to click the Delete button, and a warning will appear (Figure 84) in which you need to confirm the operation by clicking the Delete button or cancel the action via the Cancel button
View Camera Location#
To view the location of the camera on the map, click on the camera preview.
This opens the Yandex. Maps website window (Figure 85). The window displays a marker of the location of the camera on the map, the coordinates and the address corresponding to these coordinates are indicated.
Placing camera on the map occurs using the administrator interface and is described in the document «CARS Analytics. Administration Manual».
Viewing Video Stream#
To view the processed video stream, click on the camera preview.
This opens a window in the browser (Figure 86), where CARS Stream detects and tracks the vehicle in the registration zone in real time in the video stream. The frame boundaries are determined by the detection zone settings (see section 7.2).
Viewing the processed video stream is used to debug the camera.