Skip to content

Sequence diagrams#

This chapter describes the general requests to LP and shows interactions between services during the request processing.

More information about the requests can be found in the OpenAPI specification.

Samples creation diagrams#

Sample creation diagram#

The request enables detecting the faces in input photos. The photos can be sent in standard image formats (JPEG, PNG, etc.) or in the BASE64 format.

See the "detect faces" request in OpenAPI specification for details.

Request Description Type
detect faces The request enables to detect faces in incoming images, estimate face parameters, extract samples and save them in Image Store. POST

The request result contains the parameters of the detected faces and the IDs for all the created samples.

Detect faces request
Detect faces request

The general request processing scheme:

1․ The face detection request is sent to API.

2․ API receives the request, processes it and sends the task to the Remote SDK service.

3․ The Remote SDK processes the task according to the specified parameters.

6․ The Remote SDK sends received samples to Image Store.

7․ Image Store saves the samples to the storage.

8․ The storage returns the result of sample saving.

9․ Image Store returns the samples IDs and URLs.

10․ The Remote SDK sends the samples IDs and the received face parameters to the API service.

11․ API generates and sends the response.

Get information about samples and save samples#

The following request adds external samples to Image Store.

See the "save samples" request in OpenAPI specification for details.

Request Description Type
save sample The external sample is saved in Image Store. POST

The following requests enables to manage the already existing samples.

See the "samples" section in OpenAPI specification for details.

Request Description Type
get sample Receive a sample by its ID. GET
remove sample Delete a sample by its ID. DEL
check to exist sample Check existence of the sample with the specified ID. HEAD

All the requests above are processed by the Image Store service.

Image Store request
Image Store request

The requests processing scheme:

1․ A request is sent to the API service.

2․ API sends the request to Image Store.

3․ Image Store performs the required actions with the storage.

4․ The storage returns the required data.

5․ Image Store returns the result to API.

6․ API generates and sends the response.

More information about the requests can be found in the OpenAPI specification document in the "Samples" section.

Attributes diagrams#

Temporary attributes extraction diagram#

Handlers receives samples from Image Store and extracts information from them.

See the "extract attributes" request in OpenAPI specification for details.

You should specify the array of sample IDs.

Request Description Type
extract attributes Extracts the following attributes: descriptors and basic attributes (age, gender, ethnicity). POST
Temporary attributes extraction sequence diagram
Temporary attributes extraction sequence diagram

The general request processing scheme:

1․ The extraction request is sent to API.

2․ API receives the request, processes it and sends the request to the Remote SDK service.

3․ The Remote SDK service requests samples from Image Store.

6․ Image Store requests the samples from the storage.

7․ The storage sends samples.

8․ Image Store sends the samples to the Remote SDK service.

9․ The Remote SDK service processes the task according to the specified parameters.

10․ The Remote SDK service sends descriptors and basic attributes to the Faces service.

9․ The Faces service sends requests to store temporary attributes in the Redis database.

10․ Redis database sends the response to Faces.

11․ Faces sends the response to the Remote SDK service.

12․ The Remote SDK service sends received attribute IDs, basic attributes, attribute URLs, filtering results and GS score to the API service.

13․ API sends the response.

Diagram of attribute creation using external data#

The diagram shows attributes creation using data from an external database.

See the "attributes" section in OpenAPI specification for details.

Request Description Type
create temporary attribute New temporary attributes creation. POST
External temporary attributes saving sequence diagram
External temporary attributes saving sequence diagram

1․ The request for adding new attributes (descriptors and/or basic attributes) is sent to the API service. All the required data for attributes creation is sent with the request. Optional. The request to the Image Store service is sent when samples were provided with descriptors and/or attributes.

2․ The API service sends the request to the Image Store service to save the received samples.

3․ Image Store requests the samples from the storage.

4․ The storage sends samples.

5․ Image Store sends the samples to the API service.

6․ The API service sends descriptors and basic attributes to the Faces service.

7․ The Faces service sends requests to store temporary attributes in the Redis database.

8․ Redis database sends the response to Faces.

9․ Faces sends the response to the API service.

10․ The API service sends response.

Attributes information diagrams#

The following requests enable you to receive data about the already existing attributes or delete them.

Request Description Type
get temporary attributes Receive all attribute IDs and their creation time according to the targets. GET
get temporary attribute Receive the temporary attribute information by ID. GET
check temporary attribute Check existence of an attribute by its ID. HEAD
delete attributes Delete the attribute with the specified ID. DEL
get temporary attribute samples Get all the temporary attribute samples by the attribute ID. GET
Get information about temporary attributes
Get information about temporary attributes

The general request processing scheme:

1․ A request is sent to the API service.

2․ API sends the request to the Faces service.

3․ Faces service sends the request to the Redis database to receive information about temporary attributes.

4․ The Redis database returns requested information.

5․ The Faces service returns the result.

6․ The API service returns the information to the user. If the TTL of the attribute has expired, an error is returned.

Faces and lists diagrams#

All the requests in this section are processed by the Faces service.

See the "faces" section in OpenAPI specification for details.

Face creation diagram#

Request Description Type
create face Create a new face with the specified attribute ID, user's data, avatar and lists. POST
Create new face
Create new face

The general request processing scheme:

1․ A request is sent to the API service.

2․ API sends the request to the Faces service.

3․ Faces service sends the request to the Redis database to receive temporary attributes. Optional. The request to the Redis DB is not sent if there are external attributes specified for the face or when there are no attributes set for the face.

4․ The Redis database returns requested information.

5․ The Faces service returns the result.

6․ The Faces service sends the request to the Faces database to create a new face using the specified data.

7․ The Faces database saves the data.

8․ The Faces service returns the information about the created face.

9․ The API service returns the information to the user.

Faces and Lists information and management#

All the following requests have similar sequence diagrams.

The following requests enable you to create faces, receive data about the already existing faces or delete them.

Request Description Type
get faces Get the array of all the existing faces and their data: face ID, external ID, user's data, create time, avatar, account ID, and lists to which the Face is attached. GET
delete faces Delete several faces by IDs. DEL
get face count Receive the number of existing faces according to the specified filter. GET
get count of faces with attributes Count of faces with attributes. GET
get face Receive data of the face (face ID, external ID, user's data, create time, avatar, account ID, and lists to which the face is attached) that corresponds to the specified ID. GET
patch face Update a Face using the specified data: user's data, event ID, external ID, avatar. PATCH
remove face Delete the specified face. DEL
check to exist a face Check existence of the face by its ID. HEAD
put face attribute Put face attribute, changing all the attribute data corresponding to the specified face. PUT
get face attribute Receive attributes of the specified face: gender, age, ethnicity, create time. GET
delete face attribute Remove face attribute by ID. DEL
get face attribute samples Receive information about samples (sample ID) that correspond to the specified face. GET

The following requests enable you to create lists and receive data about the already existing lists or delete them.

Request Description Type
create list Create a new empty list. You can specify user data for it. POST
get lists Receive the array of all existing lists with the following data: list ID, user data, account ID, creation time, last update time. GET
delete lists Delete several lists by IDs. DEL
get list count Get number of the existing lists. GET
get list Receive information (list ID, user data, account ID, creation time, last update time) about the list by the specified ID. GET
check list existence Check existence of the list with the specified ID. HEAD
update list Update the user_data field of the list. PATCH
delete list Delete the list with the specified ID. DEL
attach/detach faces to the list Update the list by attaching or detaching the specified faces from it. PATCH

The following diagram represents workflow for all the listed above requests to Faces.

Faces request processing diagram
Faces request processing diagram

The general request processing scheme:

1․ A request is sent to the API service.

2․ API sends the request to the Faces service.

3․ The Faces service sends the request to the Faces database to receive information or manage the existing data.

4․ The Faces database returns requested information or information about database changes.

5․ The Faces service returns the result.

6․ The API service returns the result to the user.

Matching diagrams#

You should specify references and candidates for matching. You can limit the number of candidates with the highest similarity value.

Request Description Type
matcher/faces Matcher compares given references with the given candidates. As a result, matcher returns similarity level for each of the candidates and additional information about candidates. POST
human body matching Allows to submit tasks to a service that searches for human bodies similar to a given reference(s) by matching them. POST
raw matching Allows to do similarity calculations for input descriptors. POST

See the "matching faces", "human body matching" and "raw matching" sections in OpenAPI specification for details.

Matching using Python Matcher#

Matching by DB#

The example of matching events (references) with faces (candidates) is given below.

Matching of events and faces
Matching of events and faces

1․ The matching request is sent to the API service.

2․ The API service sends the request to the Python Matcher service.

3․ The Python Matcher service requests references from the Events database.

4․ The Events database returns the data.

5․ The Python Matcher service sends the requests for matching by the Faces database.

6․ The matching is performed.

7․ The Faces database returns the matching results.

8․ The Python Matcher service requests additional data for candidates.

9․ The Faces database returns the data.

10․ The Python Matcher service returns results to the API service.

11․ The API service sends the response.

Matching by list#

The example of matching faces (references) with the list of faces (candidate) is given below.

Matching faces by list
Matching faces by list

1․ The matching request is sent to the API service.

2․ The API service sends the request to the Python Matcher service.

3․ The Python Matcher service requests references from the Faces database.

4․ The Faces database returns the data.

5․ The matching is performed in The Python Matcher service. The cached descriptors are used.

6․ The Python Matcher service requests additional data for candidates.

7․ The Faces database returns the data.

8․ The Python Matcher service returns results to the API service.

9․ The API service sends the response.

Handlers diagrams#

Handlers management requests#

A handler defines the logic for input image processing. You should specify a handler when a new event is created.

The following requests are related to the handler.

Request Description Type
create handler Create handler. POST
get handlers Get handlers according to the specified filters. GET
get handler count Receive number of existing handlers. GET
get handler Receive handler policies by the handler ID. GET
replace handler Update fields of a handler. You should specify the handler ID. You should fill in all the required policies in the request body. Update of individual parameters of a handler is not allowed. PUT
check to exist a handler Check if the handler with the specified ID exists. HEAD
remove handler Remove a handler by ID. DEL

The general scheme of handler creation is shown in the figure below. All the handler requests have similar sequence diagrams.

Handler creation diagram
Handler creation diagram

1․ A request for a new handler creation is sent from the API service to the Handlers service.

2․ The Handlers service processes the request and creates a handler.

3․ The Handlers service stores the handler to API database.

4․ The Handlers service database returns result.

5․ The Handlers service service returns the ID of the created handler.

The handler is used upon Event creation and its usage example is described in the "Events" section. All the results are stored in the Events database.

Events diagrams#

Event creation general diagram#

Event is created after an image is processed according to a handler.

Request Description Service
generate events Create an event. You should specify the required handler ID or set dynamic handler policies and specify images for processing. You can set additional parameters and data for the created event. POST

The sequence diagram for a new event creation will differ depending on the specified handler policies.

The sequence diagram below shows the general process of new event creation and includes only general entry points for the execution of policies.

Event creation diagram
Event creation diagram

1․ The API service sends the request for new Event creation to the Handlers service.

2․ The Handlers service receives the corresponding handler from the Handlers database.

3․ The Handlers database sends the handler to the Handlers service.

4․ The "detect_policy" and "extract_policy" are processed by the Remote SDK service. The received samples and attributes are stored in RAM.

5․ The "match_policy" is processed according to the provided filters. Descriptors received during the "extract_policy" execution are provided as references for matching. There can be several ways for matching execution:

  - Faces are set as matching candidates. The matching will be performed using the Faces DB. You can specify the necessary lists with faces using filters.
  - Events are set as matching candidates. The matching will be performed using the Events DB.
  - Attributes are set as matching candidates. The matching will be performed using the Redis DB.

6․ The "conditional_tags_policy" is processed by the Handlers service.

7․ The "storage_policy" is processed according to the specified data:

  - Face samples, body samples and source images are saved to the Image Store service;
  - Attributes are created and saved in the Redis database by the Faces service; 
  - Faces are created and saved in the Faces database by the Faces service. They also can be linked to lists using "link_to_lists_policy";
  - Events are saved to the Events database by the Events service;
  - Notifications are sent via pub/sub to Redis (see the ["Sender service"](services-description.md#sender-service) section).

8․ The Handlers service returns results to the API service.

Get statistics on events and events information#

Policy Description Service
get statistics on events Receive statistics on Events. The target group enables to aggregate events according to the specified aggregators: number of items, max or min value, average value, grouping. You can also apply filters and periods to select events that should be aggregated. Grouping by frequency or grouping by time intervals is available. POST
get events Receive all events that satisfy the specified filters. In the filters you can set values of one or several event fields. The target parameter enables to select fields that you want to receive in response. You can left the target field of the request empty. In this case all data about the events will be shown. You may also set sorting parameters, number of events on page and number of pages. If the create_time__gte is not set, it is set to one month by default. GET
get event Receive all fields for the event. You should specify "event_id" for the request. GET
check event existence Check existence of the specified event. You should specify "event_id" for the request. HEAD

The sequence diagram describes the request processing.

Events information
Events information

1․ API receives request to Events.

2․ API sends the request to Events.

3․ Events receives the required data from the database.

4․ The database returns result.

5․ Events returns results.

6․ API returns results.

Tasks diagrams#

This section provides diagrams of the sequence of tasks.

The section "General diagram of task creation and execution" shows the general process of task processing. For some tasks, the process may be different. For example, for a Clustering task, only one subtask is created and only one worker is used. Separate comments are provided for such tasks.

General diagram of task creation and execution#

A diagram of the general process of creating and executing a task is presented below.

Note that the Tasks worker is constantly waiting for any data to be received. The service has a certain priority for the execution of work, namely:

1․ Checking whether the current task/subtask needs to be canceled (not reflected in the diagram).

2․ Requests for splitting into subtasks.

3․ Requests for merging results.

4․ Requests for processing subtasks.

This means that the Tasks worker will not process a subtask request until it performs a cancellation check on the current task.

Starting task creation#

This section describes the process of starting to create a task.

Task creation diagram
Task creation diagram

1․ The API service receives a request to create a task.

2․ The API service sends a request to the Tasks service.

3․ The Tasks service performs task validation, creates it and prepares for its processing. At this stage, the Tasks service can perform additional actions, for example, get the "descriptor_id" to perform the task "Additional extraction".

4․ The Tasks service sends data about the task being created to the Tasks database.

5․ The Tasks service receives a response.

6․ The Tasks service sends the task ID "task_id" to the API service.

7․ The user receives the task ID, i.e. the API service issues the ID of the created task and does not wait for the completion of the task. By this identifier, the user can evaluate the status of the task.

To get a report with the status of task completion and perform other actions with the task, you should use the task ID. The relevant requests are listed in the "General information about tasks" section.

Splitting tasks into subtasks#

This section describes the process of splitting a task into subtasks. If only one subtask is assumed for some task (for example, Clustering task), then the task will be split into one subtask according to the process described below.

8․ The Tasks service sends data to Redis.

The Tasks service starts waiting for subtasks to appear in Redis.

The Tasks Worker service is waiting for data to appear in Redis to split them into subtasks.

9․ The Tasks Worker service receives data from Redis.

10․ The Tasks Worker service performs the process of splitting into subtasks.

11․ The Tasks Worker service sends subtasks back to Redis.

12․ The Tasks service receives subtasks.

13․ The Tasks service stores information about subtasks in the Tasks database.

14․ The Tasks service sends subtasks to Redis.

Processing each subtask#

This section describes the general process of processing a subtask. Each task has its own processing process, described in the corresponding section below.

The Tasks service starts waiting for a message to appear about the start of processing a subtask in Redis.

The Tasks Worker service is waiting for data to appear in Redis to process a subtask.

15․ The Tasks Worker service receives data from Redis.

16․ The Tasks Worker service starts processing the subtask.

17․ The Tasks Worker service sends a message to Redis that processing has started.

18․ The Tasks service receives a message about the start of processing.

19․ The Tasks service updates information about the Tasks database task.

The Tasks service starts waiting for the result to appear.

20․ The Tasks Worker service saves the task report to the Image Store service.

21․ The Image Store service sends a request to the Storage to save the report.

22․ The Image Store service receives a response from the user.

23․ The Image Store service returns a response to the Tasks Worker service.

24․ The Tasks Worker service sends a message to Redis that the subtask has been processed or that some error occurred during processing.

25․ The Tasks service reads the corresponding message from Redis.

26․ The Tasks service updates information about the Tasks database task.

27․ After processing the last subtask, the Tasks service updates the information about the Tasks database task.

Merging results and completing processing#

This section describes the general process of merging the results and completing the processing. If only one subtask is assumed for some task (for example, Clustering task), then merging the results will not be performed. See the "Task processing completion" section for a specific task.

28․ The Tasks service creates an internal task to combine all the results and sends this task to Redis.

The Tasks service starts waiting for the result to appear.

The Tasks Worker service is waiting for a merge task to appear.

29․ The Tasks Worker service receives the task data for merging.

30․ The Tasks Worker service merges the results of all subtasks.

31․ The Tasks Worker service saves the task report to the Image Store service.

32․ The Image Store service sends a request to the Storage to save the report.

33․ The Image Store service receives a response from the user.

34․ The Image Store service returns a response to the Tasks Worker service.

35․ The Tasks Worker service sends the result of the merge to Redis.

36․ The Tasks service reads the result of the merge from Redis.

37․ The Tasks service updates the task data in the Tasks database.

38․ The Tasks service updates the status of the task in the Tasks database.

General diagram of task cancellation#

A diagram of the general task cancellation process is presented below.

Task cancellation diagram
Task cancellation diagram

1․ The user sends a request to cancel the task to the API service.

2․ The API service sends a request to cancel the task to the Tasks service.

3․ The API service returns a response that the task cancellation process has been activated.

4․ The Tasks service updates the status of the task and its unfinished subtasks to "cancelled" in the Tasks database.

5․ The Tasks service receives a response.

6․ The Tasks service deletes entries in Redis that relate to the task being canceled.

7․ The Tasks service adds an entry to the Redis Stream stating that the task should be canceled if there are tasks whose status is in "in_progress".

8․ The Tasks worker detects a cancellation entry in Redis Streams and deletes it.

The Tasks worker performs a check for the presence of records throughout the processing.

9․ The Tasks worker Tasks cancels task processing.

10․ The Tasks service receives the cancellation result from Redis.

Redis is used to store data, including records, which are represented as event streams in Redis Streams.

At this stage, the task is considered completely canceled, i.e. the status "2" ("cancelled") will be returned in responses to requests like "get task".

Clustering task diagrams#

Request Description Method
clustering task Creates the task for clustering Faces or Events according to the specified filters. POST

Clustering task creating#

Creating a task is performed in the standard way described in the section "Starting task creation" in the section with the general diagram of the work of the Tasks service.

Splitting Clustering task into subtask#

A single subtask is created for this task type. Subtask contains filters according to which faces or events are selected. The subtask is processed by a single worker.

See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Clustering subtask processing#

Clustering task processing depends on the objects (faces or events) specified in the request.

The general workflow for processing a subtask is shown below.

Clustering task processing diagram
Clustering task processing diagram

1․ The Tasks worker receives data from Redis.

2․ Faces: The Tasks worker sends the request to Faces service. The Tasks worker requests all the required attribute IDs according to the filters specified in the subtask.

3․ Faces: The Faces service sends the request to the Faces database.

4․ Faces: The Faces database returns IDs.

5․ Faces: The Faces service sends IDs to the Tasks worker.

6․ Events: The Tasks worker sends the request to Events service. The Tasks worker requests all the required attribute IDs according to the filters specified in the subtask.

7․ Events: The Events service sends a request to the database to receive data.

8․ Events: The Events database sends the required data.

9․ Events: The Events sends data to the Tasks worker.

10․ The Tasks worker matching request. The requests is processed according to one of the schemes described in section "Matching diagrams".

11․ The Python Matcher service sends the results.

12․ The Tasks service saves the task report to the Image Store service.

13․ The Image Store service sends a request to the Storage to save the report.

14․ The Image Store service receives a response from the user.

15․ The Image Store service returns a response to the Tasks service.

16․ The Tasks worker sends the result to Redis.

Clustering task processing completion#

Next, the Tasks service receives the result from Redis and updates the status of the task in the Tasks database.

Linker task diagrams#

Request Description Method
linker task The request enables you to create a linker task. POST

Linker task creation#

The linker task can be created for faces and events objects. Linker task creation process depends on the object type.

Attach faces to list

Creation of linking task for faces
Creation of linking task for faces

1․ The API service receives a request to create a task.

2․ The API service sends a request to the Tasks service.

3․ The Tasks service creates a new linker task.

4․ The Tasks service sends information to the Tasks database.

5․ The task ID is returned from the Tasks database.

6․ The task ID is sent to the API service.

7․ The API service sends the task ID as a response.

8․ Optional. If you have specified a list ID in the request, Tasks service checks the existence of the list.

9․ Optional. The Faces service checks the list existence in the Faces database.

10․ Optional. The Faces database sends response.

11․ Optional. The Faces service sends response to the Tasks service.

12․ Optional. If the specified list does not exist or you have specified new list creation in the request, the Tasks service sends request for new list creation.

13․ Optional. The Faces service creates the list in the Faces database.

14․ Optional. The Faces database sends response.

15․ Optional. The Faces service sends the response to the Tasks service.

16․ The Tasks service sends data to Redis.

Attach faces created from events to list

Creation of linking task for events
Creation of linking task for events

1․ The API service receives a request to create a task.

2․ The API service sends a request to the Tasks service.

3․ The Tasks service creates a new linker task.

4․ The Tasks service sends information to the Tasks database.

5․ The task ID is returned from the Tasks database.

6․ The task ID is sent to the API service.

7․ API service sends the task ID as a response.

8․ Optional. If you specified a list ID in the request, Tasks service service checks the existence of the list.

9․ Optional. The Faces service checks the list existence in the Faces database.

10․ Optional. The Faces database sends response.

11․ Optional. The Faces service sends the response to the Tasks service.

12․ Optional. If the list does not exist or you have specified to create a new list in the request, the Tasks service sends the request to create a new list.

13․ Optional. The Faces service creates the list in the Faces database.

14․ Optional. The Faces database sends response.

15․ Optional. The Faces service sends the response to the Tasks.

16․ The Tasks service receives face IDs and attribute IDs from the Events service.

17․ The Events service receives the data from the Events database.

18․ The Events database sends response.

19․ The Faces service sends the IDs to the Tasks service.

20․ The Tasks service checks the existence of faces in Faces.

21․ The Faces service sends the request to the Faces database.

22․ The Faces database sends information about faces existence.

23․ The Faces service sends the data to Tasks service.

24․ Faces exist in Faces database If the face for an event exits, The Tasks service checks that the attribute ID of the face is similar to the attribute ID of the event.

25․ The Faces service sends a request to the Faces database.

26․ The Faces database send attribute IDs.

27․ The Faces service sends attribute IDs to the Tasks service. If the attribute IDs for the existing face and event are not equal, the "28009: Attribute is not equal" error will be returned.

28․ Faces do not exist in Faces database If there are no faces with the specified IDs in the Faces database, the Tasks service sends the request to create new faces in Faces database.

29․ The Faces service sends the request to the Faces database.

30․ The Faces database sends results of face creation.

31․ The Faces service sends the response to the Tasks service.

32․ The Tasks service sends data to Redis.

Splitting Linker task into subtasks#

There can be several workers used for task execution.

See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Linker subtasks processing#

The general workflow for processing each subtask is shown below.

Linker task processing
Linker task processing

1․ The Tasks worker receives data from Redis.

2․ The Tasks worker requests faces in Faces service according to the received face IDs.

3․ The Faces service requests faces in the Faces database.

4․ The database sends faces to Faces.

5․ The Faces service sends faces to the Tasks worker.

6․ The Tasks worker sends requests for attaching faces to the specified list.

7․ The Faces service sends requests for attaching to the database.

8․ The database sends the result of attaching to Faces.

9․ The Faces service sends the result to the Tasks worker.

10․ The Tasks worker generates a report on the basis of the subtask results and stores it in the Image Store.

11․ Image Store stores report in the storage.

12․ The storage sends result of storage. The result has its result ID.

13․ The Image Store service returns the result to the Tasks worker.

14․ The Tasks worker sends the result to Redis.

Linker task completing processing#

Next, the results of the subtasks are merged and a general report is sent to the Image Store. See "Merging results and completing processing" in the section with the general diagram of the Tasks service.

Garbage collection task diagram#

Request Description Method
garbage collection task A task for deleting attributes according to the specified creation time filter. Only attributes that are not attached to any face are deleted. The task allows to delete unused data from Faces database. POST

Garbage collection task creating#

Creating a task is performed in the standard way described in the section "Starting task creation" in the section with the general diagram of the work of the Tasks service. The only difference is that the Garbage collection task can be created not only through the API service, but also through the Admin or Tasks service.

Splitting Garbage collection task into subtasks#

Several subtasks are created for this task type. Each of the subtasks contains the range of attribute IDs. See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Garbage collection task processing#

The general workflow for processing each subtask is shown below.

Garbage collection task processing
Garbage collection task processing

1․ The Tasks worker receives data from Redis.

2․ Removing faces or face descriptors. The Tasks worker has a range of the attribute IDs that should be deleted. Tasks worker requests deletion of temporary attributes and corresponding descriptors in Faces service.

3․ The Faces service sends the request to Faces database to delete attributes with the specified IDs.

4․ The database sends results of deletion.

5․ The Faces service sends the response to the Tasks worker.

6․ Removing events or event descriptors. The Tasks worker has a range of the attribute IDs that should be deleted. The Tasks worker requests deletion of temporary attributes and corresponding descriptors in Events service.

7․ The Events service sends the request to Events database to delete attributes with the specified IDs.

8․ The database sends results of deletion.

9․ The Events sends the response to the Tasks worker.

10․ The Tasks worker stores report in Image Store.

11․ Image Store sends report to the storage.

12․ The storage sends information about storage results to Image Store.

13․ Image Store sends the result to the Tasks worker.

14․ The Tasks worker sends the result to Redis.

Garbage collection task completing processing#

Next, the results of the subtasks are merged and a general report is sent to the Image Store. See "Merging results and completing processing" in the section with the general diagram of the Tasks service.

Reporter task diagram#

Request Description Method
reporter task Create a report for a complete clustering task. The report is in the CSV format. You can specify the columns that should be added to the report. The report is in a zip archive and includes avatar images for each of the objects in a cluster. POST

Reporter task creating#

Creating a task is performed in the standard way described in the section "Starting task creation" in the section with the general diagram of the work of the Tasks service.

Splitting Reporter task into subtasks#

A single subtask is created for this task type. The subtask includes filters for the data that should be obtained for inclusion into the report. See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Reporter subtask processing#

Request processing depends on the data, stored in the clustering report. If the report was created for faces, face data will be requested from the Faces service and added to the report. If the report was created for events, event data will be requested from the Events service and added to the report.

Reporter task processing
Reporter task processing

1․ The Tasks worker receives data from Redis.

2․ The Tasks worker requests a clustering report in Image Store.

3․ The Image Store service requests the report from the storage.

4․ The storage sends the report.

5․ The Image Store service sends the report.

6․ Report for faces. The Tasks worker receives all the face IDs from the cluster and requests additional face data from the Faces service. The requested data depends on the columns specified in the request.

7․ The Faces service requests data from the Faces database.

8․ The Faces database sends the required data.

9․ The Faces service sends the data to the Tasks worker.

10․ Report for events. The Tasks worker requests additional event data from the Events service. The requested data depends on the columns specified in the request.

11․ The Events service requests data from the Events database.

12․ The Events database sends the required data.

13․ The Events service sends the data to the Tasks worker.

14․ The Tasks worker requests the avatar image for each of the faces or events. The image for a face is specified in the "avatar" field of the Faces database. It can be stored in the Image Store storage or any other storage. The image for the event is the corresponding sample from the Image Store storage.

15․ The Image Store service requests the image from the storage.

16․ The storage sends the image.

17․ The Image Store service sends the image.

18․ The Tasks worker sends report to Image Store.

19․ The Image Store service saves the report in the storage.

20․ Storage sends response about saving.

21․ The Image Store service sends response to the Tasks worker.

22․ The Tasks worker sends the result to Redis.

Reporter task completing processing#

Next, the Tasks service receives the result from Redis and updates the status of the task in the Tasks database.

Exporter task diagram#

Request Description Method
exporter task Collects event/face data and exports them to a CSV file. The exported CSV file is in a zip archive and includes avatar images for each of the objects. POST

Exporter task creating#

Creating a task is performed in the standard way described in the section "Starting task creation" in the section with the general diagram of the work of the Tasks service.

Splitting Exporter task into subtask#

A single subtask is created for this task type. See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Exporter subtask processing#

The general workflow for processing each subtask is shown below.

Exporter task processing
Exporter task processing

1․ The Tasks worker receives data from Redis.

2․ Export data for faces. The Tasks worker receives all the face IDs and requests additional face data from the Faces service. The requested data depends on the columns specified in the request.

3․ The Faces service requests data from the Faces database.

4․ The Faces database sends the required data.

5․ The Faces service sends the data to the Tasks worker.

6․ Export data for events. The Tasks worker requests additional event data from the Events service. The requested data depends on the columns specified in the request.

7․ The Events service requests data from the Events database.

8․ The Events database sends the required data.

9․ The Events service sends the data to the Tasks worker.

10․ The Tasks worker requests the avatar image for each of the faces or events. The image for a face is specified in the "avatar" field of the Faces database. It can be stored in the Image Store storage or any other storage. The image for the event is the corresponding sample from the Image Store storage.

11․ The Image Store service requests the image from the storage.

12․ The storage sends the image.

13․ The Image Store service sends the image.

14․ The Tasks worker sends exported data to Image Store.

15․ The Image Store service saves the exported data in the storage.

16․ The Storage sends response about saving.

17․ The Image Store service sends response to the Tasks worker.

18․ The Tasks worker sends the result to Redis.

Exporter task completing processing#

Next, the Tasks service receives the result from Redis and updates the status of the task in the Tasks database.

Cross-matching task diagrams#

Request Description Method
cross-matching task The task enables to match events and faces according to the specified filters. Faces and events can be set as references and as candidates. Additional filters can be specified for both references and candidates. POST

Cross-matching task creating#

Creating a task is performed in the standard way described in the section "Starting task creation" in the section with the general diagram of the work of the Tasks service.

Splitting Cross-matching task into subtasks#

A single subtask is created for this task type. The subtask includes all the required reference and candidate filters. See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Cross-matching subtasks processing#

Cross-matching subtask execution may vary depending on the specified references and candidates.

They can be specified using faces and/or events. In the diagrams below, requests to Faces and Events services are marked as alternative. The requests to Faces service are used when faces are set as references or candidates. The requests to Events service are used when events are set as references or candidates.

Cross-matching task processing depends on the objects selected (events or faces).

The general workflow for processing each subtask is shown below.

Cross-matching task processing
Cross-matching task processing

1․ The Tasks worker receives data from Redis.

2․ Faces are set as references or candidates. The Tasks worker sends the request to the Faces service to receive attribute IDs for faces. Faces are selected according to the specified filters.

3․ The Faces service requests IDs in the Faces database.

4․ The Faces database sends attribute IDs.

5․ The Faces service sends the attribute IDs to the Tasks worker.

6․ Events are set as references or candidates. The Tasks worker sends the request to the Events service to receive attribute IDs for events. Events are selected according to the specified filters.

7․ The Events service requests event IDs in the Events database.

8․ The Events database sends the IDs.

9․ The Events service sends the IDs to the Tasks worker.

10․ The Tasks worker matching request. The requests is processed according to one of the schemes described in section "Matching diagrams".

11․ The Python Matcher service sends the results.

12․ A report is sent from the Tasks worker to the Image Store.

13․ Using the Image Store, the report is saved to the storage.

14․ The repository sends a response about saving.

15․ The Image Store service sends a response to the Tasks worker.

16․ The Tasks worker sends the result to Redis.

Cross-matching task completing processing#

Next, the Tasks service receives the result from Redis and updates the status of the task in the Tasks database.

Estimator task diagram#

Request Description Method
estimator task Create an Estimator task. With this task, you can perform batch image processing with the specified policies. POST

Estimator task creating#

Creating a task is performed in the standard way described in the section "Starting task creation" in the section with the general diagram of the work of the Tasks service.

Splitting Estimator task into subtask#

A single subtask is created for this task type. See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Estimator task processing#

Estimator task diagram
Estimator task diagram

1․ The Tasks worker receives data from Redis.

2․ The Tasks worker requests a batch of images from the following sources:

  - From the Image Store service in a ZIP archive if the "zip" type is selected as the image source.
  - From the S3 storage if the "s3" type is selected as the image source.
  - From the FTP server if the "ftp" type is selected as the image source.
  - From the Samba if the "samba" type is selected as the image source.
  - From a directory on a network disk that is mounted on a local drive and synchronized with the directory in the containers of the Tasks and Tasks Worker service (see the detailed description in the ["Estimator task"](services-description.md#estimator-task) section).

3․ Processing the parameters specified in the request - authorization, use of prefixes, etc.

4․ Source (Image Store/S3/Network disk/FTP server/Samba) sends an images to the Tasks worker.

5․ handler_id exists: The Tasks worker sends a request to get the existing handler ID in the Handlers database.

6․ handler_id exists:From the Handlers database, a response comes with handler_id to the Tasks worker.

7․ handler_id exists:The Tasks worker sends a request to the Handlers service to process handler_id.

8․ handler_id exists:The Handlers service processes the batch of images by the received handler_id in accordance with the specified policies (see the event creation diagram).

9․ handler_id exists:The received response is returned to the Tasks worker.

10․ handler_id doesn't exist. The Tasks worker sends a request to process the policies specified directly in the request to the Handlers service.

11․ handler_id doesn't exist. The Handlers service processes the batch of images in accordance with the specified policies (see the event creation diagram).

12․ handler_id doesn't exist. The received response is returned to the Tasks worker.

13․ A report is sent from the Tasks worker to the Image Store.

14․ Using the Image Store, the report is saved to the storage.

15․ The repository sends a response about saving.

16․ The Image Store service sends a response to the Tasks worker.

17․ The Tasks worker sends the result to Redis.

Estimator task completing processing#

Next, the Tasks service receives the result from Redis and updates the status of the task in the Tasks database.

Additional extraction task diagram#

Request Description Method
create additional extraction task Extract all the existing descriptors using the new NN version. POST

Additional extraction task creation#

The request is sent using Admin service. Task creation is shown in the diagram below.

Additional extraction task creation
Additional extraction task creation

1․ The administrator sends a request to the Admin service using the Admin UI or any other application.

2․ The Admin service sends a request to Tasks.

3․ The Tasks service sends a request to Faces to receive descriptor IDs that were not extracted using the new NN.

4․ The Faces service sends the request to the database.

5․ The Faces database returns IDs.

6․ The Faces service sends IDs to Tasks.

7․ The Tasks service sends data about the created task to the Tasks database.

8․ The Tasks service gets a "task_id".

9․ The Tasks service sends the task ID to the Admin service.

10․ The Admin service sends the created task ID to the user interface or application used by the administrator.

11․ The Tasks service sends data to Redis, from where the Tasks worker will take the task for further processing.

Splitting Additional extraction task into subtasks#

Several subtasks are created for this task type. Each of the subtasks contains the range of attribute IDs. See "Splitting tasks into subtasks" in the section with the general diagram of the Tasks service.

Additional extraction subtask processing#

The general workflow for each subtask processing is shown below.

Additional extraction task processing
Additional extraction task processing

1․ The Tasks worker receives data from Redis.

2․ The Tasks Worker sends the request to Faces to receive all the required descriptors according to their IDs.

3․ The Faces sends the request to the DB.

4․ The DB sends back the required descriptors.

5․ The Faces sends the descriptors to the Tasks worker.

6․ The Tasks Worker sends the extraction request to the Handlers service.

7․ The Handlers service requests samples for extraction from Image Store.

8․ The Image Store requests samples from the storage.

9․ The Image Store receives samples from the storage.

10․ The Image Store sends samples to the Handlers service.

11․ The Handlers service sends request for extracting descriptors to the Remote SDK service.

12․ The Remote SDK extracts descriptors.

13․ The Remote SDK service sends extracted descriptors to Handlers service.

14․ The Handlers service sends descriptors and basic attributes to Faces.

15․ Faces sends the request to store data in the database.

16․ The database returns the result of data storage.

17․ The Faces service returns the result of data storage.

18․ The Handlers service sends the result of the task execution to the Tasks service.

19․ A report is sent from the Tasks worker to the Image Store.

20․ Using the Image Store, the report is saved to the storage.

21․ The repository sends a response about saving.

22․ The Image Store service sends a response to the Tasks worker.

23․ The Tasks worker sends the result to Redis.

Additional extraction task completing processing#

Next, the results of the subtasks are merged and a general report is sent to the Image Store. See "Merging results and completing processing" in the section with the general diagram of the Tasks service.

Tasks information diagrams#

The following requests provide information on the status of existing tasks.

Request Description Method
cancel task Cancel the tasks execution by its ID. PATCH
get tasks Receive tasks information. You can set filters for the tasks. GET
get tasks count Receive number of tasks according to the specified filters. GET
get task Receive information about a task by its ID. GET
get subtasks Receive information about all subtasks of the specified task. GET

The following requests provide information about errors occurred while processing tasks.

Request Description Method
get errors of task Receive all the errors of the specified task by the task ID. GET
get errors Receive all the errors according to the specified filters. GET
get errors count Receive the number of errors according to the specified filter. GET
get error Receive information about an error by the error ID. GET

The corresponding diagram is shown below.

Receive task information or cancel task
Receive task information or cancel task

1․ A request is sent to API.

2․ API service sends the request to Tasks to receive the required data or perform required actions.

3․ Tasks sends the request to the Tasks database to receive the required data or perform required actions.

4․ The database sends the response to Tasks.

5․ Tasks sends the response to API service.

6․ API service returns information.

Request Description Method
delete task Delete the specified task and its results. DEL
get task result Receive results of the specified task. GET
Delete task/get task result
Delete task/get task result

1․ The request is sent to API.

2․ API sends the request to Tasks to receive the required task data or delete a task.

3․ Tasks sends the request to the Tasks database to receive the required data or delete a task.

4․ The database sends the response to Tasks.

5․ Tasks sends the request to Image Store to receive or delete the results of the specified task. 6․ Image Store sends the request to the storage.

7․ The storage sends the response to Image Store.

8․ Image Store returns the result to Tasks.

9․ Tasks sends the response to API service.

10․ API service sends the response.

Lambda diagrams#

Lambda creation diagram#

This diagram describes the general process of creating lambda.

Request Response Method
create lambda Create lambda POST
Lambda creation diagram
Lambda creation diagram

General scheme of request processing:

1․ The user determines which type of lambda he will need - Handlers-lambda, Standalone-lambda, Tasks-lambda.

2․ The user writes Python code according to the type of future lambda and code requirements.

3․ The user transfers the module to the archive in accordance with archive requirements.

4․ The user performs a request "create lambda" to the API service.

5․ The API service sends a request to the Lambda service.

6․ The Lambda service sends a request for license check to the Licenses service.

7․ The Lambda service receives a response.

8․ The Lambda service writes data to the Lambda database, updating the lambda creation status to "waiting".

9․ The Lambda service receives a response.

During lambda creation, it is possible to get the status of its creation using the request "get lambda status".

10․ The Lambda service returns the "lambda_id" to the API service.

11․ The API service returns the "lambda_id" to the user.

12․ The Lambda service adds additional files and creates a new archive.

13․ The Lambda service saves a new archive in S3 storage.

Image creation

14․ The Lambda service sends an image creation request to Kubernetes.

15․ The Lambda service updates the image creation status to "in_progress".

16․ The Lambda service starts monitoring the status of image creation in Kubernetes.

The user can get the status and logs of image creation using the requests "get lambda image creation status" and "get lambda image creation logs".

17․ Kubernetes gets the archive from S3 storage.

18․ Kubernetes creates an image.

19․ Kubernetes publishes the image in Docker registry.

S3, Docker registry and Kubernetes settings are set in the Lambda service settings.

20․ The Lambda service records the publication of the image.

21․ The Lambda service updates the image creation status to "completed".

Creation of service in Kubernetes cluster

22․ The Lambda service sends a request to create a lambda in the Kubernetes cluster.

23․ The Lambda service starts monitoring the status of lambda creation in Kubernetes.

During lambda creation, it is not possible to get the status and logs of image creation. The lambda creation status and logs can be obtained using the requests "get lambda status" and "get lambda logs".

24․ Kubernetes gets the image from Docker registry.

25․ Kubernetes expands the resulting image.

26․ The Lambda service captures the state of the deployed image.

27․ The Lambda service updates the lambda status to "running" in the Lambda database.

28․ The Lambda service receives a response.

Then you can start sending lambda requests.

Lambda processing diagram#

This diagram describes the general lambda processing process. The lambda processing process depends on its type - Standalone-lambda, Handlers-lambda or Tasks-lambda.

Processing of the Tasks-lambda type is not reflected in this diagram. It is performed according to the general task creation process, except that the Lambda service is used instead of the Tasks worker.

For Handlers-lambda and Standalone-lambda types, you can run "proxy post request to lambda" requests for direct interaction with Kubernetes. If Handlers-lambda can mimic the response of a classic handler, then you can perform the "generate events" request.

Lambda processing diagram
Lambda processing diagram

General request processing diagram:

Handlers-lambda

1․ The user performs a "create handler" request to the API service, specifying "handler_type" = "2" and the resulting "lambda_id" (see "Lambda creation diagram").

2․ The API service sends a request to the Handlers service.

3․ The Handlers service performs a healthcheck of the lambda deployed in Kubernetes and creates a handler.

4․ The Handlers service receives a response.

5․ The API service receives the "handler_id".

6․ The user receives the "handler_id".

Request to generate an event

7․ The user performs a "generate events" request to the API service, specifying the "handler_id" received in the previous step.

8․ The API service sends a request to the Handlers service.

9․ The Handlers service sends a request for processing to Kubernetes.

10․ The Kubernetes cluster processes user code.

During the operation of Handlers-lambda, interaction with some LUNA PLATFORM services will be performed. See the section "Handlers-lambda" for more information.

11․ The Handlers service receives a response.

12․ The API service receives a response.

13․ The user receives a response.

Proxy request

14․ The user performs a "proxy pass request to lambda" request to the API service.

15․ The API service sends a request to the Lambda service.

16․ The Lambda service sends a request for processing to Kubernetes.

17․ The Kubernetes cluster processes user code.

During the operation of Handlers-lambda, interaction with some LUNA PLATFORM services will be performed. See the section "Handlers-lambda" for more information.

18․ The Lambda service receives a response.

19․ The API service receives a response.

20․ The user receives a response.

Proxy request for Standalone-lambda

21․ The user sends a "proxy pass request to lambda" request to the API service.

22․ The API service sends a request to the Lambda service.

23․ The Lambda service sends a request for processing to Kubernetes.

24․ The Kubernetes cluster processes user code.

During the operation of Handlers-lambda, interaction with some LUNA PLATFORM services will be performed. See the section "Standalone-lambda" for more information.

25․ The Lambda service receives a response.

26․ The API service receives a response.

27․ The user receives a response.