Download OpenAPI specification:Download
Luna Human Tracking Analytic used for tracking people face or/and body on a video with 
subsequent dispatch tracking results to handler for generating event according to handler policies. 
Before being formed, tracking results are filtered with specified filters and sent to handlers 
according to the event policy setting. Handler can use raw images (raw_images) with detections or face/body 
warps (samples) as sources for event generation. Note that proper analytic targets and parameters must be 
set to provide the handler with the selected source type, otherwise there will be no data to send 
and events won't be generated. After events are successfully generated, they will be sent callbacks. 
In addition, tracking results can be received by subscription to web sockets.
Analytic parameters schema and example. Applicable to use as analytics.parameters of create stream
request in Luna-Video-Manager
| targets | Array of strings  Default:  ["face_detection"] Items Enum: "face_detection" "body_detection" "head_pose" "ags" "aggregated_face_samples" "aggregated_body_samples"  Estimations to perform on the video. **Note that targets must correspond with selected detector type | 
| Array of any (callback_human_tracking)   <= 10 items  Callbacks parameters. 
 
 
 | |
| object (human_tracking_analytic_parameters)  | 
{- "parameters": {- "probe_count": 2
 },
- "targets": [- "face_detection",
- "body_detection"
 ]
}Callback request schema and example.
| account_id | string Account id of event. | 
| event_create_time | string Event creation time. | 
| event_end_time | string Event end time. | 
| event_type | string  Value: "human_tracking"  Event type. | 
| event | object (handlers_reply_event)  Event received after detections processing by handler. Event contains information about face and/or body of a 
single person. See  | 
{- "account_id": "string",
- "event_create_time": "string",
- "event_end_time": "string",
- "event_type": "human_tracking",
- "event": { }
}| stream_id required | string <uuid>  (stream_id) ^[a-f0-9]{8}-[a-f0-9]{4}-[a-f0-9]{4}-[a-f0-9]...  Example:  stream_id=557d54ec-29ad-4f3c-93b4-c9092ef12515 Stream ID for events you need to subscribe to. | 
| account_id | string <uuid>  (account_id) ^[a-f0-9]{8}-[a-f0-9]{4}-[a-f0-9]{4}-[a-f0-9]...  Example:  account_id=557d54ec-29ad-4f3c-93b4-c9092ef12515 Account id of stream owner. | 
import asyncio import websockets async def main(): uri = "ws://127.0.0.1:5240/1/ws?stream_id=70c75b04-1ec8-4aa1-930c-ddad9e47b54b" # Note filtering by gender! async with websockets.connect(uri) as websocket: while True: data = await websocket.recv() print(data) asyncio.run(main())
{- "stream_status": "in_progress",
- "error": null,
- "analytics_results": {- "human_tracking": {- "track_id": "557d54ec-29ad-4f3c-93b4-c9092ef12515",
- "event_id": "557d54ec-29ad-4f3c-93b4-c9092ef12515",
- "event_status": "started",
- "track_status": "started",
- "estimated": {- "face": {- "detections": {- "rect": {- "x": 1,
- "y": 1,
- "width": 100,
- "height": 100
 },
- "score": 0.9
 },
- "head_pose": {- "pitch": 0.1,
- "roll": 0.1,
- "yaw": 0.1
 },
- "ags": 0.5
 }
 }
 }
 }
}