Sending multiple frames for estimation aggregation to the backend#
In LUNA ID, you can send multiple frames to the backend for aggregation. This capability is essential for certain resource-intensive estimations performed in LUNA PLATFORM 5, such as DeepFake Detection and OneShotLiveness.
In LUNA ID for Android#
Getting multiple frames#
To enable the acquisition of multiple frames:
1․ Set the multipartBestShotsEnabled parameter of LunaConfig to true.
2․ Specify the number of best shots to be returned by setting the LunaConfig.bestShotsCount parameter. The valid range of values for bestShotsCount is from 1 to 10.
3․ Get the list of best shots by subscribing to the BestShotsFound event. Use the bestShots Flow to collect this list.
Structure of BestShotsFound:
data class BestShotsFound(
val bestShots: List<BestShot>?
) : Event()
Usage example:
LunaID.bestShots.filterNotNull().onEach { bestShotsList ->
Log.e(TAG, "bestShots: ${bestShotsList.bestShots}")
}.launchIn(viewModelScope)
This Flow continuously gets a list of best shots as they are detected during the session.
<!-- 3․ Get the list of best shots using the bestShots: List<BestShot>? field of the data class FinishSuccessData class:
data class FinishSuccessData(
val bestShot: BestShot,
val bestShots: List<BestShot>?,
val videoPath: String?
)
``` -->
> **Important:** If `multipartBestShotsEnabled` is set to `false`, the `bestShots` field will be returned as `null`.
### Implementing online aggregation
To implement online aggregation for resource-intensive estimations:
1․ Use the `apiEventsStaticHandler` method of the `ApiHuman` class.
```kotlin
fun apiEventsStaticHandler(
query: StaticEventRequest,
consumer: Consumer<Result<EventGenerateResponse>>,
)
The method generates and sends an HTTP request that returns the EventGenerateResponse object. This object contains information about aggregated DeepFake and OneShotLiveness estimations.
2․ Use the StaticEventRequest class, which represents a request model:
class StaticEventRequest(
override val handlerId: String,
override val extraHeaders: Map<String, String> = emptyMap(),
override val externalId: String? = null,
override val userData: String? = null,
override val imageType: Int? = null,
override val aggregateAttributes: Int? = null,
override val source: String? = null,
override val tags: List<String>? = null,
override val trackId: String? = null,
override val useExifInfo: Int? = null,
val requestBody: RequestBody
) : AbsEventRequest(
handlerId,
extraHeaders,
externalId,
userData,
imageType,
aggregateAttributes,
source,
tags,
trackId,
useExifInfo,
)
3․ Get results of aggregated estimations with the data class EventGenerateResponse object:
// Getting the aggregated OneShotLiveness estimation
eventGenerateResponse().aggregateEstimations?.face?.attributes?.liveness
// Getting the aggregated DeepFake estimation
eventGenerateResponse().aggregateEstimations?.face?.attributes?.deepfake
In LUNA ID for iOS#
Getting multiple frames#
To enable multiple frame acquisition:
1․ Set the multipartBestShotsEnabled to true. You will receive several best shots instead of one through the following method:
func multipartBestShots(_ bestShots: [LCBestShot], _ videoFile: String?)
Note that the method previously used to get a single best shot will no longer be called:
func bestShot(_ bestShot: LunaCore.LCBestShot, _ videoFile: String?)
2․ Specify the number of best shots to be returned by setting the numberOfBestShots parameter.
Getting aggregated data#
To obtain aggregated OneShotLiveness and DeepFake estimation data, execute the following query:
generateEvents(handlerID: String, query: EventQuery, handler: @escaping (Result<EventsResponse, Error>) -> Void)
Query parameters:
| Parameter | Description |
|---|---|
handlerID |
Your custom handler. |
query |
An array of received images. Set the following values: imageType = .rawImageaggregateAttributes = true |
The aggregated data will be available in the aggregateEstimations section in the query response.