Performing Offline OneShotLiveness estimation#
With LUNA ID, you can perform liveness estimation directly on your device. Unlike Online OneShotLiveness estimation, which sends requests to the LUNA PLATFORM 5 /liveness
endpoint, Offline OneShotLiveness estimation operates locally, ensuring faster processing and reduced dependency on backend services.
This feature allows you to determine whether the person in the image is a living individual or a spoof (for example, a photograph or mask).
In LUNA ID for Android#
To perform Offline OneShotLiveness estimation:
1․ Add the required dependency.
Add the appropriate dependency to your build.gradle file based on your device's architecture. This dependency includes the neural networks required for Offline OneShotLiveness estimation.
- For ARM architecture:
implementation("ai.visionlabs.lunaid:oslm-arm:X.X.X@aar")
- For x86 architecture:
implementation("ai.visionlabs.lunaid:oslm-x86:X.X.X@aar")
2․ Specify the estimation type in LunaConfig
:
LunaConfig.create(
livenessType = LivenessType.Offline
)
3․ Specify the neural networks to be used for the estimation by using the LunaConfig.livenessNetVersion
parameter. This parameter is of type LivenessNetVersion
and supports two values:
Value | Description |
---|---|
V3_AND_V4 |
Default. Loads both neural network models: |
V4 |
Loads only the oneshot_rgb_liveness_v8_model_4_device.plan model. Recommended for devices with lower performance. |
Important: After changing the
livenessNetVersion
parameter, restart the final application for the changes to take effect.
LunaConfig.create(
livenessType = LivenessType.Offline,
livenessNetVersion = LivenessNetVersion.V3_AND_V4
)
In LUNA ID for iOS#
To perform Offline OneShotLiveness estimation:
1․ Make sure that you have the following .plan files in your deployment directory:
- fsdk.framework/data/oneshot_rgb_liveness_v8_model_3_arm.plan
- fsdk.framework/data/oneshot_rgb_liveness_v8_model_4_arm.plan
2․ Enable the estimation:
configuration.bestShotConfiguration.livenessType = LivenessType.Offline