LUNA ID for Android API reference#
Applies to LUNA ID for Android only.
-
LUNA ID initialization
-
Main classes
-
Configuration
-
Working with the camera
-
Working with Luna Web API v6
-
Events and result processing
-
Usage examples
-
Additional methods
-
Error processing
LUNA ID initialization#
Before using any LUNA ID features, you need to initialize the engine with a valid configuration.
Main initialization method#
Use the initEngine() method to initialize the LUNA ID with your desired configuration.
fun initEngine(
context: Context,
lunaConfig: LunaConfig,
apiHumanConfig: ApiHumanConfig? = null,
license: File? = null,
timeoutMillis: Long = 30_000L
)
| Parameter | Description |
|---|---|
context: Context |
Application context. |
lunaConfig: LunaConfig |
LUNA ID configuration. |
apiHumanConfig: ApiHumanConfig? |
Optional configuration for Luna Web API. |
license: File? |
License file. |
timeoutMillis: Long |
Initialization timeout in milliseconds (by default, 30 seconds). |
Example:
val lunaConfig = LunaConfig.create(
livenessType = LivenessType.Offline,
bestShotsCount = 1
)
LunaID.initEngine(
context = applicationContext,
lunaConfig = lunaConfig
)
Monitoring initialization status#
val engineInitStatus: StateFlow<EngineInitStatus>
Initialization states:
| Status | Description |
|---|---|
EngineInitStatus.NotInitialized |
LUNA ID is not initialized yet. |
EngineInitStatus.InProgress |
Initialization is in progress. |
EngineInitStatus.Success |
Initialization completed successfully. |
EngineInitStatus.Failure(cause: Throwable?) |
Initialization failed. |
Example:
lifecycleScope.launch {
LunaID.engineInitStatus.collect { status ->
when (status) {
is LunaID.EngineInitStatus.Success -> {
// SDK is ready for use
}
is LunaID.EngineInitStatus.Failure -> {
// Error handling
}
else -> { /* processing other statuses */ }
}
}
}
Getting the device fingerprint#
The getFingerprint() method returns a unique device identifier that can be used for license activation.
fun getFingerprint(app: Application): String?
Resetting the license cache#
The resetLicenseCache() method clears all locally stored license data from the device.
suspend fun resetLicenseCache(context: Context): Boolean
| Return value | Description |
|---|---|
true |
One or more license cache fines were successfully deleted. |
false |
No cache file were found or could be deleted. |
Main classes#
LunaID#
LunaID is the main LUNA ID object, providing all the core methods and properties.
Main properties#
| Property | Description |
|---|---|
engineInitStatus: StateFlow<EngineInitStatus> |
Initialization status. |
bestShot: MutableStateFlow<Event.BestShotFound?> |
Best face shot. |
bestShots: MutableStateFlow<Event.BestShotsFound?> |
List of best shots. |
videoRecordingResult: MutableStateFlow<Event.VideoRecordingResult?> |
Video recording result. |
eventChannel: Channel<Event> |
Event channel. |
faceDetectionChannel: Channel<Effect.FaceDetected> |
Face detection channel. |
errorFlow: Flow<Effect.Error?> |
Error stream. |
apiHuman: ApiHuman? |
API for working with the Human API. |
BestShot#
The BestShot class describes the best face shot.
data class BestShot(
val ags: Float, // Image quality rating (0.0 - 1.0)
al detection: Rect, // Face region in the image
val image: Bitmap, // Image
val warp: Bitmap // Normalized face image
)
InteractionFrame#
The InteractionFrame class represents a user interaction frame, including an interaction type and the associated visual frame.
data class InteractionFrame(
val interactionType: Int, // Interaction type
val image: Bitmap // Frame image
)
Configuration#
LunaConfig#
LUNA ID configuration class. Created using the LunaConfig.create() method.
Detection parameters#
| Parameter | Type | Default value | Description |
|---|---|---|---|
detectorStep |
Int |
7 | Number of frames between full detections [0..30]. |
skipFrames |
Int |
36 | Number of frames to wait when no detection is found [0..50]. |
minimalTrackLength |
Int |
1 | Minimum number of detections for a track [0..10]. |
minFaceSize |
Int |
50 | Minimum face size in pixels [20..350]. |
Best shot parameters#
| Parameter | Type | Default value | Description |
|---|---|---|---|
bestShotsCount |
Int |
1 | Number of best shots [1..10]. |
bestShotInterval |
Long |
500 | Interval between shots in milliseconds [0..10000]. |
foundFaceDelayMs |
Long |
0 | Delay after face detection before starting best shot search. |
Quality parameters#
| Parameter | Type | Default value | Description |
|---|---|---|---|
headPitch |
Float |
25.0 | Head tilt up/down threshold in degrees [0..45]. |
headYaw |
Float |
25.0 | Head roll left/right threshold in degrees [0..45]. |
headRoll |
Float |
25.0 | Head tilt left/right threshold in degrees [0..45]. |
ags |
Float |
0.5 | Image quality assessment threshold [0.0..1.0]. |
blurThreshold |
Float |
0.61 | Blur threshold [0.01..1.0]. |
lightThreshold |
Float |
0.57 | Lightness threshold [0.01..1.0]. |
darknessThreshold |
Float |
0.5 | Darkness threshold [0.01..1.0]. |
livenessQuality |
Float |
0.5 | Liveness quality threshold [0.0..1.0]. |
Liveness parameters#
| Parameter | Default value | Description |
|---|---|---|
livenessType: LivenessType |
LivenessType.None |
OneShotLiveness estimation type. |
LivenessType.None |
N/A | No OneShotLiveness estimation. |
LivenessType.Online |
N/A | Online OneShotLiveness estimation on the server. |
LivenessType.Offline |
N/A | Offline OneShotLiveness estimation on the device. |
livenessFormat: CompressFormat |
CompressFormat.JPEG |
Image format for OneShotLiveness estimation. |
livenessCompressionQuality: Int |
50 | Compression quality [0..100]. |
livenessNetVersion: LivenessNetVersion |
LivenessNetVersion.MOBILE |
OneShotLiveness estimation network version. |
faceSimilarityThreshold: Float |
0.5 | Face similarity threshold [0.01..1.0]. |
Face acceptance parameters#
| Parameter | Type | Default value | Description |
|---|---|---|---|
acceptOneEyed |
Boolean |
false |
Accept faces with one eye. |
acceptEyesClosed |
Boolean |
false |
Accept faces with closed eyes. |
acceptOccludedFaces |
Boolean |
true |
Accept partially occluded faces. |
acceptMask |
Boolean |
true |
Accept masked faces. |
glassesChecks |
Set<GlassesCheckType> |
empty set | Glasses estimation. |
Other parameters#
| Parameter | Type | Default value | Description |
|---|---|---|---|
interactionDelayMs |
Long |
0 | Delay between interactions. |
usePrimaryFaceTracking |
Boolean |
true |
Track only the primary face. |
multipartBestShotsEnabled |
Boolean |
false |
Enable collection of multiple best shots. |
savingInteractionFrames |
Boolean |
false |
Save interaction frames. |
strictlyMinSize |
Boolean |
false |
Strict minimum size check. |
usePrimaryFaceTracking |
Int |
V60 |
Descriptor version. |
useDescriptors |
Boolean |
true |
Use descriptors. |
Configuration creation example#
val config = LunaConfig.create(
livenessType = LivenessType.Offline,
bestShotsCount = 3,
headPitch = 20f,
headYaw = 20f,
acceptMask = false,
multipartBestShotsEnabled = true
)
Working with the camera#
Launching the camera#
@JvmOverloads
fun showCamera(
context: Context,
params: ShowCameraParams = ShowCameraParams(),
interactions: Interactions = Interactions(),
commands: Commands = Commands()
)
| Parameter | Description |
|---|---|
context: Context |
Context for launching the camera activity. |
params: ShowCameraParams |
Camera display parameters. |
interactions: Interactions |
List of user interactions. |
commands: Commands |
Camera control commands. |
Example#
val params = ShowCameraParams(
recordVideo = false,
checkSecurity = true,
minFaceSideToMinScreenSide = 0.3f
)
val interactions = Interactions.Builder()
.addInteraction(BlinkInteraction(timeoutMs = 5000))
.addInteraction(YawLeftInteraction(timeoutMs = 5000))
.build()
LunaID.showCamera(
context = this,
params = params,
interactions = interactions
)
ShowCameraParams#
Camera display options.
| Parameter | Default value | Description |
|---|---|---|
disableErrors: Boolean |
true |
Disables error display. |
logToFile: Boolean |
false |
Log to a file. |
recordVideo: Boolean |
false |
Record a video. |
recordingTimeMillis: Long |
0 | Video recording duration, in milliseconds. |
ignoreVideoWithoutFace: Boolean |
true |
Ignore video without a face. |
borderDistanceStrategy: BorderDistancesStrategy |
N/A | Border distance strategy. |
autoFocus: Boolean |
true |
Autofocus. |
checkSecurity: Boolean |
false |
Device security check. |
minFaceSideToMinScreenSide: Float |
0.3f |
Minimum face-to-screen ratio [0.0..1.0]. |
cameraSelector: CameraSelector |
Front camera | Camera selector. |
previewResolutionSelector: ResolutionSelector |
N/A | Preview resolution selector. |
analysisResolutionSelector: ResolutionSelector |
N/A | Analysis resolution selector. |
videoQualitySelector: QualitySelector |
N/A | Video quality selector. |
Example#
val params = ShowCameraParams(
recordVideo = true,
recordingTimeMillis = 5000,
checkSecurity = false,
minFaceSideToMinScreenSide = 0.25f,
cameraSelector = CameraSelector.DEFAULT_FRONT_CAMERA
)
Interactions#
List of user interactions for Dynamic Liveness estimation.
Interaction types#
1․ BlinkInteraction - blink
BlinkInteraction(
timeoutMs: Int = 5000,
acceptOneEyed: Boolean = false
)
2․ YawLeftInteraction - turn head left
YawLeftInteraction(
timeoutMs: Int = 5000,
startAngleDeg: Int = 10,
endAngleDeg: Int = 30
)
3․ YawRightInteraction - turn head right
YawRightInteraction(
timeoutMs: Int = 5000,
startAngleDeg: Int = 10,
endAngleDeg: Int = 30
)
4․ PitchUpInteraction - tilt head up
PitchUpInteraction(
timeoutMs: Int = 5000,
startAngleDeg: Int = 5,
endAngleDeg: Int = 20
)
5․ PitchDownInteraction - tilt head down
PitchDownInteraction(
timeoutMs: Int = 5000,
startAngleDeg: Int = 5,
endAngleDeg: Int = 20
)
Example#
val interactions = Interactions.Builder()
.addInteraction(BlinkInteraction(timeoutMs = 5000))
.addInteraction(YawLeftInteraction(timeoutMs = 5000))
.addInteraction(YawRightInteraction(timeoutMs = 5000))
.build()
Commands#
Camera control commands. The following commands are available:
| Command | Description |
|---|---|
CloseCameraCommand |
Closes the camera. |
StartBestShotSearchCommand |
Starts the best shot search. |
Example#
val commands = Commands.Builder()
.override(StartBestShotSearchCommand)
.build()
Sending a command#
LunaID.sendCommand(CloseCameraCommand)
API for working with Luna Web API v6#
ApiV6#
Interface for working with Luna Web API v6. An instance can be obtained via LunaID.apiHuman?.apiV6.
Extract (Create a user)#
Extract a descriptor from an image#
fun extract(
query: ExtractQuery,
consumer: Consumer<Result<ExtractResponse>>
)
ExtractQuery:
data class ExtractQuery(
val userData: String, // User data
val externalId: String, // External user ID
val warp: Boolean, // Use normalized image
val photo: ByteArray // Image
)
ExtractResponse:
data class ExtractResponse(
val personId: String, // ID of the created user
val errorCode: Int = -1, // Error code
val detail: String = "" // Error description
)
Extract a descriptor from a descriptor#
fun extract(
query: ExtractByDescriptorQuery,
consumer: Consumer<Result<ExtractResponse>>
)
ExtractByDescriptorQuery:
data class ExtractByDescriptorQuery(
val userData: String, // User data
val externalId: String, // User external ID
val descriptor: ByteArray // Descriptor
)
Example:
val query = ExtractQuery(
userData = "John Doe",
externalId = "user123",
warp = true,
photo = bestShot.image.toByteArray()
)
apiV6.extract(query) { result ->
when (result) {
is Result.Success -> {
val personId = result.data.personId
// Handle a successful result
}
is Result.Error -> {
// Error handling
}
}
}
Verify (Verification)#
Image verification#
fun verify(
query: VerifyQuery,
consumer: Consumer<Result<VerifyResponse>>
)
VerifyQuery:
data class VerifyQuery(
val userId: String, // User ID
val warp: Boolean, // Use normalized image
val photo: ByteArray // Image
)
VerifyResponse:
data class VerifyResponse(
val images: List<Image> // List of results
)
data class Image(
val error: LunaWebErrorDescription,
val detections: Detections
)
data class Detections(
val faceDetections: List<FaceDetection>
)
data class FaceDetection(
val verifications: List<Verification>
)
data class Verification(
val status: Boolean, // Verification result
val similarity: Double, // Similarity level
val faceVerification: FaceVerification
)
Verification by descriptor#
fun verify(
query: VerifyByDescriptorQuery,
consumer: Consumer<Result<VerifyResponse>>
)
VerifyByDescriptorQuery:
data class VerifyByDescriptorQuery(
val userId: String, // User ID
val descriptor: ByteArray // Descriptor
)
Example:
val query = VerifyQuery(
userId = "user123",
warp = true,
photo = bestShot.image.toByteArray()
)
apiV6.verify(query) { result ->
when (result) {
is Result.Success -> {
val verified = result.data.images
.flatMap { it.detections.faceDetections }
.flatMap { it.verifications }
.any { it.status }
// Process the result
}
is Result.Error -> {
// Handle the error
}
}
}
Identify (Identification)#
Identification by image#
fun identify(
query: IdentifyQuery,
consumer: Consumer<Result<IdentifyResponse>>
)
IdentifyQuery:
data class IdentifyQuery(
val warp: Boolean, // Use Normalized image
val photo: ByteArray // Image
)
IdentifyResponse:
data class IdentifyResponse(
val images: List<Image> // List of results
)
Identification by descriptor#
fun identify(
query: IdentifyByDescriptorQuery,
consumer: Consumer<Result<IdentifyResponse>>
)
IdentifyByDescriptorQuery:
data class IdentifyByDescriptorQuery(
val descriptor: ByteArray // Descriptor
)
Example:
val query = IdentifyQuery(
warp = true,
photo = bestShot.image.toByteArray()
)
apiV6.identify(query) { result ->
when (result) {
is Result.Success -> {
val candidates = result.data.images
.flatMap { it.detections.faceDetections }
.flatMap { it.identifications }
.flatMap { it.candidates }
// Processing candidates
}
is Result.Error -> {
// Error handling
}
}
}
Liveness#
fun liveness(
query: PredictLivenessQuery,
consumer: Consumer<Result<LivenessResponseLegacy>>
)
PredictLivenessQuery:
data class PredictLivenessQuery(
val photos: List<ByteArray>, // List of images
val meta: Meta = Meta(), // Device metadata
val aggregate: Boolean = true // Aggregate results
)
LivenessResponseLegacy:
data class LivenessResponseLegacy(
val images: List<ImagesResult>, // Results for each image
val aggregateEstimation: AggregateEstimation? // Aggregated result
)
data class ImagesResult(
val filename: String,
val status: Int, // 0 - error, 1 - success
val predictionResult: PredictionResult?, // Validation result
val description: LivenessError // Error description
)
data class PredictionResult(
val prediction: Int, // 1 - alive, 0 - not alive
val estimations: Estimation
)
data class Estimation(
val probability: Float, // Probability of a living person [0.0..1.0]
val quality: Float // Estimate quality [0.0..1.0]
)
Example:
val photos = listOf(
bestShot1.image.toByteArray(),
bestShot2.image.toByteArray(),
bestShot3.image.toByteArray()
)
val query = PredictLivenessQuery(
photos = photos,
aggregate = true
)
apiV6.liveness(query) { result ->
when (result) {
is Result.Success -> {
val isAlive = result.data.aggregateEstimation
?.predictionResult
?.prediction == 1
// Process the result
}
is Result.Error -> {
// Error handling
}
}
}
Removing callbacks#
fun removeCallbacks()
Removes all registered callbacks.
Events and result processing#
Events#
Events are available via LunaID.eventChannel and various StateFlows.
Event types#
| Event | Description |
|---|---|
Started |
Camera started. |
LivenessCheckStarted |
OneShotLiveness estimation started. |
InteractionStarted(type: Int) |
Dynamic Liveness interaction started. |
InteractionEnded(type: Int) |
Dynamic Liveness interaction ended. |
InteractionTimeout(type: Int) |
Dynamic Liveness interaction timed out. |
InteractionFailed |
Dynamic Liveness interaction failed. |
LivenessCheckFailed |
OneShotLiveness estimation failed. |
LivenessCheckError(cause: Throwable?) |
OneShotLiveness estimation error. |
UnknownError(cause: Throwable?) |
Unknown error. |
BestShotFound |
Best shot found.data class BestShotFound( |
BestShotsFound |
Best shots found.data class BestShotsFound( |
VideoRecordingResult |
Video recording result.data class VideoRecordingResult( |
FaceFound |
Face detected. |
SecurityCheck.Success |
Security check passed. |
SecurityCheck.Failure |
Security check failed. |
CameraPermissionDenied |
Denied camera permission. |
FatalError(e: Exception) |
Critical error. |
Event handling example#
lifecycleScope.launch {
LunaID.eventChannel.consumeAsFlow().collect { event ->
when (event) {
is LunaID.Event.Started -> {
// Camera launched
}
is LunaID.Event.BestShotFound -> {
val bestShot = event.bestShot
val videoPath = event.videoPath
val frames = event.interactionFrames
// Processing the result
}
is LunaID.Event.LivenessCheckFailed -> {
// Liveness check failed
}
else -> { /* handle other events */ }
}
}
}
Effects (Effects)#
Effects are available via LunaID.faceDetectionChannel and LunaID.errorFlow.
Effect types#
| Effect type | Description |
|---|---|
FaceDetected(data: RectF) |
Face detected. data class FaceDetected(val data: RectF) |
Error(error: DetectionError) |
Detection error. data class Error(val error: DetectionError) |
Processing example#
lifecycleScope.launch {
LunaID.faceDetectionChannel.consumeAsFlow().collect { effect ->
when (effect) {
is LunaID.Effect.FaceDetected -> {
val faceRect = effect.data
// Update UI with face region
}
}
}
}
lifecycleScope.launch {
LunaID.errorFlow.collect { error ->
error?.let {
when (it.error) {
DetectionError.FaceLost -> {
// Face lost
}
DetectionError.FaceDetectSmall -> {
// Face too small
}
DetectionError.PrimaryFaceLost -> {
// Primary face lost
}
// Handle other errors
}
}
}
}
StateFlow for results#
bestShot#
lifecycleScope.launch {
LunaID.bestShot.collect { bestShotFound ->
bestShotFound?.let {
val bestShot = it.bestShot
// Process the best shot
}
}
}
bestShots#
lifecycleScope.launch {
LunaID.bestShots.collect { bestShotsFound ->
bestShotsFound?.let {
val bestShots = it.bestShots
// Process the list of best shots
}
}
}
videoRecordingResult#
lifecycleScope.launch {
LunaID.videoRecordingResult.collect { result ->
result?.let {
if (it.uri != null) {
// Video recorded successfully
} else {
// Video recording error
}
}
}
}
Usage examples#
Basic example: Face capture and verification#
class MainActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
// SDK initialization
val config = LunaConfig.create(
livenessType = LivenessType.Offline,
bestShotsCount = 1
)
LunaID.initEngine(
context = applicationContext,
lunaConfig = config
)
// Waiting for the initialization
lifecycleScope.launch {
LunaID.engineInitStatus
.filter { it is LunaID.EngineInitStatus.Success }
.first()
// Camera launch
val params = ShowCameraParams(
checkSecurity = true,
minFaceSideToMinScreenSide = 0.3f
)
val interactions = Interactions.Builder()
.addInteraction(BlinkInteraction(timeoutMs = 5000))
.build()
LunaID.showCamera(
context = this@MainActivity,
params = params,
interactions = interactions
)
}
// Result handling
lifecycleScope.launch {
LunaID.bestShot.collect { bestShotFound ->
bestShotFound?.let {
val bestShot = it.bestShot
// Performing verification
verifyUser(bestShot)
}
}
}
}
private fun verifyUser(bestShot: BestShot) {
val apiV6 = LunaID.apiHuman?.apiV6 ?: return
val query = VerifyQuery(
userId = "user123",
warp = true,
photo = bestShot.image.toByteArray()
)
apiV6.verify(query) { result ->
when (result) {
is Result.Success -> {
val verified = result.data.images
.flatMap { it.detections.faceDetections }
.flatMap { it.verifications }
.any { it.status }
if (verified) {
// The user has been verified
} else {
// Verification failed
}
}
is Result.Error -> {
// Error handling
}
}
}
}
}
Example: Registering a new user#
private fun registerUser(bestShot: BestShot) {
val apiV6 = LunaID.apiHuman?.apiV6 ?: return
val query = ExtractQuery(
userData = "John Doe",
externalId = "user123",
warp = true,
photo = bestShot.image.toByteArray()
)
apiV6.extract(query) { result ->
when (result) {
is Result.Success -> {
val personId = result.data.personId
// User registered with ID = personId
}
is Result.Error -> {
// Handling registration error
}
}
}
}
Example: User identification#
private fun identifyUser(bestShot: BestShot) {
val apiV6 = LunaID.apiHuman?.apiV6 ?: return
val query = IdentifyQuery(
warp = true,
photo = bestShot.image.toByteArray()
)
apiV6.identify(query) { result ->
when (result) {
is Result.Success -> {
val candidates = result.data.images
.flatMap { it.detections.faceDetections }
.flatMap { it.identifications }
.flatMap { it.candidates }
.sortedByDescending { it.similarity }
if (candidates.isNotEmpty()) {
val bestMatch = candidates.first()
// Found user with ID = bestMatch.personId
// Similarity = bestMatch.similarity
} else {
// User not found
}
}
is Result.Error -> {
// Error handling
}
}
}
}
Example: OneShotLiveness estimation with multiple frames#
private fun checkLiveness(bestShots: List<BestShot>) {
val apiV6 = LunaID.apiHuman?.apiV6 ?: return
val photos = bestShots.map { it.image.toByteArray() }
val query = PredictLivenessQuery(
photos = photos,
aggregate = true
)
apiV6.liveness(query) { result ->
when (result) {
is Result.Success -> {
val aggregate = result.data.aggregateEstimation
val isAlive = aggregate?.predictionResult?.prediction == 1
val probability = aggregate?.predictionResult?.estimations?.probability ?: 0f
if (isAlive && probability > 0.8f) {
// The user is alive, high probability
} else {
// Liveness estimation failed
}
}
is Result.Error -> {
// Error handling
}
}
}
}
Example: Process all events#
private fun setupEventHandlers() {
lifecycleScope.launch {
LunaID.eventChannel.consumeAsFlow().collect { event ->
when (event) {
is LunaID.Event.Started -> {
Log.d("LunaID", "Camera launched")
}
is LunaID.Event.LivenessCheckStarted -> {
Log.d("LunaID", "Liveness estimation started")
}
is LunaID.Event.InteractionStarted -> {
val message = Interaction.message(
context = this@MainActivity,
type = event.type
)
Log.d("LunaID", "Interaction started: $message")
// Show hint to a user
}
is LunaID.Event.InteractionEnded -> {
Log.d("LunaID", "Interaction completed")
}
is LunaID.Event.InteractionTimeout -> {
Log.w("LunaID", "Interaction timeout expired")
// Show an error message
}
is LunaID.Event.BestShotFound -> {
Log.d("LunaID", "Best shot found")
handleBestShot(event.bestShot)
}
is LunaID.Event.LivenessCheckFailed -> {
Log.w("LunaID", "Liveness estimation failed")
}
is LunaID.Event.SecurityCheck.Success -> {
Log.d("LunaID", "Liveness estimation passed")
}
is LunaID.Event.SecurityCheck.Failure -> {
Log.e("LunaID", "Security check failed")
}
is LunaID.Event.FatalError -> {
Log.e("LunaID", "Critical error", event.e)
}
else -> { /* other events */ }
}
}
}
}
Additional methods#
Log dump#
fun dumpLogs(
context: Context,
outputFile: File = File(context.filesDir, "logs.logcat")
)
Saves application logs to a file.
Saving a performance report#
fun saveToInternalStorage(context: Context, filename: String, content: String)
Saves the performance report to internal storage.
Getting the SDK version#
fun getVersion(): String
Returns the SHA-256 hash of the SDK version.
Error processing#
DetectionError#
Types of detection errors:
| Error | Description |
|---|---|
PrimaryFaceLostCritical |
Critical loss of the primary face. |
PrimaryFaceLost |
Loss of the primary face. |
FaceLost |
Face lost. |
TooManyFaces |
Too many faces. |
FaceOutOfFrame |
The face went outside the frame. |
FaceDetectSmall |
Face is too small. |
BadHeadPose |
Bad head pose. |
BadQuality |
Poor image quality. |
BlurredFace |
Blurred face. |
TooDark |
Too dark. |
TooMuchLight |
Too much light. |
GlassesOn |
Glasses detected. |
OccludedFace |
The face is partially closed. |
BadEyesStatus |
Bad eye status. |
FaceWithMask |
Mask detected. |
Processing example#
lifecycleScope.launch {
LunaID.errorFlow.collect { error ->
error?.let {
val message = when (it.error) {
DetectionError.FaceLost -> "Face not found"
DetectionError.PrimaryFaceLost -> "The primary face is lost"
DetectionError.PrimaryFaceLostCritical -> "Critical face loss"
DetectionError.FaceDetectSmall -> "Move closer to the camera"
DetectionError.FaceOutOfFrame -> "Keep your face in the center of the frame"
DetectionError.BadHeadPose -> "Keep your head straight"
DetectionError.BadEyesStatus -> "Open your eyes"
DetectionError.TooDark -> "Too dark"
DetectionError.TooMuchLight -> "Too much light"
DetectionError.BlurredFace -> "Keep your device still"
DetectionError.OccludedFace -> "Take your hand away from your face"
DetectionError.GlassesOn -> "Take off your glasses"
DetectionError.FaceWithMask -> "Take off your mask"
DetectionError.TooManyFaces -> "There must be one person in the frame"
DetectionError.BadQuality -> "Poor image quality"
else -> "Detection error"
}
// Show message to a user
}
}
}
Important notes#
- Always check the initialization status before using the SDK.
- Handle all possible errors.
- Free up resources when shutting down the SDK.
- Use appropriate permissions in AndroidManifest.xml:
android.permission.CAMERAandroid.permission.INTERNET(for working with the API)