Getting the best shot#
With LUNA ID, you can capture video stream and get the best shot on which the face is fixed in the optimal angle for further processing.
Tip: In LUNA ID for Android you can specify a face recognition area for best shot selection.
In LUNA ID for Android#
To get the best shot, call the LunaID.showCamera()
method.
To receive a result, subscribe to LunaID.finishStates()
for the StateFinished(val result: FinishResult)
events.
A value of the result
field depends on a best shot search result. Possible values are:
class ResultSuccess(val data: FinishSuccessData) : FinishResult()
class ResultFailed(val data: FinishFailedData) : FinishResult()
// when the camera closed before the best shot was found
class ResultCancelled(val data: FinishCancelledData) : FinishResult()
ResultSuccess
When the best shot was found, data: FinishSuccessData
will contain the found best shot and an optional path to the recorded video.
class FinishSuccessData(
val bestShot: BestShot,
val videoPath: String?,
)
ResultFailed
Search for the best shot can fail for various reasons.
In case the search fails, the data: FinishFailedData
type will define a reason.
sealed class FinishFailedData {
class InteractionFailed() : FinishFailedData()
class LivenessCheckFailed() : FinishFailedData()
class LivenessCheckError(val cause: Throwable?) : FinishFailedData()
class UnknownError(val cause: Throwable?) : FinishFailedData()
}
ResultCancelled
If a user closes a camera screen before the best shot was found, data: FinishCancelledData
will contain an optional path to the recorded video.
Since for getting the best shot, you open a camera in a new Activity
class, pay special attention to the lifecycle of your code components. For example, the calling Activity
class may be terminated or a presenter or view model may be recreated while searching for the best shot. In these cases, subscribe to any of the flows exposed via the LunaID
class (.allEvents()
, interactions()
, and so on) with respect to a component's lifecycle. To do this, consider using the flowWithLifecycle()
and launchIn()
extension functions available for the Flow
class in Kotlin.
Example#
The example below shows how to subscribe to the StateFinished
events with respect to components' lifecycles:
LunaID.finishStates()
.flowOn(Dispatchers.IO)
.flowWithLifecycle(lifecycleOwner.lifecycle, Lifecycle.State.STARTED)
.onEach {
when (it.result) {
is LunaID.FinishResult.ResultSuccess -> {
val image = (it.result as LunaID.FinishResult.ResultSuccess).data.bestShot
}
is LunaID.FinishResult.ResultCancelled -> {
}
is LunaID.FinishResult.ResultFailed -> {
val failReason = (it.result as LunaID.FinishResult.ResultFailed).data
}
}
}
.launchIn(viewModelScope)
Face recognition area#
In some cases, you may need the best shot search to start only after a user places their face in a certain area in the screen. You can specify face recognition area borders by implementing one of the following strategies:
- Border distances are not initialized
- Border distances are initialized with an Android custom view
- Border distances are initialized in dp
- Border distances are initialized automatically
Add a delay before starting face recognition#
You can optionally set up a fixed delay or specific moment in time to define when the face recognition will start after the camera is displayed in the screen. To do this, use the StartBestShotSearchCommand
command.
Add a delay before getting the best shot#
You can optionally set up a delay, in milliseconds, to define for how long a user's face should be placed in the face detection bounding box before the best shot is taken. To do this, use the LunaID.foundFaceDelayMs
parameter. The default value is 0.
In LUNA ID for iOS#
To get the best shots, pass a value to the delegate
parameter of the LMCameraBuilder.viewController
camera controller instance creation function that conforms to the LMCameraDelegate
protocol.
let controller = LMCameraBuilder.viewController(delegate: LMCameraDelegate,
configuration: LCLunaConfiguration,
livenessAPI: livenessAPI)
With the implementation of the LMCameraDelegate
protocol, the camera controller will interact with the user application. In the implemented methods, you will receive the best shot or the corresponding error.
public protocol LMCameraDelegate: AnyObject {
func bestShot(_ bestShot: LunaCore.LCBestShot, _ videoFile: String?)
func error(_ error: LMCameraError, _ videoFile: String?)
}
Add a delay before starting face recognition#
You can optionally set up a delay, in seconds, to define when the face recognition will start after the camera is displayed in the screen. To do this, use LCLunaConfiguration.startDelay
.
Add a delay before getting the best shot#
You can optionally set up a delay, in seconds, to define for how long a user's face should be placed in the face detection bounding box before the best shot is taken. To do this, define the LCLunaConfiguration::faceTime
property. The default value is 5. In case, the face disappears from the bounding box within the specified period, the BestShotError.FACE_LOST
will be caught in the LCBestShotDelegate::bestShotError
delegate.