LivenessOneShotRGB example#
Note. This example shows how to use LivenessOneShotRGBEstimator on backend.
What it does#
This example demonstrates how to use LivenessOneShotRGBEstimator to estimate liveness of an input face images.
Prerequisites#
This example assumes that you have already read the FaceEngine Handbook (or at least have it somewhere nearby for reference) and know some core concepts, like memory management, object ownership, and life-time control. This sample will not explain these aspects in detail.
Example walkthrough#
To get familiar with FSDK usage and common practices, please go through example_extraction first.st.
How to run#
Use the following command to run the example.
Run the command from ”FSDK_ROOT”.
<install_prefix>/example_oneshot_liveness <image_0.ppm> <image_1.ppm>
Image paths must be absolute or relative to the working directory.
Example output#
Info Image 0 Found face with coordinates x = 229 y = 282 width = 545 height = 721
Warn Image 1 Result may be incorrect. Reason: Found 2 faces. Only first detection will be handled.
Info Image 1 Found face with coordinates x = 277 y = 426 width = 77 height = 98
Warn Image 1 Result may be incorrect. Reason: Bounding Box width and/or height is less than `minDetSize` - 200
LivenessOneShotRGBEstimator - single image estimation:
Image 0: state: Alive, score: 0.997796
LivenessOneShotRGBEstimator - batched estimation:
Image 0: state: Alive, score: 0.997796
Image 1: state: Alive, score: 0.994598
LivenessOneShotRGBEstimator - aggregation:
aggregation: state: Alive, score: 0.996197