Comment on page
How to Integrate On-Device Liveness into Your Mobile Application
This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and performing on-device liveness checks without sending any data to a server.
The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results.
Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g.,
com.yourcompany.yourapp
. Issue the 1-month trial license on our website or email us for a long-term license. In the build.gradle of your project, add:
allprojects {
repositories {
maven { url "https://ozforensics.jfrog.io/artifactory/main" }
}
}
In the build.gradle of the module, add:
dependencies {
implementation 'com.ozforensics.liveness:full:<version>'
// You can find the version needed in the Android changelog
}
Rename the license file to forensics.license and place it into the project's res/raw folder.
Kotlin
Java
OzLivenessSDK.init(
context,
listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
)
OzLivenessSDK.INSTANCE.init(
context,
Collections.singletonList(new LicenseSource.LicenseAssetId(R.raw.forensics)),
null
);
To start recording, use startActivityForResult:
Kotlin
Java
val OZ_LIVENESS_REQUEST_CODE = 1
val intent = OzLivenessSDK.createStartIntent(listOf( OzAction.Blank)) startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE)
int OZ_LIVENESS_REQUEST_CODE = 1;
Intent intent = OzLivenessSDK.INSTANCE.createStartIntent(Collections.singletonList(OzAction.Blank));
startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE);
To obtain the captured video, use
onActivityResult
:Kotlin
Java
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
val sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
if (!sdkMediaResult.isNullOrEmpty()) {
analyzeMedia(sdkMediaResult)
} else println(sdkErrorString)
}
}
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
List<OzAbstractMedia> sdkMediaResult = OzLivenessSDK.INSTANCE.getResultFromIntent(data);
String sdkErrorString = OzLivenessSDK.INSTANCE.getErrorFromIntent(data);
if (sdkMediaResult != null && !sdkMediaResult.isEmpty()) {
analyzeMedia(sdkMediaResult);
} else System.out.println(sdkErrorString);
}
}
The
sdkMediaResult
object contains the captured videos.To run the analyses, execute the code below. Mind that
mediaList
is an array of objects that were captured (sdkMediaResult) or otherwise created (media you captured on your own).Kotlin
Java
private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
AnalysisRequest.Builder()
.addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList))
.build()
.run(object : AnalysisRequest.AnalysisListener {
override fun onSuccess(result: List<OzAnalysisResult>) {
result.forEach {
println(it.resolution.name)
println(it.folderId)
}
}
override fun onError(error: OzException) {
error.printStackTrace()
}
})
}
private void analyzeMedia(List<OzAbstractMedia> mediaList) {
new AnalysisRequest.Builder()
.addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList, Collections.emptyMap()))
.build()
.run(new AnalysisRequest.AnalysisListener() {
@Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
@Override
public void onSuccess(@NonNull List<OzAnalysisResult> list) {
for (OzAnalysisResult result: list) {
System.out.println(result.getResolution().name());
System.out.println(result.getFolderId());
}
}
@Override
public void onError(@NonNull OzException e) { e.printStackTrace(); }
});
}
Install OZLivenessSDK via CocoaPods. To integrate OZLivenessSDK into an Xcode project, add to Podfile:
pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in iOS changelog
Rename the license file to forensics.license and put it into the project.
OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
if let error = error {
print(error.errorDescription)
}
}
Create a controller that will capture videos as follows:
let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions)
self.present(ozLivenessVC, animated: true)
The delegate object must implement OZLivenessDelegate protocol:
let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions)
self.present(ozLivenessVC, animated: true)
Use AnalysisRequestBuilder to initiate the Liveness analysis.
let analysisRequest = AnalysisRequestBuilder()
let analysis = Analysis.init(
media: mediaToAnalyze,
type: .quality,
mode: .onDevice)
analysisRequest.uploadMedia(mediaToAnalyze)
analysisRequest.addAnalysis(analysis)
analysisRequest.run(
scenarioStateHandler: { state in }, // scenario steps progress handler
uploadProgressHandler: { (progress) in } // file upload progress handler
) { (analysisResults : [OzAnalysisResult], error) in
// receive and handle analyses results here
for result in analysisResults {
print(result.resolution)
print(result.folderID)
}
}
With these steps, you are done with basic integration of Mobile SDKs. The data from the on-device analysis is not transferred anywhere, so please bear in mind you cannot access it via API or Web console.
Last modified 5mo ago