How to Integrate On-Device Liveness into Your Mobile Application

This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and performing on-device liveness checks without sending any data to a server.

The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results.

Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license on our website or email us for a long-term license.

Android

1. Add SDK to your project

In the build.gradle of your project, add:

allprojects {
    repositories {
        maven { url "https://ozforensics.jfrog.io/artifactory/main" }
    }
}

In the build.gradle of the module, add:

dependencies {
    implementation 'com.ozforensics.liveness:full:<version>'
    // You can find the version needed in the Android changelog
}

2. Initialize SDK

Rename the license file to forensics.license and place it into the project's res/raw folder.

OzLivenessSDK.init(
    context,
    listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
)

3. Add face recording

To start recording, use startActivityForResult:

val OZ_LIVENESS_REQUEST_CODE = 1
val intent = OzLivenessSDK.createStartIntent(listOf( OzAction.Blank)) startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE)

To obtain the captured video, use onActivityResult:

override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
    super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
            val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
            val sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
            if (!sdkMediaResult.isNullOrEmpty()) {
                analyzeMedia(sdkMediaResult)
            } else println(sdkErrorString)
        }
    }

The sdkMediaResult object contains the captured videos.

4. Run analyses

To run the analyses, execute the code below. Mind that mediaList is an array of objects that were captured (sdkMediaResult) or otherwise created (media you captured on your own).

private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
    AnalysisRequest.Builder()
        .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList))
        .build()
        .run(object : AnalysisRequest.AnalysisListener {
            override fun onSuccess(result: List<OzAnalysisResult>) {
                result.forEach { 
                    println(it.resolution.name)
                    println(it.folderId)
                }
            }
            override fun onError(error: OzException) {
                error.printStackTrace()
            }
        })
} 

iOS

1. Add our SDK to your project

Install OZLivenessSDK via CocoaPods. To integrate OZLivenessSDK into an Xcode project, add to Podfile:

pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in  iOS changelog

2. Initialize SDK

Rename the license file to forensics.license and put it into the project.

OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
    if let error = error {
        print(error.errorDescription)
    }
}

3. Add face recording

Create a controller that will capture videos as follows:

let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
self.present(ozLivenessVC, animated: true)

The delegate object must implement OZLivenessDelegate protocol:

let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
self.present(ozLivenessVC, animated: true)

4. Run analyses

Use AnalysisRequestBuilder to initiate the Liveness analysis.

let analysisRequest = AnalysisRequestBuilder()
let analysis = Analysis.init(
media: mediaToAnalyze, 
type: .quality, 
mode: .onDevice)
analysisRequest.uploadMedia(mediaToAnalyze)
analysisRequest.addAnalysis(analysis)
analysisRequest.run(
scenarioStateHandler: { state in }, // scenario steps progress handler
uploadProgressHandler: { (progress) in } // file upload progress handler 
) { (analysisResults : [OzAnalysisResult], error) in 
    // receive and handle analyses results here 
    for result in analysisResults {
        print(result.resolution)
        print(result.folderID)
    }
}

With these steps, you are done with basic integration of Mobile SDKs. The data from the on-device analysis is not transferred anywhere, so please bear in mind you cannot access it via API or Web console. However, the internet is still required to check the license. Additionally, we recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. We'll provide you with credentials.

Last updated