How to Integrate Server-Based Liveness into Your Mobile Application

This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and subsequently analyzing them on the server.

The SDK implements a ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. The SDK methods for liveness analysis communicate with Oz API under the hood.

Before you begin, make sure you have Oz API credentials. When using SaaS API, you get them from usenvelope:

Login: [email protected]

Password: …

API: https://sandbox.ohio.ozforensics.com

Web Console: https://sandbox.ohio.ozforensics.com

For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the guide on user creation via Web Console. Consider the proper user role (CLIENT in most cases or CLIENT ADMIN, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.

We also recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. For Oz API users, the service is enabled by default. For on-premise installations, we'll provide you with credentials.

Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license on our websitearrow-up-right or email usenvelope for a long-term license.

Android

1

Add SDK to your project

In the build.gradle of your project, add:

In the build.gradle of the module, add:

2

Initialize SDK

Rename the license file to forensics.license and place it into the project's res/raw folder.

3

Connect SDK to Oz API

Use API credentials (login, password, and API URL) that you’ve got from us.

In production, instead of hard-coding login and password in the application, it is recommended to get access token on your backend with API autharrow-up-right method then pass it to your application:

4

Add face recording

To start recording, use startActivityForResult:

To obtain the captured video, use onActivityResult:

5

Run analyses

To run the analyses, execute the code below.

iOS

1

Add our SDK to your project

CocoaPodsarrow-up-right

To integrate OZLivenessSDK into an Xcode project, add to Podfile:

SPM

Add the following package dependencies via SPM: https://gitlab.com/oz-forensics/oz-mobile-ios-sdkarrow-up-right (if you need a guide on adding the package dependencies, please refer to the Apple documentationarrow-up-right). OzLivenessSDK is mandatory. Skip the OzLivenessSDKOnDevice file.

2

Initialize SDK

Rename the license file to forensics.license and put it into the project.

3

Connect SDK to Oz API

Use API credentials (login, password, and API URL) that you’ve got from us.

In production, instead of hard-coding the login and password in the application, it is recommended to get an access token on your back end using the API autharrow-up-right method, then pass it to your application:

4

Add face recording

Create a controller that will capture videos as follows:

The delegate object must implement the OZLivenessDelegate protocol:

5

Run analyses

Use AnalysisRequestBuilder to initiate the Liveness analysis. The communication with Oz API is under the hood of the run method.

With these steps, you are done with basic integration of Mobile SDKs. You will be able to access recorded media and analysis results in Web Console via browser or programmatically via API.

In developer guides, you can also find instructions for customizing the SDK look-and-feel and access the full list of our Mobile SDK methods. Check out the table below:

Android sample apparrow-up-right source codes

iOS sample apparrow-up-right source codes

Android OzLiveness SDK Developer Guidearrow-up-right

Last updated

Was this helpful?