All pages
Powered by GitBook
1 of 9

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Integration Quick Start Guides

This section contains the most common cases of integrating the Oz Forensics Liveness and face Biometry system.

The scenarios can be combined together, for example, integrating liveness into both web and mobile applications or integrating liveness with face matching.

Server-Based Liveness:

How to Integrate Server-Based Liveness into Your Web ApplicationHow to Integrate Server-Based Liveness into Your Mobile ApplicationHow to Check Your Media for Liveness without Oz Front End

On-Device Liveness

Face Matching

How to Integrate On-Device Liveness into Your Mobile Application
How to Add Face Matching of Liveness Video with a Reference Photo From Your Database
How to Add Photo ID Capture and Face Matching to Your Web or Mobile Application

How to Integrate Server-Based Liveness into Your Web Application

This guide outlines the steps for integrating the Oz Liveness Web SDK into a customer web application for capturing facial videos and subsequently analyzing them on a server.

The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. Under the hood, it communicates with Oz API.

Oz Liveness Web SDK detects both presentation and injection attacks. An injection attack is an attempt to feed pre-recorded video into the system using a virtual camera.

Finally, while the cloud-based service provides the fully-fledged functionality, we also offer an on-premise version with the same functions but no need for sending any data to our cloud. We recommend starting with the SaaS mode and then reconnecting your web app to the on-premise Web Adapter and Oz API to ensure seamless integration between your front end and back end. With these guidelines in mind, integrating the Oz Liveness Web SDK into your web application can be a simple and straightforward process.

1. Get your Web Adapter

Tell us domain names of the pages from which you are going to call Web SDK and email for admin access, e.g.:

In response, you’ll get URLs and credentials for further integration and usage. When using SaaS API, you get them :

For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the . Consider the proper user role (CLIENT in most cases or CLIENT ADMIN, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.

2. Add Web Plugin to your web pages

Add the following tags to your HTML code. Use Web Adapter URL received before:

3. Implement your logic around Web Plugin

Add the code that opens the plugin and handles the results:

Keep in mind that it is more secure to get your back end responsible for the decision logic. You can find more details including code samples .

With these steps, you are done with basic integration of Web SDK into your web application. You will be able to access recorded media and analysis results in via browser or programmatically via (please find the instructions here: , ).

In the you can find instructions for common next steps:

  • Customizing plugin look-and-feel

  • Adding custom language pack

  • Tuning plugin behavior

  • Plugin parameters and callbacks

Please find a sample for Oz Liveness Web SDK . To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.

For Angular and React, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.

Server-Based Liveness

In this section, we listed the guides for the server-based liveness check integrations.

How to Integrate Server-Based Liveness into Your Web Application
How to Integrate Server-Based Liveness into Your Mobile Application

Face Matching

In this section, we listed the guides for the face matching checks.

How to Add Face Matching of Liveness Video with a Reference Photo From Your Database

On-Device Liveness

In this section, there's a guide for the integration of the on-device liveness check.

How to Integrate On-Device Liveness into Your Mobile Application

We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results.

Security recommendations

Domain names from which WebSDK will be called:

  1. www.yourbrand.com

  2. www.yourbrand2.com

Email for admin access:

  • [email protected]

Login: [email protected]

Password: …

API: https://sandbox.ohio.ozforensics.com/

Web Console: https://sandbox.ohio.ozforensics.com

Web Adapter: https://web-sdk.cdn.sandbox.ozforensics.com/your_company_name/

from us
guide on user creation via Web Console
here
Web Console
API
retrieving an MP4 video
getting analysis results
Web Plugin Developer Guide,
here
Angular sample
React sample
<script src="https://<web-adapter-url>/plugin_liveness.php"></script>
OzLiveness.open({
  lang: 'en',
  action: [
    // 'photo_id_front', // request photo ID picture
    'video_selfie_blank' // request passive liveness video
  ],
  on_complete: function (result) {
    // This callback is invoked when the analysis is complete
    console.log('on_complete', result);
  }
});

How to Integrate On-Device Liveness into Your Mobile Application

We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results.

This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and performing on-device liveness checks without sending any data to a server.

The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results.

Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license or for a long-term license.

Android
1

Add SDK to your project

In the build.gradle of your project, add:

In the build.gradle of the module, add:

2

Initialize SDK

Rename the license file to forensics.license and place it into the project's res/raw folder.

3

Add face recording

To start recording, use startActivityForResult:

To obtain the captured video, use onActivityResult:

4

Run analyses

To run the analyses, execute the code below. Mind that mediaList is an array of objects that were captured (sdkMediaResult) or otherwise created (media you captured on your own).

iOS

1

Add our SDK to your project

Install OZLivenessSDK via CocoaPods. To integrate OZLivenessSDK into an Xcode project, add to Podfile:

2

Initialize SDK

Rename the license file to forensics.license and put it into the project.

3

Add face recording

Create a controller that will capture videos as follows:

The delegate object must implement OZLivenessDelegate protocol:

4

Run analyses

Use AnalysisRequestBuilder to initiate the Liveness analysis.

With these steps, you are done with basic integration of Mobile SDKs. The data from the on-device analysis is not transferred anywhere, so please bear in mind you cannot access it via API or Web console. However, the internet is still required to check the license. Additionally, we recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. We'll provide you with credentials.

Android source codes

iOS source codes

Android OzLiveness SDK

iOS OzLiveness SDK

in PlayMarket

in TestFlight

on our website
email us
allprojects {
    repositories {
        maven { url "https://ozforensics.jfrog.io/artifactory/main" }
    }
}
dependencies {
    implementation 'com.ozforensics.liveness:full:<version>'
    // You can find the version needed in the Android changelog
}
pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in  iOS changelog
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    super.onActivityResult(requestCode, resultCode, data);
    if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
        List<OzAbstractMedia

The sdkMediaResult object contains the captured videos.

sample app
sample app
Developer Guide
Developer Guide
Demo app
Demo app
OzLivenessSDK.INSTANCE.init(
        context,
        Collections.singletonList(new LicenseSource.LicenseAssetId(R.raw.forensics)),
        null
);
int OZ_LIVENESS_REQUEST_CODE = 1;
Intent intent = OzLivenessSDK.INSTANCE.createStartIntent(Collections.singletonList(OzAction.Blank));
startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE);
private void analyzeMedia(List<OzAbstractMedia> mediaList) {
    new AnalysisRequest.Builder()
            .addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList, Collections.emptyMap()))
            .build()
            .run(new AnalysisRequest.AnalysisListener() {
                @Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
                @Override
                public void onSuccess(@NonNull List<OzAnalysisResult> list) {
                    for (OzAnalysisResult result: list) {
                        System.out.println(result.getResolution().name());
                        System.out.println(result.getFolderId());
                    }
                }
                @Override
                public void onError(@NonNull OzException e) { e.printStackTrace(); }
    });
}
OzLivenessSDK.init(
    context,
    listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
)
val OZ_LIVENESS_REQUEST_CODE = 1
val intent = OzLivenessSDK.createStartIntent(listOf( OzAction.Blank)) startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE)
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
    super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
            val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
            val sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
            if (!sdkMediaResult.isNullOrEmpty()) {
                analyzeMedia(sdkMediaResult)
            } else println(sdkErrorString)
        }
    }
private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
    AnalysisRequest.Builder()
        .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList))
        .build()
        .run(object : AnalysisRequest.AnalysisListener {
            override fun onSuccess(result: List<OzAnalysisResult>) {
                result.forEach { 
                    println(it.resolution.name)
                    println(it.folderId)
                }
            }
            override fun onError(error: OzException) {
                error.printStackTrace()
            }
        })
} 
OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
    if let error = error {
        print(error.errorDescription)
    }
}
let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
self.present(ozLivenessVC, animated: true)
let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
self.present(ozLivenessVC, animated: true)
let analysisRequest = AnalysisRequestBuilder()
let analysis = Analysis.init(
media: mediaToAnalyze, 
type: .quality, 
mode: .onDevice)
analysisRequest.uploadMedia(mediaToAnalyze)
analysisRequest.addAnalysis(analysis)
analysisRequest.run(
scenarioStateHandler: { state in }, // scenario steps progress handler
uploadProgressHandler: { (progress) in } // file upload progress handler 
) { (analysisResults : [OzAnalysisResult], error) in 
    // receive and handle analyses results here 
    for result in analysisResults {
        print(result.resolution)
        print(result.folderID)
    }
}
> sdkMediaResult
=
OzLivenessSDK
.
INSTANCE
.
getResultFromIntent
(data);
String sdkErrorString = OzLivenessSDK.INSTANCE.getErrorFromIntent(data);
if (sdkMediaResult != null && !sdkMediaResult.isEmpty()) {
analyzeMedia(sdkMediaResult);
} else System.out.println(sdkErrorString);
}
}

How to Integrate Server-Based Liveness into Your Mobile Application

This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and subsequently analyzing them on the server.

The SDK implements a ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. The SDK methods for liveness analysis communicate with Oz API under the hood.

Before you begin, make sure you have Oz API credentials. When using SaaS API, you get them from us:

Login: [email protected]

Password: …

API: https://sandbox.ohio.ozforensics.com

Web Console: https://sandbox.ohio.ozforensics.com

For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the guide on user creation via Web Console. Consider the proper user role (CLIENT in most cases or CLIENT ADMIN, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.

We also recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. For Oz API users, the service is enabled by default. For on-premise installations, we'll provide you with credentials.

Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license or for a long-term license.

Android

1. Add SDK to your project

In the build.gradle of your project, add:

In the build.gradle of the module, add:

2. Initialize SDK

Rename the license file to forensics.license and place it into the project's res/raw folder.

3. Connect SDK to Oz API

Use API credentials (login, password, and API URL) that you’ve got from us.

In production, instead of hard-coding login and password in the application, it is recommended to get access token on your backend with API method then pass it to your application:

4. Add face recording

To start recording, use startActivityForResult:

To obtain the captured video, use onActivityResult:

The sdkMediaResult object contains the captured videos.

5. Run analyses

To run the analyses, execute the code below. Mind that mediaList is an array of objects that were captured (sdkMediaResult) or otherwise created (media you captured on your own).

iOS

1. Add our SDK to your project

Install OZLivenessSDK via . To integrate OZLivenessSDK into an Xcode project, add to Podfile:

2. Initialize SDK

Rename the license file to forensics.license and put it into the project.

3. Connect SDK to Oz API

Use API credentials (login, password, and API URL) that you’ve got from us.

In production, instead of hard-coding the login and password in the application, it is recommended to get an access token on your back end using the API method, then pass it to your application:

4. Add face recording

Create a controller that will capture videos as follows:

The delegate object must implement OZLivenessDelegate protocol:

5. Run analyses

Use AnalysisRequestBuilder to initiate the Liveness analysis. The communication with Oz API is under the hood of the run method.

With these steps, you are done with basic integration of Mobile SDKs. You will be able to access recorded media and analysis results in Web Console via browser or programmatically via API.

In developer guides, you can also find instructions for customizing the SDK look-and-feel and access the full list of our Mobile SDK methods. Check out the table below:

How to Add Face Matching of Liveness Video with a Reference Photo From Your Database

This guide describes how to match a liveness video with a reference photo of a person that is already stored in your database.

However, if you prefer to include a photo ID capture step to your liveness process instead of using a stored photo, then you can refer to another guide in this section.

By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:

  • Integration of Oz Liveness Web SDK

In this scenario, you upload your reference image to the same folder where you have a liveness video, initiate the BIOMETRY analysis, and poll for the results.

1. Get folder_id

Given that you already have the liveness video recorded and uploaded, you will be working with the same Oz API folder where your liveness video is. Obtain the folder ID as described below, and pass it to your back end.

  • For a video recorded by Web SDK, get the folder_id as described .

  • For a video recorded by Android or iOS SDK, retrieve the folder_id from the analysis’ results as shown below:

Android:

iOS:

2. Upload your reference photo

POST /api/folders/{{folder_id}}/media/ method, replacing the folder_id with the ID you’ve got in the previous step. This will upload your new media to the folder where your ready-made liveness video is located.

Set the appropriate tags in the payload field of the request, depending on the nature of a reference photo that you have.

3. Initiate the analysis

To launch the analysis, POST /api/folders/{{folder_id}}/analyses/ with the folder_id from the previous step. In the request body, specify the biometry check to be launched.

4. Poll for the results

Repeat GET /api/analyses/{{analysis_id}} with the analysis_id from the previous step once a second until the state changes from PROCESSING to something else. For a finished analysis:

  • get the qualitative result from resolution (SUCCESS or DECLINED).

  • get the quantitative results from analyses.results_data.min_confidence

Here is the Postman collection for this guide.

With these steps completed, you are done with adding face matching via Oz API. You will be able to access your media and analysis results in Web UI via browser or programmatically via API.

Oz API methods can be combined with great flexibility. Explore Oz API using the .

Android sample app source codes

iOS sample app source codes

Android OzLiveness SDK Developer Guide

iOS OzLiveness SDK Developer Guide

Demo app in PlayMarket

Demo app in TestFlight

on our website
email us
auth
CocoaPods
auth
Integration of Oz Liveness Mobile SDK
here
Call
call
calling
6KB
Face Matching with a Reference Photo.postman_collection.json
Open
API Developer Guide
allprojects {
    repositories {
        maven { url "https://ozforensics.jfrog.io/artifactory/main" }
    }
}
dependencies {
    implementation 'com.ozforensics.liveness:full:<version>'
    // You can find the version needed in the Android changelog
}
OzLivenessSDK.init(
    context,
    listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
)
OzLivenessSDK.INSTANCE.init(
        context,
        Collections.singletonList(new LicenseSource.LicenseAssetId(R.raw.forensics)),
        null
);
OzLivenessSDK.setApiConnection(
    OzConnection.fromCredentials(host, username, password),
    statusListener(
        { token -> /* token */ },
        { ex -> /* error */ }
    )
)
OzLivenessSDK.INSTANCE.setApiConnection(
        OzConnection.Companion.fromCredentials(host, username, password),
        new StatusListener<String>() {
            @Override
            public void onStatusChanged(@Nullable String s) {}
            @Override
            public void onSuccess(String token) { /* token */ }
            @Override
            public void onError(@NonNull OzException e) { /* error */ }
        }
);
OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(host, token))
OzLivenessSDK.INSTANCE.setApiConnection(
        OzConnection.Companion.fromServiceToken(host, token), 
        null
);
val OZ_LIVENESS_REQUEST_CODE = 1
val intent = OzLivenessSDK.createStartIntent(listOf( OzAction.Blank)) startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE)
int OZ_LIVENESS_REQUEST_CODE = 1;
Intent intent = OzLivenessSDK.INSTANCE.createStartIntent(Collections.singletonList(OzAction.Blank));
startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE);
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
    super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
            val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
            val sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
            if (!sdkMediaResult.isNullOrEmpty()) {
                analyzeMedia(sdkMediaResult)
            } else println(sdkErrorString)
        }
    }
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
    super.onActivityResult(requestCode, resultCode, data);
    if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
        List<OzAbstractMedia> sdkMediaResult = OzLivenessSDK.INSTANCE.getResultFromIntent(data);
        String sdkErrorString = OzLivenessSDK.INSTANCE.getErrorFromIntent(data);
        if (sdkMediaResult != null && !sdkMediaResult.isEmpty()) {
            analyzeMedia(sdkMediaResult);
        } else System.out.println(sdkErrorString);
    }
}
private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
    AnalysisRequest.Builder()
        .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList))
        .build()
        .run(object : AnalysisRequest.AnalysisListener {
            override fun onSuccess(result: List<OzAnalysisResult>) {
                result.forEach { 
                    println(it.resolution.name)
                    println(it.folderId)
                }
            }
            override fun onError(error: OzException) {
                error.printStackTrace()
            }
        })
} 
private void analyzeMedia(List<OzAbstractMedia> mediaList) {
    new AnalysisRequest.Builder()
            .addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList, Collections.emptyMap()))
            .build()
            .run(new AnalysisRequest.AnalysisListener() {
                @Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
                @Override
                public void onSuccess(@NonNull List<OzAnalysisResult> list) {
                    for (OzAnalysisResult result: list) {
                        System.out.println(result.getResolution().name());
                        System.out.println(result.getFolderId());
                    }
                }
                @Override
                public void onError(@NonNull OzException e) { e.printStackTrace(); }
    });
}
pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in  iOS changelog
OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
    if let error = error {
        print(error.errorDescription)
    }
}
OZSDK.setApiConnection(Connection.fromCredentials(host: “https://sandbox.ohio.ozforensics.com”, login: login, password: p)) { (token, error) in
    // Your code to handle error or token
}
OZSDK.setApiConnection(Connection.fromServiceToken(host: "https://sandbox.ohio.ozforensics.com", token: token)) { (token, error) in
}
let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
self.present(ozLivenessVC, animated: true)
let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
self.present(ozLivenessVC, animated: true)
let analysisRequest = AnalysisRequestBuilder()
let analysis = Analysis.init(
media: mediaToAnalyze, 
type: .quality, 
mode: .serverBased)
analysisRequest.uploadMedia(mediaToAnalyze)
analysisRequest.addAnalysis(analysis)
analysisRequest.run(
scenarioStateHandler: { state in }, // scenario steps progress handler
uploadProgressHandler: { (progress) in } // file upload progress handler 
) { (analysisResults : [OzAnalysisResult], error) in 
    // receive and handle analyses results here 
    for result in analysisResults {
        print(result.resolution)
        print(result.folderID)
    }
}
AnalysisRequest.Builder()
        ...
        .run(object : AnalysisRequest.AnalysisListener {
            override fun onSuccess(result: List<OzAnalysisResult>) {
                // save folder_id that is needed for the next step
                val folderId = result.firstOrNull()?.folderId
            }
            ...
        })
private void analyzeMedia(List<OzAbstractMedia> mediaList) {
    new AnalysisRequest.Builder()
            ...
            .run(new AnalysisRequest.AnalysisListener() {
                @Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
                @Override
                public void onSuccess(@NonNull List<OzAnalysisResult> list) {
                    String folderId = list.get(0).getFolderId();
                    }
                }
                ...
    });
}
analysisRequest.run(
scenarioStateHandler: { state in }, 
uploadProgressHandler: { (progress) in }  
)   { (analysisResults : [OzAnalysisResult], error) in 
        // save folder_id that is needed for the next step
        let folderID = analysisResults.first?.folderID
    }
}
{
  "media:tags": { 
    "photo1": [
        "photo_id", "photo_id_front" // for the front side of an ID
        // OR
        "photo_selfie" // for a non-ID photo
    ]
  }
}
{
    "analyses": [
        {
            "type": "biometry"
        }
    ]
}

How to Add Photo ID Capture and Face Matching to Your Web or Mobile Application

Please note that the Oz Liveness Mobile SDK does not include a user interface for scanning official documents. You may need to explore alternative SDKs that offer that functionality or implement it on your own. Web SDK does include a simple photo ID capture screen.

This guide describes the steps needed to add face matching to your liveness check.

By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:

Adding Photo ID Capture Step to Web SDK

Simply add photo_id_front to the list of actions for the plugin, e.g.,

Adding Face Matching to Android SDK

For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.

Modify the code that runs the analysis as follows:

For on-device analyses, you can change the analysis mode from Analysis.Mode.SERVER_BASED to Analysis.Mode.ON_DEVICE

Check also the Android sample app source code.

Adding Face Matching to iOS SDK

For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.

Modify the code that runs the analysis as follows:

For on-device analyses, you can change the analysis mode from mode: .serverBased to mode: .onDevice

Check also the iOS sample app source code.

Final notes for all SDKs

You will be able to access your media and analysis results in Web UI via browser or programmatically via API.

Oz API methods as well as Mobile and Web SDK methods can be combined with great flexibility. Explore the options available in the Developer Guide section.

Integration of Oz Liveness Web SDK
Integration of Oz Liveness Mobile SDK
private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {

    val refFile = File(context.filesDir, "reference.jpg")
    val refMedia = OzAbstractMedia.OzDocumentPhoto(
        OzMediaTag.PhotoIdFront , // OzMediaTag.PhotoSelfie for a non-ID photo
        refFile.absolutePath
    )

    AnalysisRequest.Builder()
        .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList))
        .addAnalysis(Analysis(Analysis.Type.BIOMETRY, Analysis.Mode.SERVER_BASED, mediaList + refMedia))
        .build()
        .run(object : AnalysisRequest.AnalysisListener {
            override fun onSuccess(result: List<OzAnalysisResult>) {
                result.forEach { 
                    println(it.resolution.name)
                    println(it.folderId)
                }
            }
            override fun onError(error: OzException) {
                error.printStackTrace()
            }
        })
} 
private void analyzeMedia(List<OzAbstractMedia> mediaList) {
    File refFile = new File(context.getFilesDir(), "reference.jpg");
    OzAbstractMedia refMedia = new OzAbstractMedia.OzDocumentPhoto(
            OzMediaTag.PhotoIdFront , // OzMediaTag.PhotoSelfie for a non-ID photo
            refFile.getAbsolutePath()
    );
    ArrayList<OzAbstractMedia> mediaWithReferencePhoto = new ArrayList<>(mediaList);
    mediaWithReferencePhoto.add(refMedia);
    new AnalysisRequest.Builder()
            .addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList, Collections.emptyMap()))
            .addAnalysis(new Analysis(Analysis.Type.BIOMETRY, Analysis.Mode.SERVER_BASED, mediaWithReferencePhoto, Collections.emptyMap()))
            .build()
            .run(new AnalysisRequest.AnalysisListener() {
                @Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
                @Override
                public void onSuccess(@NonNull List<OzAnalysisResult> list) {
                    String folderId = list.get(0).getFolderId();
                }
                @Override
                public void onError(@NonNull OzException e) { e.printStackTrace(); }
    });
}
OzLiveness.open({
  lang: 'en',
  action: [
    'photo_id_front', 
    'video_selfie_blank'
  ],
  ...
});
let imageURL = URL(fileURLWithPath: NSTemporaryDirectory())
    .appendingPathComponent("reference.jpg")

let refMedia = OZMedia.init(movement: .selfie,
                   mediaType: .movement,
                   metaData: nil,
                   videoURL: nil,
                   bestShotURL: imageUrl,
                   preferredMediaURL: nil,
                   timestamp: Date())
   
var mediaBiometry = [OZMedia]()
mediaBiometry.append(refMedia)
mediaBiometry.append(contentsOf: mediaToAnalyze)
let analysisRequest = AnalysisRequestBuilder()
let analysisBiometry = Analysis.init(media: mediaBiometry, type: .biometry, mode: .serverBased)
let analysisQuality = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased)
analysisRequest.addAnalysis(analysisBiometry)
analysisRequest.addAnalysis(analysisQuality)
analysisRequest.uploadMedia(mediaBiometry)
analysisRequest.run(
    scenarioStateHandler: { state in }, // scenario steps progress handler
    uploadProgressHandler: { (progress) in } // file upload progress handler
) { (analysisResults : [OzAnalysisResult], error) in
    // receive and handle analyses results here
    for result in analysisResults {
        print(result.resolution)
        print(result.folderID)
    }
}