Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This section contains the most common cases of integrating the Oz Forensics Liveness and face Biometry system.
The scenarios can be combined together, for example, integrating liveness into both web and mobile applications or integrating liveness with face matching.
Oz API is a rich Rest API for facial biometry, where you can do liveness checks and face matching. Oz API features are:
Persistence: your media and analysis are stored for future reference unless you explicitly delete it,
Ability to work with videos as well as images,
Asynchronous analyses,
Authentication,
Roles and access management.
The unit of work in Oz API is a folder: you can upload interrelated media to a folder, run analyses on them, and check for the aggregated result.
This step-by-step guide describes how to perform a liveness check on a facial image or video that you already have with Oz backend: create a folder, upload your media to this folder, initiate liveness check and poll for the results.
For better accuracy and user experience, we recommend that you use our Web and/or Native SDK on your front end for face capturing. Please refer to the relevant guides:
Before you begin, make sure you have Oz API credentials. When using SaaS API, you get them from us:
Login: j.doe@yourcompany.com
Password: …
API: https://sandbox.ohio.ozforensics.com
Web Console: https://sandbox.ohio.ozforensics.com
For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the guide on user creation via Web Console. Consider the proper user role (CLIENT
in most cases or CLIENT ADMIN
, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.
You can explore all API methods with Oz Postman collection.
For security reasons, we recommend obtaining the access token instead of using the credentials. Call POST /api/authorize/auth with your login and password in the request body.
Get the access_token from the response and add it to the X-Forensic-Access-Token of all subsequence requests.
This step is not needed for API 5.0.0 and above.
With API 4.0.8 and below, Oz API requires video or archived sequence of images in order to perform liveness check. If you want to analyze a single image, you need to transform it into a ZIP archive. Oz API will treat this archive as a video.
Make sure that you use corresponding video-related tags later on.
To create a folder and upload your media into it, call POST /api/folders/ method, adding the media you need to the body part of the request.
In the payload field, set the appropriate tags:
The successful response will return code 201 and the folder_id
you’ll need later on.
To launch the analysis, call POST /api/folders/{{folder_id}}/analyses/ with the folder_id from the previous step. In the request body, specify the liveness check to be launched.
The results will be available in a short while. The method will return analyse_id that you’ll need at the next step.
Repeat calling GET /api/analyses/{{analyse_id}} with the analyse_id
from the previous step once a second until the state changes from PROCESSING
to something else. For a finished analysis:
get the qualitative result from resolution (SUCCESS
or DECLINED
).
get the quantitative results from results_media[0].results_data.confidence_spoofing
. confidence_spoofing
ranges from 0.0 (real person) to 1.0 (spoofing).
Here is the Postman collection for this guide.
With these steps completed, you are done with Liveness check via Oz API. You will be able to access your media and analysis results in Web UI via browser or programmatically via API.
Oz API methods can be combined with great flexibility. Explore Oz API using the API Developer Guide.
In this section, we listed the guides for the face matching checks.
In this section, there's a guide for the integration of the on-device liveness check.
In this section, we listed the guides for the server-based liveness check integrations.
This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and subsequently analyzing them on the server.
The SDK implements a ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. The SDK methods for liveness analysis communicate with Oz API under the hood.
Before you begin, make sure you have Oz API credentials. When using SaaS API, you get them :
We also recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. For Oz API users, the service is enabled by default. For on-premise installations, we'll provide you with credentials.
In the build.gradle of your project, add:
In the build.gradle of the module, add:
Rename the license file to forensics.license and place it into the project's res/raw folder.
Use API credentials (login, password, and API URL) that you’ve got from us.
To start recording, use startActivityForResult:
To obtain the captured video, use onActivityResult
:
The sdkMediaResult
object contains the captured videos.
To run the analyses, execute the code below. Mind that mediaList
is an array of objects that were captured (sdkMediaResult
) or otherwise created (media you captured on your own).
Rename the license file to forensics.license and put it into the project.
Use API credentials (login, password, and API URL) that you’ve got from us.
Create a controller that will capture videos as follows:
The delegate object must implement OZLivenessDelegate protocol:
Use AnalysisRequestBuilder to initiate the Liveness analysis. The communication with Oz API is under the hood of the run method.
With these steps, you are done with basic integration of Mobile SDKs. You will be able to access recorded media and analysis results in Web Console via browser or programmatically via API.
In developer guides, you can also find instructions for customizing the SDK look-and-feel and access the full list of our Mobile SDK methods. Check out the table below:
This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and performing on-device liveness checks without sending any data to a server.
The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results.
Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp
. Issue the 1-month trial license or for a long-term license.
In the build.gradle of your project, add:
In the build.gradle of the module, add:
Rename the license file to forensics.license and place it into the project's res/raw folder.
To start recording, use startActivityForResult:
To obtain the captured video, use onActivityResult
:
The sdkMediaResult
object contains the captured videos.
To run the analyses, execute the code below. Mind that mediaList
is an array of objects that were captured (sdkMediaResult) or otherwise created (media you captured on your own).
Rename the license file to forensics.license and put it into the project.
Create a controller that will capture videos as follows:
The delegate object must implement OZLivenessDelegate protocol:
Use AnalysisRequestBuilder to initiate the Liveness analysis.
With these steps, you are done with basic integration of Mobile SDKs. The data from the on-device analysis is not transferred anywhere, so please bear in mind you cannot access it via API or Web console. However, the internet is still required to check the license. Additionally, we recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. We'll provide you with credentials.
Please note that the Oz Liveness Mobile SDK does not include a user interface for scanning official documents. You may need to explore alternative SDKs that offer that functionality or implement it on your own. Web SDK does include a simple photo ID capture screen.
This guide describes the steps needed to add face matching to your liveness check.
By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:
Simply add photo_id_front to the list of actions for the plugin, e.g.,
For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.
Modify the code that runs the analysis as follows:
For on-device analyses, you can change the analysis mode from Analysis.Mode.SERVER_BASED
to Analysis.Mode.ON_DEVICE
For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.
Modify the code that runs the analysis as follows:
For on-device analyses, you can change the analysis mode from mode: .serverBased
to mode: .onDevice
You will be able to access your media and analysis results in Web UI via browser or programmatically via API.
For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the . Consider the proper user role (CLIENT
in most cases or CLIENT ADMIN
, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.
Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp
. Issue the 1-month trial license or for a long-term license.
In production, instead of hard-coding login and password in the application, it is recommended to get access token on your backend with API method then pass it to your application:
Install OZLivenessSDK via . To integrate OZLivenessSDK into an Xcode project, add to Podfile:
In production, instead of hard-coding the login and password in the application, it is recommended to get an access token on your back end using the API method, then pass it to your application:
Install OZLivenessSDK via . To integrate OZLivenessSDK into an Xcode project, add to Podfile:
Check also the Android source code.
Check also the iOS source code.
Oz API methods as well as Mobile and Web SDK methods can be combined with great flexibility. Explore the options available in the section.
Login: j.doe@yourcompany.com
Password: …
API: https://sandbox.ohio.ozforensics.com
Web Console: https://sandbox.ohio.ozforensics.com
Android sample app source codes
iOS sample app source codes
Android OzLiveness SDK Developer Guide
iOS OzLiveness SDK Developer Guide
Demo app in PlayMarket
Demo app in TestFlight
Android sample app source codes
iOS sample app source codes
Android OzLiveness SDK Developer Guide
iOS OzLiveness SDK Developer Guide
Demo app in PlayMarket
Demo app in TestFlight
This guide outlines the steps for integrating the Oz Liveness Web SDK into a customer web application for capturing facial videos and subsequently analyzing them on a server.
The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. Under the hood, it communicates with Oz API.
Oz Liveness Web SDK detects both presentation and injection attacks. An injection attack is an attempt to feed pre-recorded video into the system using a virtual camera.
Finally, while the cloud-based service provides the fully-fledged functionality, we also offer an on-premise version with the same functions but no need for sending any data to our cloud. We recommend starting with the SaaS mode and then reconnecting your web app to the on-premise Web Adapter and Oz API to ensure seamless integration between your front end and back end. With these guidelines in mind, integrating the Oz Liveness Web SDK into your web application can be a simple and straightforward process.
Tell us domain names of the pages from which you are going to call Web SDK and email for admin access, e.g.:
Domain names from which WebSDK will be called:
www.yourbrand.com
www.yourbrand2.com
Email for admin access:
j.doe@yourcompany.com
In response, you’ll get URLs and credentials for further integration and usage. When using SaaS API, you get them from us:
Login: j.doe@yourcompany.com
Password: …
API: https://sandbox.ohio.ozforensics.com/
Web Console: https://sandbox.ohio.ozforensics.com
Web Adapter: https://web-sdk.cdn.sandbox.ozforensics.com/your_company_name/
For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the guide on user creation via Web Console. Consider the proper user role (CLIENT
in most cases or CLIENT ADMIN
, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.
Add the following tags to your HTML code. Use Web Adapter URL received before:
Add the code that opens the plugin and handles the results:
Keep in mind that it is more secure to get your back end responsible for the decision logic. You can find more details including code samples here.
With these steps, you are done with basic integration of Web SDK into your web application. You will be able to access recorded media and analysis results in Web Console via browser or programmatically via API (please find the instructions here: retrieving an MP4 video, getting analysis results).
In the Web Plugin Developer Guide, you can find instructions for common next steps:
Customizing plugin look-and-feel
Adding custom language pack
Tuning plugin behavior
Plugin parameters and callbacks
Security recommendations
Please find a sample for Oz Liveness Web SDK here. To make it work, replace <web-adapter-url>
with the Web Adapter URL you've received from us.
For Angular and React, replace https://web-sdk.sandbox.ohio.ozforensics.com
in index.html.
This guide describes how to match a liveness video with a reference photo of a person that is already stored in your database.
However, if you prefer to include a photo ID capture step to your liveness process instead of using a stored photo, then you can refer to another guide in this section.
By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:
In this scenario, you upload your reference image to the same folder where you have a liveness video, initiate the BIOMETRY analysis, and poll for the results.
folder_id
Given that you already have the liveness video recorded and uploaded, you will be working with the same Oz API folder where your liveness video is. Obtain the folder ID as described below, and pass it to your back end.
For a video recorded by Web SDK, get the folder_id
as described here.
For a video recorded by Android or iOS SDK, retrieve the folder_id
from the analysis’ results as shown below:
Android:
iOS:
Call POST /api/folders/{{folder_id}}/media/
method, replacing the folder_id
with the ID you’ve got in the previous step. This will upload your new media to the folder where your ready-made liveness video is located.
Set the appropriate tags in the payload field of the request, depending on the nature of a reference photo that you have.
To launch the analysis, call POST /api/folders/{{folder_id}}/analyses/
with the folder_id
from the previous step. In the request body, specify the biometry check to be launched.
Repeat calling GET /api/analyses/{{analyse_id}}
with the analyse_id
from the previous step once a second until the state changes from PROCESSING
to something else. For a finished analysis:
get the qualitative result from resolution (SUCCESS
or DECLINED
).
get the quantitative results from analyses.results_data.min_confidence
Here is the Postman collection for this guide.
With these steps completed, you are done with adding face matching via Oz API. You will be able to access your media and analysis results in Web UI via browser or programmatically via API.
Oz API methods can be combined with great flexibility. Explore Oz API using the API Developer Guide.