Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This section contains the most common cases of integrating the Oz Forensics Liveness and face Biometry system.
The scenarios can be combined together, for example, integrating liveness into both web and mobile applications or integrating liveness with face matching.
This guide outlines the steps for integrating the Oz Liveness Web SDK into a customer web application for capturing facial videos and subsequently analyzing them on a server.
The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. Under the hood, it communicates with Oz API.
Oz Liveness Web SDK detects both presentation and injection attacks. An injection attack is an attempt to feed pre-recorded video into the system using a virtual camera.
Finally, while the cloud-based service provides the fully-fledged functionality, we also offer an on-premise version with the same functions but no need for sending any data to our cloud. We recommend starting with the SaaS mode and then reconnecting your web app to the on-premise Web Adapter and Oz API to ensure seamless integration between your front end and back end. With these guidelines in mind, integrating the Oz Liveness Web SDK into your web application can be a simple and straightforward process.
Tell us domain names of the pages from which you are going to call Web SDK and email for admin access, e.g.:
In response, you’ll get URLs and credentials for further integration and usage. When using SaaS API, you get them :
For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the . Consider the proper user role (CLIENT in most cases or CLIENT ADMIN, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.
Add the following tags to your HTML code. Use Web Adapter URL received before:
Add the code that opens the plugin and handles the results:
With these steps, you are done with basic integration of Web SDK into your web application. You will be able to access recorded media and analysis results in via browser or programmatically via (please find the instructions here: , ).
In the you can find instructions for common next steps:
Customizing plugin look-and-feel
Adding custom language pack
Tuning plugin behavior
Plugin parameters and callbacks
Please find a sample for Oz Liveness Web SDK . To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.
For Angular and React, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.
In this section, we listed the guides for the server-based liveness check integrations.
In this section, we listed the guides for the face matching checks.
How to Add Face Matching of Liveness Video with a Reference Photo From Your DatabaseIn this section, there's a guide for the integration of the on-device liveness check.
How to Integrate On-Device Liveness into Your Mobile ApplicationSecurity recommendations
Domain names from which WebSDK will be called:
www.yourbrand.com
www.yourbrand2.com
Email for admin access:
Login: [email protected]
Password: …
API: https://sandbox.ohio.ozforensics.com/
Web Console: https://sandbox.ohio.ozforensics.com
Web Adapter: https://web-sdk.cdn.sandbox.ozforensics.com/your_company_name/

<script src="https://<web-adapter-url>/plugin_liveness.php"></script>OzLiveness.open({
lang: 'en',
action: [
// 'photo_id_front', // request photo ID picture
'video_selfie_blank' // request passive liveness video
],
on_complete: function (result) {
// This callback is invoked when the analysis is complete
console.log('on_complete', result);
}
});This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and performing on-device liveness checks without sending any data to a server.
The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results.
Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license or for a long-term license.
Install OZLivenessSDK via CocoaPods. To integrate OZLivenessSDK into an Xcode project, add to Podfile:
With these steps, you are done with basic integration of Mobile SDKs. The data from the on-device analysis is not transferred anywhere, so please bear in mind you cannot access it via API or Web console. However, the internet is still required to check the license. Additionally, we recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. We'll provide you with credentials.
Android source codes
iOS source codes
Android OzLiveness SDK
iOS OzLiveness SDK
in PlayMarket
in TestFlight
allprojects {
repositories {
maven { url "https://ozforensics.jfrog.io/artifactory/main" }
}
}dependencies {
implementation 'com.ozforensics.liveness:full:<version>'
// You can find the version needed in the Android changelog
}pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in iOS changelog
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
List<OzAbstractMedia
The sdkMediaResult object contains the captured videos.
OzLivenessSDK.INSTANCE.init(
context,
Collections.singletonList(new LicenseSource.LicenseAssetId(R.raw.forensics)),
null
);int OZ_LIVENESS_REQUEST_CODE = 1;
Intent intent = OzLivenessSDK.INSTANCE.createStartIntent(Collections.singletonList(OzAction.Blank));
startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE);private void analyzeMedia(List<OzAbstractMedia> mediaList) {
new AnalysisRequest.Builder()
.addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList, Collections.emptyMap()))
.build()
.run(new AnalysisRequest.AnalysisListener() {
@Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
@Override
public void onSuccess(@NonNull List<OzAnalysisResult> list) {
for (OzAnalysisResult result: list) {
System.out.println(result.getResolution().name());
System.out.println(result.getFolderId());
}
}
@Override
public void onError(@NonNull OzException e) { e.printStackTrace(); }
});
}OzLivenessSDK.init(
context,
listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
)val OZ_LIVENESS_REQUEST_CODE = 1
val intent = OzLivenessSDK.createStartIntent(listOf( OzAction.Blank)) startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE)override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
val sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
if (!sdkMediaResult.isNullOrEmpty()) {
analyzeMedia(sdkMediaResult)
} else println(sdkErrorString)
}
}private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
AnalysisRequest.Builder()
.addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList))
.build()
.run(object : AnalysisRequest.AnalysisListener {
override fun onSuccess(result: List<OzAnalysisResult>) {
result.forEach {
println(it.resolution.name)
println(it.folderId)
}
}
override fun onError(error: OzException) {
error.printStackTrace()
}
})
} OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
if let error = error {
print(error.errorDescription)
}
}let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions)
self.present(ozLivenessVC, animated: true)let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions)
self.present(ozLivenessVC, animated: true)let analysisRequest = AnalysisRequestBuilder()
let analysis = Analysis.init(
media: mediaToAnalyze,
type: .quality,
mode: .onDevice)
analysisRequest.uploadMedia(mediaToAnalyze)
analysisRequest.addAnalysis(analysis)
analysisRequest.run(
scenarioStateHandler: { state in }, // scenario steps progress handler
uploadProgressHandler: { (progress) in } // file upload progress handler
) { (analysisResults : [OzAnalysisResult], error) in
// receive and handle analyses results here
for result in analysisResults {
print(result.resolution)
print(result.folderID)
}
}This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and subsequently analyzing them on the server.
The SDK implements a ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. The SDK methods for liveness analysis communicate with Oz API under the hood.
Before you begin, make sure you have Oz API credentials. When using SaaS API, you get them from us:
Login: [email protected]
Password: …
API: https://sandbox.ohio.ozforensics.com
Web Console: https://sandbox.ohio.ozforensics.com
For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the guide on user creation via Web Console. Consider the proper user role (CLIENT in most cases or CLIENT ADMIN, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.
We also recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. For Oz API users, the service is enabled by default. For on-premise installations, we'll provide you with credentials.
Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license or for a long-term license.
In the build.gradle of your project, add:
In the build.gradle of the module, add:
Rename the license file to forensics.license and place it into the project's res/raw folder.
Use API credentials (login, password, and API URL) that you’ve got from us.
In production, instead of hard-coding login and password in the application, it is recommended to get access token on your backend with API method then pass it to your application:
To start recording, use startActivityForResult:
To obtain the captured video, use onActivityResult:
The sdkMediaResult object contains the captured videos.
To run the analyses, execute the code below. Mind that mediaList is an array of objects that were captured (sdkMediaResult) or otherwise created (media you captured on your own).
Install OZLivenessSDK via . To integrate OZLivenessSDK into an Xcode project, add to Podfile:
Rename the license file to forensics.license and put it into the project.
Use API credentials (login, password, and API URL) that you’ve got from us.
In production, instead of hard-coding the login and password in the application, it is recommended to get an access token on your back end using the API method, then pass it to your application:
Create a controller that will capture videos as follows:
The delegate object must implement OZLivenessDelegate protocol:
Use AnalysisRequestBuilder to initiate the Liveness analysis. The communication with Oz API is under the hood of the run method.
With these steps, you are done with basic integration of Mobile SDKs. You will be able to access recorded media and analysis results in Web Console via browser or programmatically via API.
In developer guides, you can also find instructions for customizing the SDK look-and-feel and access the full list of our Mobile SDK methods. Check out the table below:
This guide describes how to match a liveness video with a reference photo of a person that is already stored in your database.
However, if you prefer to include a photo ID capture step to your liveness process instead of using a stored photo, then you can refer to another guide in this section.
By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:
In this scenario, you upload your reference image to the same folder where you have a liveness video, initiate the BIOMETRY analysis, and poll for the results.
folder_idGiven that you already have the liveness video recorded and uploaded, you will be working with the same Oz API folder where your liveness video is. Obtain the folder ID as described below, and pass it to your back end.
For a video recorded by Web SDK, get the folder_id as described .
For a video recorded by Android or iOS SDK, retrieve the folder_id from the analysis’ results as shown below:
Android:
iOS:
POST /api/folders/{{folder_id}}/media/ method, replacing the folder_id with the ID you’ve got in the previous step. This will upload your new media to the folder where your ready-made liveness video is located.
Set the appropriate tags in the payload field of the request, depending on the nature of a reference photo that you have.
To launch the analysis, POST /api/folders/{{folder_id}}/analyses/ with the folder_id from the previous step. In the request body, specify the biometry check to be launched.
Repeat GET /api/analyses/{{analysis_id}} with the analysis_id from the previous step once a second until the state changes from PROCESSING to something else. For a finished analysis:
get the qualitative result from resolution (SUCCESS or DECLINED).
get the quantitative results from analyses.results_data.min_confidence
Here is the Postman collection for this guide.
With these steps completed, you are done with adding face matching via Oz API. You will be able to access your media and analysis results in Web UI via browser or programmatically via API.
Oz API methods can be combined with great flexibility. Explore Oz API using the .
Android sample app source codes
iOS sample app source codes
Android OzLiveness SDK Developer Guide
iOS OzLiveness SDK Developer Guide
Demo app in PlayMarket
Demo app in TestFlight


allprojects {
repositories {
maven { url "https://ozforensics.jfrog.io/artifactory/main" }
}
}dependencies {
implementation 'com.ozforensics.liveness:full:<version>'
// You can find the version needed in the Android changelog
}OzLivenessSDK.init(
context,
listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
)OzLivenessSDK.INSTANCE.init(
context,
Collections.singletonList(new LicenseSource.LicenseAssetId(R.raw.forensics)),
null
);OzLivenessSDK.setApiConnection(
OzConnection.fromCredentials(host, username, password),
statusListener(
{ token -> /* token */ },
{ ex -> /* error */ }
)
)OzLivenessSDK.INSTANCE.setApiConnection(
OzConnection.Companion.fromCredentials(host, username, password),
new StatusListener<String>() {
@Override
public void onStatusChanged(@Nullable String s) {}
@Override
public void onSuccess(String token) { /* token */ }
@Override
public void onError(@NonNull OzException e) { /* error */ }
}
);OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(host, token))OzLivenessSDK.INSTANCE.setApiConnection(
OzConnection.Companion.fromServiceToken(host, token),
null
);val OZ_LIVENESS_REQUEST_CODE = 1
val intent = OzLivenessSDK.createStartIntent(listOf( OzAction.Blank)) startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE)int OZ_LIVENESS_REQUEST_CODE = 1;
Intent intent = OzLivenessSDK.INSTANCE.createStartIntent(Collections.singletonList(OzAction.Blank));
startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE);override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
val sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
if (!sdkMediaResult.isNullOrEmpty()) {
analyzeMedia(sdkMediaResult)
} else println(sdkErrorString)
}
}protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
List<OzAbstractMedia> sdkMediaResult = OzLivenessSDK.INSTANCE.getResultFromIntent(data);
String sdkErrorString = OzLivenessSDK.INSTANCE.getErrorFromIntent(data);
if (sdkMediaResult != null && !sdkMediaResult.isEmpty()) {
analyzeMedia(sdkMediaResult);
} else System.out.println(sdkErrorString);
}
}private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
AnalysisRequest.Builder()
.addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList))
.build()
.run(object : AnalysisRequest.AnalysisListener {
override fun onSuccess(result: List<OzAnalysisResult>) {
result.forEach {
println(it.resolution.name)
println(it.folderId)
}
}
override fun onError(error: OzException) {
error.printStackTrace()
}
})
} private void analyzeMedia(List<OzAbstractMedia> mediaList) {
new AnalysisRequest.Builder()
.addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList, Collections.emptyMap()))
.build()
.run(new AnalysisRequest.AnalysisListener() {
@Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
@Override
public void onSuccess(@NonNull List<OzAnalysisResult> list) {
for (OzAnalysisResult result: list) {
System.out.println(result.getResolution().name());
System.out.println(result.getFolderId());
}
}
@Override
public void onError(@NonNull OzException e) { e.printStackTrace(); }
});
}pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in iOS changelog
OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
if let error = error {
print(error.errorDescription)
}
}OZSDK.setApiConnection(Connection.fromCredentials(host: “https://sandbox.ohio.ozforensics.com”, login: login, password: p)) { (token, error) in
// Your code to handle error or token
}OZSDK.setApiConnection(Connection.fromServiceToken(host: "https://sandbox.ohio.ozforensics.com", token: token)) { (token, error) in
}let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions)
self.present(ozLivenessVC, animated: true)let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions)
self.present(ozLivenessVC, animated: true)let analysisRequest = AnalysisRequestBuilder()
let analysis = Analysis.init(
media: mediaToAnalyze,
type: .quality,
mode: .serverBased)
analysisRequest.uploadMedia(mediaToAnalyze)
analysisRequest.addAnalysis(analysis)
analysisRequest.run(
scenarioStateHandler: { state in }, // scenario steps progress handler
uploadProgressHandler: { (progress) in } // file upload progress handler
) { (analysisResults : [OzAnalysisResult], error) in
// receive and handle analyses results here
for result in analysisResults {
print(result.resolution)
print(result.folderID)
}
}AnalysisRequest.Builder()
...
.run(object : AnalysisRequest.AnalysisListener {
override fun onSuccess(result: List<OzAnalysisResult>) {
// save folder_id that is needed for the next step
val folderId = result.firstOrNull()?.folderId
}
...
})private void analyzeMedia(List<OzAbstractMedia> mediaList) {
new AnalysisRequest.Builder()
...
.run(new AnalysisRequest.AnalysisListener() {
@Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
@Override
public void onSuccess(@NonNull List<OzAnalysisResult> list) {
String folderId = list.get(0).getFolderId();
}
}
...
});
}analysisRequest.run(
scenarioStateHandler: { state in },
uploadProgressHandler: { (progress) in }
) { (analysisResults : [OzAnalysisResult], error) in
// save folder_id that is needed for the next step
let folderID = analysisResults.first?.folderID
}
}{
"media:tags": {
"photo1": [
"photo_id", "photo_id_front" // for the front side of an ID
// OR
"photo_selfie" // for a non-ID photo
]
}
}{
"analyses": [
{
"type": "biometry"
}
]
}Please note that the Oz Liveness Mobile SDK does not include a user interface for scanning official documents. You may need to explore alternative SDKs that offer that functionality or implement it on your own. Web SDK does include a simple photo ID capture screen.
This guide describes the steps needed to add face matching to your liveness check.
By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:
Simply add photo_id_front to the list of actions for the plugin, e.g.,
For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.
Modify the code that runs the analysis as follows:
For on-device analyses, you can change the analysis mode from Analysis.Mode.SERVER_BASED to Analysis.Mode.ON_DEVICE
Check also the Android sample app source code.
For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.
Modify the code that runs the analysis as follows:
For on-device analyses, you can change the analysis mode from mode: .serverBased to mode: .onDevice
Check also the iOS sample app source code.
You will be able to access your media and analysis results in Web UI via browser or programmatically via API.
Oz API methods as well as Mobile and Web SDK methods can be combined with great flexibility. Explore the options available in the Developer Guide section.
private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
val refFile = File(context.filesDir, "reference.jpg")
val refMedia = OzAbstractMedia.OzDocumentPhoto(
OzMediaTag.PhotoIdFront , // OzMediaTag.PhotoSelfie for a non-ID photo
refFile.absolutePath
)
AnalysisRequest.Builder()
.addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList))
.addAnalysis(Analysis(Analysis.Type.BIOMETRY, Analysis.Mode.SERVER_BASED, mediaList + refMedia))
.build()
.run(object : AnalysisRequest.AnalysisListener {
override fun onSuccess(result: List<OzAnalysisResult>) {
result.forEach {
println(it.resolution.name)
println(it.folderId)
}
}
override fun onError(error: OzException) {
error.printStackTrace()
}
})
} private void analyzeMedia(List<OzAbstractMedia> mediaList) {
File refFile = new File(context.getFilesDir(), "reference.jpg");
OzAbstractMedia refMedia = new OzAbstractMedia.OzDocumentPhoto(
OzMediaTag.PhotoIdFront , // OzMediaTag.PhotoSelfie for a non-ID photo
refFile.getAbsolutePath()
);
ArrayList<OzAbstractMedia> mediaWithReferencePhoto = new ArrayList<>(mediaList);
mediaWithReferencePhoto.add(refMedia);
new AnalysisRequest.Builder()
.addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList, Collections.emptyMap()))
.addAnalysis(new Analysis(Analysis.Type.BIOMETRY, Analysis.Mode.SERVER_BASED, mediaWithReferencePhoto, Collections.emptyMap()))
.build()
.run(new AnalysisRequest.AnalysisListener() {
@Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
@Override
public void onSuccess(@NonNull List<OzAnalysisResult> list) {
String folderId = list.get(0).getFolderId();
}
@Override
public void onError(@NonNull OzException e) { e.printStackTrace(); }
});
}OzLiveness.open({
lang: 'en',
action: [
'photo_id_front',
'video_selfie_blank'
],
...
});let imageURL = URL(fileURLWithPath: NSTemporaryDirectory())
.appendingPathComponent("reference.jpg")
let refMedia = OZMedia.init(movement: .selfie,
mediaType: .movement,
metaData: nil,
videoURL: nil,
bestShotURL: imageUrl,
preferredMediaURL: nil,
timestamp: Date())
var mediaBiometry = [OZMedia]()
mediaBiometry.append(refMedia)
mediaBiometry.append(contentsOf: mediaToAnalyze)
let analysisRequest = AnalysisRequestBuilder()
let analysisBiometry = Analysis.init(media: mediaBiometry, type: .biometry, mode: .serverBased)
let analysisQuality = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased)
analysisRequest.addAnalysis(analysisBiometry)
analysisRequest.addAnalysis(analysisQuality)
analysisRequest.uploadMedia(mediaBiometry)
analysisRequest.run(
scenarioStateHandler: { state in }, // scenario steps progress handler
uploadProgressHandler: { (progress) in } // file upload progress handler
) { (analysisResults : [OzAnalysisResult], error) in
// receive and handle analyses results here
for result in analysisResults {
print(result.resolution)
print(result.folderID)
}
}