arrow-left

Only this pageAll pages
gitbookPowered by GitBook
triangle-exclamation
Couldn't generate the PDF for 156 pages, generation stopped at 100.
Extend with 50 more pages.
1 of 100

EN: Oz Knowledge Base

General

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Guides

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Oz Liveness and Biometry Key Concepts

Oz Forensics specializes in liveness and face matching: we develop products that help you to identify your clients remotely and avoid any kind of spoofing or deepfake attack. Oz software helps you to add facial recognition to your software systems and products. You can integrate Oz modules in many ways depending on your needs. We are constantly improving our components, increasing their quality.

hashtag
Oz Liveness

  • Oz Liveness is responsible for recognizing a living person on a video it receives. Oz Liveness can distinguish a real human from their photo, video, mask, or other kinds of spoofing and deepfake attacks. The accuracy of our algorithms is confirmed by BixeLab, the independent biometric testing laboratory.

Here are the confirmation letters from the laboratory:

Oz Liveness is also certified by NIST accreditation iBeta biometric test laboratory with 100% accuracy.

Our liveness technology protects both against injection and presentation attacks.

The injection attack detection is layered. Our SDK examines user environment to detect potential manipulations: browser, camera, etc. Further on, the deep neural network comes into play to defend against even the most sophisticated injection attacks.

The presentation attack detection is based on deep neural networks of various architectures, combined with a proprietary ensembling algorithm to achieve optimal performance. The networks consider multiple factors, including reflection, focus, background scene, motion patterns, etc. We offer both passive (no gestures) and active (various gestures) , ensuring that your customers enjoy the user experience while delivering accurate results for you. The iBeta test was conducted using passive Liveness, and since then, we have significantly enhanced our networks to better meet the needs of our clients.

hashtag
Oz Face Matching (Biometry)

  • Oz Face Matching (Biometry) aims to identify the person, verifying that the person who performs the check and the papers' owner are the same person. Oz Biometry looks through the video, finds the best shot where the person is clearly seen, and compares it with the photo from ID or another document. The algorithm's accuracy is 99.99% confirmed by NIST FRVT.

Our biometry technology has both 1:1 Face Verification and 1:N Face Identification, which are also based on ML algorithms. To train our neural networks, we use an own framework based on state-of-the-art technologies. The large private dataset (over 4.5 million unique faces) with a wide representation of ethnic groups as well as using other attributes (predicted race, age, etc.) helps our biometric models to provide the robust matching scores.

Our face detector can work with photos and videos. Also, the face detector excels in detecting faces in images of IDs and passports (which can be rotated or of low quality).

The Oz software combines accuracy in analysis with ease of integration and use. To further simplify the integration process, we have provided a detailed description of all the key concepts of our system in this section. If you're ready to get started, please refer to our , which provide the step-by-step instructions on how to achieve your facial recognition goals quickly and easily.

Oz Forensics adheres to SOC 2® and ISO/IEC 27001:2022 standards.

file-pdf
241KB
Confirmation Letter PAD.pdf
PDF
arrow-up-right-from-squareOpen
file-pdf
259KB
Confirmation Letter IAD.pdf
PDF
arrow-up-right-from-squareOpen
Liveness options
integration guides
iBeta Level 1arrow-up-right
iBeta Level 2arrow-up-right

Solution Architecture

This article describes Oz components that can be integrated into your infrastructure in various combinations depending on your needs.

The typical integration scenarios are described in the Integration Quick Start Guides section.

hashtag
Oz API

Oz API is the central component of the system. It provides RESTful application programming interface to the core functionality of Liveness and Face matching analyses, along with many important supplemental features:

  • Persistence: your media and analyses are stored for future reference unless you explicitly delete them,

  • Authentication, roles and access management,

  • Asynchronous analyses,

For more information, please refer to and . To test Oz API, please check the Postman collection .

Under the logical hood, Oz API has the following components:

  • File storage and database where media, analyses, and other data are stored,

  • The Oz BIO module that runs neural network models to perform facial biometry magic,

  • Licensing logic.

The front-end components (Oz Liveness Mobile or Web SDK) connect to Oz API to perform server-side analyses either directly or via customer's back end.

hashtag
iOS and Android SDK

iOS and Android SDK are collectively referred to as Mobile SDKs or Native SDKs. They are written on Swift and Kotlin/Java, respectively, and designed to be integrated into your native mobile application.

Mobile SDKs implement the out-of-the-box customizable user interface for capturing Liveness video and ensure that the two main objectives are met:

  • The capture process is smooth for users,

  • The quality of a video is optimal for the subsequent Liveness analysis.

After Liveness video is recorded and available to your mobile application, you can run the server-side analysis. You can use corresponding SDK methods, call the API directly from your mobile application, or pass the media to your backend and interact with Oz API from there.

The basic integration option is described in the .

Mobile SDKs are also capable of On-device Liveness and Face matching. On-device analyses may be a good option in low-risk context, or when you don’t want the media to leave the users’ smartphones. Oz API is not required for On-device analyses. To learn how it works, please refer to this .

circle-info

We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results.

hashtag
Web SDK

Web Adapter and Web Plugin together constitute Web SDK.

Web SDK is designed to be integrated into your web applications and have the same main goals as Mobile SDKs:

  • The capture process is smooth for users,

  • The quality of a video is optimal for the subsequent Liveness analysis.

Web Adapter needs to be set up on a server side. Web Plugin is called by your web application and works in a browser context. It communicates with Web Adapter, which, in turn, communicates with Oz API.

Web SDK adds the two-layer protection against injection attacks:

  1. Collects information about browser context and camera properties to detect usage of virtual cameras or other injection methods.

  2. Records liveness video in a format that allows server-side neural networks to search for traces of injection attack in the video itself.

Check the for the basic integration scenario, and explore the for more details.

hashtag
Web UI (Web Console)

Web UI is a convenient user interface that allows to explore the stored API data in the easy way. It relies on API authentication and database and does not store any data on its own.

Web console has an intuitive interface, yet the user guide is available .

Liveness, Face Matching, Collection Checks

This article describes the main types of analyses that Oz software is able to perform.

  • Liveness checks whether a person in a media is a real human.

  • Face Matching examines two or more media to identify similarities between the faces depicted in them.

Collection looks for resemblances between an individual featured in a media and individuals in a pre-existing photo database.

These analyses are accessible in the Oz API for both SaaS and On-Premise models. Liveness and Face Matching are also offered in the On-Device model. Please visit this page to learn more about the usage models.

hashtag
Liveness

The Liveness check is important to protect facial recognition from the two types of attacks.

A presentation attack, also known as a spoofing attack, refers to the attempt of an individual to deceive a facial recognition system by presenting into a camera video, photo, or any other type of media that mimics the appearance of a genuine user. These attacks can include the use of realistic masks or make up.

An injection attack is an attempt to deceive a facial recognition system by replacing physical camera input with a prerecorded image or video, manipulating physical camera output before it becomes input to a facial recognition, or injectiong some malicious code. Virtual camera software is the most common tool for injection attacks.

Oz Liveness is able to detect both types of attacks. Any component can detect presentation attacks, and for injection attack detection, use Oz Liveness SDK. To learn about how to use Oz components to prevent attacks, check our integration quick start guides:

  • How to integrate server-based liveness into your web application,

  • How to integrate server-based liveness into your mobile application,

  • How to check your media for liveness without Oz front end.

Once the Liveness check is finished, you can check both qualitative and quantitative analysis results.

chevron-rightResults overviewhashtag

hashtag
Qualitative results

  • SUCCESS – everything went fine, the analysis has completed successfully;

  • DECLINED – the check failed (an attack detected).

If the analysis hasn't been finished yet, the result can be PROCESSING (the analysis is in progress) / FAILED (the analysis failed due to some error and couldn't get finished).

If you have analyzed multiple media, the aggregated status will be SUCCESS only if each analysis on each media has finished with the SUCCESS result.

hashtag
Quantitative results

  • 100% (1) – an attack is detected, the person in the video is not a real living person,

  • 0% (0) – a person in the video is a real living person.

Asking users to perform a gesture, such as smiling or turning their head, is a popular requirement when recording a Liveness video. With Oz Liveness Mobile and Web SDK, you can also request gestures from users. However, our Liveness check relies on other factors, analyzed by neural networks, and does not depend on gestures. For more details, please check Passive and Active Liveness.

Liveness check also can return the best shot from a video: a best-quality frame where the face is seen the most properly.

hashtag
Face Matching

The Biometry algorithm allows comparing several media and check if the people on them are the same person or not. As sources, you can use images, videos, and scans of documents (with photo). To perform the analysis, the algorithm requires at least two media.

chevron-rightResults overviewhashtag

hashtag
Qualitative results

  • SUCCESS – everything went fine, the analysis has completed successfully;

  • DECLINED – the check failed (faces don't match).

If the analysis hasn't been finished yet, the result can be PROCESSING (the analysis is in progress) / FAILED (the analysis failed due to some error and couldn't get finished).

hashtag
Quantitative results

After comparison, the algorithm provides numbers that represent the similarity level. The numbers vary from 100 to 0% (1 to 0), where:

  • 100% (1) – faces are similar, media represent the same person,

  • 0% (0) – faces are not similar and belong to different people

There are two scores to consider: the minimum and maximum. If you have analyzed two media, these scores will be equal. For three or more media, the similarity score is calculated for each pair. Once calculated, these scores get aggregated and analysis returns the minimum and maximum similarity scores for the media compared. Typically, the minimum score is enough.

Wonder how to integrate face matching into your processes? Check our integration quick start guides.

hashtag
Collection

In Oz API, you can configure one or more face collections. These collections are databases of people depicted in photos. When the Collection analysis is being conducted, Oz software compares the face in a photo or video taken with faces of this pre-made database and shows whether a face exists in a collection.

chevron-rightResults overviewhashtag

hashtag
Qualitative results

  • SUCCESS – everything went fine, the analysis has completed successfully;

  • DECLINED – the check failed (faces match).

If the analysis hasn't been finished yet, the result can be PROCESSING (the analysis is in progress) / FAILED (the analysis failed due to some error and couldn't get finished).

hashtag
Quantitative results

After comparison, the algorithm provides a score that represents the similarity level. The number varies from 100 to 0% (1 to 0), where:

  • 100% (1) – the person in an image or video matches with someone in your database,

  • 0% (0) – the person is not found in the collection.

For additional information, please refer to this article.

Ability to work with videos as well as images.
Oz API Key Concepts
Oz API Developer Guide
here
Quick Start Guide
Integration Quick Start Guide
Integration Quick Start Guide
Web SDK Developer Guide
here

Developer Guide

In this section, you will find the description of both API and SDK components of Oz Forensics Liveness and face biometric system. API is the backend component of the system, it is needed for all the system modules to interact with each other. SDK is the frontend component that is used to:

1) take videos or images which are then processed via API,

2) display results.

We provide two versions of API.

With full version, we provide you with all functionality of Oz API.

Oz APIchevron-right

The Lite version is a simple and lightweight version with only the necessary functions included.

Oz API Litechevron-right

The SDK component consists of web SDK and mobile SDK.

Web SDK is a plugin that you can embed into your website page and the adapter for this plugin.

Mobile SDK is SDK for iOS and Android.

Oz Liveness Web SDKchevron-right
iOSchevron-right
Androidchevron-right

Integration Quick Start Guides

This section contains the most common cases of integrating the Oz Forensics Liveness and face Biometry system.

The scenarios can be combined together, for example, integrating liveness into both web and mobile applications or integrating liveness with face matching.

hashtag
Server-Based Liveness:

How to Integrate Server-Based Liveness into Your Web Applicationchevron-rightHow to Integrate Server-Based Liveness into Your Mobile Applicationchevron-rightHow to Check Your Media for Liveness without Oz Front Endchevron-right

hashtag
On-Device Liveness

hashtag
Face Matching

Oz API Lite Postman Collection

Download and install the Postman client from this Then download the JSON file needed:

hashtag
1.2.0

hashtag

How to Integrate On-Device Liveness into Your Mobile Applicationchevron-right
How to Add Face Matching of Liveness Video with a Reference Photo From Your Databasechevron-right
How to Add Photo ID Capture and Face Matching to Your Web or Mobile Applicationchevron-right
1.1.1

hashtag
How to Import a Postman Collection:

Launch the client and import Oz API Lite collection for Postman by clicking the Import button:

Click files, locate the JSON needed, and hit Open to add it:

The collection will be imported and will appear in the Postman interface:

page.arrow-up-right
file-download
11KB
Oz API Lite 1.2.0.json
arrow-up-right-from-squareOpen
file-download
8KB
Oz API Lite 1.1.1.postman_collection.json
arrow-up-right-from-squareOpen

Passive and Active Liveness

Describing how passive and active liveness works.

The objective of the Liveness check is to verify the authenticity and physical presence of an individual in front of the camera. In the passive Liveness check, it is sufficient to capture a user's face while they look into the camera. Conversely, the active Liveness check requires the user to perform an action such as smiling, blinking, or turning their head. While passive Liveness is more user-friendly, active Liveness may be necessary in some situations to confirm that the user is aware of undergoing the Liveness check.

In our Mobile or Web SDKs, you can define what action the user is required to do. You can also combine several actions into a sequence. Actions vary in the following dimensions:

  • User experience,

  • File size,

  • Liveness check accuracy,

  • Suitability for review by a human operator or in court.

In most cases, the Selfie action is optimal, but you can choose other actions based on your specific needs. Here is a summary of available actions:

hashtag
Passive Liveness

hashtag
Active Liveness

To recognize the actions from either passive or active Liveness, our algorithms refer to the corresponding tags. These tags indicate the type of action that a user is performing within a media. For more information, please read the article. The detailed information on how the actions, or, in other words, gestures are called in different Oz Liveness components is .

SaaS, On-premise, On-device: What to Choose

We offer different usage models for the Oz software to meet your specific needs. You can either utilize the software as a service from one of our cloud instances or integrate it into your existing infrastructure. Regardless of the usage model you choose, all Oz modules will function equally. It’s only up to you what to pick, depending on your needs.

hashtag
When to Choose SaaS

With the SaaS model, you can access one of our clouds without having to install our software in your own infrastructure.

Choose SaaS when you want:

  • Faster start as you don’t need to procure or allocate hardware within your company and set up a new instance.

  • Zero infrastructure cost as server components are located in Oz cloud.

  • Lower maintenance cost as Oz maintains and upgrades server components.

hashtag
When to Choose On-Premise

The on-premise model implies that all the Oz components required are installed within your infrastructure. Choose on-premise for:

  • Your data not leaving your infrastructure.

  • Full and detailed control over the configuration.

chevron-rightExpand this section to check the minimum hardware requirementshashtag

Oz Biometry / Liveness Server

  • OS: please check versions

hashtag
When to Choose On-Device

We also provide an opportunity of using the on-device Liveness and Face matching. This model is available in Mobile SDKs.

Consider the on-device option when:

  • You can’t transmit facial images to any server due to privacy concerns

  • The network conditions whereon you plan using Oz products are extremely poor.

circle-info

We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results.

The choice is yours to make, but we're always available to provide assistance.

  • Selfie

A short video, around 0.7 sec. Users are not required to do anything. Recommended for most cases. It offers the best combination of user experience and liveness check accuracy.

  • One shot

Similar to “Simple selfie” but only one image is chosen instead of the whole video. Recommended when media size is the most important factor. Hard to evaluate for a spoofing by a human, e.g., by an operator or in a court. We recommend avoiding the one shot gesture whenever possible, as it tends to produce less accurate results.

  • Scan

A 5-second video where a user is asked to follow the text looking at it. Recommended when the longer video is required, e.g., for subsequent review by a human operator or in a court.

  • Smile

  • Blink

  • Tilt head up

  • Tilt head down

  • Turn head left

  • Turn head right

A user is required to complete a particular gesture within 5 seconds.

Use active liveness when you need a confirmation that the user is aware of undergoing a Liveness check.

Video length and file size may vary depending on how soon a user completes a gesture.

Media Tags
here

No cross-border data transfer for the regions where Oz has cloud instances.

CPU: 16 cores
  • RAM: 32 GB

  • Disk: 80 GB

  • Oz API / Web UI / Web SDK Server

    • OS: OS: please check versions herearrow-up-right

    • CPU: 8 cores

    • RAM: 16 GB

    • Disk: 300 GB

    Please check the OS version with our team.

    herearrow-up-right

    Oz Licensing Options

    For commercial use of Oz Forensics products, a license is required. The license is time-limited and defines the software access parameters based on the terms of your agreement.

    Once you initialize mobile SDK, run Web Plugin, or use Oz Bio, the system checks if your license is valid. The check runs in the background and has minimal impact on the user experience.

    As you can see on the scheme above, the license is required for:

    • Mobile SDKs for iOS and Android,

    • Web SDK, which consists of Web Adapter and Web Plugin,

    • Oz BIO, which is needed for server analyses and is installed for .

    For each of the components, you require a separate license which is bound to this component. Thus, if you use all three components, three licenses are required.

    hashtag
    Native SDKs (iOS and Android)

    To issue a license for mobile SDK, we require your bundle (application) ID. There are two types of licenses for iOS and Android SDKs: online and offline. Any license type can be applied to any analysis mode: on-device, server-based, or hybrid.

    hashtag
    Online License

    As its name suggests, an online license requires a stable connection. Once you initialize our SDK with this license, it connects to our license server and retrieves information about license parameters, including counters of transactions or devices, where:

    • Transaction: increments each time you start a video capture.

    • Device: increments when our SDK is installed on a new device.

    The online license can be transaction-limited, device-limited, or both, according to your agreement.

    The main advantages of the online license are:

    • You don’t need to update your application after the license renewal,

    • And if you want to add a new bundle ID to the incense, there’s also no need to re-issue it. Everything is done on the fly.

    The data exchange for the online license is quick, ensuring your users won't experience almost any delay compared to using the offline license.

    Please note that even though on-device analyses don’t need the Internet themselves, you still require a connection for license verification.

    circle-exclamation

    Online license is the default option for Mobile SDKs. If you require the offline license, please inform your manager.

    hashtag
    Offline License

    Offline license is a type of license that can work without Internet. All license parameters are set in the license file, and you just need to add the file to your project. This license type doesn’t have any restrictions on transactions or devices.

    The main benefit of the offline license is its autonomy, allowing it to function without a network connection. However, when your license expires, and you add a new one, you’ll require to release a new version of your application in Google Play and App Store. Otherwise, the SDK won’t function.

    How to add a license to mobile SDK:

    hashtag
    Web SDK

    Web SDK license is almost similar to the mobile SDK offline license. It can function without network connection, and the license file contains all the necessary parameters, such as expiration date. Web SDK license also has no restrictions on transactions or devices.

    The license is bound to URLs of your domains and/or subdomains. To add the license to your SDK instance, you need to place it to the Web SDK container as described . In rare cases, it is also possible to .

    The difference between Mobile SDK offline license and Web SDK license is that you don’t need to release a new application version when Web SDK license is renewed.

    hashtag
    On-Premise (Oz BIO or Server License)

    For on-premise installations, we offer a dedicated license with a limitation on activations, with each activation representing a separate Oz BIO seat. This license can be online or offline, depending on whether your Oz BIO servers have internet access. The online license is verified through our license server, while for offline licenses, we assist you in within your infrastructure and activating the license.

    hashtag
    Trial

    For test integration purposes, we provide a free trial license that is sufficient for initial use, such as testing with your datasets to check analysis accuracy. For Mobile SDKs, you can generate a one-month license yourself on our website: . If you would like to integrate with your web application, please to obtain a license, and we will also assist you in configuring your dedicated instance of our Web SDK. With the license, you will receive credentials to access our services.

    Once you're ready to move to commercial use, a new production license will be issued. We’ll provide you with new production credentials and assist you with integration and configuration. Our engineers are always available to help.

    Our software offers flexible licensing options to meet your specific needs. Whether you prioritize seamless updates or prefer autonomous operation, we have a solution tailored for you. If you have any questions, please contact us.

    On-Premise model of use
    Android
    iOS
    here
    add a license via Web Plugin
    setting up an offline license server
    click herearrow-up-right
    contact usenvelope

    SDK

    Adding the Plugin to Your Web Page

    hashtag
    Requirements

    A dedicated Web Adapter in our cloud or the adapter deployed on-premise. The adapter's URL is required for adding the plugin.

    hashtag
    Processing Steps

    To embed the plugin in your page, add a reference to the primary script of the plugin (plugin_liveness.php) that you received from Oz Forensics to the HTML code of the page. web-sdk-root-url is the Web Adapter link you've received from us.

    chevron-rightFor versions below 1.4.0hashtag

    Add a reference to the file with styles and to the primary script of the plugin (plugin_liveness.php) that you received from Oz Forensics to the HTML code of the page. web-sdk-root-url is the Web Adapter link you've received from us.

    circle-info

    For Angular and Vue, script (and files, if you use a version lower than 1.4.0) should be added in the same way. For React apps, use head at your template's main page to load and initialize the OzLiveness plugin. Please note: if you use <React.StrictMode>, you may experience issues with Web Liveness.

    On-Device Liveness

    In this section, there's a guide for the integration of the on-device liveness check.

    How to Integrate On-Device Liveness into Your Mobile Applicationchevron-right
    circle-info

    We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results.

    Server-Based Liveness

    In this section, we listed the guides for the server-based liveness check integrations.

    How to Integrate Server-Based Liveness into Your Web Applicationchevron-rightHow to Integrate Server-Based Liveness into Your Mobile Applicationchevron-right

    API

    <link rel="stylesheet" href="https://web-sdk-root-url/plugin/ozliveness.css" />
    <script src="https://web-sdk-root-url/plugin_liveness.php"></script>
    <script src="https://web-sdk-root-url/plugin_liveness.php"></script>

    Flutter

    In this section, we explain how to use Oz Flutter SDK for iOS and Android.

    Before you start, it is recommended that you install:

    • Flutter 3.0.0 or higher;

    • Android SDK 21 or higher;

    • dart 2.18.6 or higher;

    • iOS platform 13 or higher;

    • Xcode.

    Please find the Flutter repository .

    Oz Mobile SDK (iOS, Android, Flutter)

    Oz Mobile SDK stands for the Software Developer’s Kit of the Oz Forensics Liveness and Face Biometric System, providing seamless integration with customers’ mobile apps for login and biometric identification.

    Androidchevron-rightiOSchevron-right
    circle-info

    Currently, both Android and iOS SDK work in the portrait mode.

    Flutterchevron-right

    Android Localization: Adding a Custom or Updating an Existing Language Pack

    Please note: this feature has been implemented in 8.1.0.

    To add or update the language pack for Oz Android SDK, please follow these instructions:

    circle-info

    The localization record consists of the localization key and its string value, e.g., <string name="about">"About"</string>.

    1. Go to the folder for the locale needed, or create a new folder. Proceed to for the details.

    2. Create the file called strings.xml.

    3. Copy the strings from the attached file to your freshly created file.

    4. Redefine the strings you need in the appropriate localization records.

    A list of keys for Android:

    The keys action_*_go refer to the appropriate gestures. Others refer to the hints for any gesture, info messages, or errors.

    When new keys appear with new versions, if no translation is provided in your file, the new strings are shown in English.

    Instant API: Non-Persistent Mode

    Instant API, or non-persistent operation mode, has been introduced in API 6.0.1. It is a mode where we do not save any data anywhere. All data is being used only within a request: you send it, receive the response, and that's all, nothing gets recorded. This ensures you do not store any sensitive data, which might be crucial for GDPR compliance. Also, it significantly reduces storage requirements.

    To enable this mode, when you prepare the config.py file to run the API, set the OZ_APP_COMPONENTS parameter to stateless. Call POST /api/instant/folders/ to send the request without saving any data. Authorization for Instant API should be set on your side.

    circle-info

    Connecting SDK to API

    To connect SDK to Oz API, specify the API URL and as shown below.

    circle-info

    Please note: in your host application, it is recommended that you set the API address on the screen that precedes the liveness check. Setting the API URL initiates a service call to the API, which may cause excessive server load when being done at the application initialization or startup.

    Alternatively, you can use the login and password provided by your Oz Forensics account manager:

    By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for

    Oz API

    Oz API is the most important component of the system. It makes sure all other components are connected with each other. Oz API:

    • provides the unified Rest API interface to run the Liveness and Biometry analyses,

    • processes authorization and user permissions management,

    Description of the on_error Callback

    This callback is called when the system encounters any error. It contains the error details and telemetry ID that you can use for further investigation.

    Authentication and Non-Instant Data Handling

    In API 6.0, we've implemented new analysis modes:

    Basic Scenarios

    Liveness is checking that a person on a video is a real living person.

    Biometry compares two or more faces from different media files and shows whether the faces belong to the same person or not.

    Best shot is an addition to the Liveness check. The system chooses the best frame from a video and saves it as a picture for later use.

    Blacklist checks whether a face on a photo or a video matches with one of the faces in the pre-created database.

    herearrow-up-right
    this guidearrow-up-right
    file-archive
    15KB
    Oz_SDK_Android_Strings.zip
    archive
    arrow-up-right-from-squareOpen
    Authenticationchevron-right
    Uploading Mediachevron-right
    Quantitative Resultschevron-right
    Using a Webhook to Get Resultschevron-right
    Single Requestchevron-right
    Instant API: Non-Persistent Modechevron-right
    Livenesschevron-right
    Biometry (Face Matching)chevron-right
    Best Shotchevron-right
    Collection (1:N) Checkchevron-right

    tracks and records requested orders and analyses to the database (for example, in 6.0, the database size for 100 000 orders with analyses is ~4 GB),

  • archives the inbound media files,

  • collects telemetry from connected mobile apps,

  • provides settings for specific device models,

  • generates reports with analyses' results.

  • For the latest API methods collection, please refer to our API referencearrow-up-right.

    In API 6.0, we introduced two new operation modes: Instant API and single request.

    In the Instant API mode – also known as non-persistent – no data is stored at any point. You send a request, receive the result, and can be confident that nothing is saved. This mode is ideal for handling sensitive data and helps ensure GDPR compliance. Additionally, it reduces storage requirements on your side.

    Single request mode allows you to send all media along with the analysis request in one call and receive the results in the same response. This removes the need for multiple API calls – one is sufficient. However, if needed, you can still use the multi-request mode.

    on_error { 
        "code": "error_code", 
        "event_session_id": "id_of_telemetry_session_with_error", 
        "message": "<error decription>", 
        "context": {}  // additional information if any
    }
    Please note: as Instant API doesn't store data, it is not intended to work with Blacklist (1:N).

    If you use Instant API with Web SDK, in Web Adapter configuration, set architecture to lite. The version of Web SDK should be 1.7.14 or above.

    hashtag
    Requirements

    • CPU: 16 cores, 32 threads, base frequency – 2.3 GHz, single-core maximum turbo frequency – 4 GHz.

    • RAM: 32 GB, DDR 5, Dual Channel.

    To evaluate your RPS and RPM and configure your system for optimal performance, please contact us.

    hashtag
    Configuration File Parameters

    Prior to the launch, prepare a configuration file with the parameters listed below.

    hashtag
    Mandatory Parameters

    These parameters are crucial to run Instant API.

    hashtag
    Installation

    hashtag
    Docker

    hashtag
    Docker Compose

    hashtag
    Instant API Methods

    You can find the Instant API methods herearrow-up-right or download the collection below.

    file-download
    10KB
    OZ-Forensic Instant 6.0.0.postman_collection.json
    arrow-up-right-from-squareOpen
    as shown below:
    OZSDK.setApiConnection(Connection.fromServiceToken(host: "https://sandbox.ohio.ozforensics.com", token: token)) { (token, error) in
    }
    OZSDK.setApiConnection(Connection.fromCredentials(host: “https://sandbox.ohio.ozforensics.com”, login: login, password: p)) { (token, error) in
        // Your code to handle error or token
    }
    access token
    let eventsConnection = Connection.fromCredentials(host: https://echo.cdn.ozforensics.com/,
                                    login: <[email protected]>,
                                    password: your_telemetry_password)
    OZSDK.setEventsConnection(eventsConnection) { (token, error) in
    }
    telemetry
    # application components list, values for Instant API: auth,stateless
    # auth is for Oz authentication component
    OZ_APP_COMPONENTS=stateless
    # local storage support enable
    OZ_LOCAL_STORAGE_SUPPORT_ENABLE=false
    # service tfss host
    OZ_SERVICE_TFSS_HOST=http://xxx.xxx.xxx.xxx:xxxx
    # allowed hosts
    APP_ALLOWED_HOSTS=example-host1.com,example-host2.com
    # secret key
    OZ_API_SECRET_KEY=long_secret_key
    CONTAINER_NAME=<container name> \
    DEPLOY_INSTANT_PORT_EXT=<external port> \
    INSTANT_IMAGE=<provided image name> \
    ADDITIONAL_PARAMS="-e LICENSE_KEY=<your license key>" \
    
    docker run -d --name $CONTAINER_NAME
          $ADDITIONAL_PARAMS
          -p ${DEPLOY_INSTANT_PORT_EXT}:8080
          $INSTANT_IMAGE
    services:
      oz-api-instant:
        image: <provided image>
        container_name: oz-api-instant
        environment:
            - LICENSE_KEY=<your license key>
            # - TF_ENABLE_ONEDNN_OPTS=1 # In some cases, especially for AMD CPUs, set to 0
            # - API_LISTEN_PORT=8080
            # - LOG_LEVEL=info # ['critical', 'error', 'warning', 'info', 'debug', 'trace']
        restart: always
        ports:
          - 8080:8080

    iOS

    To start using Oz iOS SDK, follow the steps below.

    1. Embed Oz iOS SDK into your project as described here.

    2. Get a trial license for SDK on our websitearrow-up-right or a production license by emailenvelope. We'll need your bundle id. Add the license to your project as described here.

    3. Connect SDK to API as described . This step is optional, as this connection is required only when you need to process data on a server.

    4. Capture videos by creating the controller as described . You'll send them for analysis afterwards.

    5. Upload and analyze media you've taken at the previous step. The process of checking liveness and face biometry is described .

    6. If you want to customize the look-and-feel of Oz iOS SDK, please refer to .

    hashtag
    Resources

    Minimal iOS version: 11.

    Minimal Xcode version: 16.

    Available languages: EN, ES, HY, KK, KY, TR, PT-BR.

    A sample app source code using the Oz Liveness SDK is located in the GitLab repository:

    Follow the link below to see a list of SDK methods and properties:

    Download the demo app latest build .

    Collection (1:N) Management in Oz API

    This article describes how to create a collection via API, how to add persons and photos to this collection and how to delete them and the collection itself if you no longer need it. You can do the same in Web console, but this article covers API methods only.

    Collection in Oz API is a database of facial photos that are used to compare with the face from the captured photo or video via the Collection analysis

    Person represents a human in the collection. You can upload several photos for a single person.

    hashtag
    How to Create a Collection

    The collection should be created within a company, so you require your company's company_id as a prerequisite.

    If you don't know your ID, call GET /api/companies/?search_text=test, replacing "test" with your company name or its part. Save the company_id you've received.

    Now, create a collection via POST /api/collections/. In the request body, specify the alias for your collection and company_id of your company:

    In a response, you'll get your new collection identifier: collection_id.

    hashtag
    How to Add a Person or a Photo to a Collection

    To add a new person to your collection, call POST /api/collections/{{collection_id}}/persons/, using collection_id of the collection needed. In the request body, add a photo or several photos. Mark them with appropriate in the payload:

    The response will contain the person_id which stands for the person identifier within your collection.

    If you want to add a name of the person, in the request payload, add it as metadata:

    To add more photos of the same person, call POST {{host}}/api/collections/{{collection_id}}/persons/{{person_id}}/images/ using the appropriate person_id. The request body should be filled as you did it before with POST /api/collections/{{collection_id}}/persons/.

    To obtain information on all the persons within the single collection, call GET /api/collections/{{collection_id}}/persons/.

    To obtain a list of photos for a single person, call GET /api/collections/{{collection_id}}/persons/{{person_id}}/images/. For each photo, the response will containperson_image_id. You'll need this ID, for instance, if you want to delete the photo.

    hashtag
    How to Remove a Photo or a Person from a Collection

    To delete a person with all their photos, call DELETE /api/collections/{{collection_id}}/persons/{{person_id}} with the appropriate collection and person identifiers. All the photos will be deleted automatically. However, you can't delete a person entity if it has any related analyses, which means the Collection analysis used this photo for comparison and found a match. To delete such a person, you'll need to delete these analyses using DELETE /api/analyses/{{analysis_id}} with analysis_id of the Collection (1:N) analysis.

    To delete all the collection-related analyses, get a list of folders where the Collection analysis has been used: call GET /api/folders/?analyse.type=COLLECTION. For each folder from this list (GET /api/folders/{{folder_id}}/), find the analysis_id of the required analysis, and delete the analysis – DELETE /api/analyses/{{analysis_id}}.

    To delete a single photo of a person, call DELETE collections/<collection_id>/persons/<person_id>/images/<media_id>/ with collection, person, and image identifiers specified.

    hashtag
    How to Delete a Collection

    Delete the information on all the persons from this collection as described above, then call DELETE /api/collections/{{collection_id}}/ to delete the remaining collection data.

    Adding SDK to a Client’s Mobile App

    hashtag
    CocoaPods

    To integrate OZLivenessSDK into an Xcode project via the CocoaPodsarrow-up-right dependency manager, add the following code to Podfile:

    Version is optional as, by default, the newest version is integrated. However, if necessary, you can find the older version number in Changelog.

    Since 8.1.0, you can also use a simpler code:

    By default, the full version is being installed. It contains both server-based and on-device analysis modes. To install the server-based version only, use the following code:

    For 8.1.0 and higher:

    hashtag
    SPM

    Please note: installation via SPM is available for versions 8.7.0 and above.

    Add the following package dependencies via SPM: (if you need a guide on adding the package dependencies, please refer to the ). OzLivenessSDK is mandatory. If you don't need the on-device analyses, skip the OzLivenessSDKOnDevice file.

    hashtag
    Manual Installation

    You can also add the necessary frameworks to your project manually.

    1. Download the SDK files from and add them to your project.

    • OZLivenessSDK.xcframework,

    • OZLivenessSDKResources.bundle,

    • OZLivenessSDKOnDeviceResources.bundle (if you don't need the on-device analyses, skip this file).

    1. Download the TensorFlow framework 2.11 from .

    2. Make sure that:

    • both xcframework are in Target-Build Phases -> Link Binary With Libraries and Target-General -> Frameworks, Libraries, and Embedded Context;

    • the bundle file(s) are in Target-Build Phases -> Copy Bundle Resources.

    Localization: Adding a Custom Language Pack

    The add_lang(lang_id, lang_obj) method allows adding a new or customized language pack.

    Parameters:

    • lang_id: a string value that can be subsequently used as lang parameter for the open() method;

    • lang_obj: an object that includes identifiers of translation strings as keys and translation strings themselves as values.

    A list of language identifiers:

    lang_id
    Language

    An example of usage:

    OzLiveness.add_lang('en', enTranslation), where enTranslation is a JSON object.

    To set the SDK language, when you launch the plugin, specify the language identifier in lang:

    You can check which locales are installed in Web SDK: use the ozLiveness.get_langs() method. If you have added a locale manually, it will also be shown.

    A list of all language identifiers:

    The keys oz_action_*_go refer to the appropriate gestures. oz_tutorial_camera_* – to the hints on how to enable camera in different browsers. Others refer to the hints for any gesture, info messages, or errors.

    Since 1.5.0, if your language pack doesn't include a key, the message for this key will be shown in English.

    chevron-rightBefore 1.5.0hashtag

    If your language pack doesn't include a key, the translation for this key won't be shown.

    Checking Liveness and Face Biometry

    circle-info

    hashtag
    If you use our SDK just for capturing videos, omit this step.

    No-Server Licensing

    circle-info

    Mostly, license is set on the server side (Web Adapter). This article covers a rare case when you use Web Plugin only.

    To generate the license, we need the domain name of the website where you are going to use Oz Forensics Web SDK, for instance, your-website.com. You can also define subdomains.

    circle-info

    pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => 'VERSION'
    // for the latest version
    pod ‘OZLivenessSDK’
    // OR, for the specific version
    // pod ‘OZLivenessSDK’, ‘8.22.0’
    tags
    https://gitlab.com/oz-forensics/oz-mobile-ios-sdkarrow-up-right
    Apple documentationarrow-up-right
    herearrow-up-right
    herearrow-up-right

    en

    English

    es

    Spanish

    pt-br

    Portuguese (Brazilian)

    kz, kk

    Kazakh

    file-archive
    23KB
    Web SDK Strings 1.6.0. EN.zip
    archive
    arrow-up-right-from-squareOpen
    hashtag
    OzCapsula (SDK v8.22 and newer)

    At the Capturing Videos step you've created a data container with all the required information in it , so now just send it to analysis using the addContainer(container) and run methods.

    hashtag
    SDK 8.21 and older

    To check liveness and face biometry, you need to upload media to our system and then analyze them.

    circle-info

    To interpret the results of analyses, please refer to Types of Analyses.

    Here’s an example of performing a check:

    To delete media files after the checks are finished, use the clearActionVideos method.

    hashtag
    Adding Metadata

    To add metadata to a folder, use the addFolderMeta method.

    hashtag
    Extracting the Best Shot

    In the params field of the Analysis structure, you can pass any additional parameters (key + value), for instance, to extract the best shot on the server side.

    hashtag
    Using Media from Another SDK

    To use a media file that is captured with another SDK (not Oz Android SDK), specify the path to it in OzAbstractMedia:

    hashtag
    Adding Media to a Certain Folder

    If you want to add your media to the existing folder, use the setFolderId method:

    To find the origin, in the developer mode, run window.origin on the page you are going to embed Oz Web SDK in. At localhost / 127.0.0.1, license can work without this information.

    Set the license as shown below:

    • With license data:

    • With license path:

    Check whether the license is updated properly.

    Example

    Proceed to your website origin and launch Liveness -> Simple selfie.

    Once the license is added, the system will check its validity on launch.

    OzLiveness.open({
        license: {
            'payload_b64': 'some_payload',
            'signature': 'some_data',
            'enc_public_key': 'some_key'
        },
        ...,
    })
    {
      "alias": "your_collection",
      "company_id": "your_company_id"
    }
    {
        "media:tags": {
            "image1": [
                "photo_selfie",
                "orientation_portrait"
            ]
        }
    }
        "person:meta_data": {
            "person_info": {
                "first_name": "John",
                "middle_name": "Jameson",
                "last_name": "Doe"
            }
        },
    pod 'OZLivenessSDK/Core', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios.git',  :tag => 'VERSION'
    pod ‘OZLivenessSDK/Core’
    // OR
    // pod ‘OZLivenessSDK/Core’, ‘8.22.0’
    // Editing the button text
    OzLiveness.add_lang('en', {
      action_photo_button: 'Take a photo'
    });
    OzLiveness.open({
        lang: 'es', // the identifier of the needed language
        ...
    });
    analysisCancelable = AnalysisRequest.Builder()
     // mediaToAnalyze is an array of OzAbstractMedia that were captured or otherwise created 
        .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaToAnalyze))// or ON_DEVICE if you want the on-device analysis
        .build()
    //initiating the analyses and setting up a listener
        .run(object : AnalysisRequest.AnalysisListener {
            override fun onStatusChange(status: AnalysisRequest.AnalysisStatus) { handleStatus(status) // or your status handler
            }
            override fun onSuccess(result: RequestResult) {
                handleResults(result) // or your result handler
            }
            override fun onError(error: OzException) { handleError(error) // or your error handler 
            }
        })
    analysisCancelable = new AnalysisRequest.Builder()
    // mediaToAnalyze is an array of OzAbstractMedia that were captured or otherwise created 
            .addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaToAnalyze)) // or ON_DEVICE if you want the on-device analysis
            .build()
    //initiating the analyses and setting up a listener
            .run(new AnalysisRequest.AnalysisListener() { 
                @Override
                public void onSuccess(@NonNull RequestResult list) { handleResults(list); } // or your result handler
                @Override
                public void onError(@NonNull OzException e) { handleError(e); } // or your error handler
                @Override
                public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) { handleStatus(analysisStatus); } // or your status handler
            })
        .addFolderMeta(
            mapOf(
                "key1" to "value1",
                "key2" to "value2"
            )
        )
    .addFolderMeta(Collections.singletonMap("key", "value")) 
    Kotlin
    private fun runAnalysis(container: DataContainer?) {
        if (container == null) return
        AnalysisRequest.Builder()
            .addContainer(container)
            .build()
            .run(
                { result ->
                    val isSuccess = result.analysisResults.all { it.resolution == Resolution.SUCCESS }
                },
                { /* show error */ },
                { /* update status */ },
            )
    }
    mapOf("extract_best_shot" to true)
           val file = File(context.filesDir, "media.mp4") // use context.getExternalFilesDir(null) instead of context.filesDir for external app storage
           val media = OzAbsractMedia.OzVideo(OzMediaTag.VideoSelfieSmile, file.absolutePath)
        .setFolderId(folderId)
    OzLiveness.open({
        licenseUrl: 'https://some_url',
        ...,
    })
    here
    here
    here
    this section
    iOS SDK Methods and Propertieschevron-right
    herearrow-up-right

    Checking Liveness and Face Biometry

    circle-info

    hashtag
    If you use our SDK just for capturing videos, omit this step.

    hashtag
    OzCapsula (SDK v8.22 and newer)

    At the step you've created a data container with all the required information in it , so now just send it to analysis using the addContainer(container) and run methods.

    hashtag
    SDK 8.21 and older

    To check liveness and face biometry, you need to upload media to our system and then analyze them.

    circle-info

    To interpret the results of analyses, please refer to .

    Below, you'll see the example of performing a check and its description.

    To delete media files after the checks are finished, use the cleanTempDirectory method.

    hashtag
    Adding Metadata

    To add metadata to a folder, use AnalysisRequest.addFolderMeta.

    hashtag
    Extracting the Best Shot

    In the params field of the Analysis structure, you can pass any additional parameters (key + value), for instance, to extract the best shot on the server side.

    hashtag
    Using Media from Another SDK

    To use a media file that is captured with another SDK (not Oz iOS SDK), specify the path to it in the structure (the bestShotURL property):

    hashtag
    Adding Media to a Certain Folder

    If you want to add your media to the existing folder, use the addFolderId method:

    Oz API Key Concepts

    The Oz API is a comprehensive Rest API that enables facial biometrics, allowing for both face matching and liveness checks. This write-up provides an overview of the essential concepts that one should keep in mind while using the Oz API.

    hashtag
    Authentication, Roles, and Access Management

    To ensure security, every Oz API call requires an access token in its HTTP headers. To obtain this token, execute the POST /api/authorize/auth method with login and password provided by us. Pass this token in X-Forensics-Access-Token header in subsequent Oz API calls.

    provides comprehensive details on the authentication process. Kindly refer to it for further information.

    Furthermore, the Oz API offers distinct user roles, ranging from CLIENT, who can perform checks and access reports but lacks administrative rights, e.g., deleting folders, to ADMIN, who enjoys nearly unrestricted access to all system objects. For additional information, please consult .

    hashtag
    Persistence

    The unit of work in Oz API is a folder: you can upload interrelated media to a folder, run analyses on them, and check for the aggregated result. A folder can contain the unlimited number of media, and each of the media can be a target of several analyses. Also, analyses can be performed on a bunch of media.

    hashtag
    Media Types and Tags

    Media OZ API works with photos and videos. Video can be either a regular video container, e.g., MP4 or MOV, or a ZIP archive with a sequence of images. Oz API uses the file mime type to define whether media is an image, a video, or a shot set.

    It is also important to determine the semantics of a content, e.g., if an image is a photo of a document or a selfie of a person. This is achieved by using tags. The selection of tags impacts whether specific types of analyses will recognize or ignore particular media files. The most important tags are:

    • photo_id_front – for the front side of a photo ID

    • photo_selfie – for a non-document reference photo

    • video_selfie_blank

    The full list of Oz media tags with their explanation and examples can be found .

    hashtag
    Asynchronous analyses

    Since video analysis may take a few seconds, the analyses are performed asynchronously. This implies that you initiate an analysis (/api/folders/{{folder_id}}/analyses/) and then monitor the outcomes by polling until processing is complete (/api/analyses/{{analyse_id}} for a single analysis or /api/folders/{{folder_id}}/analyses/ for all folder’s analyses). Alternatively, there is a webhook option available. To see an example of how to use both the polling and webhook options, please check .

    These were the key concepts of Oz API. To gain a deeper understanding of its capabilities, please refer to the of our developer guide.

    Closing or Hiding the Plugin

    hashtag
    Closing the Plugin

    To force the closing of the plugin window, use the close() method. All requests to server and callback functions (except on_close) within the current session will be aborted.

    Example:

    var session_id = 123;
    
    OzLiveness.open({
      // We transfer the arbitrary meta data, by which we can later identify the session in Oz API
      meta: {
        session_id: session_id 
      },
      // After sending the data, forcibly close the plugin window and independently request the result
      on_submit: function() {
        OzLiveness.close();
        my_result_function(session_id);
      }
    });

    hashtag
    Hiding the Plugin Window without Cancelling the Callbacks

    To hide the plugin window without cancelling the requests for analysis results and user callback functions, call the hide() method. Use this method, for instance, if you want to display your own upload indicator after submitting data.

    An example of usage:

    Description of the on_result Callback

    This callback is called periodically during the analysis’ processing. It retrieves an intermediate result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode configuration parameter.

    circle-info

    Keep in mind that it is more secure to get your back end responsible for the decision logic. You can find more details including code samples here.

    hashtag
    Safe

    When result_mode is safe, the on_result callback contains the state of the analysis only:

    or

    triangle-exclamation

    Please note: the options listed below are for testing purposes only. If you require more information than what is available in the Safe mode, please follow .

    hashtag
    Status

    For the status value, the callback contains the state of the analysis, and for each of the analysis types, the name of the type, state, and resolution.

    or

    hashtag
    Folder

    The folder value is almost similar to the status value, with the only difference: the folder_id is added.

    hashtag
    Full

    In this case, you receive the detailed response possibly containing sensitive data. This mode is deprecated; for security reasons, we recommend using the safe mode.

    Proprietary Format: OzCapsula Data Container

    To improve the overall security level or Oz products, we’ve introduced a new proprietary data exchange format that provides improved data confidentiality and integrity: OzCapsula.

    hashtag
    How It Works

    A proprietary data exchange format is a container that safely stores and transmits transaction-related media data.

    When you capture video using Oz SDK, your media is placed into the OzCapsula container along with all required information. The package can be processed only in Oz API due to internal mechanisms, so it is significantly more difficult to access the package containing.

    circle-info

    Please note: we do not disclose specific technical details for security reasons.

    hashtag
    Benefits

    • Integrity and Authenticity Guaranteed. Each container’s content is verified by the API, ensuring data arriving from the user device is genuine, untampered, and fully intact.

    • Full Confidentiality of Internal Data. Multi-layered cryptographic protection keeps all data, metadata, and technical details hidden from unauthorized viewing, extraction, or interpretation.

    • Unified Data Package. All content is stored in a single, secure file, simplifying transmission and enabling consistent, predictable processing.

    hashtag
    Requirements

    Minimal component versions:

    • API: 6.4.1.

    • Web SDK: 1.9.2.

    • Native SDKs: 8.22.

    You will also require a new token: session_token.

    hashtag
    Usage

    circle-info

    Please note: currently, the container works with only.

    1. Configure SDK and API:

    Rules of Assigning Analyses

    This article covers the default rules of applying analyses.

    Analyses in Oz system can be applied in two ways:

    • manually, for instance, when you choose the Liveness scenario in our demo application;

    • automatically, when you don’t choose anything and just assign all possible analyses (via API or SDK).

    The automatic assignment means that Oz system decides itself what analyses to apply to media files based on its

    How to Issue a Service Token

    Here’s a step-by-step guide on how to issue a service token in Oz API 5 and 6.

    1

    hashtag
    Step 1

    Authorize using your ADMIN account: {{host}}/api/authorize/auth

    Master License for iOS

    Master license is the offline license that allows using Mobile SDKs with any bundle_id, unlike the regular licenses. To get a master license, create a pair of keys as shown below. Email us the public key, and we will email you the master license shortly after that. Your application needs to sign its bundle_id with the private key, and the Mobile SDK checks the signature using the public key from the master license. Master licenses are time-limited.

    hashtag
    Generating Keys

    This section describes the process of creating your private and public keys.

    Metadata

    hashtag
    Overview

    Metadata is any optional data you might need to add to a . In the meta_data section, you can include any information you want, simply by providing any number of fields with their values:

    hashtag

    Using a Webhook to Get Results

    The webhook feature simplifies getting analyses' results. Instead of polling after the analyses are launched, add a webhook that will call your website once the results are ready.

    When you create a folder, add the webhook endpoint (resolution_endpoint) into the payload section of your request body:

    You'll receive a notification each time the analyses are completed for this folder. The webhook request will contain information about the folder and its corresponding analyses.

    How to Restore the Previous Design after an Update

    If you want to get back to the previous (up to 6.4.2) versions' design, reset the customization settings of the capture screen and apply the parameters that are listed below.

    Security Recommendations

    hashtag
    Retrieve the analysis response and process it on the back end

    Even though the analysis result is available to the host application via Web Plugin callbacks, it is recommended that the application back end receives it directly from Oz API. All decisions of the further process flow should be made on the back end as well. This eliminates any possibility of malicious manipulation with analysis results within the browser context.

    To find your folder from the back end, you can follow these steps:

    iOS Localization: Adding a Custom or Updating an Existing Language Pack

    Please note: this feature has been implemented in 8.1.0.

    To add or update the language pack for Oz iOS SDK, use the set(languageBundle: Bundle) method. It shows the SDK that you are going to use the non-standard bundle. In OzLocalizationCode, use the custom language (optional).

    circle-info

    The localization record consists of the localization key and its string value, e.g., "about" = "About"

    Browser Compatibility

    Please note: for the plugin to work, your browser version should support JavaScript ES6 and be the one as follows or newer.

    Browser
    Version

    Description of the on_complete Callback

    This callback is called after the check is completed. It retrieves the analysis result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode .

    circle-info

    Keep in mind that it is more secure to get your back end responsible for the decision logic. You can find more details including code samples .

    Oz API Lite

    What is Oz API Lite, when and how to use it.

    triangle-exclamation

    API Lite is deprecated and no longer maintained. Its functionality has been added to API full: see .

    Oz API Lite is the lightweight yet powerful version of Oz API. The Lite version is less resource-demanding, more productive, and easier to work with. The analyses are made within the API Lite image. As Oz API Lite doesn't include any additional services like statistics or data storage, this version is the one to use when you need a high performance.

    hashtag
    Examples of methods

    To check the Liveness processor, call GET /v1/face/liveness/health.

    To check the Biometry processor, call GET /v1/face/pattern/health.

    To perform the liveness check for an image, call POST /v1/face/liveness/detect (it takes an image as an input and displays the evaluation of spoofing attack chance in this image)

    To compare two faces in two images, call POST /v1/face/pattern/extract_and_compare (it takes two images as an input, derives the biometry templates from these images, and compares them).

    To compare an image with a bunch of images, call POST /v1/face/pattern/extract_and_compare_n.

    For the full list of Oz API Lite methods, please refer to API Methods.

    Instant API
    – for a liveness video recorded beyond Oz Liveness SDK
  • if a media file is captured using the Oz Liveness SDK, the tags are assigned automatically.

  • This article
    this guide
    here
    this guide
    Oz API section

    Built-In Investigation Tools. Every action within the container is logged, giving complete visibility for incident analysis and rapid troubleshooting.

  • Strict, Built-In Access Control. Only authorized systems can open or use the container, preventing misuse, tampering attempts, or unauthorized integration.

  • Ready for High-Volume Workflows. The container is designed to scale effortlessly, supporting any number of transmissions without performance or integration issues.

  • Native SDK doesn't require additional configuration, you'll just need to use new methods.
  • On your backend, obtain a session token as described here to use it on the frontend.

  • Using Oz SDK methods, take a video and package it along with all required data into container:

    • Web SDK

    • Native SDK

  • Send the container to Oz API using Oz SDK or through your backend (optional) and get the results.

  • single request mode
    Web SDK
    API

    Face Matching

    In this section, we listed the guides for the face matching checks.

    circle-exclamation

    Please note: this section applies to the non-container flow only.

    How to Add Face Matching of Liveness Video with a Reference Photo From Your Databasechevron-rightHow to Add Photo ID Capture and Face Matching to Your Web or Mobile Applicationchevron-right

    17

    Opera

    47

    *Web SDK doesn't work in Internet Explorer compatibility mode due to lack of important functions.

    Google Chrome (and other browsers based on the Chromium engine)

    56

    Mozilla Firefox

    55

    Safari

    11

    Microsoft Edge*

    Capturing Videos
    Types of Analyses
    OzMedia
    OzLiveness.open({
      // When receiving an intermediate result, hide the plugin window and show your own loading indicators
      on_result: function(result) {
        OzLiveness.hide();
        if (result.state === 'processing') {
          show_my_loader();
        }
      },
      on_complete: function() {
        hide_my_loader();
      }
    });
    Security Recommendations
    Payload example
    {    
      "resolution_endpoint": "address.com", // use address of your website here
        ... // other request details - folder etc.
    }
    Postman
    OzLivenessSDK.config.customization = UICustomization(
        // customization parameters for the toolbar
        toolbarCustomization = ToolbarCustomization(
            closeIconTint = Color.ColorHex("#FFFFFF"),
            backgroundColor = Color.ColorHex("#000000"),
            backgroundAlpha = 100,
        ),
        // customization parameters for the center hint
        centerHintCustomization = CenterHintCustomization(
            verticalPosition = 70
        ),
        // customization parameters for the hint animation
        new HintAnimation(
            hideAnimation = true
        ),
        // customization parameters for the frame around the user face
        faceFrameCustomization = FaceFrameCustomization(
            strokeDefaultColor = Color.ColorHex("#EC574B"),
            strokeFaceInFrameColor = Color.ColorHex("#00FF00"),
            strokeWidth = 6,
        ),
        // customization parameters for the background outside the frame
        backgroundCustomization = BackgroundCustomization(
            backgroundAlpha = 100
        ),
    )

    Customizing iOS SDK Interface

    To customize the Oz Liveness interface, use OZCustomization as shown below. For the description of customization parameters, please refer to iOS SDK Methods and Properties.

    circle-exclamation

    Please note: the customization methods should be called before the video capturing ones.

    How to Restore the Previous Design after an Update

    If you want to get back to the previous (up to 6.4.2) versions' design, reset the customization settings of the capture screen and apply the parameters that are listed below.

    // customization parameters for the toolbar
    let toolbarCustomization = ToolbarCustomization(
       closeButtonColor: .white,
       backgroundColor: .black)
    
    // customization parameters for the center hint
    let centerHintCustomization = CenterHintCustomization(
       verticalPosition: 70)
       
    // customization parameters for the center hint animation
    let hintAnimationCustomization = HintAnimationCustomization(
        hideAnimation: true)
    
    // customization parameters for the frame around the user face
    let faceFrameCustomization = FaceFrameCustomization(
       strokeWidth: 6,
       strokeFaceAlignedColor: .green,
       strokeFaceNotAlignedColor: .red)
    
    // customization parameters for the background outside the frame
    let backgroundCustomization = BackgroundCustomization(
       backgroundColor: .clear)
    
    OZSDK.customization = OZCustomization(toolbarCustomization: toolbarCustomization,
       centerHintCustomization: centerHintCustomization,
       hintAnimationCustomization: hintAnimationCustomization,
       faceFrameCustomization: faceFrameCustomization,
       versionCustomization: vesrionCustomization,
       backgroundCustomization: backgroundCustomization)

    hashtag
    Creating a Private Key

    To create a private key, run the commands below one by one.

    You will get these files:

    • privateKey.der is a private .der key;

    • privateKey.txt is privateKey.der encoded by base64. This key containing will be used as the host app bundle_id signature.

    The OpenSSL command specification: https://www.openssl.org/docs/man1.1.1/man1/openssl-pkcs8.htmlarrow-up-right

    hashtag
    Creating a Public Key

    To create a public key, run this command.

    You will get the public key file: publicKey.pub. To get a license, please email us this file. We will email you the license.

    hashtag
    SDK Integration

    SDK initialization:

    License setting:

    Prior to the SDK initializing, create a base64-encoded signature for the host app bundle_id using the private key.

    Signature creation example:

    Pass the signature as the masterLicenseSignature parameter either during the SDK initialization or license setting.

    If the signature is invalid, the initialization continues as usual: the SDK checks the list of bundle_id included into the license like it does it by default without a master license.

    Objects and Methods

    Metadata is available for most Oz system objects. Here is the list of these objects with the API methods required to add metadata. Please note: you can also add metadata to these objects during their creation.

    Object

    API Method

    User

    PATCH /api/users/{{user_id}}

    Folder

    PATCH /api/folders/{{folder_id}}/meta_data/

    Media

    PATCH /api/media/{{media_id}}/meta_data

    Analysis

    PATCH /api/analyses/{{analyse_id}}/meta_data

    Collection

    PATCH /api/collections/{{collection_id}}/meta_data/

    and, for a person in a collection,

    PATCH /api/collections/{{collection_id}}/persons/{{person_id}}/meta_data

    You can also change or delete metadata. Please refer to our API documentationarrow-up-right.

    hashtag
    Usage Examples

    You may want to use metadata to group folders by a person or lead. For example, if you want to calculate conversion when a single lead makes several Liveness attempts, just add the person/lead identifier to the folder metadata.

    Here is how to add the client ID iin to a folder object.

    In the request body, add:

    circle-info

    You can pass an ID of a person in this field, and use this ID to combine requests with the same person and count unique persons (same ID = same person, different IDs = different persons). This ID can be a phone number, an IIN, an SSN, or any other kind of unique ID. The ID will be displayed in the report as an additional column.

    Another case is security: when you need to process the analyses’ result from your back end, but don’t want to perform this using the folder ID. Add an ID (transaction_id) to this folder and use this ID to search for the required information. This case is thoroughly explained here.

    If you store PII in metadata, make sure it complies with the relevant regulatory requirements.

    You can also add metadata via SDK to process the information later using API methods. Please refer to the corresponding SDK sections:

    • iOS

    • Android

    • Web

    system object
    1. On the front end, add your unique identifier to the folder metadata.

    You can add your own key-value pairs to attach user document numbers, phone numbers, or any other textual information. However, ensure that tracking personally identifiable information (PII) complies with relevant regulatory requirements.

    1. Use the on_complete callback of the plugin to be notified when the analysis is done. Once used, call your back end and pass the transaction_id value.

    2. On the back end side, find the folder by the identifier you've specified using the Oz API Folder LIST method:

      To speed up the processing of your request, we recommend adding the time filter as well:

    3. In the response, find the analysis results and folder_id for future reference.

    hashtag
    Limit amount of the information sent to Web Plugin from the server

    Web Adapter may send analysis results to the Web Plugin with various levels of verbosity. It is recommended that, in production, the level of verbosity is set to minimum. In the Web Adapter configuration file, set the result_mode parameter to "safe".

    hashtag
    Safe

    When result_mode is safe, the on_complete callback contains the state of the analysis only:

    triangle-exclamation

    Please note: The options listed below are for testing purposes only. If you require more information than what is available in the Safe mode, please follow Security Recommendations.

    hashtag
    Status

    For the status value, the callback contains the state of the analysis, and for each of the analysis types, the name of the type, state, and resolution.

    hashtag
    Folder

    The folder value is almost similar to the status value, with the only difference: the folder_id is added.

    hashtag
    Full

    In this case, you receive the detailed response possibly containing sensitive data. This mode is deprecated; for security reasons, we recommend using the safe mode.

    configuration parameter
    here
    func onResult(container: DataContainer) {
      let analysisRequest = AnalysisRequestBuilder()
      analysisRequest.addContainer(container)
      analysisRequest.run(
                statusHandler: { status in
                },
                errorHandler: { error in
                }
            ) { result in
            }
    }
    let analysisRequest = AnalysisRequestBuilder()
    // create one or more analyses
    let analysis = Analysis.init(
    	media: mediaToAnalyze, // mediaToAnalyze is an array of OzMedia that were captured or otherwise created
    	type: .quality, // check the analysis types in iOS methods
    	mode: .serverBased) // or .onDevice if you want the on-device analysis
    analysisRequest.uploadMedia(mediaToAnalyze)
    analysisRequest.addAnalysis(analysis)
    // initiate the analyses
    analysisRequest.run(
    	statusHandler: { state in }, // scenario steps progress handler
    	errorHandler: { _ in }  
    ) { result in
        // receive and handle analyses results here 
    }
    let analysis = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased)
    var folderMeta: [String: Any] = ["key1": "value1"]
    analysisRequest.addFolderMeta(folderMeta)
    ...
    let analysis = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased, params: [“extract_best_shot” : true])
    let referenceMedia = OZMedia.init(movement: .selfie,
                      mediaType: .movement,
                      metaData: ["meta":"data"],
                      videoURL: nil,
                      bestShotURL: imageUrl,
                      preferredMediaURL: nil,
                      timestamp: Date())
    let analysis = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased)
    analysisRequest.addFolderId(IdRequired)
    {
     "state": "processing"
    }
    {
     "state": "finished"
    }
    {
     "state": "processing",
     "analyses": {
       "quality": {
         "state": "processing",
         "resolution": ""
       }
     }
    }
    {
     "state": "finished",
     "analyses": {
       "quality": {
         "state": "finished",
         "resolution": "success"
       }
     }
    }
    {
     "state": "processing",
     "folder_id": "your_folder_id",
     "analyses": {
       "quality": {
         "state": "processing",
         "resolution": ""
       }
     }
    }
    OzLivenessSDK.INSTANCE.getConfig().setCustomization(new UICustomization(
    // customization parameters for the toolbar
    new ToolbarCustomization(
        R.drawable.ib_close,
        new Color.ColorHex("#FFFFFF"),
        new Color.ColorHex("#000000"),
        100, // toolbar text opacity (in %)
        ),
    // customization parameters for the center hint
    new CenterHintCustomization(
        70, // vertical position (in %)
    ),
    // customization parameters for the hint animation
    new HintAnimation(
        true, // hide animation
    ),
    // customization parameters for the frame around the user face
    new FaceFrameCustomization(     
        new Color.ColorHex("#EC574B"), 
        new Color.ColorHex("#00FF00"),
        6, // frame stroke width (in dp)
     ),
    // customization parameters for the background outside the frame
    new BackgroundCustomization(
        100 // background opacity (in %)
    ),
      )
    );
    // customization parameters for the toolbar
    let toolbarCustomization = ToolbarCustomization(
       closeButtonIcon: UIImage(named: "example"),
       closeButtonColor: .black.withAlphaComponent(0.8),
       titleText: "",
       titleFont: .systemFont(ofSize: 18, weight: .regular),
       titleColor: .gray,
       backgroundColor: .lightGray)
    
    // customization parameters for the center hint
    let centerHintCustomization = CenterHintCustomization(
       textColor: .white,
       textFont: .systemFont(ofSize: 22, weight: .regular),
       verticalPosition: 42,
       backgroundColor: UIColor.init(hexRGBA: "1C1C1E8F")!,
       hideTextBackground: false,
       backgroundCornerRadius: 14)
       
    // customization parameters for the center hint animation
    let hintAnimationCustomization = HintAnimationCustomization(
       hideAnimation: false,
       animationIconSize: 80,
       toFrameGradientColor: UIColor.red)
    
    // customization parameters for the frame around the user face
    let faceFrameCustomization = FaceFrameCustomization(
       strokeWidth: 4,
       strokeFaceAlignedColor: .green,
       strokeFaceNotAlignedColor: .red,
       geometryType: .rect(cornerRadius: 10),
       strokePadding: 3)
    
    // customization parameters for the SDK version text
    let versionCustomization = VersionLabelCustomization(
       textFont: .systemFont(ofSize: 12, weight: .regular),
       textColor: .gray
    )
    
    // customization parameters for the background outside the frame
    let backgroundCustomization = BackgroundCustomization(
       backgroundColor: .lightGray
    )
    
    // customization parameters for the antiscam protection text
    let antiscamCustomization: AntiscamCustomization = AntiscamCustomization(
       customizationEnableAntiscam: false,
       customizationAntiscamTextMessage: "Face recognition",
       customizationAntiscamTextFont: UIFont.systemFont(ofSize: 15, weight: .semibold),
       customizationAntiscamTextColor: UIColor.black,
       customizationAntiscamBackgroundColor: UIColor.init(hexRGBA: "F2F2F7FF")!,
       customizationAntiscamCornerRadius: 18,
       customizationAntiscamFlashColor: UIColor.init(hexRGBA: "FF453AFF")!)
    
    // customization parameters for your logo
    // should be allowed by license
    let logoCustomization = LogoCustomization(
       image: UIImage(), 
       size: CGSize(width: 100, height: 100), 
       verticalPosition: 100, 
       horizontalPosition: 50)
    
    OZSDK.customization = Customization(toolbarCustomization: toolbarCustomization,
       antiscamCustomization: antiscamCustomization,
       centerHintCustomization: centerHintCustomization,
       hintAnimationCustomization: hintAnimationCustomization,
       faceFrameCustomization: faceFrameCustomization,
       versionCustomization: vesrionCustomization,
       backgroundCustomization: backgroundCustomization,
       logoCustomization: logoCustomization)
    
    openssl genpkey -algorithm RSA -outform DER -out privateKey.der -pkeyopt rsa_keygen_bits:2048
    # for MacOS
    base64 -i privateKey.der -o privateKey.txt
    # for Linux 
    base64 -w 0 privateKey.der > privateKey.txt
    openssl rsa -pubout -in privateKey.der -out publicKey.pub
    OZSDK(licenseSources: [LicenseSource], masterLicenseSignature: String? = nil, completion: @escaping ((LicenseData?, LicenseError?) -> Void))
    setLicense(licenseSource: LicenseSource, masterLicenseSignature: String? = nil)
    private func getSignature() -> String? {
        let privateKeyBase64String = "the string copied from the privateKey.txt file"
        // with key example:
        // let privateKeyBase64String = "MIIEogIBAAKCAQEAvxpyONpif2AjXiiG8fs9pQkn5C9yfiP0lD95Z0UF84t0Ox1S5U1UuVE5kkTYYGvS2Wm7ykUEGuHhqt/PyOAxrrNkAGz3OcVTsvcqPmwcf4UNdYZmug6EnQ5ok9wxYARS0aYqJUdzUb4dKOYal6WpHZE4yLx08R0zQ5jPkg5asT2u2PLB7JHZNnXwBcvdUonAgocNzdakUbWTNHKMxjwdAvwdIICdIneLZ9nCqe1d0cx7JBIhLzSPu/DVRANF+DOsE9JM8DT/Snnjok2xXzqpxBs1GwqiMJh98KYP78AVRWFuq3qbq0hWpjbq+mWl8xa7UMv8WxVd4PvQkWVYq/ojJwIDAQABAoIBAEvkydXwTMu/N2yOdcEmAP5I25HQkgysZNZ3OtSbYdit2lQbui8cffg23MFNHA125L65Mf4LWK0AZenBhriE6NYzohRVMf28cxgQ9rLhppOyGH1DCgr79wiUj02hVe6G6Qkfj39Ml+yvrs7uS0NMZBQ89yspRNv4t8IxrsWXc8cNQr33fdArlZ021Z12u2wdamaagiFwTa6ZYcQ5OYl3d/xL+oAwf9ywHwRrVM2JksGCxrcLJ7JCOL6lLyjp8rRrIG4iZ1V8YDfUNHmwD4w1fl30H6ejA+Cy5qge7CBZK+hqKr+hOcfBfakfOtgcEbFq2L8DqHoSaTeY6n9wjPJiFrkCgYEA8fc/Cg+d0BN98aON80TNT+XLEZxXMH+8qdmJYe/LB2EckBj1N0en87OEhqMFm9BjYswGf/uO+q2kryEat1n3nejbENN9XaO36ahlXTpQ6gdHO3TuO+hnnUkXeUNgiGYs+1L8Ot6PuNykwL0BZ09U0iBVoawEjTAg9tLNfVW2upsCgYEAyi/75YFT3ApxfSf/9XR74a0aKrfGGlBppcGvxuhVUC2aW2fPEZGXo4qlEhP2tyVTfj78F2p0Fnf3NSyvLZAzSwKo4w8EyZfXn1RI4sM95kzIMhH3Gz8VxCZWKEgr7cKNU5Zhs8un/VFd9Mc0KyZfmVy4VrZ5JumgahBRzSn9zGUCgYA7TTt3/cfRvVU6qbkajBw9nrYcRNLhogzdG+GdzSVXU6eqcVN4DunMwoySatXvEC2rgxF8wGyUZ4ZbHaPsl/ImE3HNN+gb0Qo8C/d719UI5mvA2LGioRzz4XwNTkQUaeZQWlBTJUTYK8t9KVV0um6xaRdTnlMnP0p088lFFILKTQKBgDsR98selKyF1JBXPl2s8YCGfU2bsWIAukz2IG/BcyNgn2czFfkxCxd5qy5z7LGnUxRgPHBu5oml9PBxJKDwLzwsA8GKosBu/00KZ9zwY8ZECn0uaH5qWOacuLE+HK9zFq0kE1lfF65XtlaMWH5+0JFS2HxlBVJMEVTLfcquCPtNAoGAG6ytPm24AS1satPwlKnZODQEg0kc7d5S42jbt4X7lwICY/7gORSdbNS8OmrqYhP/8tDOAUtzPQ20fEt43/VA87bq88BVzoSp4GVQcSL44MzRBQHQwTVkoVnbCXSD9K9gZ71wii+m+8rZZ0EMdiTR3hsRXRuSmw4t8y3CuzlZ9k4="
        guard let data = Data(base64Encoded: privateKeyBase64String, options: [.ignoreUnknownCharacters]) else {
          return nil
        }
         
        let sizeInBits = data.count * 8
        let keyDict: [CFString: Any] = [
          kSecAttrKeyType: kSecAttrKeyTypeRSA,
          kSecAttrKeyClass: kSecAttrKeyClassPrivate,
          kSecAttrKeySizeInBits: NSNumber(value: sizeInBits)
        ]
         
        var error: Unmanaged<CFError>?
        guard let secKey = SecKeyCreateWithData(data as CFData, keyDict as CFDictionary, &error) else {
          return nil
        }
         
        guard let bundleID = Bundle.main.bundleIdentifier else {
          return nil
        }
        guard let signature = SecKeyCreateSignature(secKey,
                              .rsaSignatureMessagePKCS1v15SHA512,
                              Data(bundleID.utf8) as CFData,
                              &error) else {
          return nil
        }
        return (signature as Data).base64EncodedString()
      }
    …
    meta_data:
    {
      "field1": "value1",
      "field2": "value2"
    }
    …
    {
      "iin": "123123123"
    }
    /api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true
    /api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true&time_created.min=([CURRENT_TIME]-1hour)
    OzLiveness.open({
      ...
      meta: { 
      // the user or lead ID from an external lead generator 
      // that you can pass to keep track of multiple attempts made by the same user
        'end_user_id': '<user_or_lead_id>',
      // the unique attempt ID
        'transaction_id': '<unique_transaction_id>'
      }
    });
    /api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true
    "result_mode": "safe"
    {
      "state": "finished"
    }
    {
     "state": "finished",
     "analyses": {
       "quality": {
         "state": "finished",
         "resolution": "success"
       }
     }
    }
    {
     "state": "finished",
     "folder_id": "your_folder_id",
     "analyses": {
       "quality": {
         "state": "finished",
         "resolution": "success"
       }
     }
    }
    and type. If you upload files via the web console, you select the tags needed; if you take photo or video via Web SDK, the SDK picks the tags automatically. As for the media type, it can be IMAGE (a photo)/VIDEO/SHOTS_SET, where SHOTS_SET is a .zip archive equal to video.

    Below, you will find the tags and type requirements for all analyses. If a media doesn’t match the requirements for the certain analysis, this media is ignored by algorithms.

    The rules listed below act by default. To change the mapping configuration, please contact usenvelope.

    hashtag
    Quality (Liveness)

    This analysis is applied to all media, regardless of the gesture recorded (gesture tags begin from video_selfie).

    Important: to process a photo in API 4.0.8 and below, pack it into a .zip archive, apply the SHOTS_SET type, and mark it with video_*. Otherwise, it will be ignored.

    hashtag
    Biometry

    This analysis is applied to all media.

    If the folder contains less than two matching media files, the system will return an error. If there are more than two files, then all pairs will be compared, and the system will return a result for the pair with the least similar faces.

    hashtag
    Collection

    This analysis works only when you have a pre-made image database, which is called collection. The analysis is applied to all media in the folder (or the ones marked as source media).

    hashtag
    Best Shot

    Best Shot is an addition to the Quality (Liveness) analysis. It requires the appropriate option enabled. The analysis is applied to all media files that can be processed by the Quality analysis.

    hashtag
    Documents

    circle-info

    This analysis type is deprecated.

    The Documents analysis is applied to images with tags photo_id_front and photo_id_back (documents), and photo_selfie (selfie). The result will be positive if the system finds the selfie photo and matches it with a photo on one of the valid documents from the following list:

    • personal ID card

    • driver license

    • foreign passport

    hashtag
    Tag-Related Errors

    Code

    Message

    Description

    202

    Could not locate face on source media [media_id]

    No face is found in the media that is being processed, or the source media has wrong (photo_id_back) or/and missing tag used for the media.

    202

    Biometry. Analysis requires at least 2 media objects to process

    The algorithms did not find the two appropriate media for analysis. This might happen when only a single media has been sent for the analysis, or a media is missing a tag.

    202

    Processing error - did not found any document candidates on image

    The Documents analysis can't be finished because the photo uploaded seems not to be a document, or it has wrong (not photo_id_*) or/and missing tags.

    5

    tags
    .

    hashtag
    Example request

    hashtag
    Example response

    2

    hashtag
    Step 2 (optional)

    circle-info

    This step can be omitted if a company already exists.

    As a user must belong to a company, create a company: call {{host}}/api/companies/ with your company name.

    hashtag
    Example request

    hashtag
    Example response

    3

    hashtag
    Step 3

    Create a service user. Call {{host}}/api/users/ and write down user_id that you will get in response.

    hashtag
    Example request

    hashtag
    Example response

    As in API 6.0, the logic of issuing a service token has slightly changed, here are examples for both API 6 and API 5 (and below) cases.

    hashtag
    API 6

    In the request body, define user_type as CLIENT_SERVICE.

    hashtag
    API 5 and below

    Set the is_service flag value to true.

    4

    hashtag
    Step 4

    If you need to obtain the service token to use it, for instance, with Web SDK, authorize as ADMIN (same as in Step 1) and call:

    • API 6: {{host}}/api/authorize/service_token/{user_id} with user_id from the previous step.

    • API 5 and below: {{host}}/api/authorize/service_token.

    hashtag
    Example request

    hashtag
    Example response

    In response, you will get a service token that you can use in any service processes.

    circle-info

    For Web SDK, specify this token’s value as api_token in the .

    .
    • If you don’t set the custom language and bundle, the SDK uses the pre-installed languages only.

    • If the custom bundle is set (and language is not), it has a priority when checking translations, i.e, SDK checks for the localization record in the custom bundle localization file. If the key is not found in the custom bundle, the standard bundle text for this key is used.

    • If both custom bundle and language are set, SDK retrieves all the translations from the custom bundle localization file.

    A list of keys for iOS:

    The keys Action.*.Task refer to the appropriate gestures. Others refer to the hints for any gesture, info messages, or errors.

    When new keys appear with new versions, if no translation is provided by your custom bundle localization file, you’ll see the default (English) text.

    file-archive
    12KB
    Oz_SDK_iOS_Strings.zip
    archive
    arrow-up-right-from-squareOpen

    Types of Analyses and What They Check

    Here, you'll get acquainted with types of analyses that Oz API provides and will learn how to interpret the output.

    Using Oz API, you can perform one of the following analyses:

    • biometry,

    • quality (liveness, best shot),

    • (deprecated),

    • .

    circle-exclamation

    The possible results of the analyses are explained .

    Each of the analyses has its threshold that determines the output of these analyses. By default, the threshold for Liveness is 0.5 or 50%, for Collection and Biometry (Face Matching) – 0.85 or 85%.

    hashtag
    Biometry

    hashtag
    Purpose

    The Biometry algorithm allows comparing several media and check if the people on them are the same person or not. As sources, you can use images, videos, and scans of documents (with photo). To perform the analysis, the algorithm requires at least two media (for details, please refer to ).

    hashtag
    Output

    After comparison, the algorithm provides a number that represents the similarity level. The number varies from 100 to 0% (1 to 0), where:

    • 100% (1) – faces are similar, media represent the same person,

    • 0% (0) – faces are not similar and belong to different people

    hashtag
    Quality (Liveness, Best Shot)

    hashtag
    Purpose

    The Liveness detection (Quality) algorithm aims to check whether a person in a media is a real human acting in good faith, not a fake of any kind.

    The Best Shot algorithm checks for the best shot from a video (a best-quality frame where the face is seen the most properly). It is an addition to liveness.

    hashtag
    Output

    After checking, the analysis shows the chance of a spoofing attack in percents.

    • 100% (1) – an attack is detected, the person in the video is not a real living person,

    • 0% (0) – a person in the video is a real living person.

    *Spoofing in biometry is a kind of scam when a person disguises as another person using both program and non-program tools like deepfake, masks, ready-made photos, or fake videos.

    hashtag
    Documents

    circle-info

    This analysis type is deprecated.

    hashtag
    Purpose

    The Documents analysis aims to recognize the document and check if its fields are correct according to its type.

    circle-info

    Oz API uses a third-party OCR analysis service provided by our partner. If you want to change this service to another one, please .

    hashtag
    Output

    As an output, you'll get a list of document fields with recognition results for each field and a result of checking that can be:

    • The documents passed the check successfully,

    • The documents failed to pass the check.

    Additionally, the result of Biometry check is displayed.

    hashtag
    Collection (1:N)

    hashtag
    Purpose

    The Collection checking algorithm is used to determine whether the person on a photo or video is present in the database of pre-uploaded images. The person's face is being compared with the faces of known swindlers or a list of VIPs depending on your needs.

    hashtag
    Output

    After comparison, the algorithm provides a number that represents the similarity level. The number varies from 100 to 0% (1 to 0), where:

    • 100% (1) – the person in an image or video matches with someone in the collection,

    • 0% (0) – the person is not found in the collection.

    Master License for Android

    Master license is the offline license that allows using Mobile SDKs with any bundle_id, unlike the regular licenses. To get a master license, create a pair of keys as shown below. Email us the public key, and we will email you the master license shortly after that. Your application needs to sign its bundle_id with the private key, and the Mobile SDK checks the signature using the public key from the master license. Master licenses are time-limited.

    hashtag
    Generating Keys

    This section describes the process of creating your private and public keys.

    hashtag
    Creating a Private Key

    To create a private key, run the commands below one by one.

    You will get these files:

    • privateKey.der is a private .der key;

    • privateKey.txt is privateKey.der encoded by base64. This key containing will be used as the host app bundle_id signature.

    The OpenSSL command specification:

    hashtag
    Creating a Public Key

    To create a public key, run this command.

    You will get the public key file: publicKey.pub. To get a license, please email us this file. We will email you the license.

    hashtag
    SDK Integration

    SDK initialization:

    circle-info

    For Android 6.0 (API level 23) and older:

    1. Add the implementation 'com.madgag.spongycastle:prov:1.58.0.0' dependency;

    Prior to the SDK initializing, create a base64-encoded signature for the host app bundle_id using the private key.

    Signature creation example:

    Pass the signature as the masterLicenseSignature parameter during the SDK initialization.

    If the signature is invalid, the initialization continues as usual: the SDK checks the list of bundle_id included into the license, like it does it by default without a master license.

    Security Recommendations

    In 8.8.0, we’ve implemented SSL pinning to protect our clients from MITM attacks. We strongly recommend adding a built-in certificate whitelist to your application to prevent fraud with third-party certificates set as trusted.

    chevron-rightWhat is a MITM attackhashtag

    MITM (man in the middle) attack is a type of attacks when a cyber fraudster breaks into the communication between application and backend, setting up a proxy to intercept and alter the traffic (e.g., substitute the video being sent). Typically, these attacks involve the fraudster setting their certificate as trusted on the user's device beforehand.

    You can add a list of certificates your application should trust at the moment of connection to Oz API via the optional sslPins field of OzConnection class. As an input, this field takes a list of public certificate key hashes with their expiration dates as shown below:

    hashtag
    Android

    hashtag
    iOS

    hashtag
    Getting keys and dates: the simplest way

    1. Go to the website.

    2. Enter your domain address. Once the address is processed, you’ll see a list of your servers.

    3. Click the server address needed to load a list of certificates. Certificate key is in the Pin SHA256 line of the Subject field. Expiration date is shown in the Valid until field.

    Certificate number one is your host certificate. Your root certificate is in the very bottom of the list. Others are intermediate. For SSL pinning, any of them fits.

    hashtag
    Choosing a certificate

    The higher the certificate is on the list, the better the level of protection against theft. Thus, if you use the host certificate to pin in your application, you get the highest security level. However, the lifetime of these certificates is significantly shorter than that of intermediate or root certificates. To keep your application secure, you will need to change your pins as soon as they expire; otherwise, functionality might become unavailable.

    As a reasonable balance between safety and the resources needed to maintain it, we recommend using intermediate or even root certificate keys for pinning. While the security level is slightly lower, you won’t need to change these pins as often because these certificates have a much longer lifetime.

    hashtag
    Obtaining Hash and Date

    circle-info

    The commands listed in this section have been tested on Ubuntu, but they should work on other Linux-based OS as well.

    To obtain the hash, run the following command with your server domain and port:

    In the response, you’ll receive hash for your SslPin.

    To get the certificate’s expiration date, run the next command – again with your server domain and port:

    The date you require will be in the notAfter parameter.

    We’ll provide you with the hash and date of our API server certificate.

    Quantitative Results

    This article describes how to get the analysis scores.

    When you perform an analysis, the result you get is a number. For biometry, it reflects a chance that the two or more people represented in your media are the same person. For liveness, it shows a chance of deepfake or a spoofing attack: that the person in uploaded media is not a real one. You can get these numbers via API from a JSON response.

    1. Authorize.

    2. Make a request to the folder or folder list to get a JSON response. Set the with_analyses parameter to true.

    3. For the Biometry analysis, check the response for the min_confidence value:

    This value is a quantitative result of matching the people on the media uploaded.

    4. For the Liveness Analysis, seek the confidence_spoofing value related to the video you need:

    This value is a chance that a person is not a real one.

    To process a bunch of analysis results, you can parse the appropriate JSON response.

    Oz API Postman Collections

    Download and install the Postman client from this page.arrow-up-right Then download the JSON file needed:

    hashtag
    Oz API Postman collections

    hashtag
    6.0

    hashtag
    6.0

    hashtag
    5.3 and 5.2

    hashtag
    5.0

    Oz API 5.1.0 works with the same collection.

    hashtag
    4.0

    hashtag
    3.33

    hashtag
    How to Import a Postman Collection:

    Launch the client and import Oz API collection for Postman by clicking the Import button:

    Click files, locate the JSON needed, and hit Open to add it:

    The collection will be imported and will appear in the Postman interface:

    Best Shot

    The "Best shot" algorithm is intended to choose the most high-quality and well-tuned frame with a face from a video record. This algorithm works as a part of the liveness analysis, so here, we describe only the best shot part.

    circle-info

    Please note: historically, some instances are configured to allow Best Shot only for certain gestures.

    hashtag
    Processing steps

    1. Initiate the analysis similar to , but make sure that extract_best_shot is set to true as shown below:

    If you want to use a webhook for response, add it to the payload at this step, as described .

    2. Check and interpret results in the same way as for the pure analysis.

    3. The URL to the best shot is located in the results_media -> output_images -> original_url response.

    How to Add Photo ID Capture and Face Matching to Your Web or Mobile Application

    circle-exclamation

    Please note: this guide applies to the non-container flow only.

    circle-info

    Please note that the Oz Liveness Mobile SDK does not include a user interface for scanning official documents. You may need to explore alternative SDKs that offer that functionality or implement it on your own. Web SDK does include a simple photo ID capture screen.

    Adding SDK to a Project

    Add the following URL to the build.gradle of the project:

    Add this to the build.gradle of the module (VERSION is the version you need to implement. Please refer to ):

    hashtag
    for the server-based version only

    Web Plugin (Web SDK Frontend part)

    Web Plugin is a plugin called by your web application. It works in a browser context. The Web Plugin communicates with Web Adapter, which, in turn, communicates with Oz API.

    Please find a sample for Oz Liveness Web SDK . To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.

    For the samples below, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.

    curl -L 'https://{{host}}/api/authorize/auth' \
    -H 'Content-Type: application/json' \
    --data-raw '{
        "credentials": {
            "email": "[email protected]",
            "password": "your_admin_password"
        }
    }'
    {
      …
        "user": {
            "user_type": "ADMIN",
      …
        },
        "access_token": "<token>",
        …
    }

    Invalid/missed tag values to process quality check

    The tags applied can't be processed by the Quality algorithm (most likely, the tags begin from photo_*; for Quality, they should be marked as video_*)

    5

    Invalid/missed tag values to process collection check

    The tags applied can't be processed by the Collection algorithm. This might happen when a media is missing a tag.

    Biometry: if the final score is equal to or above the threshold, the faces on the analyzed media are considered similar.
  • Collection: if the final score is equal to or above the threshold, the face on the analyzed media matches with one of the faces in the database.

  • Quality: if the final score is equal to or above the threshold, the result is interpreted as an attack.

  • To configure the threshold depending on your needs, please contact usenvelope.

    For more information on how to read the numbers in analyses' results, please refer to Quantitative Results.

    documents
    collection (1:N)
    here
    Rules of Assigning Analyses
    contact usenvelope
    file-download
    256KB
    OZ-Forensic 6.0.0.postman_collection.json
    arrow-up-right-from-squareOpen
    Instant API
    file-download
    10KB
    OZ-Forensic Instant 6.0.0.postman_collection.json
    arrow-up-right-from-squareOpen
    file-download
    301KB
    OZ-Forensic 5.2.0-.postman_collection.json
    arrow-up-right-from-squareOpen
    file-download
    299KB
    OZ-Forensic 5.0.0.postman_collection.json
    arrow-up-right-from-squareOpen
    file-download
    165KB
    OZ-Forensic 4.0.0.postman_collection.json
    arrow-up-right-from-squareOpen
    file-download
    168KB
    OZ-Forensic 3.33.0.postman_collection.json
    arrow-up-right-from-squareOpen
    Web Adapter configuration file
    Before creating a signature, call Security.insertProviderAt(org.spongycastle.jce.provider.BouncyCastleProvider(), 1)
    https://www.openssl.org/docs/man1.1.1/man1/openssl-pkcs8.htmlarrow-up-right

    Certificate owner

    Trust level

    Resources requirements (depend on the certificate’s lifetime)

    Host

    Highest

    High, but requires the most resources to maintain: keys’ list should be updated at the same time as certificate

    Intermediate certificate authority

    Above average; the application considers all certificates that have been issued by this authority as trusted

    Average

    Root certificate authority

    Average; the application considers all certificates that have been issued by this authority as trusted, including the intermediate authority-issued certificates

    Low

    https://globalsign.ssllabs.com/arrow-up-right
    "items": 
     [
      {
       "analyses": 
        [
         {
          "analysis_id": "biometry_analysis_id"
          "folder_id": "some_folder_id", 
          "type": "BIOMETRY", 
          "state": "FINISHED", 
          "results_data": 
           {
            "max_confidence": 0.997926354, 
            "min_confidence": 0.997926354
           }
    request body
    {
      "analyses": [{
        "type": "quality",
        "source_media": ["1111aaaa-11aa-11aa-11aa-111111aaaaaa"], // // optional; omit to include all media from the folder
        "params" : {
          "extract_best_shot": true // the mandatory part for the best shot analysis
        }
      }]
    }
    Liveness
    here
    Liveness
    Please note: this is the default version.

    hashtag
    for both server-based and on-device versions

    Please note: the resulting file will be larger.

    Also, regardless of the mode chosen, add:

    allprojects {
      repositories {
        maven { url "https://ozforensics.jfrog.io/artifactory/main" }
      }
    }
    Changelog
    dependencies {
      implementation 'com.ozforensics.liveness:sdk:VERSION'
    }
    curl -L 'https://{{host}}/api/companies/'
    -H 'X-Forensic-Access-Token: token_id'
    -H 'Content-Type: application/json'
    -d '{ "name": "your_company_name" }'
    {
        "company_id": "company_id",
        "name": "your_company_name",
        "in_deletion": false,
        "technical_meta_data": {}
    }
     curl -L 'https://{{host}}/api/users/'
    -H 'X-Forensic-Access-Token: token_id'
    -H 'Content-Type: application/json'
    --data-raw '{
      "credentials": {
        "email": "<[email protected]>",
        "password": "<your_service_user_password>"
      },
      "profile": {
        "company_id": " company_id",
        <!-- the next line is for API 6 -->
        "user_type": "CLIENT_SERVICE",
        "first_name": "first_name",
        "last_name": "last_name",
        "middle_name": "",
        "is_admin": false,
        <!-- the next line is for API 5 and below -->
        "is_service": true,
        "can_start_analyse_biometry": true,
        "can_start_analyse_collection": true,
        "can_start_analyse_documents": true,
        "can_start_analyse_quality": true
      }
    }
    {
        "user_id": "user_id",
        "user_type": "CLIENT_SERVICE",
         …
        "is_active": true,
         …
        "is_service": true
    }
    {
      "credentials": {
        "email": " <[email protected]> ",
        "password": "your_client_service_user_password"
      },
      "profile": {
        "company_id": "{{company_id}}",
        "user_type": "CLIENT_SERVICE",
        "first_name": "john",
        "last_name": "doe",
        "middle_name": "",
        "can_start_analysis_biometry": true,
        "can_start_analysis_collection": true,
        "can_start_analysis_documents": true,
        "can_start_analysis_quality": true
      }
    }
    
    {
        "credentials": {
            "email": "[email protected]",
            "password": "your_client_service_user_password"
        },
        "profile": {
            "company_id": "{{company_id}}",
            "user_type": "CLIENT",
            "first_name": "john",
            "last_name": "doe",
            "middle_name": "",
            "is_admin": false,
            "is_service": true,
            "can_start_analyse_biometry": true,
            "can_start_analyse_collection": true,
            "can_start_analyse_documents": true,
            "can_start_analyse_quality": true
        }
    }
    curl -L 'https://{{host}}/api/authorize/service_token/{{user_id}}' \
    -H 'X-Forensic-Access-Token: token_id' \
    -H 'Content-Type: application/json'
    {
        "token_id": "token_id",
        "user_id": "user_id",
        "access_token": "service_token",
        "expire_date": 1904659888.282587,
        "session_id": 0
    }
    openssl genpkey -algorithm RSA -outform DER -out privateKey.der -pkeyopt rsa_keygen_bits:2048
    # for MacOS
    base64 -i privateKey.der -o privateKey.txt
    # for Linux 
    base64 -w 0 privateKey.der > privateKey.txt
    openssl rsa -pubout -in privateKey.der -out publicKey.pub
    fun init(
        context: Context,
        licenseSources: List<LicenseSource>,
        masterLicenseSignature: String,
        statusListener: StatusListener<LicensePayload>? = null,
    )
    private fun getMasterSignature(): String {
        Security.insertProviderAt(org.spongycastle.jce.provider.BouncyCastleProvider(), 1)
    
        val privateKeyBase64String = "the string copied from the privateKey.txt file"
        // with key example:
        // val privateKeyBase64String = "MIIEpAIBAAKCAQEAxnpv02nNR34uNS0yLRK1o7Za2hs4Rr0s1V1/e1JZpCaK8o5/3uGV+qiaTbKqU6x1tTrlXwE2BRzZJLLQdTfBL/rzqVLQC/n+kAmvsqtHMTUqKquSybSTY/zAxqHF3Fk59Cqisr/KQamPh2tmg3Gu61rr9gU1rOglnuqt7FioNMCMvjW7ciPv+jiawLxaPrzNiApLqHVN+xCFh6LLb4YlGRaNUXlOgnoLGWSQEsLwBZFkDJDSLTJheNVn9oa3PXg4OIlJIPlYVKzIDDcSTNKdzM6opkS5d+86yjI1aTKEH3Zs64+QoEuoDfXUxS3TOUFx8P+wfjOR5tYAT+7TRN4ocwIDAQABAoIBAATWJPV05ZCxbXTURh29D/oOToZ0FVn78CS+44Vgy1hprAcfG9SVkK8L/r6X9PiXAkNJTR+Uivly64Oua8//bNC7f8aHgxRXojFmWwayj8iOMBncFnad1N2h4hy1AnpNHlFp3I8Yh1g0RpAZOOVJFucbTxaup9Ev0wLdWyGgQ3ENmRXAyLU5iUDwUSXg59RCBFKcmsMT2GmmJt1BU4P3lL9KVyLBktqeDWR/l5K5y8pPo6K7m9NaOkynpZo+mHVoOTCtmTj5TC/MH9YRHlF15VxQgBbZXuBPxlYoQCsMDEcZlMBWNw3cNR6VBmGiwHIc/tzSHZVsbY0VRCYEbxhCBZkCgYEA+Uz0VYKnIWViQF2Na6LFuqlfljZlkOvdpU4puYTCdlfpKNT3txYzO0T00HHY9YG9k1AW78YxQwsopOXDCmCqMoRqlbn1SBe6v49pVB85fPYU2+L+lftpPlx6Wa0xcgzwOBZonHb4kvp1tWhUH+B5t27gnvRz/rx5jV2EfmWinycCgYEAy8/aklZcgoXWf93N/0EZcfzQo90LfftkKonpzEyxSzqCw7B9fHY68q/j9HoP4xgJXUKbx1Fa8Wccc0DSoXsSiQFrLhnT8pE2s1ZWvPaUqyT5iOZOW6R+giFSLPWEdwm6+BeFoPQQFHf8XH3Z2QoAepPrEPiDoGN1GSIXcCwoe9UCgYEAgoKj4uQsJJKT1ghj0bZ79xVWQigmEbE47qI1u7Zhq1yoZkTfjcykc2HNHBaNszEBks45w7qo7WU5GOJjsdobH6kst0eLvfsWO9STGoPiL6YQE3EJQHFGjmwRbUL7AK7/Tw2EJG0wApn150s/xxRYBAyasPxegTwgEj6j7xu7/78CgYEAxbkI52zG5I0o0fWBcf9ayx2j30SDcJ3gx+/xlBRW74986pGeu48LkwMWV8fO/9YCx6nl7JC9dHI+xIT/kk8OZUGuFBRUbP95nLPHBB0Hj50YRDqBjCBh5qaizSEGeGFFNIfFSKddri3U8nnZTNiKLGCx7E3bjE7QfCh5qoX8ZF0CgYAtsEPTNKWZKA23qTFI+XAg/cVZpbSjvbHDSE8QB6X8iaKJFXbmIC0LV5tQO/KT4sK8g40m2N9JWUnaryTiXClaUGU3KnSlBdkIA+I77VvMKMGSg+uf4OdfJvvcs4hZTqZRdTm3dez8rsUdiW1cX/iI/dJxF4964YIFR65wL+SoRg=="
        val sig = Signature.getInstance("SHA512WithRSA")
        val keySpec = PKCS8EncodedKeySpec(Base64.decode(privateKeyBase64String, Base64.DEFAULT))
        val keyFactory = KeyFactory.getInstance("RSA")
        sig.initSign(keyFactory.generatePrivate(keySpec))
        sig.update(packageName.toByteArray(Charsets.UTF_8))
        return Base64.encodeToString(sig.sign(), Base64.DEFAULT).replace("\n", "")
    }
    Connection.fromServiceToken(
       "your API server host",
       "your token",
       listOf(
         SslPin(
           "your hash", // SHA256 key hash in base64
           <date> // key expiration date as a UNIX timestamp, UTC time
         )
       ),
     )
    let pins = [SSLPin.pin(
          publicKeyHash: "your hash", // SHA256 key hash in base64
          expirationDate: date)] // key expiration date as a UNIX timestamp, UTC time
    OZSDK.setApiConnection(.fromServiceToken(
            host: "your API server host",
            token: "your token",
            sslPins: pins)) { (token, error) in
              //
            }
    echo | openssl s_client -connect {SERVER_DOMAIN_NAME}:{PORT} 2> /dev/null | openssl x509 -pubkey -noout | openssl pkey -pubin -outform der | openssl dgst -sha256 -binary| openssl enc -base64
    openssl s_client -servername {SERVER_DOMAIN_NAME} -connect {SERVER_DOMAIN_NAME}:{PORT} | openssl x509 -noout -dates
    echo -n Q | openssl s_client -servername {SERVER_DOMAIN_NAME} -connect {SERVER_DOMAIN_NAME}:{PORT} | openssl x509 -noout -dates
    "items": 
     [
      {
       "analyses": 
        [
         {
          "source_media": 
           [
            {
            "media_id": "your_media_id", 
            "media_type": "VIDEO_FOLDER",
            }
           ]
          "results_media": 
           [
            "analysis_id": "liveness_analysis_id",
            "results_data": 
             {
              "confidence_spoofing": 0.55790174
             }
    dependencies {
    implementation 'com.ozforensics.liveness:full:VERSION'
    }
    android {
      compileOptions {
        sourceCompatibility JavaVersion.VERSION_1_8
        targetCompatibility JavaVersion.VERSION_1_8
      }
    }
    This guide describes the steps needed to add face matching to your liveness check.

    By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:

    • Integration of Oz Liveness Web SDK

    • Integration of Oz Liveness Mobile SDK

    hashtag
    Adding Photo ID Capture Step to Web SDK

    Simply add photo_id_front to the list of actions for the plugin, e.g.,

    hashtag
    Adding Face Matching to Android SDK

    For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.

    Modify the code that runs the analysis as follows:

    For on-device analyses, you can change the analysis mode from Analysis.Mode.SERVER_BASED to Analysis.Mode.ON_DEVICE

    Check also the Android sample apparrow-up-right source code.

    hashtag
    Adding Face Matching to iOS SDK

    For the purpose of this guide, it is assumed that your reference photo (e.g., front side of an ID) is stored on the device as reference.jpg.

    Modify the code that runs the analysis as follows:

    For on-device analyses, you can change the analysis mode from mode: .serverBased to mode: .onDevice

    Check also the iOS sample apparrow-up-right source code.

    hashtag
    Final notes for all SDKs

    You will be able to access your media and analysis results in Web UI via browser or programmatically via API.

    Oz API methods as well as Mobile and Web SDK methods can be combined with great flexibility. Explore the options available in the Developer Guide section.

    sample
  • Reactarrow-up-right sample

  • Vuearrow-up-right sample

  • Sveltearrow-up-right sample

  • herearrow-up-right
    Angulararrow-up-right
    Adding the Plugin to Your Web Pagechevron-right
    Launching the Pluginchevron-right
    Closing or Hiding the Pluginchevron-right
    Localization: Adding a Custom Language Packchevron-right
    Look-and-Feel Customizationchevron-right
    Security Recommendationschevron-right
    Browser Compatibilitychevron-right
    No-Server Licensingchevron-right

    Authentication

    hashtag
    Getting an Access Token

    To get an access token, call POST /api/authorize/auth/ with credentials (which you've got from us) containing the email and password needed in the request body. The host address should be the API address (the one you've also got from us).

    The successful response will return a pair of tokens:access_token and expire_token.

    access_token is a key that grants you access to system resources. To access a resource, you need to add your access_token to the header.

    headers = {‘ X-Forensic-Access-Token’: <access_token>}

    access_token is time-limited, the limits depend on the account type.

    • service accounts – OZ_SESSION_LONGLIVE_TTL (5 years by default),

    • other accounts – OZ_SESSION_TTL (15 minutes by default).

    expire_token is the token you can use to renew your access token if necessary.

    hashtag
    Automatic session extension

    If the value ofexpire_date > current date, the value of current sessionexpire_date is set to current date + time period that is defined as shown above (depending on the account type).

    hashtag
    Token Renewal

    To renewaccess_token and expire_token, call POST /api/authorize/refresh/. Add expire_token to the request body and X-Forensic-Access-Token to the header.

    In case of success, you'll receive a new pair of access_token and expire_token. The "old" pair will be deleted upon the first authentication with the renewed tokens.

    hashtag
    Errors

    Customization Options for Older Versions (before 1.0.1)

    To set your own look-and-feel options, use the style section in the Ozliveness.open method. Here is what you can change:

    • faceFrame – the color of the frame around a face:

      • faceReady – the frame color when the face is correctly placed within the frame;

      • faceNotReady – the frame color when the face is placed improperly and can't be analyzed.

    • centerHint – the text of the hint that is displayed in the center.

      • textSize – the size of the text;

      • color

    • closeButton – the button that closes the plugin:

      • image – the button image, can be an image in PNG or dataURL in base64.

    • backgroundOutsideFrame – the color of the overlay filling (outside the frame):

      • color – the fill color.

    Example:

    API Error Codes

    hashtag
    HTTP Response Codes

    • Response codes 2XX indicate a successfully processed request (e.g., code 200 for retrieving data, code 201 for adding a new entity, code 204 for deletion, etc.).

    • Response codes 4XX indicate that a request could not be processed correctly because of some client-side data issues (e.g., 404 when addressing a non-existing resource).

    • Response codes 5XX indicate that an internal server-side error occurred during the request processing (e.g., when database is temporarily unavailable).

    hashtag
    Response Body with Errors

    Each response error includes HTTP code and JSON data with error description. It has the following structure:

    • error_code – integer error code;

    • error_message– text error description;

    • details

    Sample error response:

    Error codes:

    • 0 – UNKNOWN Unknown server error.

    • 1 - NOT ALLOWED An unallowed method is called. Usually is followed by the 405 HTTP status of response. For example, trying to request the PATCH method, while only GET/POST ones are supported.

    • 2 - NOT REALIZED

    OzCapsula Data Container

    hashtag
    Configuring Oz API

    The main change in interaction with API is that you require to send data with another content type, as data container is a binary file: Content-Type = application/octet-stream. We've added support for this content type along with container functionality.

    Also, Instant API now requires client’s private and public keys to function. The paths to these keys should be specified in OZ_JWT_PRIVATE_KEY_PATH and OZ_JWT_PUBLIC_KEY_PATH in the configuration file.

    To generate them, use commands as listed below.

    hashtag
    Examples

    POST api/folders:

    POST api/instant/folders:

    hashtag
    Exceptions

    hashtag
    Obtaining a Session Token

    Before you start with SDK, obtain a session token:

    1. (Optional, only if you use stateful API) Authorize as any non-OPERATOR role.

    2. Call GET {{host}}/api/authorize/session_token.

    Example request

    Example response

    Capturing Videos

    hashtag
    OzCapsula (SDK v8.22 and newer)

    circle-exclamation

    Please note: all required data (other than the video) must be packaged into the container before starting the Liveness screen.

    Create a controller that will capture videos as follows:

    The delegate object must implement the OZLivenessDelegate protocol:

    hashtag
    SDK 8.21 and older

    Create a controller that will capture videos as follows:

    action – a list of user’s while capturing the video.

    Once video is captured, the system calls the onOZLivenessResult method:

    The method returns the results of video capturing: the [] objects. The system uses these objects to perform checks.

    circle-info

    If you use our SDK just for capturing videos, omit the Checking Liveness and Face Biometry step.

    If a user closes the capturing screen manually, the failedBecauseUserCancelled error appears.

    Uploading Media

    To launch one or more analyses for your media files, you need to create a folder via Oz API (or use an existing folder) and put the files into this folder. Each file should be marked by tags: they describe what's pictured in a media and determine the applicable analyses.

    circle-info

    For API 4.0.8 and below, please note: if you want to upload a photo for the subsequent Liveness analysis, put it into the ZIP archive and apply the video-related tags.

    To create a folder and upload media to it, call POST /api/folders/

    To add files to the existing folder, call POST /api/folders/{{folder_id}}/media/

    Add the files to the request body; tags should be specified in the payload.

    Here's the example of the payload for a passive Liveness video and ID front side photo.

    An example of usage (Postman):

    The successful response will return the folder data.

    Changelog

    API Lite (FaceVer) changes

    hashtag
    1.2.3 – Nov., 2024

    • Fixed the bug with the time_created and folder_id parameters of the Detect method that sometimes might have been generated incorrectly.

    • Security updates.

    hashtag
    1.2.2 – Oct. 17, 2024

    • Updated models.

    hashtag
    1.2.1 – Sept. 05, 2024

    • The file size for the detect Liveness method is now capped at 15 MB, with a maximum of 10 files per request.

    • Updated the gesture list for best_shot analysis: it now supports head turns (left and right), tilts (up and down), smiling, and blinking.

    hashtag
    1.2.0 – July 26, 2024

    • Introduced the new that can process videos and archives as well.

    hashtag
    1.1.1 – Nov. 28, 2022

    • Added the .

    hashtag
    1.1.0

    • API Lite now accepts base64.

    hashtag
    09.2021

    • Improved the biometric model.

    • Added the 1:N mode.

    hashtag
    08.2021

    • Added the CORS policy.

    • Published the documentation.

    hashtag
    06.2021

    • Improved error messages – made them more detailed.

    • Simplified the Liveness/Detect methods.

    hashtag
    04.2021

    • Reworked and improved the core.

    • Added anti-spoofing algorithms.

    hashtag
    10.2020

    • Added the extract_and_compare method.

    Connecting SDK to API

    To connect SDK to Oz API, specify the API URL and as shown below.

    circle-exclamation

    Please note:

    General Security Recommendations

    This article covers common cyberattacks and the steps you can take to stay safe.

    hashtag
    Cyberattack Types

    The most common cyberattacks can be divided to three groups: injection, integration, and presentation attacks. Below, you’ll find some examples for mobile and web SDKs.

    hashtag

    Liveness

    The Liveness detection algorithm is intended to detect a real living person in a media.

    hashtag
    Requirements

    1. You're .

    private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
    
        val refFile = File(context.filesDir, "reference.jpg")
        val refMedia = OzAbstractMedia.OzDocumentPhoto(
            OzMediaTag.PhotoIdFront , // OzMediaTag.PhotoSelfie for a non-ID photo
            refFile.absolutePath
        )
    
        AnalysisRequest.Builder()
            .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList))
            .addAnalysis(Analysis(Analysis.Type.BIOMETRY, Analysis.Mode.SERVER_BASED, mediaList + refMedia))
            .build()
            .run(object : AnalysisRequest.AnalysisListener {
                override fun onSuccess(result: List<OzAnalysisResult>) {
                    result.forEach { 
                        println(it.resolution.name)
                        println(it.folderId)
                    }
                }
                override fun onError(error: OzException) {
                    error.printStackTrace()
                }
            })
    } 
    private void analyzeMedia(List<OzAbstractMedia> mediaList) {
        File refFile = new File(context.getFilesDir(), "reference.jpg");
        OzAbstractMedia refMedia = new OzAbstractMedia.OzDocumentPhoto(
                OzMediaTag.PhotoIdFront , // OzMediaTag.PhotoSelfie for a non-ID photo
                refFile.getAbsolutePath()
        );
        ArrayList<OzAbstractMedia> mediaWithReferencePhoto = new ArrayList<>(mediaList);
        mediaWithReferencePhoto.add(refMedia);
        new AnalysisRequest.Builder()
                .addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaList, Collections.emptyMap()))
                .addAnalysis(new Analysis(Analysis.Type.BIOMETRY, Analysis.Mode.SERVER_BASED, mediaWithReferencePhoto, Collections.emptyMap()))
                .build()
                .run(new AnalysisRequest.AnalysisListener() {
                    @Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
                    @Override
                    public void onSuccess(@NonNull List<OzAnalysisResult> list) {
                        String folderId = list.get(0).getFolderId();
                    }
                    @Override
                    public void onError(@NonNull OzException e) { e.printStackTrace(); }
        });
    }
    OzLiveness.open({
      session_token,
      lang: 'en',
      action: [
        'photo_id_front', 
        'video_selfie_blank'
      ],
      ...
    });
    let imageURL = URL(fileURLWithPath: NSTemporaryDirectory())
        .appendingPathComponent("reference.jpg")
    
    let refMedia = OZMedia.init(movement: .selfie,
                       mediaType: .movement,
                       metaData: nil,
                       videoURL: nil,
                       bestShotURL: imageUrl,
                       preferredMediaURL: nil,
                       timestamp: Date())
       
    var mediaBiometry = [OZMedia]()
    mediaBiometry.append(refMedia)
    mediaBiometry.append(contentsOf: mediaToAnalyze)
    let analysisRequest = AnalysisRequestBuilder()
    let analysisBiometry = Analysis.init(media: mediaBiometry, type: .biometry, mode: .serverBased)
    let analysisQuality = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased)
    analysisRequest.addAnalysis(analysisBiometry)
    analysisRequest.addAnalysis(analysisQuality)
    analysisRequest.uploadMedia(mediaBiometry)
    analysisRequest.run(
        scenarioStateHandler: { state in }, // scenario steps progress handler
        uploadProgressHandler: { (progress) in } // file upload progress handler
    ) { (analysisResults : [OzAnalysisResult], error) in
        // receive and handle analyses results here
        for result in analysisResults {
            print(result.resolution)
            print(result.folderID)
        }
    }
    {
    	"credentials": {
    		"email": "{{user_email}}", // your login
    		"password": "{{user_password}}" // your password
    	}
    }
    Liveness detect method
    version check method

    Error code

    Error message

    What caused the error

    400

    Could not locate field for key_path expire_token from provided dict data

    expire_token haven't been found in the request body

    401

    Session not found

    The session with expire_token you have passed doesn't exist.

    403

    You have not access to refresh this session

    A user who makes the request is not thisexpire_token session owner.

    – the color of the text;
  • yPosition – the vertical position measured from top;

  • letterSpacing – the spacing between letters;

  • fontStyle – the style of font (bold, italic, etc.).

  • – additional error details (format is specified to each case). Can be empty.
    The method is documented but is not realized by any temporary or permanent reason.
  • 3 - INVALID STRUCTURE Incorrect structure of request. Some required fields missing or a format validation error occurred.

  • 4 - INVALID VALUE Incorrect value of the parameter inside request body or query.

  • 5 - INVALID TYPE The invalid data type of the request parameter.

  • 6 - AUTH NOT PROVIDED Access token not specified.

  • 7 - AUTH INVALID The access token does not exist in the database.

  • 8 - AUTH EXPIRED Auth token is expired.

  • 9 - AUTH FORBIDDEN Access denied for the current user.

  • 10 - NOT EXIST the requested resource is not found (alternative of HTTP status_code = 404).

  • 11 - EXTERNAL SERVICE Error in the external information system.

  • 12 – DATABASE Critical database error on the server host.

  • Error code

    Error message

    Description

    13

    No data container provided

    API didn’t receive the container

    14

    • Data container unpacking failed

    • Invalid Data Container

    • Invalid signature

    • Invalid SummingHash

    • Invalid or empty Session Token

    The container appears to contain errors and can’t be unpackaged.

    actions
    OZMedia

    You have already created a folder and added your media marked by correct tags into this folder. For API 4.0.8 and below, please note: the Liveness analysis works with videos and shotsets, images are ignored. If you want to analyze an image, upload it as a shotset (archive) with a single image and mark with the video_selfie_blank tag.

    hashtag
    Processing Steps

    1. Initiate the analysis for the folder: POST /api/folders/{{folder_id}}/analyses/

    If you want to use a webhook for response, add it to the payload at this step, as described here.

    You'll needanalysis_id or folder_id from response.

    2. If you use a webhook, just wait for it to return the information needed. Otherwise, initiate polling:

    • GET /api/analyses/{{analysis_id}} – for the analysis_id you have from the previous step.

    • GET api/folders/{{folder_id}}/analyses/ – for all analyses performed on media in the folder with the folder_id you have from the previous step.

    Repeat the check until theresolution_status and resolution fields change status to any other except PROCESSING and treat this as a result.

    For the Liveness Analysis, seek the confidence_spoofing value related to the video you need. It indicates a chance that a person is not a real one.

    authorized
    {
        "expire_token": "{{expire_token}}"
    }
    OzLiveness.open({
      // ...
      style: {
            // the backward compatibility block
        doc_color: "", 
        face_color_success: "",
        face_color_fail: "", 
    	// the current customization block
        faceFrame: {
          faceReady: "",
          faceNotReady: "",
        },
        centerHint: {
          textSize: "",
          color: "",
          yPosition: "",
          letterSpacing: "", 
          fontStyle: "", 
        },
        closeButton: {
          image: "",
        },
        backgroundOutsideFrame: {
          color: "", 
        },
      },
      // ...
    });
    {
        "error_code": 0,
        "error_message": "Unknown server side error occurred",
        "details": null
    }
    # Generate private key:
    openssl ecparam -name secp384r1 -genkey -noout | openssl pkcs8 -topk8 -nocrypt -out ./jwt.key
    # Generate public key:
    openssl ec -in ./jwt.key -pubout -out ./jwt.pub
    curl -X POST \
      '{{host}}/api/folders' \
      -H 'Content-Type: application/octet-stream' \
      -H 'X-Forensic-Access-Token: <YOUR_TOKEN>' \
      --data-binary '@/path/to/container.dat'
    curl -X POST \
      '{{host}}api/instant/folders' \
      -H 'Content-Type: application/octet-stream' \
      -H 'X-Forensic-Access-Token: <YOUR_TOKEN>' \
      --data-binary '@/path/to/container.dat'
    curl -L 'https://{{host}}/api/authorize/session_token' \
    -H 'X-Forensic-Access-Token: <token>' \
    -H 'Content-Type: application/json'
    {
        "session_token": "<session_token>"
    }
    getSessionToken() { sessionToken in
                DispatchQueue.main.async {
                    do {
                        let action:OZVerificationMovement = .selfie
                        let mediaRequest = MediaRequest.action(action)
                        let profile = AnalysisProfile(mediaList: [mediaRequest],
                                                      type: .quality,
                                                      params: [:] )
                        let request = CaptureRequest(analysisProfileList: [profile], cameraPosition: .front)
                        let ozLivenessVC = try OZSDK.createMediaCaptureScreen(self, request, sessionToken: sessionToken)
                        self.present(ozLivenessVC, animated: true)
                    } catch let error {
                        print(error.localizedDescription)
                    }
                }
            }
    extension ViewController: LivenessDelegate {
        func onResult(container: DataContainer) {
        }
        func onError(status: OZVerificationStatus?) {
        }
    }
    let actions: [OZVerificationMovement] = [.selfie]
    let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(self, actions: actions)
    self.present(ozLivenessVC, animated: true)
    extension viewController: OZLivenessDelegate {
     func onError(status: OZVerificationStatus?) {
            // show error
       }
     }
     func onOZLivenessResult(results: [OZMedia]) {
       // proceed to the checks step
     }
    }
    payload
    {
        "media:tags": { // this section sets the tags for the media files that you upload
        // media files are referenced by the keys in a multipart form
            "video1": [ // your file key
            // a typical set of tags for a passive Liveness video
                "video_selfie", // video of a person
                "video_selfie_blank", // no gesture used
                "orientation_portrait" // video orientation
            ],
            "photo1": [
            // a typical set of tags for an ID front side
                "photo_id",
                "photo_id_front"
            ]
        }
    }
    request body
    {  
      "analyses": [
        {
          "type": "quality",
          "source_media": ["1111aaaa-11aa-11aa-11aa-111111aaaaaa"], // optional; omit to include all media from the folder
          ...
        }
      ]
    }
    [
      {
        // you may have multiple analyses in the list
        // pick the one you need by analyse_id or type
        "analysis_id": "1111aaaa-11aa-11aa-11aa-111111aaaaaa",
        "type": "QUALITY",
        "results_media": [
          {
            // if you have multiple media in one analysis, match score with media by source_video_id/source_shots_set_id 
            "source_video_id": "1111aaaa-11aa-11aa-11aa-111111aaaaab", // for shots_set media, the key would be source_shots_set_id 
            "results_data": 
            {
              "confidence_spoofing": 0.05790174 // quantitative score for this media
            }
          "resolution_status": "SUCCESS", // qualitative resolution (based on all media)
          ...
        ]
        ...
      }
      ...
    ]

    In your host application, it is recommended that you set the API address on the screen that precedes the liveness check. Setting the API URL initiates a service call to the API, which may cause excessive server load when being done at the application initialization or startup. We recommend calling the setApiConnection method once, for example, in the Application class.

  • The order of SDK initialization and API connection does not matter, but both methods must be finished successfully before invoking the createStartIntent method.

  • Alternatively, you can use the login and password provided by your Oz Forensics account manager:

    OzLivenessSDK.setApiConnection(
        OzConnection.fromCredentials(host, username, password),
        statusListener(
    
    OzLivenessSDK.INSTANCE.setApiConnection(
            OzConnection.Companion
    

    Although, the preferred option is authentication via access token – for security reasons.

    By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for telemetry as shown below:

    OzLivenessSDK.setEventsConnection(
        OzConnection.fromCredentials(
            "https://echo.cdn.ozforensics.com/",
    
    OzLivenessSDK.setEventsConnection(
            OzConnection.fromCredentials(
    

    Clearing authorization:

    hashtag
    Other Methods

    Check for the presence of the saved Oz API access token:

    LogOut:

    access token
    OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(host, token))
    OzLivenessSDK.INSTANCE.setApiConnection(
            OzConnection.Companion.fromServiceToken(host, token), 
            null
    );
    Injection Attacks

    An injection attack on a liveness detection system is an attempt to bypass liveness verification mechanisms by injecting falsified data or modifying the execution logic of the frontend SDK. This type of attack is typically carried out by code injection, tampering with the runtime environment, or replacing camera input components (e.g., intercepting and substituting the video stream in real time, using virtual cameras, or manipulating JavaScript logic in the Web SDK). The goal of such attacks is to trick the liveness system into accepting fake or pre-recorded content as a genuine live interaction with the user.

    Examples for mobile SDKs:

    • Virtual cameras.

    • File system integrity compromise.

    • Function hooking & application modification.

    • Emulators and cloud devices.

    For web:

    • Virtual cameras.

    • Code injection attacks.

    hashtag
    Presentation Attacks

    A presentation attack is an attempt to deceive the system by presenting pre-recorded or artificial content that mimics a real user. The goal of such attacks is to pass the liveness check without involving a real, live person. These attacks do not target the SDK directly but rather the biometric models on the backend. They may include:

    • Photos,

    • Videos,

    • 3D masks,

    • Screens of other devices, or

    • Other media used to create the illusion of live presence.

    hashtag
    Other (System Manipulation) Attacks

    These attacks include cyber fraudsters manipulating how the liveness detection module is integrated into the application or backend, bypassing or faking the check. Typically, these attacks involve patching the app, injecting hooks, or exploiting weak verification of liveness results. For instance, Man-in-the-Middle (MitM) / SSL Interception attack that is based on substitution or manipulation of captured data during network transmission, typically involving SSL/TLS violations or certificate pinning bypass.

    hashtag
    Oz Software Built-In Security Measures

    With cyberattacks on the rise, cybersecurity has become crucial and is now our highest priority. We provide protection even from multi-vector complex attacks, ensuring your data is safe at all stages of processing, including media capture, data transmission, and analysis. This protection involves many mechanisms on multiple layers that work together, supporting and reinforcing each other. To name a few:

    • We do not accept virtual cameras and emulators.

    • In native SDKs, you can configure SSL pinning and add protection for media files using request payload.

    • For Web SDK, you can move the decision logic to backend to avoid manipulating data within the browser context.

    As our software aims to be embedded, it includes mechanisms to verify its runtime integrity, but it does not validate the integrity of the host application itself. Ensuring protection of the host application through anti-tampering techniques, code obfuscation, and runtime integrity verification is the responsibility of the host application owner. Without such safeguards, even a secure SDK may become susceptible to manipulation at the application or platform level.

    hashtag
    Recommendations for Host Application Protection

    Here are some measures we recommend to protect your application.

    1. Consider revising your policies. This might involve:

      • Creating and using corporate SSL certificates,

      • Limiting access to unverified sources,

      • Using SSL proxy,

      • Controlling connections via SNI / TLS Handshake,

      • Creating a security policy and adhere to it,

      • etc.

    2. For mobile applications, use Play Integrity (Android) and App Attest (iOS).

    3. As for our SDKs, we recommend:

      • Ensuring you always have the latest version of Oz software installed, as almost each of our releases includes security enhancements.

      • Setting up for Native SDKs.

    For more detailed recommendations, please contact us. For us, clients' safety comes first, and we’ll be happy to help.

    How to Integrate Server-Based Liveness into Your Web Application

    This guide outlines the steps for integrating the Oz Liveness Web SDK into a customer web application for capturing facial videos and subsequently analyzing them on a server.

    The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. Under the hood, it communicates with Oz API.

    Oz Liveness Web SDK detects both presentation and injection attacks. An injection attack is an attempt to feed pre-recorded video into the system using a virtual camera.

    Finally, while the cloud-based service provides the fully-fledged functionality, we also offer an on-premise version with the same functions but no need for sending any data to our cloud. We recommend starting with the SaaS mode and then reconnecting your web app to the on-premise Web Adapter and Oz API to ensure seamless integration between your front end and back end. With these guidelines in mind, integrating the Oz Liveness Web SDK into your web application can be a simple and straightforward process.

    1

    hashtag
    Get your Web Adapter

    Tell us domain names of the pages from which you are going to call Web SDK and email for admin access, e.g.:

    In response, you’ll get URLs and credentials for further integration and usage. When using SaaS API, you get them :

    2

    hashtag
    Obtain a session token from Oz API

    Session token is required for functionality.

    3

    hashtag
    Add Web Plugin to your web pages

    Add the following tags to your HTML code. Use Web Adapter URL received before:

    4

    hashtag
    Implement your logic around Web Plugin

    Add the code that opens the plugin and handles the results. You'll require a session token from step 2.

    circle-info

    With these steps, you are done with basic integration of Web SDK into your web application. You will be able to access recorded media and analysis results in via browser or programmatically via (please find the instructions here: , ).

    In the you can find instructions for common next steps:

    • Customizing plugin look-and-feel

    • Adding custom language pack

    • Tuning plugin behavior

    Please find a sample for Oz Liveness Web SDK . To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.

    For Angular and React, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.

    • sample

    Changelog

    hashtag
    8.22.0 – Dec. 23, 2025

    • Changed the way you add our SDK to pubspec.yaml. Please check the installation section.

    • Android:

      • Fixed the bug with green videos on some smartphone models.

      • Fixed occasional SDK crashes in specific cases and / or on specific devices.

      • Resolved the issue with mediaId appearing null.

    • iOS:

      • Fixed the bug with crashes that might happen during the Biometry analysis after taking a reference photo using camera.

      • Resolved the issue with SDK not returning the license-related callbacks.

    hashtag
    8.19.0 – Nov. 24, 2025

    • Android:

      • Resolved an issue with warning that could appear when running Fragment.

      • SDK no longer crashes when calling copyPlane.

    hashtag
    8.18.1 – Sept. 10, 2025

    • initSDK in the iOS debugging mode now works properly.

    hashtag
    8.18.0 – Aug. 26, 2025

    • You can now .

    • Fixed an error in the example code.

    • The Scan gesture hint is now properly voiced.

    hashtag
    8.16.0 – Apr. 30, 2025

    • Changed the wording for the head_down gesture: the new wording is “tilt down”.

    • Updated the authorization logic.

    • Improved voiceover.

    hashtag
    8.14.0 – Dec. 17, 2024

    • Security and telemetry updates.

    • The SDK hints and UI controls can be voiced in accordance to WCAG requirements.

    • Improved user experience with head movement gestures.

    hashtag
    8.12.0 – Oct. 11, 2024

    • The executeLiveness method is now deprecated, please use startLiveness instead.

    • Updated the code needed to obtain the Liveness results.

    • Security and telemetry updates.

    hashtag
    8.8.2 – June 27, 2024

    • Added descriptions for the errors that occur when providing an empty string as an ID in the addFolderID (iOS) and setFolderID (Android) methods.

    • Android:

      • Fixed a bug causing an endless spinner to appear if the user switches to another application during the Liveness check.

    hashtag
    8.6.0 – Apr. 15, 2024

    • Android:

      • Upgraded the on-device Liveness model.

      • Security updates.

    hashtag
    8.5.0 – Mar. 20, 2024

    • The length of the Selfie gesture is now (affects the video file size).

    • Removed the pause after the Scan gesture.

    • Security and logging updates.

    hashtag
    8.4.0 – Jan. 11, 2024

    • Android: updated the on-device Liveness model.

    • iOS: changed the default behavior in case a localization key is missing: now the English string value is displayed instead of a key.

    • Fixed some bugs.

    hashtag
    8.3.0 – Nov. 30, 2023

    • Implemented the possibility of using a master license that works with any bundle_id.

    • Fixed the bug with background color flashing.

    • Video compression failure on some phone models is now fixed.

    hashtag
    8.2.0 – Nov. 17, 2023

    • Initial release.

    Files · main · oz-forensics / Oz Forensics Public projects / oz-liveness-ios-sample · GitLabGitLabchevron-right

    How to Integrate Server-Based Liveness into Your Mobile Application

    This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and subsequently analyzing them on the server.

    The SDK implements a ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results. The SDK methods for liveness analysis communicate with Oz API under the hood.

    Before you begin, make sure you have Oz API credentials. When using SaaS API, you get them :

    For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the

    Launching the Plugin

    The plugin window is launched with open(options) method:

    GET /api/folders/?meta_data=transaction_id==<your_transaction_id> to find a folder in Oz API from your backend by your unique identifier.

    Read more about .

    hashtag

    How to Add Face Matching of Liveness Video with a Reference Photo From Your Database

    circle-exclamation

    Please note: this guide applies to the non-container flow only.

    This guide describes how to match a liveness video with a reference photo of a person that is already stored in your database.

    However, if you prefer to include a photo ID capture step to your liveness process instead of using a stored photo, then you can refer to in this section.

    Media Tags

    hashtag
    What Are Tags for

    To work properly, the resolution algorithms need each uploaded media to be marked with special tags. For video and images, the tags are different. They help algorithms to identify what should be in the photo or video and analyze the content.

    hashtag

    OzLivenessSDK.setApiConnection(null)
    OzLivenessSDK.INSTANCE.setApiConnection(null, null);
    val isLoggedIn = OzLivenessSDK.isLoggedIn
    boolean isLoggedIn = OzLivenessSDK.INSTANCE.isLoggedIn();
    OzLivenessSDK.logout()
    OzLivenessSDK.INSTANCE.logout();
    { token -> /* token */ },
    { ex -> /* error */ }
    )
    )
    .
    fromCredentials
    (
    host
    ,
    username
    ,
    password
    ),
    new StatusListener<String>() {
    @Override
    public void onStatusChanged(@Nullable String s) {}
    @Override
    public void onSuccess(String token) { /* token */ }
    @Override
    public void onError(@NonNull OzException e) { /* error */ }
    }
    );
    "<[email protected]>",
    "your_telemetry_password"
    )
    )
    "https://tm.ozforensics.com/",
    "<[email protected]>",
    "your_telemetry_password"
    )
    );
    For Web SDK, retrieving and processing the analyses' results on your backend to avoid possible manipulations within the browser context, and using the Safe mode when setting up responses.
    SSL pinning
    Improved SDK performance for some devices.
  • Updated SDK to support the upcoming security features.

  • Enhanced security.

  • Resolved the issue with SDK sometimes not responding to user actions on some devices.
  • Updated SDK to support the upcoming security features.

  • Enhanced security.

  • When you choose to send compressed videos for a hybrid analysis, SDK no longer saves original media as well as compressed.

  • iOS:

    • The Scan gesture animation now works properly.

    • Fixed the bug where SDK didn’t call completion during initialization in debug mode.

  • Enhanced security.

  • If you try to delete the reference photo, SDK now asks you to confirm deletion.
  • Background is no longer dark when you launch SDK.

  • SDK no longer flips one of the images during the Biometry analysis.

  • Fixed some bugs related to the on-device and hybrid analysis types.

  • Android:

    • Added support for Google Dynamic Feature Delivery.

    • Resolved the issue with possible SDK crashes when closing the Liveness screen.

  • iOS:

    • Resolved the issue with integration via Swift UI.

  • Enhanced security and updated telemetry.

  • Bug fixes.
  • Security updates.

  • Android: you can now disable video validation that has been implemented to avoid recording extremely short videos (3 frames and less).

  • iOS:

    • SDK now compresses videos if their size exceeds 10 MB.

    • Head movement gestures are now handled properly.

    • Xcode updated to version 16 to comply with Apple requirements.

  • Android:
    • Moved the large video compression step to the Liveness screen closure.

    • Fixed the bug when the best shot frame could contain an image with closed eyes.

    • Resolved codec issues on some smartphone models.

    • Fixed the bug when the recorded videos might appear green.

  • iOS:

    • Added Xcode 16 support.

    • The screen brightness no longer changes when the rear camera is used.

    • Fixed the video recording issues on some smartphone models.

  • Fixed some smartphone model specific-bugs.

  • Security and logging updates.

  • iOS:
    • The messages displayed by the SDK after uploading media have been synchronized with Android.

    • The bug causing analysis delays that might have occurred for the One Shot gesture has been fixed.

    Bug fixes.
  • Android: if the recorded video is larger than 10 MB, it gets compressed.

  • run an analysis for the particular folder
    configurable

    For the on-premise Oz API, you need to create a user yourself or ask your team that manages the API. See the guide on user creation via Web Console. Consider the proper user role (CLIENT in most cases or CLIENT ADMIN, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.

    (Optional, only if you use stateful API) Authorize as any non-OPERATOR role.
  • Call GET {{host}}/api/authorize/session_token.

  • Example request

    Keep in mind that it is more secure to get your back end responsible for the decision logic. You can find more details including code samples here.
    Plugin parameters and callbacks
  • Security recommendations

  • Sveltearrow-up-right sample

    Domain names from which Web SDK will be called:

    1. www.yourbrand.com

    2. www.yourbrand2.com

    Email for admin access:

    • [email protected]

    from usenvelope
    OzCapsula
    Web Console
    API
    retrieving an MP4 video
    getting analysis results
    Web Plugin Developer Guide,arrow-up-right
    herearrow-up-right
    Angular samplearrow-up-right
    React samplearrow-up-right
    Vuearrow-up-right

    Login: [email protected]

    Password: …

    API: https://sandbox.ohio.ozforensics.com/

    Web Console: https://sandbox.ohio.ozforensics.com

    Web Adapter: https://web-sdk.cdn.sandbox.ozforensics.com/your_company_name/

    By this time you should have already implemented liveness video recording and liveness check. If not, please refer to these guides:
    • Integration of Oz Liveness Web SDK

    • Integration of Oz Liveness Mobile SDK

    In this scenario, you upload your reference image to the same folder where you have a liveness video, initiate the BIOMETRY analysis, and poll for the results.

    hashtag
    1. Get folder_id

    Given that you already have the liveness video recorded and uploaded, you will be working with the same Oz API folder where your liveness video is. Obtain the folder ID as described below, and pass it to your back end.

    • For a video recorded by Web SDK, get the folder_id as described herearrow-up-right.

    • For a video recorded by Android or iOS SDK, retrieve the folder_id from the analysis’ results as shown below:

    Android:

    iOS:

    hashtag
    2. Upload your reference photo

    Callarrow-up-right POST /api/folders/{{folder_id}}/media/ method, replacing the folder_id with the ID you’ve got in the previous step. This will upload your new media to the folder where your ready-made liveness video is located.

    Set the appropriate tags in the payload field of the request, depending on the nature of a reference photo that you have.

    hashtag
    3. Initiate the analysis

    To launch the analysis, callarrow-up-right POST /api/folders/{{folder_id}}/analyses/ with the folder_id from the previous step. In the request body, specify the biometry check to be launched.

    hashtag
    4. Poll for the results

    Repeat callingarrow-up-right GET /api/analyses/{{analysis_id}} with the analysis_id from the previous step once a second until the state changes from PROCESSING to something else. For a finished analysis:

    • get the qualitative result from resolution (SUCCESS or DECLINED).

    • get the quantitative results from analyses.results_data.min_confidence

    Here is the Postman collection for this guide.

    With these steps completed, you are done with adding face matching via Oz API. You will be able to access your media and analysis results in Web UI via browser or programmatically via API.

    Oz API methods can be combined with great flexibility. Explore Oz API using the API Developer Guidearrow-up-right.

    another guide
    file-download
    6KB
    Face Matching with a Reference Photo.postman_collection.json
    arrow-up-right-from-squareOpen
    <script src="https://<web-adapter-url>/plugin_liveness.php"></script>
    OzLiveness.open({
      session_token,
      lang: 'en',
      action: [
        // 'photo_id_front', // request photo ID picture
        'video_selfie_blank' // request passive liveness video
      ],
      on_complete: function (result) {
        // This callback is invoked when the analysis is complete
        console.log('on_complete', result);
      }
    });
    curl -L 'https://{{host}}/api/authorize/session_token' \
    -H 'X-Forensic-Access-Token: <token>' \
    -H 'Content-Type: application/json'
    AnalysisRequest.Builder()
            ...
            .run(object : AnalysisRequest.AnalysisListener {
                override fun onSuccess(result: List<OzAnalysisResult>) {
                    // save folder_id that is needed for the next step
                    val folderId = result.firstOrNull()?.folderId
                }
                ...
            })
    private void analyzeMedia(List<OzAbstractMedia> mediaList) {
        new AnalysisRequest.Builder()
                ...
                .run(new AnalysisRequest.AnalysisListener() {
                    @Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
                    @Override
                    public void onSuccess(@NonNull List<OzAnalysisResult> list) {
                        String folderId = list.get(0).getFolderId();
                        }
                    }
                    ...
        });
    }
    analysisRequest.run(
    scenarioStateHandler: { state in }, 
    uploadProgressHandler: { (progress) in }  
    )   { (analysisResults : [OzAnalysisResult], error) in 
            // save folder_id that is needed for the next step
            let folderID = analysisResults.first?.folderID
        }
    }
    {
      "media:tags": { 
        "photo1": [
            "photo_id", "photo_id_front" // for the front side of an ID
            // OR
            "photo_selfie" // for a non-ID photo
        ]
      }
    }
    {
        "analyses": [
            {
                "type": "biometry"
            }
        ]
    }
    . Consider the proper user role (CLIENT in most cases or CLIENT ADMIN, if you are going to make SDK work with the pre-created folders from other API users). In the end, you need to obtain a similar set of credentials as you would get for the SaaS scenario.

    We also recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. For Oz API users, the service is enabled by default. For on-premise installations, we'll provide you with credentials.

    Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license on our websitearrow-up-right or email usenvelope for a long-term license.

    hashtag
    Android

    1

    hashtag
    Add SDK to your project

    In the build.gradle of your project, add:

    In the build.gradle of the module, add:

    2

    hashtag
    Initialize SDK

    Rename the license file to forensics.license and place it into the project's res/raw folder.

    3

    hashtag
    Connect SDK to Oz API

    Use API credentials (login, password, and API URL) that you’ve got from us.

    In production, instead of hard-coding login and password in the application, it is recommended to get access token on your backend with API method then pass it to your application:

    4

    hashtag
    Add face recording

    To start recording, use startActivityForResult:

    To obtain the captured video, use onActivityResult

    5

    hashtag
    Run analyses

    To run the analyses, execute the code below.

    hashtag
    iOS

    1

    hashtag
    Add our SDK to your project

    CocoaPodsarrow-up-right

    To integrate OZLivenessSDK into an Xcode project, add to Podfile:

    SPM

    Add the following package dependencies via SPM: (if you need a guide on adding the package dependencies, please refer to the ). OzLivenessSDK is mandatory. Skip the OzLivenessSDKOnDevice file.

    2

    hashtag
    Initialize SDK

    Rename the license file to forensics.license and put it into the project.

    3

    hashtag
    Connect SDK to Oz API

    Use API credentials (login, password, and API URL) that you’ve got from us.

    In production, instead of hard-coding the login and password in the application, it is recommended to get an access token on your back end using the API method, then pass it to your application:

    4

    hashtag
    Add face recording

    Create a controller that will capture videos as follows:

    The delegate object must implement the OZLivenessDelegate protocol:

    5

    hashtag
    Run analyses

    Use AnalysisRequestBuilder to initiate the Liveness analysis. The communication with Oz API is under the hood of the run method.

    With these steps, you are done with basic integration of Mobile SDKs. You will be able to access recorded media and analysis results in Web Console via browser or programmatically via API.

    In developer guides, you can also find instructions for customizing the SDK look-and-feel and access the full list of our Mobile SDK methods. Check out the table below:

    Android source codes

    iOS source codes

    Android OzLiveness SDK

    iOS OzLiveness SDK

    in PlayMarket

    in TestFlight

    Login: [email protected]

    Password: …

    API: https://sandbox.ohio.ozforensics.com

    Web Console: https://sandbox.ohio.ozforensics.com

    from usenvelope
    guide on user creation via Web Console
    Parameters

    The full list of OzLiveness.open() parameters:

    • options – an object with the following settings:

      • session_token – a specific token required to use OzCapsula data container.

      • token – (optional) the auth token;

      • license – an object containing the license data;

      • licenseUrl – a string containing the path to the license;

      • lang – a string containing the identifier of one of the installed language packs;

      • meta– an object with names of meta fields in keys and their string values in values. is transferred to Oz API and can be used to obtain analysis results or for searching;

      • params– an object with identifiers and additional parameters:

        • extract_best_shot– true or false: run the best frame choice in the Quality analysis;

      • action– an array of strings with identifiers of actions to be performed. Available actions:

        • photo_id_front – photo of the ID front side;

        • photo_id_back

      • overlay_options – the document's template displaying options:

        • show_document_pattern: true/false – true by default, displays a template image, if set to false, the image is replaced by a rectangular frame;

      • on_submit– a callback function (no arguments) that is called after submitting customer data to the server (unavailable for the ).

      • on_capture_complete – a callback function (with one argument) that is called after the video is captured and retrieves the information on this video. The example of the response is described .

      • on_result– a callback function (with one argument) that is called periodically during the analysis and retrieves an intermediate result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode and is described .

      • on_complete– a callback function (with one argument) that is called after the check is completed and retrieves the analysis result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode and is described .

      • on_error – a callback function (with one argument) that is called in case of any error happened during video capturing and retrieves the error information: an object with the error code, error message, and telemetry ID for logging.

      • on_close– a callback function (no arguments) that is called after the plugin window is closed (whether manually by the user or automatically after the check is completed).

      • style – .

      • device_id – (optional) identifier of camera that is being used.

      • enable_3d_mask – enables the 3D mask as the default face capture behavior. This parameter works only if load_3d_mask in the Web Adapter is set to true; the default value is false.

      • cameraFacingMode (since 1.4.0) – the parameter that defines which camera to use; possible values: user (front camera), environment (rear camera). This parameter only works if the use_for_liveness option in the file is undefined. If use_for_liveness is set (with any value), cameraFacingMode gets overridden and ignored.

      • disable_adaptive_aspect_ratio (since 1.5.0) – if True, disables the video adaptive aspect ratio, so your video doesn’t automatically adjust to the window aspect ratio. The default value is False, and by default, the video adjusts to the closest ratio of 4:3, 3:4, 16:9, or 9:16. Please note: smartphones still require the portrait orientation to work.

      • get_user_media_timeout (since 1.5.0) – when Web SDK can’t get access to the user camera, after this timeout it displays a hint on how to solve the problem. The default value is 40000 (ms).

      • if the getUserMedia() function hangs, you can manage the SDK behavior using the following parameters (since 1.7.15):

        • get_user_media_promise_timeout_ms – set the timeout (in ms) after which SDK will throw an error or display an instruction. This parameter is an object with the following keys: "platform_browser", "browser", "platform", "default"

    Callarrow-up-right
    Oz APIarrow-up-right
    Tags for Video Files

    The following tag types should be specified in the system for video files.

    • To identify the data type of the video:

      • video_selfie

    • To identify the orientation of the video:

      • orientation_portrait – portrait orientation;

      • orientation_landscape – landscape orientation.

    • To identify the action on the video:

      • video_selfie_left – head turn to the left;

      • video_selfie_right – head turn to the right;

    The tags listed allow the algorithms recognizing the files as suitable for the Quality (Liveness) and Biometry analyses.

    Important: in API 4.0.8 and below, to launch the Quality analysis for a photo, pack the image into a .zip archive, apply the SHOTS_SET type, and mark it with video_*. Otherwise, it will be ignored by algorithms.

    Example of the correct tag set for a video file with the “blink” action:

    hashtag
    Tags for Photo Files

    The following tag types should be specified in the system for photo files:

    • A tag for selfies:

      • photo_selfie – to identify the image type as “selfie”.

    • Tags for photos/scans of ID cards:

      • photo_id – to identify the image type as “ID”;

      • photo_id_front – for the photo of the ID front side;

      • photo_id_back – for the photo of the ID back side (ignored for any other analyses like

    Important: in API 4.0.8 and below, to launch the Quality analysis for a photo, pack the image into a .zip archive, apply the SHOTS_SET type, and mark it with video_*. Otherwise, it will be ignored by algorithms.

    Example of the correct tag set for a “selfie” photo file:

    Example of the correct tag set for a photo file with the face side of an ID card:

    Example of the correct set of tags for a photo file of the back of an ID card:

    How to Integrate On-Device Liveness into Your Mobile Application

    circle-info

    We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results.

    This guide outlines the steps for integrating the Oz Liveness Mobile SDK into a customer mobile application for capturing facial videos and performing on-device liveness checks without sending any data to a server.

    The SDK implements the ready-to-use face capture user interface that is essential for seamless customer experience and accurate liveness results.

    Oz Liveness Mobile SDK requires a license. License is bound to the bundle_id of your application, e.g., com.yourcompany.yourapp. Issue the 1-month trial license or for a long-term license.

    hashtag
    Android

    1

    hashtag
    Add SDK to your project

    In the build.gradle of your project, add:

    In the build.gradle of the module, add:

    2

    hashtag
    iOS

    1

    hashtag
    Add our SDK to your project

    Install OZLivenessSDK.

    hashtag

    With these steps, you are done with basic integration of Mobile SDKs. The data from the on-device analysis is not transferred anywhere, so please bear in mind you cannot access it via API or Web console. However, the internet is still required to check the license. Additionally, we recommend that you use our logging service called telemetry, as it helps a lot in investigating attacks' details. We'll provide you with credentials.

    Oz Liveness Web SDK

    Oz Liveness Web SDK is a module for processing data on clients' devices. With Oz Liveness Web SDK, you can take photos and videos of people via their web browsers and then analyze these media. Most browsers and devices are supported. Available languages: EN, ES, PT-BR, KK.

    Please find a sample for Oz Liveness Web SDK herearrow-up-right. To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.

    For Angular and React, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.

    • sample

    • sample

    Web SDK requires HTTPS (with SSL encryption) to work; however, at localhost and 127.0.01, you can check the resources' availability via HTTP.

    Oz Liveness Web SDK consists of two components:

    1. Client side – a JavaScript file that is being loaded within the frontend part of your application. It is called .

    2. Server side – a separate server module with , the backend part. The module is called Liveness.

    The integration guides can be found here:

    Oz Web SDK can be provided via SaaS, when the server part works on our servers and is maintained by our engineers, and you just use it, or on-premise, when Oz Web Adapter is installed on your servers. for more details and choose the model that is convenient for you.

    Oz Web SDK requires a to work. To issue a license, we need the domain name of the website where you are going to use our SDK.

    This is a guide on how to start with Oz Web SDK:

    1. the plugin into your page.

    2. If you want to customize the look-and-feel of Oz Web SDK, please refer to .

    Single Request

    hashtag
    Overview and Benefits

    In version 6.0.1, we introduced a new feature which allows you to send all required data and receive the analysis result within a single request.

    Before 6.0.1, interacting with the API required multiple requests: you had to create a folder and upload media to it, initiate analyses (see Liveness, Biometry, and Blacklist), and then either poll for results or use webhooks for notifications when the result was ready. This flow is still supported, so if you need to send separate requests, you can continue using the existing methods that are listed above.

    However, the new API operation mode significantly simplifies the process by allowing you to send a single request and receive the response synchronously. The key benefits are:

    • Single request for everything – all data is sent in one package, eliminating the risk of data loss.

    • Synchronous response – no need for polling or webhooks to retrieve results.

    • High performance – supports up to 36 analyses per minute per instance.

    hashtag
    Usage

    To use this method, call POST /api/folders/. In the X-Forensic-Access-Token header, pass your . Add media files to the request body and define the tags and metadata if needed in the payload part.

    hashtag
    Request Example

    hashtag
    Response Example

    In response, you receive analysis results.

    You're done.

    Getting a License for Android SDK

    You can generate the trial license herearrow-up-right or contact us by emailenvelope to get a productive license. To create the license, your applicationId (bundle id) is required.

    To pass your license file to the SDK, call the OzLivenessSDK.init method with a list of LicenseSources. Use one of the following:

    • LicenseSource.LicenseAssetId should contain a path to a license file called forensics.license, which has to be located in the project's res/raw folder.

    • LicenseSource.LicenseFilePath should contain a file path to the place in the device's storage where the license file is located.

    In case of any license errors, the onError function is called. Use it to handle the exception as shown above. Otherwise, the system will return information about license. To check the license data manually, use the getLicensePayload method.

    hashtag
    Possible License Errors

    Error message
    What to Do

    Getting a License for iOS SDK

    hashtag
    License

    You can generate the trial license herearrow-up-right or contact us by emailenvelope to get a productive license. To create the license, your bundle id is required. After you get a license file, there are two ways to add the license to your project.

    1. Rename this file to forensics.license and put it into the project. In this case, you don't need to set the path to the license.

    2. During the runtime: when initializing SDK, use the following method.

    or

    LicenseSource a source of license, and LicenseData is the information about your license. Please note: this method checks whether you have an active license or not and if yes, this license won't be replaced with a new one. To force the license replacement, use the setLicense method.

    In case of any license errors, the system will use your error handling code as shown above. Otherwise, the system will return information about license. To check the license data manually, use OZSDK.licenseData.

    hashtag
    Possible License Errors

    Error message
    What to Do

    Android

    To start using Oz Android SDK, follow the steps below.

    1. Embed Oz Android SDK into your project as described .

    2. Get a trial license for SDK on our or a production license by . We'll need your application id. Add the license to your project as described .

    Collection (1:N) Check

    How to compare a photo or video with ones from your database.

    The collection check algorithm is designed to check the presence of a person using a database of preloaded photos. A video fragment and/or a photo can be used as a source for comparison.

    hashtag
    Prerequisites:

    1. You're

    allprojects {
        repositories {
            maven { url "https://ozforensics.jfrog.io/artifactory/main" }
        }
    }
    dependencies {
        implementation 'com.ozforensics.liveness:full:<version>'
        // You can find the version needed in the Android changelog
    }
    pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in  iOS changelog
    
    OzLiveness.open({
    // omit session_token if your SDK version is older than 1.9.2 or you have set api_use_session_token to api
      session_token, 
      lang: 'en',
      action: [
        'photo_id_front', // request photo ID picture
        'video_selfie_blank' // request passive liveness video
      ],
      meta: { 
        // an ID of user undergoing the check
        // add for easier conversion calculation
        'end_user_id': '<user_or_lead_id>',
        // Your unique identifier that you can use later to find this folder in Oz API 
        // Optional, yet recommended
        'transaction_id': '<your_transaction_id>',
        // You can add iin if you plan to group transactions by the person identifier 
        'iin': '<your_client_iin>',
        // Other meta data
        'meta_key': 'meta_value',
      },
      on_error: function (result) {
      // error details
      console.error('on_error', result);
      },
      on_complete: function (result) {
        // This callback is invoked when the analysis is complete
        // It is recommended to commence the transaction on your backend, 
        // using transaction_id to find the folder in Oz API and get the results
        console.log('on_complete', result);
      },
      on_capture_complete: function (result) {
        // Handle captured data here if necessary
        console.log('on_capture_complete', result);
      }
    });
    "video1": [
      "video_selfie",
      "video_selfie_eyes",
      "orientation_portrait"
    ]
    "photo1": [
      "photo_selfie"
    ]
    "photo1": [
      "photo_id",
      "photo_id_front"
    ]
    "photo1": [
      "photo_id",
      "photo_id_back"
    ]
    – photo of the ID back side;
  • video_selfie_left – turn head to the left;

  • video_selfie_right – turn head to the right;

  • video_selfie_down – tilt head downwards;

  • video_selfie_high – raise head up;

  • video_selfie_smile – smile;

  • video_selfie_eyes – blink;

  • video_selfie_scan – scanning;

  • video_selfie_blank – no action, simple selfie;

  • video_selfie_best – special action to select the best shot from a video and perform analysis on it instead of the full video.

  • (the priority matches the sequence).
  • get_user_media_promise_timeout_throw_error – defines whether, after the time period defined in the parameter above, SDK should call an error (if true) or display a user instruction (if false).

  • Metadata
    capture mode
    here
    configuration parameter
    here
    configuration parameter
    here
    the customization section
    configuration parameters
    Web Adapter configuration

    video_selfie_down – head tilt downwards;

  • video_selfie_high – head raise up;

  • video_selfie_smile – smile;

  • video_selfie_eyes – blink;

  • video_selfie_scan – scanning;

  • video_selfie_oneshot – a one-frame analysis;

  • video_selfie_blank – no action.

  • or
    ).
    Quality
    Biometry
    Logo
    access token

    License error. License at (your_URI) not found

    The license file is missing. Please check its name and path to the file.

    License error. Cannot parse license from (your_URI), invalid format

    The license file is somehow damaged. Please email us the file.

    License error. Bundle company.application.id is not in the list allowed by license (bundle.id1, bundle.id2)

    The bundle (application) identifier you specified is missing in the allowed list. Please check the spelling, if it is correct, you need to get another license for your application.

    License error. Current date yyyy-mm-dd hh:mm:ss is later than license expiration date yyyy-mm-dd hh:mm:ss

    Your license has expired. Please contact us.

    License is not initialized.

    You haven't initialized the license. Please add the license to your project as described above.

    :
    autharrow-up-right
    https://gitlab.com/oz-forensics/oz-mobile-ios-sdkarrow-up-right
    Apple documentationarrow-up-right
    autharrow-up-right
    sample apparrow-up-right
    sample apparrow-up-right
    Developer Guidearrow-up-right
    Developer Guidearrow-up-right
    Demo apparrow-up-right
    Demo apparrow-up-right
    .
  • You have already created a folder and added your media marked by correct tags into this folder.

  • hashtag
    Processing steps:

    1. Initiate the analysis: POST/api/folders/{{folder_id}}/analyses/

    If you want to use a webhook for response, add it to the payload at this step, as described here.

    You'll need analysis_id or folder_id from response.

    2. If you use a webhook, just wait for it to return the information needed. Otherwise, initiate polling:

    • GET /api/analyses/{{analysis_id}} – for the analysis_id you have from the previous step.

    • GET /api/folders/{{folder_id}} – for all analyses performed on media in the folder with the folder_id you have from the previous step.

    Wait for the resolution_status and resolution fields to change the status to anything other than PROCESSING and treat this as a result.

    If you want to know which person from your collection matched with the media you have uploaded, find the collection analysis in the response, check results_media, and retrieve person_id. This is the ID of the person who matched with the person in your media. To get the information about this person, use GET /api/collections/{{collection_id}}/persons/{{person_id}} with IDs of your collection and person.

    authorized
    {
      // (optional block) folder metadata if needed
      "folder:meta_data": {
        "partner_side_folder_id": "00000000-0000-0000-0000-000000000000",
        "person_info": {
          "first_name": "John",
          "middle_name": "Jameson",
          "last_name": "Doe"
        }
      },
      // (optional block) folder metadata if needed
      "media:meta_data": {
        "video1": {
          "foo": "bar"
        }
      },
      "media:tags": {
        "video1": [
          "video_selfie",
          "video_selfie_eyes",
          "orientation_portrait"
        ]
      },
      "analyses": [
        {
          "type": "quality",
          // (optional block) folder metadata if needed
          "meta_data": {
            "example1": "some_example1"
          },
          // additional parameters
          "params": {
            "threshold_spoofing": 0.5,
            "extract_best_shot": false
          }
        }
      ]
    }
    {
      "company_id": "00000000-0000-0000-0000-000000000000",
      "time_created": 1744017549.366616,
      "folder_id": "00000000-0000-0000-0000-000000000000",
      "user_id": "00000000-0000-0000-0000-000000000000",
      "resolution_endpoint": null,
      "resolution_status": "FINISHED",
      "resolution_comment": "[]",
      "system_resolution": "SUCCESS",
      ...
      // folder metadata if you've added it
      "meta_data": {
        "partner_side_folder_id": "00000000-0000-0000-0000-000000000000",
        "person_info": {
          "first_name": "John",
          "middle_name": "Jameson",
          "last_name": "Doe"
        }
      },
      "media": [
        {
          "company_id": "00000000-0000-0000-0000-000000000000",
          "folder_id": "00000000-0000-0000-0000-000000000000",
          "folder_time_created": 1744017549.366616,
          "original_name": "00000000-0000-0000-0000-000000000000.mp4",
          "original_url": null,
          "media_id": "00000000-0000-0000-0000-000000000000",
          "media_type": "VIDEO_FOLDER",
          "tags": "video1": [
    		"video_selfie",
    		"video_selfie_eyes",
    		"orientation_portrait"
    	]
          "info": {},
          "time_created": 1744017549.368665,
          "time_updated": 1744017549.36867,
    	  // media metadata if you've added it
          "meta_data": {
            "foo": "bar"
          },
          "thumb_url": null,
          "image_id": "00000000-0000-0000-0000-000000000000"
        }
      ],
      "time_updated": 1744017549.366629,
      "analyses": [
        {
          "company_id": "00000000-0000-0000-0000-000000000000",
          "group_id": "00000000-0000-0000-0000-000000000000",
          "folder_id": "00000000-0000-0000-0000-000000000000",
          "folder_time_created": 1744017549.366616,
          "analysis_id": "00000000-0000-0000-0000-000000000000",
          "state": "FINISHED",
          "resolution_operator": null,
          "results_media": [
            {
             ...
            }
          ],
          "results_data": null,
    	  // analysis metadata if you've added it
          "meta_data": {
            "example1": "some_example1"
          },
          "time_created": 1744017549.369485,
          "time_updated": 1744017550.659305,
          "error_code": null,
          "error_message": null,
          "source_media": [
            {
    	 ...
            }
          ],
          "type": "QUALITY",
          "analyse_id": "00000000-0000-0000-0000-000000000000",
          "resolution_status": "SUCCESS",
          "resolution": "SUCCESS"
        }
      ]
    }
    OZSDK(licenseSources: [.licenseFileName(“forensics.license”)]) { licenseData, error in
          if let error = error {
            print(error)
          }
        }
    OZSDK(licenseSources: [.licenseFilePath(“path_to_file”)]) { licenseData, error in
          if let error = error {
            print(error)
          }
        }
    Kotlin
    OzLivenessSDK.init(
        context,
        listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
    )
    Kotlin
    OzLivenessSDK.setApiConnection(
        OzConnection.fromCredentials(host, username, password),
        statusListener(
            { token -> /* token */ },
            { ex -> /* error */ }
        )
    )
    Kotlin
    OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(host, token))
    Kotlin
    val sessionToken: String = getSessionToken()
    val captureRequest = CaptureRequest(
        listOf(
            AnalysisProfile(
                Analysis.Type.QUALITY,
                listOf(MediaRequest.ActionMedia(OzAction.Blank)),
            )
        )
    )
    val intent = OzLivenessSDK.createStartIntent(captureRequest, sessionToken)
    startActivityForResult(intent, REQUEST_LIVENESS_CONTAINER)
    Kotlin
    private fun runAnalysis(container: DataContainer?) {
        if (container == null) return
        AnalysisRequest.Builder()
            .addContainer(container)
            .build()
            .run(
                { result ->
                    val isSuccess = result.analysisResults.all { it.resolution == Resolution.SUCCESS }
                },
                { /* show error */ },
                { /* update status */ },
            )
    }
    OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
        if let error = error {
            print(error.errorDescription)
        }
    }
    OZSDK.setApiConnection(Connection.fromCredentials(host: “https://sandbox.ohio.ozforensics.com”, login: login, password: p)) { (token, error) in
        // Your code to handle error or token
    }
    OZSDK.setApiConnection(Connection.fromServiceToken(host: "https://sandbox.ohio.ozforensics.com", token: token)) { (token, error) in
    }
    getSessionToken() { sessionToken in
                DispatchQueue.main.async {
                    do {
                        let action:OZVerificationMovement = .selfie
                        let mediaRequest = MediaRequest.action(action)
                        let profile = AnalysisProfile(mediaList: [mediaRequest],
                                                      type: .quality,
                                                      params: [:] )
                        let request = CaptureRequest(analysisProfileList: [profile], cameraPosition: .front)
                        let ozLivenessVC = try OZSDK.createMediaCaptureScreen(self, request, sessionToken: sessionToken)
                        self.present(ozLivenessVC, animated: true)
                    } catch let error {
                        print(error.localizedDescription)
                    }
                }
            }
    extension ViewController: LivenessDelegate {
        func onResult(container: DataContainer) {
        }
        func onError(status: OZVerificationStatus?) {
        }
    }
    func onResult(container: DataContainer) {
      let analysisRequest = AnalysisRequestBuilder()
      analysisRequest.addContainer(container)
      analysisRequest.run(
                statusHandler: { status in
                },
                errorHandler: { error in
                }
            ) { result in
            }
    }
    Kotlin
    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_LIVENESS_CONTAINER) {
            when (resultCode) {
                OzLivenessResultCode.SUCCESS -> runAnalysis(OzLivenessSDK.getContainerFromIntent(data))
                OzLivenessResultCode.USER_CLOSED_LIVENESS -> { /* user closed the screen */ }
                else -> {
                    val errorMessage = OzLivenessSDK.getErrorFromIntent(data)
                    /* show error */
                }
            }
        }
    }
    request body
    {
      "analyses": [{
        "type": "collection",
        "source_media": ["1111aaaa-11aa-11aa-11aa-111111aaaaaa"], // // optional; omit to include all media from the folder
      }]
    }

    hashtag
    Initialize SDK

    Rename the license file to forensics.license and place it into the project's res/raw folder.

    OzLivenessSDK.init(
        context,
        listOf(LicenseSource.LicenseAssetId(R.raw.forensics))
    
    OzLivenessSDK.INSTANCE.init(
            context,
            
    
    3

    hashtag
    Add face recording

    To start recording, use startActivityForResult:

    val OZ_LIVENESS_REQUEST_CODE = 1
    val intent = OzLivenessSDK.createStartIntent(listOf
    int OZ_LIVENESS_REQUEST_CODE = 1;
    Intent intent 
    

    To obtain the captured video, use onActivityResult:

    The sdkMediaResult object contains the captured videos.

    4

    hashtag
    Run analyses

    To run the analyses, execute the code below. Mind that mediaList is an array of objects that were captured (sdkMediaResult) or otherwise created (media you captured on your own).

    private fun analyzeMedia(mediaList: List<OzAbstractMedia>) {
        AnalysisRequest.Builder
    
    private void analyzeMedia(List<OzAbstractMedia> mediaList) 
    
    CocoaPodsarrow-up-right

    To integrate OZLivenessSDK into an Xcode project, add to Podfile:

    hashtag
    SPM

    Add the following package dependencies via SPM: https://gitlab.com/oz-forensics/oz-mobile-ios-sdkarrow-up-right (if you need a guide on adding the package dependencies, please refer to the Apple documentationarrow-up-right). OzLivenessSDK is mandatory. Ensure you've added the OzLivenessSDKOnDevice file.

    2

    hashtag
    Initialize SDK

    Rename the license file to forensics.license and put it into the project.

    3

    hashtag
    Add face recording

    Create a controller that will capture videos as follows:

    The delegate object must implement OZLivenessDelegate protocol:

    4

    hashtag
    Run analyses

    Use AnalysisRequestBuilder to initiate the Liveness analysis.

    Android sample apparrow-up-right source codes

    iOS sample apparrow-up-right source codes

    Android OzLiveness SDK Developer Guidearrow-up-right

    iOS OzLiveness SDK Developer Guidearrow-up-right

    Demo apparrow-up-right in PlayMarket

    Demo apparrow-up-right in TestFlight

    on our websitearrow-up-right
    email usenvelope

    Connect SDK to API as described here. This step is optional, as this connection is required only when you need to process data on a server.

  • Capture videos using methods described here. You'll send them for analysis afterward.

  • Analyze media you've taken at the previous step. The process of checking liveness and face biometry is described here.

  • If you want to customize the look-and-feel of Oz Android SDK, please refer to this section.

  • hashtag
    Resources

    Recommended Android version: 5+ (the newer the smartphone is, the faster the analyses are).

    Recommended versions of components:

    Gradle

    7.5.1

    Kotlin

    1.7.21

    AGP

    7.3.1

    Java Target Level

    1.8

    JDK

    17

    We do not support emulators.

    Available languages: EN, ES, HY, KK, KY, TR, PT-BR.

    To obtain the sample apps source code for the Oz Liveness SDK, proceed to the GitLab repository:

    Follow the link below to see a list of SDK methods and properties:

    Download the demo app latest build herearrow-up-right.

    here
    websitearrow-up-right
    emailenvelope
    here
    Android SDK Methods and Propertieschevron-right

    License error. License at (your_URI) not found

    The license file is missing. Please check its name and path to the file.

    License error. Cannot parse license from (your_URI), invalid format

    The license file is somehow damaged. Please email us the file.

    License error. Bundle company.application.id is not in the list allowed by license (bundle.id1, bundle.id2)

    The bundle (application) identifier you specified is missing in the allowed list. Please check the spelling, if it is correct, you need to get another license for your application.

    License error. Current date yyyy-mm-dd hh:mm:ss is later than license expiration date yyyy-mm-dd hh:mm:ss

    Your license has expired. Please contact us.

    License is not initialized. Call 'OzLivenessSDK.init before using SDK

    You haven't initialized the license. Call OzLivenessSDK.init with your license data as explained above.

    OzLivenessSDK.init(context,
        listOf(
            LicenseSource.LicenseAssetId(R.raw.your_license_name),
            LicenseSource.LicenseFilePath("absolute_path_to_your_license_file")
        ),
        object : StatusListener<LicensePayload> {
            override fun onSuccess(result: LicensePayload) { /*check the license payload*/ }
            override fun onError(error: OzException) { /*handle the exception */ }
        }
      )
    OzLivenessSDK.INSTANCE.getConfig().setBaseURL(BASE_URL);
    OzLivenessSDK.INSTANCE.init(context,
        Arrays.asList(
            new LicenseSource.LicenseAssetId(R.raw.forensics),
            new LicenseSource.LicenseFilePath("absolute_path_to_your_license_file")
        ),
        new StatusListener<LicensePayload>() {
            @Override public void onStatusChanged(@Nullable String s) {}
            @Override public void onSuccess(LicensePayload licensePayload) { /*check the license payload*/
            @Override public void onError(@NonNull OzException e) { /*
        }
    );
    1. On-Premise

    • Install our Web SDK. Our engineers will help you to install the components needed using the or manually. The license will be installed as well; to update it, please refer to .

    • Configure the

    1. SaaS

    • This part is fully covered by the Oz Forensics engineers. You get a link for Oz Web Plugin (see step 2).

    Angular samplearrow-up-right
    React samplearrow-up-right
    Vuearrow-up-right
    Sveltearrow-up-right
    Oz Liveness Web Plugin
    OZ API
    Oz
    Web Adapter
    How to Integrate Server-Based Liveness into Your Web Applicationchevron-right
    How to Add Photo ID Capture and Face Matching to Your Web or Mobile Applicationchevron-right
    Contact usenvelope
    license
    Integrate
    this section

    Statuses in API

    This article contains the full description of folders' and analyses' statuses in API.

    Field name / status
    analyse.state
    analyse.resolution_status
    folder.resolution_status
    system_resolution

    INITIAL

    -

    -

    starting state

    starting state

    The details on each status are below.

    hashtag
    Analysis State (analyse.state)

    This is the state when the analysis is being processed. The values of this state can be:

    PROCESSING – the analysis is in progress;

    FAILED – the analysis failed due to some error and couldn't get finished;

    FINISHED – job's done, the analysis is finished, and you can check the result.

    hashtag
    Analysis Result (analyse.resolution_status)

    Once the analysis is finished, you'll see one of the following results:

    SUCCESS– everything went fine, the check succeeded (e.g., faces match or liveness confirmed);

    OPERATOR_REQUIRED (except the Liveness analysis) – the result should be additionally checked by a human operator;

    circle-info

    The OPERATOR_REQUIRED status appears only if it is set up in biometry settings.

    DECLINED – the check failed (e.g., faces don't match or some spoofing attack detected).

    If the analysis hasn't been finished yet, the result inherits a value from analyse.state: PROCESSING (the analysis is in progress) / FAILED (the analysis failed due to some error and couldn't get finished).

    hashtag
    Folder Status (folder.resolution_status)

    A folder is an entity that contains media to analyze. If the analyses have not been finished, the stage of processing media is shown in resolution_status:

    INITIAL – no analyses applied;

    PROCESSING – analyses are in progress;

    FAILED – any of the analyses failed due to some error and couldn't get finished;

    FINISHED – media in this folder are processed, the analyses are finished.

    hashtag
    Folder Result (system_resolution)

    Folder result is the consolidated result of all analyses applied to media from this folder. Please note: the folder result is the result of the last-finished group of analyses. If all analyses are finished, the result will be:

    SUCCESS– everything went fine, all analyses completed successfully;

    OPERATOR_REQUIRED (except the Liveness analysis) – there are no analyses with the DECLINED status, but one or more analyses have been completed with the OPERATOR_REQUIRED status;

    DECLINED – one or more analyses have been completed with the DECLINED status.

    circle-info

    The analyses you send in a single POST request form a group. The group result is the "worst" result of analyses this group contains: INITIAL > PROCESSING > FAILED > DECLINED > OPERATOR_REQUIRED > SUCCESS, where SUCCESS means all analyses in the group have been completed successfully without any errors.

    Biometry (Face Matching)

    The Biometry algorithm is intended to compare two or more photos and detect the level of similarity of the spotted faces. As a source media, the algorithm takes photos, videos, and documents (with photos).

    hashtag
    Requirements

    1. You're authorized.

    2. You have already marked by correct tags into this folder.

    hashtag
    Processing steps

    1. Initiate the analysis for the folder: POST /api/folders/{{folder_id}}/analyses/

    If you want to use a webhook for response, add it to the payload at this step, as described .

    You'll needanalysis_id or folder_id from response.

    2. If you use a webhook, just wait for it to return the information needed. Otherwise, initiate polling:

    • GET /api/analyses/{{analysis_id}} – for the analysis_id you have from the previous step.

    • GET /api/folders/{{folder_id}} – for all analyses performed on media in the folder with the folder_id you have from the previous step.

    Repeat until the resolution_status and resolution fields change status to any other except PROCESSING, and treat this as a result.

    Check the response for the min_confidence value. It is a quantitative result of matching the people on the media uploaded.

    System Objects

    The description of the objects you can find in Oz Forensics system.

    hashtag
    Objects Hierarchy

    System objects on Oz Forensics products are hierarchically structured as shown in the picture below.

    On the top level, there is a Company. You can use one copy of Oz API to work with several companies.

    OZSDK(licenseSources: [.licenseFileName("forensics.license")]) { licenseData, error in
        if let error = error {
            print(error.errorDescription)
        }
    }
    let actions: [OZVerificationMovement] = [.selfie]
    let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
    self.present(ozLivenessVC, animated: true)
    let actions: [OZVerificationMovement] = [.selfie]
    let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(delegate, actions: actions) 
    self.present(ozLivenessVC, animated: true)
    let analysisRequest = AnalysisRequestBuilder()
    let analysis = Analysis.init(
    media: mediaToAnalyze, 
    type: .quality, 
    mode: .onDevice)
    analysisRequest.uploadMedia(mediaToAnalyze)
    analysisRequest.addAnalysis(analysis)
    analysisRequest.run(
    scenarioStateHandler: { state in }, // scenario steps progress handler
    uploadProgressHandler: { (progress) in } // file upload progress handler 
    ) { (analysisResults : [OzAnalysisResult], error) in 
        // receive and handle analyses results here 
        for result in analysisResults {
            print(result.resolution)
            print(result.folderID)
        }
    }
    allprojects {
        repositories {
            maven { url "https://ozforensics.jfrog.io/artifactory/main" }
        }
    }
    dependencies {
        implementation 'com.ozforensics.liveness:full:<version>'
        // You can find the version needed in the Android changelog
    }
    pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => '<version>' // You can find the version needed in  iOS changelog
    
    }
    handle the exception
    */
    }

    PROCESSING

    starting state

    starting state

    analyses in progress

    analyses in progress

    FAILED

    system error

    system error

    system error

    system error

    FINISHED

    finished successfully

    -

    finished successfully

    -

    DECLINED

    -

    check failed

    -

    check failed

    OPERATOR_REQUIRED

    -

    additional check is needed

    -

    additional check is needed

    SUCCESS

    -

    check succeeded

    -

    check succeeded

    .
    standalone installer
    this article
    adapter
    )
    Collections
    .
    singletonList
    (
    new
    LicenseSource
    .
    LicenseAssetId
    (
    R
    .
    raw
    .
    forensics
    )),
    null
    );
    ( OzAction.Blank))
    startActivityForResult
    (intent, OZ_LIVENESS_REQUEST_CODE)
    =
    OzLivenessSDK
    .
    INSTANCE
    .
    createStartIntent
    (
    Collections
    .
    singletonList
    (
    OzAction
    .
    Blank
    ));
    startActivityForResult(intent, OZ_LIVENESS_REQUEST_CODE);
    ()
    .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList))
    .build()
    .run(object : AnalysisRequest.AnalysisListener {
    override fun onSuccess(result: List<OzAnalysisResult>) {
    result.forEach {
    println(it.resolution.name)
    println(it.folderId)
    }
    }
    override fun onError(error: OzException) {
    error.printStackTrace()
    }
    })
    }
    {
    new AnalysisRequest.Builder()
    .addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.ON_DEVICE, mediaList, Collections.emptyMap()))
    .build()
    .run(new AnalysisRequest.AnalysisListener() {
    @Override public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) {}
    @Override
    public void onSuccess(@NonNull List<OzAnalysisResult> list) {
    for (OzAnalysisResult result: list) {
    System.out.println(result.getResolution().name());
    System.out.println(result.getFolderId());
    }
    }
    @Override
    public void onError(@NonNull OzException e) { e.printStackTrace(); }
    });
    }
    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
            if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
                val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
                val sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
                if (!sdkMediaResult.isNullOrEmpty()) {
                    analyzeMedia(sdkMediaResult)
                } else println(sdkErrorString)
            }
        }
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == OZ_LIVENESS_REQUEST_CODE) {
            List<OzAbstractMedia> sdkMediaResult = OzLivenessSDK.INSTANCE.getResultFromIntent(data);
            String sdkErrorString = OzLivenessSDK.INSTANCE.getErrorFromIntent(data);
            if (sdkMediaResult != null && !sdkMediaResult.isEmpty()) {
                analyzeMedia(sdkMediaResult);
            } else System.out.println(sdkErrorString);
        }
    }
    created a folder and added your media
    here
    request body
    {
      "analyses": [{
        "type": "biometry",
        // optional; omit to include all media from the folder
        "source_media": [
          "1111aaaa-11aa-11aa-11aa-111111aaaaaa", 
          "2222bbbb-22bb-22bb-22bb-222222bbbbbb" 
          ]
      }]
    }
    [
      {
        // you may have multiple analyses in the list
        // pick the one you need by analyse_id or type
        "analysis_id": "1111aaaa-11aa-11aa-11aa-111111aaaaaa",
        "type": "BIOMETRY",
        "results_media": [
          {
            // if you have multiple media in one analysis, match score with media by source_video_id/source_shots_set_id 
            "source_video_id": "1111aaaa-11aa-11aa-11aa-111111aaaaab", // for shots_set media, the key would be source_shots_set_id 
            "results_data": 
            {
              "max_confidence": 0.997926354, 
              "min_confidence": 0.997926354 // quantitative score for this media
            }
          ...
        ]
        "resolution_status": "SUCCESS", // qualitative resolution (based on all media)
        ...
      }
      ...
    ]
    The next level is a User. A company can contain any amount of users. There are several roles of users with different permissions. For more information, refer to User Roles.

    When a user requests an analysis (or analyses), a new folder is created. This folder contains media. One user can create any number of folders. Each folder can contain any amount of media. A user applies analyses to one or more media within a folder. The rules of assigning analyses are described here. The media quality requirements are listed on this page.

    hashtag
    Object parameters

    hashtag
    Common parameters

    Parameter

    Type

    Description

    time_created

    Timestamp

    Object (except user and company) creation time

    time_updated

    Timestamp

    Object (except user and company) update time

    meta_data

    Json

    Any user parameters

    technical_meta_data

    Besides these parameters, each object type has specific ones.

    hashtag
    Company

    Parameter

    Type

    Description

    company_id

    UUID

    Company ID within the system

    name

    String

    Company name within the system

    hashtag
    User

    Parameter

    Type

    Description

    user_id

    UUID

    User ID within the system

    user_type

    String

    ​​first_name

    String

    Name

    last_name

    hashtag
    Folder

    Parameter

    Type

    Description

    folder_id

    UUID

    Folder ID within the system

    resolution_status

    ResolutionStatus

    The latter analysis status

    hashtag
    Media

    Parameter

    Type

    Description

    media_id

    UUID

    Media ID

    original_name

    String

    Original filename (how the file was called on the client machine)

    original_url

    Url

    HTTP link to this file on the API server

    tags

    hashtag
    Analysis

    Parameter

    Type

    Description

    analyse_id

    UUID

    ID of the analysis

    folder_id

    UUID

    ID of the folder

    type

    String

    Analysis type (BIOMETRY\QUALITY\DOCUMENTS)

    results_data

    Using OzCapsula Data Container in Native SDK

    OzCapsula is our proprietary data format, designed to provide end-to-end protection and maintain data integrity during transmission. We've introduced this format in 8.22 and implemented new methods to use it:

    • createMediaCaptureScreen(request) for video capture: this method takes a video and packages it into a data container.

    • AnalysisRequest.addContainer for processing data: this method adds container to the analysis request.

    hashtag
    Code Examples

    Please follow the links below to access the examples. You can also refer to the illustrative examples shown below the links.

    hashtag
    Kotlin

    hashtag
    Swift

    hashtag
    Methods and Properties

    Please check the methods and properties below. You can also find them in the corresponding sections of and documentation sections.

    hashtag
    addContainer

    This method replaces addAnalysis in the AnalysisRequest structure when you use the data container flow.

    Input

    hashtag
    createMediaCaptureScreen

    Captures media file with all information you need and packages it into a data container.

    Input

    Output

    hashtag
    public data class CaptureRequest

    Detects a request for video capture.

    hashtag
    public data class AnalysisProfile

    Contains information on media files and analyses that should be applied to them.

    hashtag
    public sealed class MediaRequest

    Stores information about a media file.

    circle-exclamation

    Please note: you should add actionMedia OR userMedia, these parameters are mutually exclusive.

    hashtag
    Exceptions

    If, during the video capture, SDK encounters an error that prevents user scenario from completion, the data container is deleted.

    Should you have any questions, please contact us.

    Capturing Videos

    hashtag
    OzCapsula (SDK v8.22 and newer)

    circle-exclamation

    Please note: all required data (other than the video) must be packaged into the container before starting the Liveness screen.

    To start recording, use startActivityForResult:

    To obtain the captured video, use onActivityResult:

    If you use fragment, please refer to the example below. LivenessFragment is the representation of the Liveness screen UI.

    hashtag
    SDK 8.21 and older

    To start recording, use thestartActivityForResult method:

    actions – a list of while recording video.

    For Fragment, use the code below. LivenessFragment is the representation of the Liveness screen UI.

    circle-exclamation

    To ensure the license being processed properly, we recommend initializing SDK first, then opening the Liveness screen.

    To obtain the captured video, use theonActivityResult method:

    • sdkMediaResult – an object with video capturing results for interactions with Oz API (a list of the objects),

    • sdkErrorString – description of , if any.

    circle-info

    If you use our SDK just for capturing videos, omit the Checking Liveness and Face Biometry step.

    If a user closes the capturing screen manually, resultCode receives the Activity.RESULT_CANCELED value.

    Code example:

    Customizing Android SDK

    hashtag
    Configuration

    We recommend applying these settings when starting the app.

    // connecting to the API server
    OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(HOST, TOKEN))
    // settings for the number of attempts to detect an action
    
    OzConfig config = OzLivenessSDK.INSTANCE.getConfig();
    

    hashtag
    Interface Customization

    To customize the Oz Liveness interface, use UIcustomization as shown below. For the description of customization parameters, please refer to .

    By default, SDK uses the locale of the device. To switch the locale, use the code below:

    Migration to OzCapsula

    This guide describes the migration to the latest OzCapsula architecture: the approach where you use an encrypted container to securely transmit data between frontend and backend.

    hashtag
    Component Version Requirements

    Before starting the migration, ensure that all components are updated to the minimum required versions:

    Json

    Module-required parameters; reserved for internal needs

    String

    Surname

    middle_name

    String

    Middle name

    email

    String

    User email = login

    password

    String

    User password (only required for new users or to change)

    can_start_analyze_*

    String

    Depends on user roles

    company_id

    UUID

    Current user company’s ID within the system

    is_admin

    Boolean

    ​​Whether this user is an admin or not

    is_service

    Boolean

    ​​Whether this user account is service or not

    Array(String)

    List of tags for this file

    JSON

    Results of the analysis

    User roles

    Parameter

    Type

    Description

    OzDataContainer

    bytearray[]

    An encrypted file containing media and collateral info, the output of the createMediaCaptureScreen method

    Parameter

    Type

    Description

    request

    CaptureRequest

    Detects a request for video capture

    session_token

    String

    Token for current session

    Parameter

    Type

    Description

    OzDataContainer

    bytearray[]

    An encrypted file containing media and collateral info

    Parameter

    Type

    Description

    analysisProfileList

    List<AnalysisProfile>

    A list of objects that contain information on media and analyses that should be applied to them

    folderMeta (optional)

    Map<String, Any>

    Additional folder metadata

    additionalMediaList (optional)

    List<MediaRequest>

    Media files that you need to upload to server, but it’s not necessary for analyses

    cameraPosition (optional)

    String

    front (default) – front camera

    back – rear camera

    Parameter

    Type

    Description

    mediaList

    List<MediaRequest>

    A list of media to be analyzed

    type

    String (Type (Android) or AnalysisType (iOS))

    Analysis type

    params (optional)

    Map<String, Any>

    Additional analysis parameters

    Parameter

    Type

    Description

    id

    String (UUID v4)

    Media ID

    actionMedia

    OzAction (Android) or OzVerificationMovement (iOS)

    An action that user should perform in a video

    userMedia

    OzAbstractMedia (Android) or OZMedia (iOS)

    An external media file, e.g., a reference or a document photo

    Error

    Text

    Description

    session_token_is_empty

    Session token must not be empty

    Session token is mandatory but hasn’t been provided

    data_container_internal_failure_1

    Internal failure occurred while processing the data container

    The device doesn’t have enough memory to proceed

    • data_container_internal_failure_2

    • data_container_internal_failure_3

    • data_container_internal_failure_4

    Internal failure occurred while processing the data container

    SDK couldn’t generate the container. Try again

    data_container_internal_failure_1000

    Internal failure occurred while processing the data container

    Any other error not from the list above

    Kotlinarrow-up-right
    Javaarrow-up-right
    Swiftarrow-up-right
    iOS
    Android
    Fragmentarrow-up-right
    user actions
    Fragmentarrow-up-right
    OzAbstractMedia
    errors
    OzLivenessSDK.config.attemptSettings = attemptSettings
    // the possibility to display additional debug information (you can do it by clicking the SDK version number)
    OzLivenessSDK.config.allowDebugVisualization = allowDebugVisualization
    // logging settings
    OzLivenessSDK.config.logging = ozLogging
    // connecting to the API server
    OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(HOST, TOKEN));
    // settings for the number of attempts to detect an action
    config.setAttemptSettings(attemptSettings);
    // the possibility to display additional debug information (you can do it by clicking the SDK version number)
    config.setAllowDebugVisualization(allowDebugVisualization);
    // logging settings
    config.setLogging(ozLogging);
    Android SDK Methods and Properties
    OzLivenessSDK.config.customization = UICustomization(
        // customization parameters for the toolbar
        toolbarCustomization = ToolbarCustomization(
            closeIconRes = R.drawable.ib_close,
            closeIconTint = Color.ColorRes(R.color.white),
            titleTextFont = R.font.roboto,
            titleTextSize = 18,
            titleTextAlpha = 100,
            titleTextColor = Color.ColorRes(R.color.white),
            backgroundColor = Color.ColorRes(R.color.black),
            backgroundAlpha = 60,
            isTitleCentered = true,
            title = "Analysis",
        ),
        // customization parameters for the center hint
       centerHintCustomization = CenterHintCustomization(
            textFont = R.font.roboto,
            textColor = Color.ColorRes(R.color.text_color),
            textSize = 20,
            verticalPosition = 50,
            textStyle = R.style.Sdk_Text_Primary,
            backgroundColor = Color.ColorRes(R.color.color_surface),
            backgroundOpacity = 56,
            backgroundCornerRadius = 14,
            textAlpha = 100,
        ),
        // customization parameters for the hint animation
        hintAnimation = HintAnimation(
        hintGradientColor = Color.ColorRes(R.color.red),
        hintGradientOpacity = 80,
        animationIconSize = 120,
        hideAnimation = false,
    ),
        // customization parameters for the frame around the user face
        faceFrameCustomization = FaceFrameCustomization(
            geometryType = GeometryType.Rectangle(10), // 10 is the corner radius
            strokeDefaultColor = Color.ColorRes(R.color.error_red),
            strokeFaceInFrameColor = Color.ColorRes(R.color.success_green),
            strokeAlpha = 100,
            strokeWidth = 5,
            strokePadding = 3,
        ),
        // customization parameters for the background outside the frame
        backgroundCustomization = BackgroundCustomization(
            backgroundColor = Color.ColorRes(R.color.black),
            backgroundAlpha = 60,
        ),
        // customization parameters for the SDK version text
        versionTextCustomization = VersionTextCustomization(
            textFont = R.font.roboto,
            textSize = 12,
            textColor = Color.ColorRes(R.color.white),
            textAlpha = 100,
        ),
        // customization parameters for the antiscam protection text
        antiScamCustomization = AntiScamCustomization(
            textMessage = "",
            textFont = R.font.roboto,
            textSize = 14,
            textColor = Color.ColorRes(R.color.text_color),
            textAlpha = 100,
            backgroundColor = Color.ColorRes(R.color.color_surface),
            backgroundOpacity = 100,
            cornerRadius = 20,
            flashColor = Color.ColorRes(R.color.green),
        ),
        // custom logo parameters
        // should be allowed by license
        logoCustomization = LogoCustomization(
        image = Image.Drawable(R.drawable.ic_logo),
        size = Size(176, 64),
        verticalPosition = 100,
        horizontalPosition = 50,
        )
    )
    // capture and pack media
    val referentPhoto = MediaRequest.UserMedia(OzAbstractMedia.OzDocumentPhoto(OzMediaTag.Blank, referentPhotoPath))
    val blinkVideo = MediaRequest.ActionMedia(OzAction.EyeBlink)
    val scanVideo = MediaRequest.ActionMedia(OzAction.Scan)
    
    val intent = OzLivenessSDK.createMediaCaptureScreen(
        CaptureRequest(
            listOf(
                AnalysisProfile(
                    Analysis.Type.BIOMETRY,
                    listOf(referentPhoto, scanVideo)
                ),
                AnalysisProfile(
                    Analysis.Type.QUALITY,
                    listOf(referentPhoto, scanVideo, blinkVideo)
                ),
            ),
        ),
        sessionToken
    )
    startActivityForResult(intent, REQUEST_CODE_SDK)
    
    // subscription to result
    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_CODE_SDK) {
            when (resultCode) {
                OzLivenessResultCode.USER_CLOSED_LIVENESS -> { /* user closed the screen */ }
                OzLivenessResultCode.SUCCESS -> {
                // result
                    val container = OzLivenessSDK.getContainerFromIntent(data)
                    ...
                }
                else -> {
                // error
                    val errorMessage = OzLivenessSDK.getErrorFromIntent(data)
                    ...
                }
            }
        }
    }
    
    // launching analyses
    AnalysisRequest.Builder()
        .addContainer(container)
        .build()
        .run(
            object: AnalysisRequest.AnalysisListener {
                override fun onSuccess(result: RequestResult) {
                    ...
                }
                override fun onError(exception: OzException) {
                    ...
                }
            }
        )
    // capture and pack media
    let mediaRequest = MediaRequest.actionMedia(.selfie)
    let profile = AnalysisProfile(mediaList: [mediaRequest],
                               type: .quality,
                               params: ["extract_best_shot" : true])
                 
    let request = CaptureRequest(analysisProfileList: [profile], cameraPosition: cameraPosition)
    self.ozLivenessVC = try OZSDK.createMediaCaptureScreen(self, request, sessionToken: sessionToken)
    self.present(ozLivenessVC, animated: true)
    
    // subscription to result
    extension ViewController: LivenessDelegate {
       
      func onError(status: OZVerificationStatus?) {
          // error handling
      }
       
      func onResult(container: DataContainer) {
        let analysisRequest = AnalysisRequestBuilder()
        analysisRequest.addContainer(container)
         
        analysisRequest.run(statusHandler: { [weak self] state in
         },
                  errorHandler: { [weak self] error in
         // error
        }) { result in
    // result
            }
       }
    
    Kotlin
    val sessionToken: String = getSessionToken()
    val captureRequest = CaptureRequest(
        listOf(
            AnalysisProfile(
                Analysis.Type.QUALITY,
                listOf(MediaRequest.ActionMedia(OzAction.Blank)),
            )
        )
    )
    val intent = OzLivenessSDK.createStartIntent(captureRequest, sessionToken)
    startActivityForResult(intent, REQUEST_LIVENESS_CONTAINER)
    Kotlin
    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_LIVENESS_CONTAINER) {
            when (resultCode) {
                OzLivenessResultCode.SUCCESS -> runAnalysis(OzLivenessSDK.getContainerFromIntent(data))
                OzLivenessResultCode.USER_CLOSED_LIVENESS -> { /* user closed the screen */ }
                else -> {
                    val errorMessage = OzLivenessSDK.getErrorFromIntent(data)
                    /* show error */
                }
            }
        }
    }
    childFragmentManager.beginTransaction()
        .replace(android.R.id.content, LivenessFragment.create(createCaptureRequest(), sessionToken))
        .commit()
    
    childFragmentManager.setFragmentResultListener(OzLivenessSDK.Extra.REQUEST_CODE, this) { _, result ->
        when (val resultCode = result.getInt(OzLivenessSDK.Extra.EXTRA_RESULT_CODE)) {
            OzLivenessResultCode.SUCCESS -> {
                val container = result.getParcelable(OzLivenessSDK.Extra.EXTRA_DATA_CONTAINER) as? DataContainer
                /* Run analysis */
            }
            OzLivenessResultCode.USER_CLOSED_LIVENESS -> { /* User closed the screen */ } 
            else -> { /* Show error */ }
        }
    }
    val intent = OzLivenessSDK.createStartIntent(listOf(OzAction.Smile, OzAction.Blank))
    startActivityForResult(intent, REQUEST_CODE)
    List<OzAction> actions  = Arrays.asList(OzAction.Smile, OzAction.Scan);
    Intent intent = OzLivenessSDK.createStartIntent(actions);
    startActivityForResult(intent, REQUEST_CODE);
    childFragmentManager.beginTransaction()
        .replace(R.id.content, LivenessFragment.create(actions))
        .commit()
    // subscribing to the Fragment result
    childFragmentManager.setFragmentResultListener(OzLivenessSDK.Extra.REQUEST_CODE, this) { _, result ->
        when (result.getInt(OzLivenessSDK.Extra.EXTRA_RESULT_CODE)) {
            OzLivenessResultCode.SUCCESS -> { /* start analysis */ }
            else -> { /* show error */ }  
        }
    }
    getSupportFragmentManager().beginTransaction()
            .replace(R.id.content, LivenessFragment.Companion.create(actions, null, null, false))
            .addToBackStack(null)
            .commit();
    // subscribing to the Fragment result
    getSupportFragmentManager().setFragmentResultListener(OzLivenessSDK.Extra.REQUEST_CODE, this, (requestKey, result) -> {
                switch (result.getInt(OzLivenessSDK.Extra.EXTRA_RESULT_CODE)) {
                    case OzLivenessResultCode.SUCCESS: {/* start analysis */}
                    default: {/* show error */}
                }
            });
    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
      super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_CODE) {
          sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
          sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
        }
    }
    @Override
    protected void onActivityResult(int requestCode, int resultCode, @androidx.annotation.Nullable Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == REQUEST_CODE) {
            List<OzAbstractMedia> sdkMediaResult = OzLivenessSDK.INSTANCE.getResultFromIntent(data);
            String sdkErrorString = OzLivenessSDK.INSTANCE.getErrorFromIntent(data);
        }
    when (resultCode) {
        Activity.RESULT_CANCELED -> *USER CLOSED THE SCREEN*
        OzLivenessResultCode.SUCCESS -> {
            val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
            *SUCCESS*
        }
        else -> {
            val errorMessage = OzLivenessSDK.getErrorFromIntent(data)
            *FAILURE*
        }
    }
    OzLivenessSDK.INSTANCE.getConfig().setCustomization(new UICustomization(
    // customization parameters for the toolbar
    new ToolbarCustomization(
        R.drawable.ib_close,
        new Color.ColorRes(R.color.white),
        R.style.Sdk_Text_Primary,
        new Color.ColorRes(R.color.white),
        R.font.roboto,
        Typeface.NORMAL,
        100, // toolbar text opacity (in %)
        18, // toolbar text size (in sp)
        new Color.ColorRes(R.color.black),
        60, // toolbar alpha (in %)
        "Liveness", // toolbar title
        true // center toolbar title
        ),
    // customization parameters for the center hint
    new CenterHintCustomization(
        R.font.roboto,
        new Color.ColorRes(R.color.text_color),
        20,
        50,
        R.style.Sdk_Text_Primary,
        new Color.ColorRes(R.color.color_surface),
        100, // background opacity
        14, // corner radius for background frame   
        100 // text opacity
        ),
    // customization parameters for the hint animation
    new HintAnimation(
        new Color.ColorRes(R.color.red), // gradient color
        80, // gradient opacity (in %)
        120, // the side size of the animation icon square
        false // hide animation
        ),
    // customization parameters for the frame around the user face
    new FaceFrameCustomization(
        GeometryType.RECTANGLE,
        10, // frame corner radius (for GeometryType.RECTANGLE)
        new Color.ColorRes(R.color.error_red), 
        new Color.ColorRes(R.color.success_green),
        100, // frame stroke alpha (in %)
        5, // frame stroke width (in dp)
        3 // frame stroke padding (in dp)
        ),
    // customization parameters for the background outside the frame
    new BackgroundCustomization(
        new Color.ColorRes(R.color.black),
        60 // background alpha (in %)
        ),
     // customization parameters for the SDK version text
     new VersionTextCustomization(
         R.style.Sdk_Text_Primary,
         R.font.roboto,
         12, // version text size
         new Color.ColorRes(R.color.white),
         100 // version text alpha
         ),
     // customization parameters for the antiscam protection text
     new AntiScamCustomization(
        "Recording .. ",
        R.font.roboto,
        12,
        new Color.ColorRes(R.color.text_color),
        100,
        R.style.Sdk_Text_Primary,
        new Color.ColorRes(R.color.color_surface),
        100,
        14,
        new Color.ColorRes(R.color.green)
        )
    // custom logo parameters
     new LogoCustomization(
        new Image.Drawable(R.drawable.ic_logo),
        new Size(176, 64),
        100,
        50
        )
      )
    );
    OzLivenessSDK.config.localizationCode = OzLivenessSDK.OzLocalizationCode.EN
    OzLivenessSDK.INSTANCE.getConfig().setLocalizationCode(OzLivenessSDK.OzLocalizationCode.EN)
    API: 6.4.1.
  • Web SDK: 1.9.2.

  • Native SDKs (iOS / Android): 8.22.

  • triangle-exclamation

    Older versions are not compatible with the new capture and analysis flow.

    circle-info

    For best results, migrate all components simultaneously and avoid partial upgrades.

    hashtag
    Authentication Changes

    All capture sessions now require a session_token issued by the backend.

    • The token must be obtained before starting video capture.

    • It is tied to the current session and the specific container.

    • The token has a limited lifetime.

    hashtag
    Migration actions

    • Implement a backend call to obtain session_token.

    • Pass the token explicitly when creating the capture screen.

    hashtag
    Flow Changes

    hashtag
    Before Container (Legacy Flow)

    1. You launch media capture.

    2. Media is captured.

    3. Media along with required data is sent to Oz API (using your backend as intermediate if needed).

    hashtag
    With Container (New Flow)

    1. You request session_token from backend.

    2. You put additional data like metadata (if needed) into container and launch video capture with session_token.

    3. SDK captures media, packages it into container, and returns an encrypted file.

    4. The encrypted file is sent to Oz API (using your backend as intermediate if needed).

    hashtag
    Migration actions

    hashtag
    API

    • Upgrade to v6.4.1-40 or higher.

    • Switch Content-Type of data you send to application/octet-stream.

    • For Instant API, obtain private and public keys as described here.

    hashtag
    Web SDK

    • Upgrade to v1.9.2 or higher.

    • Ensure backend supports session_token.

    • In the configuration file, set use_wasm_container to true and api_use_session_token to api or client (please refer to for details).

    • Update initialization to pass token explicitly.

    • If you use the capture architecture type, ensure you receive and send to your backend and then to us a blob object (application/octet-stream).

    hashtag
    Mobile SDKs

    • Upgrade to v8.22 or higher.

    • Ensure backend supports session_token.

    • Implement new interfaces as described below.

    Android

    1

    Launching Capture Screen

    2

    Delegate

    3

    Launching Analysis

    iOS

    1

    Launching Capture Screen

    2

    Delegate

    3

    Launching Analysis

    hashtag
    Migration Checklist

    val sessionToken: String = getSessionToken()
    val captureRequest = CaptureRequest(
        listOf(
            AnalysisProfile(
                Analysis.Type.QUALITY,
                listOf(MediaRequest.ActionMedia(OzAction.Blank)),
            )
        )
    )
    val intent = OzLivenessSDK.createStartIntent(captureRequest, sessionToken)
    startActivityForResult(intent, REQUEST_LIVENESS_CONTAINER)
    val intent = OzLivenessSDK.createStartIntent(listOf(OzAction.Blank))
    startActivityForResult(intent, REQUEST_LIVENESS_MEDIA)
    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_LIVENESS_CONTAINER) {
            when (resultCode) {
                OzLivenessResultCode.SUCCESS -> runAnalysis(OzLivenessSDK.getContainerFromIntent(data))
                OzLivenessResultCode.USER_CLOSED_LIVENESS -> { /* user closed the screen */ }
                else -> {
                    val errorMessage = OzLivenessSDK.getErrorFromIntent(data)
                    /* show error */
                }
            }
        }
    }
    override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
        super.onActivityResult(requestCode, resultCode, data)
        if (requestCode == REQUEST_LIVENESS_MEDIA) {
            when (resultCode) {
                OzLivenessResultCode.SUCCESS -> runAnalysis(OzLivenessSDK.getResultFromIntent(data))
                OzLivenessResultCode.USER_CLOSED_LIVENESS -> { /* user closed the screen */ }
                else -> {
                    val errorMessage = OzLivenessSDK.getErrorFromIntent(data)
                    /* show error */
                }
            }
        }
    }
    getSessionToken() { sessionToken in
                DispatchQueue.main.async {
                    do {
                        let action:OZVerificationMovement = .selfie
                        let mediaRequest = MediaRequest.action(action)
                        let profile = AnalysisProfile(mediaList: [mediaRequest],
                                                      type: .quality,
                                                      params: [:] )
                        let request = CaptureRequest(analysisProfileList: [profile], cameraPosition: .front)
                        let ozLivenessVC = try OZSDK.createMediaCaptureScreen(self, request, sessionToken: sessionToken)
                        self.present(ozLivenessVC, animated: true)
                    } catch let error {
                        print(error.localizedDescription)
                    }
                }
            }
    do {
        let ozLivenessVC : UIViewController = try OZSDK.createVerificationVCWithDelegate(self, actions: .selfie)   
        self.present(ozLivenessVC, animated: true)
    } catch let error {
        print(error.localizedDescription)
    }
    extension ViewController: LivenessDelegate {
        func onResult(container: DataContainer) {
        }
        
        func onError(status: OZVerificationStatus?) {
        }
    }
    extension ViewController: OZLivenessDelegate {
        func onOZLivenessResult(results: [OZMedia]) {
        }
      
        func onError(status: OZVerificationStatus?) {
        }
    }
    client
    .
  • this article
    private fun runAnalysis(container: DataContainer?) {
        if (container == null) return
    
        AnalysisRequest.Builder()
            .addContainer(container)
            .build()
            .run(
                { result ->
                    val isSuccess = result.analysisResults.all { it.resolution == Resolution.SUCCESS }
                },
                { /* show error */ },
                { /* update status */ },
            )
    }
    private fun runAnalysis(media: List<OzAbstractMedia>?) {
        if (media.isNullOrEmpty()) return
        AnalysisRequest.Builder()
            .addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, media))
            .build()
            .run(
                { result ->
                    val isSuccess = result.analysisResults.all { it.resolution == Resolution.SUCCESS }
                },
                { /* show error */ },
                { /* update status */ },
            )
    }
    func onResult(container: DataContainer) {
      let analysisRequest = AnalysisRequestBuilder()
      analysisRequest.addContainer(container)
      analysisRequest.run(
                statusHandler: { status in
                },
                errorHandler: { error in
                }
            ) { result in
                
            }
    }
    func onResult(results: [OZMedia]) {
            let analysisRequest = AnalysisRequestBuilder()
            let analysis = Analysis(media: results,
                                    type: .quality,
                                    mode: .serverBased,
                                    params: nil)
            analysisRequest.addAnalysis(analysis)
            analysisRequest.run { status in
            } errorHandler: { error in
            } completionHandler: { results in
                
            }
        }

    Capturing Video and Description of the on_capture_complete Callback

    In this article, you’ll learn how to capture videos and send them through your backend to Oz API.

    hashtag
    1. Overview

    Here is the data flow for your scenario:

    1. Oz Web SDK takes a video and makes it available for the host application as a frame sequence.

    2. The host application calls your backend using an archive of these frames.

    3. After the necessary preprocessing steps, your backend calls Oz API, which performs all necessary analyses and returns the analyses’ results.

    4. Your backend responds back to the host application if needed.

    hashtag
    2. Implementation

    On the server side, Web SDK must be configured to operate in the Capture mode:

    The architecture parameter must be to capture in the app_config.json file.

    In your Web app, add a callback to process captured media when opening the Web SDK :

    The result object structure depends on whether any virtual camera is detected or not.

    hashtag
    No Virtual Camera Detected

    hashtag
    Any Virtual Camera Detected

    Here’s the list of variables with descriptions.

    hashtag
    Please note:

    • The video from Oz Web SDK is a frame sequence, so, to send it to Oz API, you’ll need to archive the frames and transmit them as a ZIP file via the POST /api/folders request (check our).

    • You can retrieve the MP4 video from a folder using the /api/folders/{{folder_id}} request with this folder's ID. In the JSON that you receive, look for the preview_url in source_media. The preview_url

    • Oz API accepts data without the base64 encoding.

    How to Install and Use Oz Flutter Plugin

    Please find the Flutter repository herearrow-up-right.

    hashtag
    Installation and Licensing

    Add the lines below in pubspec.yaml of the project you want to add the plugin to.

    For 8.22 and above:

    For 8.21 and below:

    Add the license file (e.g., license.json or forensics.license) to the Flutter application/assets folder. In pubspec.yaml, specify the Flutter asset:

    For Android, add the Oz repository to /android/build.gradle, allprojects → repositories section:

    For Flutter 8.24.0 and above or Android Gradle plugin 8.0.0 and above, add to android/gradle.properties:

    The minimum SDK version should be 21 or higher:

    For iOS, set the minimum platform to 13 or higher in the Runner → Info → Deployment target → iOS Deployment Target.

    In ios/Podfile, comment the use_frameworks! line (#use_frameworks!).

    hashtag
    Getting Started with Flutter

    hashtag
    Initializing SDK

    Initialize SDK by calling the init plugin method. Note that the license file name and path should match the ones specified in pubspec.yaml (e.g., assets/license.json).

    hashtag
    Connecting SDK to API

    Use the API credentials (login, password, and API URL) that you’ve received from us.

    In production, instead of hard-coding the login and password inside the application, it is recommended to get the access token on your backend via the API auth method, then pass it to your application:

    By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for as shown below:

    or

    hashtag
    Capturing Videos

    To start recording, use the startLiveness method to obtain the recorded media:

    Please note: for versions 8.11 and below, the method name is executeLiveness, and it returns the recorded media.

    To obtain the media result, subscribe to livenessResult as shown below:

    hashtag
    Checking Liveness and Face Biometry

    To run the analyses, execute the code below.

    Create the Analysis object:

    Execute the formed analysis:

    If you need to run an analysis for a particular folder, pass its ID:

    The analysisResult list of objects contains the result of the analysis.

    If you want to use media captured by another SDK, the code should look like this:

    The whole code block will look like this:

    oz-forensics / oz-liveness-android · GitLabGitLabchevron-right
    ozsdk: ^8.22.0

    The coordinates of the face landmarks (left eye, right eye, nose, mouth, left ear, right ear) in the best frame

    frame_list

    Array[String]

    All frames in the data URL format

    frame_bounding_box_list

    Array[Array[Named_parameter: Int]]

    The coordinates of the bounding boxes where the face is located in the corresponding frames

    frame_landmarks

    Array[Named_parameter: Array[Int, Int]]

    The coordinates of the face landmarks (left eye, right eye, nose, mouth, left ear, right ear) in the corresponding frames

    action

    String

    An action code

    additional_info

    String

    Information about client environment

    parameter contains the link to the video. From the plugin, MP4 videos are unavailable (only as frame sequences).
  • Also, in the POST {{host}}/api/folders request, you need to add the additional_info field. It is required for the capture architecture mode to gather the necessary information about client environment. Here’s the example of filling in the request’s body:

  • Variable

    Type

    Description

    best_frame

    String

    The best frame, JPEG in the data URL format

    best_frame_png

    String

    The best frame, PNG in the data URL format, it is required for protection against virtual cameras when video is not used

    best_frame_bounding_box

    Array[Named_parameter: Int]

    The coordinates of the bounding box where the face is located in the best frame

    best_frame_landmarks

    set
    plugin
    arrow-up-right
    Postman collections

    Array[Named_parameter: Array[Int, Int]]

    Parameter

    Type

    Description

    actions

    List<VerificationAction>

    Actions from the captured video

    use_main_camera

    Boolean

    If True, uses the main camera, otherwise the front one.

    telemetry
    OzLiveness.open({
      ... // other parameters
      on_capture_complete: function(result) {
             // Your code to process media/send it to your API, this is STEP #2
      }
    })
    {
    	"action": <action>,
    	"best_frame": <bestframe>,
    	"best_frame_png": <bestframe_png>,
    	"best_frame_bounding_box": {
    		"left": <bestframe_bb_left>,
    		"top": <bestframe_bb_top>,
    		"right": <bestframe_bb_right>,
    		"bottom": <bestframe_bb_bottom>
    		},
    	"best_frame_landmarks": {
    		"left_eye": [bestframe_x_left_eye, bestframe_y_left_eye],
    		"right_eye": [bestframe_x_right_eye, bestframe_y_right_eye],
    		"nose_base": [bestframe_x_nose_base, bestframe_y_nose_base],
    		"mouth_bottom": [bestframe_x_mouth_bottom, bestframe_y_mouth_bottom],
    		"left_ear": [bestframe_x_left_ear, bestframe_y_left_ear],
    		"right_ear": [bestframe_x_right_ear, bestframe_y_right_ear]
    		},
    	"frame_list": [<frame1>, <frame2>],
    	"frame_bounding_box_list": [
    		{
    		"left": <frame1_bb_left>,
    		"top": <frame1_bb_top>,
    		"right": <frame1_bb_right>,
    		"bottom": <frame1_bb_bottom>
    		},
    		{
    		"left": <frame2_bb_left>,
    		"top": <frame2_bb_top>,
    		"right": <frame2_bb_right>,
    		"bottom": <frame2_bb_bottom>
    		},
    	],
    	"frame_landmarks": [
    		{
    		"left_eye": [frame1_x_left_eye, frame1_y_left_eye],
    		"right_eye": [frame1_x_right_eye, frame1_y_right_eye],
    		"nose_base": [frame1_x_nose_base, frame1_y_nose_base],
    		"mouth_bottom": [frame1_x_mouth_bottom, frame1_y_mouth_bottom],
    		"left_ear": [frame1_x_left_ear, frame1_y_left_ear],
    		"right_ear": [frame1_x_right_ear, frame1_y_right_ear]
    		},
    		{
    		"left_eye": [frame2_x_left_eye, frame2_y_left_eye],
    		"right_eye": [frame2_x_right_eye, frame2_y_right_eye],
    		"nose_base": [frame2_x_nose_base, frame2_y_nose_base],
    		"mouth_bottom": [frame2_x_mouth_bottom, frame2_y_mouth_bottom],
    		"left_ear": [frame2_x_left_ear, frame2_y_left_ear],
    		"right_ear": [frame2_x_right_ear, frame2_y_right_ear]
    		}
    	],
    "from_virtual_camera": null,
    "additional_info": <additional_info>
    }
    {
    	"action": <action>,
    	"best_frame": null,
    	"best_frame_png": null,
    	"best_frame_bounding_box": null,
    	"best_frame_landmarks": null
    	"frame_list": null,
    	"frame_bounding_box_list": null,
    	"frame_landmarks": null,
    	"from_virtual_camera": {
    	"additional_info": <additional_info>,
    		"best_frame": <bestframe>,
    		"best_frame_png": <best_frame_png>,
    		"best_frame_bounding_box": {
    			"left": <bestframe_bb_left>,
    			"top": <bestframe_bb_top>,
    			"right": <bestframe_bb_right>,
    			"bottom": <bestframe_bb_bottom>
    			},
    		"best_frame_landmarks": {
    			"left_eye": [bestframe_x_left_eye, bestframe_y_left_eye],
    			"right_eye": [bestframe_x_right_eye, bestframe_y_right_eye],
    			"nose_base": [bestframe_x_nose_base, bestframe_y_nose_base],
    			"mouth_bottom": [bestframe_x_mouth_bottom, bestframe_y_mouth_bottom],
    			"left_ear": [bestframe_x_left_ear, bestframe_y_left_ear],
    			"right_ear": [bestframe_x_right_ear, bestframe_y_right_ear]
    			},
    		"frame_list": [<frame1>, <frame2>],
    		"frame_bounding_box_list": [
    			{
    			"left": <frame1_bb_left>,
    			"top": <frame1_bb_top>,
    			"right": <frame1_bb_right>,
    			"bottom": <frame1_bb_bottom>
    			},
    			{
    			"left": <frame2_bb_left>,
    			"top": <frame2_bb_top>,
    			"right": <frame2_bb_right>,
    			"bottom": <frame2_bb_bottom>
    			},
    			],
    		"frame_landmarks": [
    			{
    			"left_eye": [frame1_x_left_eye, frame1_y_left_eye],
    			"right_eye": [frame1_x_right_eye, frame1_y_right_eye],
    			"nose_base": [frame1_x_nose_base, frame1_y_nose_base],
    			"mouth_bottom": [frame1_x_mouth_bottom, frame1_y_mouth_bottom],
    			"left_ear": [frame1_x_left_ear, frame1_y_left_ear],
    			"right_ear": [frame1_x_right_ear, frame1_y_right_ear]
    			},
    			{
    			"left_eye": [frame2_x_left_eye, frame2_y_left_eye],
    			"right_eye": [frame2_x_right_eye, frame2_y_right_eye],
    			"nose_base": [frame2_x_nose_base, frame2_y_nose_base],
    			"mouth_bottom": [frame2_x_mouth_bottom, frame2_y_mouth_bottom],
    			"left_ear": [frame2_x_left_ear, frame2_y_left_ear],
    			"right_ear": [frame2_x_right_ear, frame2_y_right_ear]
    			}
    		]
    	}
    }
    "VIDEO_FILE_KEY": VIDEO_FILE_ZIP_BINARY
    "payload": "{
            "media:meta_data": {
               "VIDEO_FILE_KEY": {
                  "additional_info": <additional_info>
                  }
               }
    }"
      ozsdk:
        git:
          url: https://gitlab.com/oz-forensics/oz-mobile-flutter-plugin.git
          ref: '8.8.2'
    assets
      - assets/license.json // please note that the license file name must match to the one placed in assets
    allprojects {
        repositories {
            google()
            mavenCentral()
            maven { url ‘https://ozforensics.jfrog.io/artifactory/main’ } // repository URL
        }
    }
    android.nonTransitiveRClass=false
    defaultConfig {
      ...
      minSDKVersion 21
      ...
    }
    await OZSDK.initSDK([<% license path and license file name %>]);
    await OZSDK.setApiConnectionWithCredentials(<login>, <password>, <host>);
     await OZSDK.setApiConnectionWithToken(token, host);
    await OZSDK.setEventConnectionWithCredentials(<login>, <password>, <host>);
    await OZSDK.setEventConnectionWithToken(<token>, <host>);
    await OZSDK.startLiveness(<actions>, <use_main_camera>);
    class Screen extends StatefulWidget {
      static const route = 'liveness';
    
      const Screen({super.key});
    
      @override
      State<Screen> createState() => _ScreenState();
    }
    
    class _ScreenState extends State<Screen> {
      late StreamSubscription<List<Media>> _subscription;
    
      @override
      void initState() {
        super.initState();
    
        // subscribe to liveness result
        _subscription = OZSDK.livenessResult.listen(
          (List<Media> medias) {
              // media contains liveness media
          },
          onError: (Object error) {
            // handle error, in most cases PlatformException
          },
        );
      }
    
      @override
      Widget build(BuildContext context) {
        // omitted to shorten the example
      }
    
      void _startLiveness() async {
        // use startLiveness to start liveness screen
        OZSDK.startLiveness(<list of actions>);
      }
    
      @override
      void dispose() {
        // cancel subscription
        _subscription.cancel();
        super.dispose();
      }
    }
    List<Analysis> analysis = [ Analysis(Type.quality, Mode.serverBased, <media>, {}), ];
    final analysisResult = await OZSDK.analyze(analysis, [], {});
    final analysisResult = await OZSDK.analyze(analysis, folderID, [], {});
    media = Media(FileTypedocumentPhoto, VerificationAction.oneShot, “photo_selfie”, null, <path to image>, null, null, “”)
    // replace VerificationAction.blank with your Liveness gesture if needed
    final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);
    
    final analysis = [
      Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
    ];
    
    final analysisResult = await OZSDK.analyze(analysis, [], {});
    // replace VerificationAction.blank with your Liveness gesture if needed
    final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);
    final biometryMedia = [...cameraMedia];
    biometryMedia.add(
      Media(
        FileType.documentPhoto,
        VerificationAction.blank,
        MediaType.movement,
        null,
        <your reference image path>,
        null,
        null,
        MediaTag.photoSelfie,
      ),
    );
    
    final analysis = [
      Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
      Analysis(Type.biometry, Mode.serverBased, biometryMedia, {}),
    ];
    
    final analysisResult = await OZSDK.analyze(analysis, [], {});

    Changelog

    API changes

    hashtag
    6.4.2-18 – Feb. 12, 2026

    • Breaking changes: if you use S3, set OZ_STATIC_S3_BUCKET_URL="None" to avoid breaking of the S3 connection.

    • You can now add container to the existing folder using the POST api/folders/{folder_id}/media method.

    • Improved performance of the Collection analysis.

    • You can now trim a video if it is too long. Set the OZ_FFMPEG_VIDEO_CROPPING_ENABLED parameter to true to enable automatic trimming of videos longer than OZ_VIDEO_DURATION_MAX. If a video is trimmed, the corresponding records will be added to logs.

    • If a container cannot be processed (error 14), we now save the corresponding order’s data to ease investigation.

    • If a container is considered invalid, API logs now include the corresponding folder_id to simplify the search.

    • Fixed folder deletion in local-local and local-s3 configurations to properly remove static files.

    hashtag
    6.4.1-40 – Dec. 24, 2025

    • Implemented a new proprietary data format: OzCapsula Data Container.

    • Resolved the issue with occasional false rejections using videos received from Web SDK.

    • The Collection analysis now ignores images tagged with photo_id_back.

    hashtag
    6.4.0 – Nov. 24, 2025

    circle-info

    Only for SaaS.

    • Updated API to support upcoming features.

    hashtag
    6.3.5 – Nov. 03, 2025

    • Fixed bugs that could cause the GET /api/folders/ request to return incorrect results.

    hashtag
    6.3.4 – Oct. 20, 2025

    • Updated API to support upcoming features.

    • Fixed bugs.

    hashtag
    6.3.3 – Sept. 29, 2025

    • Resolved an issue with POST /api/instant/folders/ and POST /api/folders/ returning “500 Internal Server Error” when the video sent is corrupted. Now system returns “400 Bad Request”.

    • Updated API to support upcoming features.

    hashtag
    6.3.0 – Aug 05, 2025

    • Updated API 6 to support Kazakhstan regulatory requirements: added the functionality of extracting action shots from videos of a person performing gestures.

    • You can remove admin rights from a CLIENT ADMIN user and change their role to CLIENT via PATCH /api/users/{{user_id}}.

    • You can generate a service token for a user with the OPERATOR role.

    hashtag
    6.2.5 – June 18, 2025

    • Optimization and performance updates.

    hashtag
    6.2.3 – June 02, 2025

    • Analyses can now be done in parallel with each other. To enable this feature, add the OZ_ANALYSE_PARALLELED_CHECK_MEDIA_ENABLED parameter to config.py and set it to true (the default value is false).

    • For the instant mode, authorization can be disabled. Add the OZ_AUTHORIZE_DISABLED_STATELESS parameter to config.py

    hashtag
    6.0.1 – Apr. 30, 2025

    Please note: this version doesn't support the Kazakhstan regulatory requirements.

    • Optimized storage and database.

    • Implemented the which involves creating a folder and executing analyses in a single request by attaching a part of the analysis in the payload.

    • Implemented the without storing any data locally or in database. This mode can be used either with or without other API components.

    Deprecated Endpoint or Parameter
    Replacement

    hashtag
    5.3.1 – Dec. 24, 2024

    • Improved the resource efficiency of server-based biometry analysis.

    hashtag
    5.3.0 – Nov. 27, 2024

    • API can now extract action shots from videos of a person performing gestures. This is done to comply with the new Kazakhstan regulatory requirements for biometric identification.

    • Created a new report template that also complies with the requirements mentioned above.

    • If action shots are enabled, the thumbnails for the report are generated from them.

    hashtag
    5.2.0 – Sept. 06, 2024

    • Updated the Postman collection. Please see the new collection and at .

    • Added the new method to check the timezone settings: GET {{host}}/api/config

    • Added parameters to the GET {{host}}/api/event_sessions method:

    hashtag
    5.1.1 – July 16, 2024

    • Security updates.

    hashtag
    5.1.0 – Mar. 20, 2024

    • Face Identification 1:N is now live, significantly increasing the data processing capacity of the Oz API to find matches. Even huge face databases (containing millions of photos and more) are no longer an issue.

    • The Liveness (QUALITY) analysis now ignores photos tagged with photo_id, photo_id_front, or photo_id_back, preventing these photos from causing the tag-related analysis error.

    hashtag
    5.0.1 – July 16, 2024

    • Security updates.

    hashtag
    5.0.0 – Nov. 17, 2023

    • You can now apply the Liveness (QUALITY) analysis to a single image.

    • Fixed the bug where the Liveness analysis could finish with the SUCCESS result with no media uploaded.

    • The default value for the extract_best_shot parameter is now True.

    hashtag
    4.0.8-patch1 – July 16, 2024

    • Security updates.

    hashtag
    4.0.8 – May 22, 2023

    • Set the autorotation of logs.

    • Added the CLI command for user deletion.

    • You can now switch off the video preview generation.

    hashtag
    4.0.2 – Sept. 13, 2022

    • For the sliced video, the system now deletes the unnecessary frames.

    • Added new methods: GET and POST at media/<media_id>/snapshot/.

    • Replaced the default report template.

    hashtag
    3.33.0

    • The Photo Expert and KYC modules are now removed.

    • The endpoint for the user password change is now POST users/user_id/change-password instead of PATCH.

    hashtag
    3.32.1

    • Provided log for the Celery app.

    hashtag
    3.32.0

    • Added filters to the Folder [LIST] request parameters: analyse.time_created, analyse.results_data for the Documents analysis, results_data for the Biometry analysis, results_media_results_data for the QUALITY analysis. To enable filters, set the with_results_media_filter query parameter to True.

    hashtag
    3.31.0

    • Added a new attribute for users – is_active (default True). If is_active == False, any user operation is blocked.

    • Added a new exception code (1401 with status code 401) for the actions of the blocked users.

    hashtag
    3.30.0

    • Added shots sets preview.

    • You can now save a shots set archive to a disk (with the original_local_path, original_url attributes).

    • A new original_info attribute is added to store md5, size, and mime-type of a shots set

    hashtag
    3.29.0

    • Added health check at GET api/healthcheck.

    hashtag
    3.28.1

    • Fixed the shots set thumbnail URL.

    hashtag
    3.28.0

    • Now, the first frame of shots set becomes this shots set's thumbnail URL.

    hashtag
    3.27.0

    • Modified the retry policy – the default max count of analysis attempts is increased to 3 and jitter configuration introduced.

    • Changed the callback algorithm.

    • Refactored and documented the command line tools.

    hashtag
    3.25.0

    • Changed the delete personal information endpoint and method from delete_pi to /pi and from POST to DELETE, respectively.

    hashtag
    3.23.1

    • Improved the delete personal information algorithm.

    • It is now forbidden to add media to cleaned folders.

    hashtag
    3.23.0

    • Changed the authorize/restore endpoint name from auth to auth_restore.

    • Added a new tag – video_selfie_oneshot.

    hashtag
    3.22.2

    • Fixed a bug with no error while trying to synchronize empty collections.

    • If persons are uploaded, the analyse collection TFSS request is sent.

    hashtag
    3.22.0

    • Added the fields_to_check parameter to document analysis (by default, all fields are checked).

    • Added the double_page_spread parameter to document analysis (True by default).

    hashtag
    3.21.3

    • Fixed collection synchronization.

    hashtag
    3.21.0

    • Authorization token can be now refreshed by expire_token.

    hashtag
    3.20.1

    • Added support for application/x-gzip.

    hashtag
    3.20.0

    • Renamed shots_set.images to shots_set.frames.

    hashtag
    3.18.0

    • Added user sessions API.

    • Users can now change a folder owner (limited by permissions).

    • Changed dependencies rules.

    hashtag
    3.16.0

    • Move oz_collection_binding (collection synchronization functional) to oz_core.

    hashtag
    3.15.3

    • Simplified the shots sets functionality. One archive keeps one shot set.

    hashtag
    3.15.2

    • Improved the document sides recognition for the docker version.

    hashtag
    3.15.1

    • Moved the orientation tag check to liveness at quality analysis.

    hashtag
    3.15.0

    • Added a default report template for Admin and Operator.

    hashtag
    3.14.0

    • Updated the biometric model.

    hashtag
    3.13.2

    • A new ShotsSet object is not created if there are no photos for it.

    • Updated the data exchange format for the documents' recognition module.

    hashtag
    3.13.1

    • You can’t delete a Collection if there are associated analyses with Collection Persons.

    hashtag
    3.13.0

    • Added time marks to analysis: time_task_send_to_broker, time_task_received, time_task_finished.

    hashtag
    3.12.0

    • Added a new authorization engine. You can now connect with Active Directory by LDAP (settings configuration required).

    hashtag
    3.11.0

    • A new type of media in Folders – "shots_set".

    • You can’t delete a CollectionPerson if there are analyses associated with it.

    hashtag
    3.10.0

    • Renamed the folder field resolution_suggest to operator_status.

    • Added a folder text field operator_comment.

    hashtag
    3.9.0

    • Fixed a deletion error: when report author is deleted, their reports get deleted as well.

    hashtag
    3.8.1

    • Client can now view only their own profile.

    hashtag
    3.8.0

    • Client Operator can now edit only their profile.

    • Client can't delete own folders, media, reports, or analyses anymore.

    • Client Service can now create Collection Person and read reports within their company.

    hashtag
    3.7.1

    • Client, Client Admin, Client Operator have read access to users profiles only in their company.

    • A/B testing is now available.

    • Added support for expiration date header.

    hashtag
    3.7.0

    • Added a new role of Client Operator (like Client Admin without permissions for company and account management).

    • Client Admin and Client Operator can change the analysis status.

    • Only Admin and Client Admin (for their company) can create, update and delete operations for Collection and CollectionPerson models from now on.

    Internal improvements.

    Security updates.

    and set it to
    true
    (the default value is
    false
    ) to use instant API without authorization.
  • Fixed the issue with MP4 videos that sometimes could not be played after downloading from SDK.

  • We now return the correct error for the non-authorized requests.

  • Fixed the bug with “spontaneous” error 500 that had been caused by too few frames in video. Added the check for the number of frames and more descriptive error messages.

  • Performance, security, and installation updates.

  • You can now combine working systems based on asynchronous method or celery worker (local processing, celery processing). Added S3 storage mechanics for each of the combinations.
  • Implemented security updates.

  • We no longer support RAR archives.

  • We no longer support Active Directory. This functionality will be returned in the upcoming releases.

  • Improved mechanics for calculating analysis time.

  • Replaced the is_admin and is_service flags for the CLIENT role with new roles: CLIENT ADMIN and CLIENT SERVICE, respectively. Set the roles in user_type.

  • To issue a service token for a user via {{host}}/api/authorize/service_token/, this user must have the CLIENT SERVICE role. You can also create a token for another user with this role: call {{host}}/api/authorize/service_token/{user_id}.

  • Removed collection and person attributes from COLLECTION.analysis.

  • We no longer store separate objects for each frame in SHOTS_SET. If you want to save an image from your video, consider enabling best shot.

  • We no longer support Podman for installation.

  • Updated the API reference: Oz API 6.0arrow-up-right.

  • Changed endpoints and parameters:

  • can_start_analyse_documents

    can_start_analysis_documents

    can_start_analyse_quality

    can_start_analysis_quality

    expire_date in {{host}}/api/authorize/auth and {{host}}/api/authorize/refresh

    access_token.exp from payload

    session_id in {{host}}/api/authorize/auth and {{host}}/api/authorize/refresh

    token_id

  • time_created

  • time_created.min

  • time_created.max

  • time_updated

  • time_updated.min

  • time_updated.max

  • session_id

  • session_id.exclude

  • sorting

  • offset

  • limit

  • total_omit

  • If you create a folder using SHOT_SET, the corresponding video will be in media.video_url.

  • Fixed the bug with CLIENT ADMIN being unable to change passwords for users from their company.

  • RAR archives are no longer supported.

  • By default, analyses.results_media.results_data now contain the confidence_spoofing parameter. However, if you need all three parameters for the backward compatibility, it is possible to change the response back to three parameters: confidence_replay, confidence_liveness, and confidence_spoofing.

  • Updated the default PDF report template.

  • The name of the PDF report now contains folder_id.

  • The ADMIN access token is now valid for 5 years.
  • Added the folder identifier folder_id to the report name.

  • Fixed bugs and optimized the API work.

  • The shot set preview now keeps images’ aspect ratio.

  • ADMIN and OPERATOR receive system_company as a company they belong to.

  • Added the company_id attribute to User, Folder, Analyse, Media.

  • Added the Analysis group_id attribute.

  • Added the system_resolution attribute to Folder and Analysis.

  • The analysis resolution_status now returns the system_resolution value.

  • Removed the PATCH method for collections.

  • Added the resolution_status filter to Folder Analyses [LIST] and analyse.resolution_status filter to Folder [LIST].

  • Added the audit log for Folder, User, Company.

  • Improved the company deletion algorithm.

  • Reforged the blacklist processing logic.

  • Fixed a few bugs.

  • Fixed ReportInfo for shots sets.

  • Refactored modules.
    Added the password validation setting (OZ_PASSWORD_POLICY).
  • Added auth, rest_unauthorized, rps_with_token throttling (use OZ_THROTTLING_RATES in configuration. Off by default).

  • User permissions are now used to access static files (OZ_USE_PERMISSIONS_FOR_STATIC in configuration, false by default).

  • Added a new folder endpoint – /delete_pi. It clears all personal information from a folder and analyses related to this folder.

  • Changed the access_token prolongation policy to fix bug of prolongation before checking the expiration permission.
    The folder fields operator_status and operator_comment can be edited only by Admin, Operator, Client Service, Client Operator, and Client Admin.
  • Only Admin and Client Admin can delete folder, folder media, report template, report template attachments, reports, and analyses (within their company).

  • Added document recognition module Standalone/Dockered binding support.

    Added a check for user permissions to report template when creating a folder report.

  • Collection creation now returns status code 201 instead of 200.

  • PATCH users/{{user_id}}/ to change user password

    POST /api/users/{{user_id}}/change-password

    DELETE images|media/<media_id> to delete an image of a person from collection

    DELETE collections/<collection_id>/persons/<person_id>/images/<media_id>/

    image_id, video_id, and shots_set_id

    media_id

    analyse_id

    analysis_id

    can_start_analyse_biometry

    can_start_analysis_biometry

    can_start_analyse_collection

    can_start_analysis_collection

    single request mode
    instant analysis mode
    here
    https://apidoc.ozforensics.com/arrow-up-right
    Logo

    Changelog

    iOS SDK changes

    hashtag
    9.0.0 – Feb. 17, 2026

    • You can now customize the logo position (if allowed by license).

    • The Scan gesture is now voiced properly.

    hashtag
    8.23.0 – Feb. 06, 2026

    • Updated code sample in align with OzCapsula functionality.

    • Long antiscam messages are now displayed properly.

    • SDK no longer crashes when Liveness screen is called several times in a row with short intervals.

    hashtag
    8.22.0 – Dec. 09, 2025

    • Implemented a new proprietary data format: OzCapsula Data Container.

    • Resolved the issue with SDK not returning the license-related callbacks.

    • Enhanced security.

    hashtag
    8.21.0 – Nov. 21, 2025

    • Resolved the issue with SDK sometimes not responding to user actions on some devices.

    • Updated SDK to support the upcoming security features.

    hashtag
    8.20.0 – Oct. 17, 2025

    • Fixed the bug with crashes that might happen during the Biometry analysis after taking a reference photo using camera.

    • Enhanced security.

    hashtag
    8.19.0 – Sept. 23, 2025

    • The Scan gesture animation now works properly.

    • Fixed the bug where SDK didn’t call completion during initialization in debug mode.

    • Enhanced security.

    hashtag
    8.18.2 – Aug. 22, 2025

    • Addressed an SDK crash that occasionally happened when invoking the license.

    hashtag
    8.18.1 – Aug. 06, 2025

    circle-exclamation

    We highly recommend updating to this version.

    • Resolved the issue with integration via Swift UI.

    • SDK no longer crashes on smartphones that are running low on storage.

    • Security and telemetry updates.

    hashtag
    8.17.0 – June 12, 2025

    • Security updates.

    hashtag
    8.16.2 – Apr. 22, 2025

    • Xcode updated to version 16 to comply with Apple requirements.

    hashtag
    8.16.1 – Apr. 09, 2025

    • Security updates.

    hashtag
    8.16.0 – Mar. 11, 2025

    • Updated the authorization logic.

    • Improved voiceover.

    • SDK now compresses videos if their size exceeds 10 MB.

    hashtag
    8.15.0 – Dec. 30, 2024

    • Changed the wording for the head_down gesture: the new wording is “tilt down”.

    • Added proper focus order for VoiceOver when the antiscam hint is enabled.

    • Added the public setting extract_action_shot in the Demo Application.

    hashtag
    8.14.0 – Dec. 3, 2024

    • Accessibility updates according to WCAG requirements: the SDK hints and UI controls can be voiced.

    • Improved user experience with head movement gestures.

    • Minor bug fixes and telemetry updates.

    hashtag
    8.13.0 – Nov. 11, 2024

    • The screen brightness no longer changes when the rear camera is used.

    • Fixed the video recording issues on some smartphone models.

    • Security and telemetry updates.

    hashtag
    8.12.2 – Oct. 24, 2024

    • Internal SDK improvements.

    hashtag
    8.12.1 – Sept. 30, 2024

    • Added Xcode 16 support.

    • Security and telemetry updates.

    hashtag
    8.11.0 – Sept. 10, 2024

    • Security updates.

    hashtag
    8.10.1 – Aug. 23, 2024

    • Bug fixes.

    hashtag
    8.10.0 – Aug. 1, 2024

    • SDK now requires Xcode 15 and higher.

    • Security updates.

    • Bug fixes.

    hashtag
    8.9.1 – July 16, 2024

    • Internal SDK improvements.

    hashtag
    8.8.3 – July 11, 2024

    • Internal SDK improvements.

    hashtag
    8.8.2 – June 25, 2024

    • Bug fixes.

    hashtag
    8.8.1 – June 25, 2024

    • Logging updates.

    hashtag
    8.8.0 – June 18, 2024

    • Security updates.

    hashtag
    8.7.0 – May 10, 2024

    • You can now install iOS SDK via Swift Package Manager.

    • The sample is now available on SwiftUI. Please find it .

    • Added a description for the error that occurs when providing an empty string as an ID in the addFolderID method.

    hashtag
    8.6.0 – Apr. 10, 2024

    • The messages displayed by the SDK after uploading media have been synchronized with Android.

    • The bug causing analysis delays that might have occurred for the One Shot gesture has been fixed.

    hashtag
    8.5.0 – Mar. 06, 2024

    • The length of the Selfie gesture is now (affects the video file size).

    • You can instead of Oz logo if your license allows it.

    • Removed the pause after the Scan gesture.

    hashtag
    8.4.2 – Jan. 24, 2024

    • Security updates.

    hashtag
    8.4.0 – Jan. 09, 2024

    • Changed the default behavior in case a localization key is missing: now the English string value is displayed instead of a key.

    • Fixed some bugs.

    hashtag
    8.3.3 – Dec. 11, 2023

    • Internal licensing improvements.

    hashtag
    8.3.0 – Nov. 17, 2023

    • Implemented the possibility of using a master license that works with any bundle_id.

    • Fixed the bug with background color flashing.

    hashtag
    8.2.1 – Nov. 11, 2023

    • Bug fixes.

    hashtag
    8.2.0 – Oct. 30, 2023

    • The Analysis structure now contains the sizeReductionStrategy field. This field defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully.

    • The messages for the errors that are retrieved from API are now detailed.

    hashtag
    8.1.1 – Oct. 09, 2023

    • If multiple analyses are applied to the folder simultaneously, the system sends them as a group. It means that the “worst” of the results will be taken as resolution, not the latest. Please refer to for details.

    • For the Liveness analysis, the system now treats the highest score as a quantitative result. The Liveness analysis output is described .

    hashtag
    8.1.0 – Sept. 07, 2023

    • Updated the Liveness on-device model.

    • Added the Portuguese (Brazilian) locale.

    • You can now add a custom or update an existing language pack. The instructions can be found .

    hashtag
    8.0.2 – Aug. 15, 2023

    • Fixed some bugs and improved the SDK algorithms.

    hashtag
    8.0.0 – June 27, 2023

    • Added the new analysis mode – hybrid (Liveness only). If the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.

    • Improved the on-device models.

    • Updated the run method.

    hashtag
    7.3.0 – June 06, 2023

    • Added the center hint background customization.

    • Added new face frame forms (Circle, Square).

    • Added the antiscam widget and its . This feature allows you to alert your customers that the video recording is being conducted, for instance, for loan application purposes. The purpose of this is to safeguard against scammers who may attempt to deceive an individual into approving a fraudulent transaction.

    hashtag
    7.2.1 – May 24, 2023

    • Fixed the issue with the server-based One shot analysis.

    hashtag
    7.2.0 – May 18, 2023

    • Improved the SDK algorithms.

    hashtag
    7.1.6 – May 04, 2023

    • Fixed error handling when uploading a file to API. From this version, an error will be raised to a host application in case of an error during file upload.

    hashtag
    7.1.5 – Apr. 03, 2023

    • Improved the on-device Liveness.

    hashtag
    7.1.4 – Mar. 24, 2023

    • Fixed the animation for sunglasses/mask.

    • Fixed the bug with the .document analysis.

    • Updated the descriptions of customization methods and structures.

    hashtag
    7.1.2 – Feb. 21, 2023

    • Updated the TensorFlow version to 2.11.

    • Fixed several bugs, including the Biometry check failures on some phone models.

    hashtag
    7.1.1 – Feb. 06, 2023

    • Added customization for the hint animation.

    hashtag
    7.1.0 – Jan. 20, 2023

    • Integrated a new model.

    • Added the uploadMedia method to AnalysisRequest. The addMedia method is now deprecated.

    hashtag
    7.0.0 – Dec. 08, 2022

    • Implemented a range of options and switched to the new design. To restore the previous settings, please refer to .

    hashtag
    6.7.0

    • The run method now works similar to the one in Android SDK and returns an .

    hashtag
    6.4.0

    • Synchronized the version numbers with Android SDK.

    • Added a new field to the Analysis structure. The params field is for any additional parameters, for instance, if you need to set extracting the best shot on server to true. The best shot algorithm chooses the most high-quality frame from a video.

    hashtag
    3.0.1

    • The Zoom in and Zoom out gestures are no longer supported.

    hashtag
    3.0.0

    • Added a new simplified analysis structure – AnalysisRequest.

    hashtag
    2.3.0

    • Added methods of on-device analysis: runOnDeviceLivenessAnalysis and runOnDeviceBiometryAnalysis.

    • You can choose the installation version. Standard installation gives access to full functionality. The core version (OzLivenessSDK/Core) installs SDK without the on-device functionality.

    hashtag
    2.2.3

    • Added the Turkish locale.

    hashtag
    2.2.1

    • Added the Kyrgyz locale.

    • Added Completion Handler for analysis results.

    • Added Error User Info to telemetry to show detailed info in case of an analysis error.

    hashtag
    2.2.0

    • Added local on-device analysis.

    • Added oval and rectangular frames.

    • Added Xcode 12.5.1+ support.

    hashtag
    2.1.4

    • Added SDK configuration with licenses.

    hashtag
    2.1.3

    • Added the One Shot gesture.

    • Improved OZVerificationResult: added bestShotURL which contains the best shot image and preferredMediaURL which contains an URL to the best quality video.

    hashtag
    2.1.2

    • Authorization sessions extend automatically.

    • Updated authorization interfaces.

    hashtag
    2.1.1

    • Added the Kazakh locale.

    • Added license error texts.

    hashtag
    2.1.0

    • You can cancel network requests.

    • You can specify Bundle for license.

    • Added analysis parameterization documentAnalyse.

    hashtag
    2.0.0

    • Added license support.

    • Added Xcode 12 support instead of 11.

    • Fixed the documentAnalyse error where you had to fill analyseStates to launch the analysis.

    Resolved the issue with container errors not being returned.
  • Updated dependencies.

  • Head movement gestures are now handled properly.
  • Security updates.

  • Bug fixes.
  • Security updates.

  • Bug fixes.

    The code in Readme.mdarrow-up-right is now up-to-date.
  • Security and logging updates.

  • The toFrameGradientColor option in hintAnimationCustomization is now deprecated, please use the hintGradientColor option instead.
  • Got back the iOS 11 support.

  • If a media hasn't been uploaded correctly, the system now repeats the upload.

  • Added a new method to retrieve the telemetry (logging) identifier: getEventSessionId.

  • The setPermanentAccessToken, configure and login methods are now deprecated. Please use the setApiConnection method instead.

  • The setLicense(from path:String) method is now deprecated. Please use the setLicense(licenseSource: LicenseSource) method instead.

  • Fixed some bugs and improved the SDK work.

  • Added new structures: RequestStatus (analysis state), ResultMedia (analysis result for a single media) and RequestResult (consolidated analysis result for all media).

  • The updated AnalysisResult structure should be now used instead of OzAnalysisResult.

  • For the OZMedia object, you can now specify additional tags that are not included into our tags list.

  • The Selfie video length is now about 0.7 sec, the file size and upload time are reduced.

  • The hint text width can now exceed the frame width (when using the main camera).

  • The methods below are no longer supported:

  • AnalysisRequest.run

    addMedia

    uploadMedia

    Synchronized the default customization values with Android.

  • Added the Spanish locale.

  • iOS 11 is no longer supported, the minimal required version is 12.

  • Fixed the combo analysis error.
  • Added a button to reset the SDK theme and language settings.

  • Fixed some bugs and localization issues.

  • Extended the network request timeout to 90 sec.

  • Added a setting for the animation icon size.

  • Fixed some localization issues.
  • Changed the Combo gesture.

  • Now you can launch the Liveness check to analyze images taken with another SDK.

  • Added a method to upload data to server and start analyzing it immediately: uploadAndAnalyse.

  • Improved the licensing process, now you can add a license when initializing SDK: OZSDK(licenseSources: [LicenseSource], completion: @escaping ((LicenseData?, LicenseError?) -> Void)), where LicenseSource is a path to physical location of your license, LicenseData contains the license information.

  • Added the setLicense method to force license adding.

  • When performing a local check, you can now choose a main or back camera.

    Fixed building errors (Xcode 12.4 / Cocoapods 1.10.1).

    Fixed logging.

    Removed method

    Replacement

    analyse

    AnalysisRequest.run

    addToFolder

    uploadMedia

    documentAnalyse

    AnalysisRequest.run

    uploadAndAnalyse

    AnalysisRequest.run

    runOnDeviceBiometryAnalysis

    AnalysisRequest.run

    herearrow-up-right
    configurable
    set your own logo
    this article
    here
    here
    customization
    UI customization
    this article
    array of analysis' results

    runOnDeviceLivenessAnalysis

    Changelog

    Android SDK changes

    hashtag
    9.0.2 – Feb. 25, 2026

    • Updated internal dependencies.

    hashtag
    9.0.1 – Feb. 18, 2026
    • Fixed the bug with “Unknown Error” being returned even in case of successful video capture.

    • Resolved the issue with occasional crashes caused by improper preview scaling on some devices.

    hashtag
    9.0.0 – Feb. 10, 2026

    • You can now customize the logo position (if allowed by license).

    • In case of lost internet connection during license checking, SDK now returns the proper error.

    • Updated code sample in align with OzCapsula functionality.

    hashtag
    8.23.1 – Jan. 30, 2026

    • Fixed the bug with green videos appearing on some devices.

    • Resolved the issue with occasional SDK crashes caused by restrictions of device camera.

    • Fixed minor bugs.

    hashtag
    8.23.0 – Dec. 30, 2025

    • Fixed the bug with SDK crashes caused by the “Invalid to call at Released state” and “Pending dequeue output buffer request cancelled” errors.

    • Fixed the bug with "java.util.concurrent.TimeoutException" causing crashes.

    • Android SDK now passes orientation and media type tags along with the action one.

    • Enhanced security.

    hashtag
    8.22.1 – Dec. 10, 2025

    • Fixed a minor bug.

    hashtag
    8.22.0 – Dec. 01, 2025

    • Implemented a new proprietary data format: OzCapsula Data Container.

    • Fixed occasional SDK crashes in specific cases and / or on specific devices.

    • Enhanced security.

    hashtag
    8.21.0 – Nov. 12, 2025

    • Improved SDK performance for some devices.

    • Updated SDK to support the upcoming security features.

    hashtag
    8.20.0 – Oct. 10, 2025

    • Fixed the bug with green videos on some smartphone models.

    • Resolved the issue with mediaId appearing null.

    • Enhanced security.

    hashtag
    8.19.0 – Sept. 15, 2025

    • Resolved an issue with warning that could appear when running Fragment.

    • SDK no longer crashes when calling copyPlane.

    • When you choose to send compressed videos for a hybrid analysis, SDK no longer saves original media as well as compressed.

    • Updated Oz Forensics website link.

    • Enhanced security.

    hashtag
    8.18.4 – Aug. 29, 2025

    • To support memory page size of 16 KB, switched TensorFlow to LiteRT.

    hashtag
    8.18.2 – Aug. 7, 2025

    circle-exclamation

    We highly recommend updating to this version.

    • Fixed the bug that caused video duration and file size to increase.

    hashtag
    8.18.0 – July 16, 2025

    • Added support for Google Dynamic Feature Delivery.

    • Resolved an issue that might have caused crashes when taping the close button on the Liveness screen.

    • Fixed a bug where the SDK would crash with "CameraDevice was already closed" exception.

    • Security and telemetry updates.

    hashtag
    8.17.3 – July 02, 2025

    • Resolved the issue with OkHttp compatibility.

    • Fixed the bug with Fragment missing context.

    hashtag
    8.17.2 – June 26, 2025

    • Resolved a camera access issue affecting some mobile device models.

    hashtag
    8.17.1 – June 23, 2025

    • Security updates.

    hashtag
    8.17.0 – May 22, 2025

    • Security updates.

    hashtag
    8.16.3 – Apr. 08, 2025

    • Security updates.

    hashtag
    8.16.2 – Mar. 19, 2025

    • Resolved the issue with possible SDK crashes when closing the Liveness screen.

    hashtag
    8.16.1 – Mar. 14, 2025

    • Security updates.

    hashtag
    8.16.0 – Mar. 11, 2025

    • Updated the authorization logic.

    • Improved voiceover.

    • Fixed the issue with SDK lags and the non-responding error that users might have encountered on some devices after completing the video recording.

    • Resolved the issue with SDK crashes on some devices that might have occurred because of trying to access non-initialized or closed resources.

    • Security updates.

    hashtag
    8.15.6 – Feb. 26, 2025

    • Security updates.

    hashtag
    8.15.5 – Feb. 18, 2025

    • You can now disable video validation that has been implemented to avoid recording of extremely short videos (3 frames and less): switch the option off using disableFramesCountValidation.

    • Fixed the bug with green videos on some smartphone models.

    • Security updates.

    hashtag
    8.15.4 – Feb. 11, 2025

    • Fixed bugs that could have caused crashes on some phone models.

    hashtag
    8.15.0 – Dec. 30, 2024

    • Changed the wording for the head_down gesture: the new wording is “tilt down”.

    • Added proper focus order for TalkBack when the antiscam hint is enabled.

    • Added the public setting extract_action_shot in the Demo Application.

    • Fixed bugs.

    • Security updates.

    hashtag
    8.14.1 – Dec. 5, 2024

    • Fixed the bug when the recorded videos might appear green.

    • Resolved codec issues on some smartphone models.

    hashtag
    8.14.0 – Dec. 2, 2024

    • Accessibility updates according to WCAG requirements: the SDK hints and UI controls can be voiced.

    • Improved user experience with head movement gestures.

    • Moved the large video compression step to the Liveness screen closure.

    • Fixed the bug when the best shot frame could contain an image with closed eyes.

    • Minor bug fixes and telemetry updates.

    hashtag
    8.13.0 – Nov. 12, 2024

    • Security and telemetry updates.

    hashtag
    8.12.4 – Oct. 01, 2024

    • Security updates.

    hashtag
    8.12.2 – Sept. 10, 2024

    • Security updates.

    hashtag
    8.12.0 – Aug. 29, 2024

    • Security and telemetry updates.

    hashtag
    8.11.0 – Aug. 19, 2024

    • Fixed the RuntimeException error with the server-based Liveness that appeared on some devices.

    • Security updates.

    hashtag
    8.10.0 – July 26, 2024

    • Security updates.

    • Bug fixes.

    hashtag
    8.9.0 – July 18, 2024

    • Updated the Android Gradle plugin version to 8.0.0.

    • Internal SDK improvements.

    hashtag
    8.8.3 – July 11, 2024

    • Internal SDK improvements.

    hashtag
    8.8.2 – June 21, 2024

    • Security updates.

    hashtag
    8.8.1 – June 12, 2024

    • Security updates.

    hashtag
    8.8.0 – June 04, 2024

    • Security updates.

    hashtag
    8.7.3 – June 03, 2024

    • Security updates.

    hashtag
    8.7.0 – May 06, 2024

    • Added a description for the error that occurs when providing an empty string as an ID in the setFolderID method.

    • Fixed a bug causing an endless spinner to appear if the user switches to another application during the Liveness check.

    • Fixed some smartphone model specific-bugs.

    hashtag
    8.6.0 – Apr. 05, 2024

    • Upgraded the on-device Liveness model.

    • Security updates.

    hashtag
    8.5.0 – Feb. 27, 2024

    • The length of the Selfie gesture is now configurable (affects the video file size).

    • You can set your own logo instead of Oz logo if your license allows it.

    • Removed the pause after the Scan gesture.

    • If the recorded video is larger than 10 MB, it gets compressed.

    • Security and logging updates.

    hashtag
    8.4.4 – Feb. 06, 2024

    • Changed the master license validation algorithm.

    hashtag
    8.4.3 – Jan. 29, 2024

    • Downgraded the required compileSdkVersion from 34 to 33.

    hashtag
    8.4.2 – Jan. 15, 2024

    • Security updates.

    hashtag
    8.4.0 – Jan. 04, 2024

    • Updated the on-device Liveness model.

    • Fixed some bugs.

    hashtag
    8.3.3 – Dec. 11, 2023

    • Internal licensing improvements.

    hashtag
    8.3.2 – Nov. 30, 2023

    • Internal SDK improvements.

    hashtag
    8.3.1 – Nov. 24, 2023

    • Bug fixes.

    hashtag
    8.3.0 – Nov. 17, 2023

    • Implemented the possibility of using a master license that works with any bundle_id.

    • Video compression failure on some phone models is now fixed.

    hashtag
    8.2.1 – Nov. 01, 2023

    • Bug fixes.

    hashtag
    8.2.0 – Oct. 23, 2023

    • The Analysis structure now contains the sizeReductionStrategy field. This field defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully.

    • The messages for the errors that are retrieved from API are now detailed.

    hashtag
    8.1.1 – Oct. 02, 2023

    • If multiple analyses are applied to the folder simultaneously, the system sends them as a group. It means that the “worst” of the results will be taken as resolution, not the latest. Please refer to this article for details.

    • For the Liveness analysis, the system now treats the highest score as a quantitative result. The Liveness analysis output is described here.

    hashtag
    8.1.0 – Sept. 07, 2023

    • Updated the Liveness on-device model.

    • Added the Portuguese (Brazilian) locale.

    • You can now add a custom or update an existing language pack. The instructions can be found here.

    • If a media hasn't been uploaded correctly, the system repeats the upload.

    • Created a new method to retrieve the telemetry (logging) identifier: getEventSessionId.

    • The login and auth methods are now deprecated. Use the setAPIConnection method instead.

    • OzConfig.baseURL and OzConfig.permanentAccessToken are now deprecated.

    • If a user closes the screen during video capture, the appropriate error is now being handled by SDK.

    • Fixed some bugs and improved the SDK work.

    hashtag
    8.0.3 – Aug. 24, 2023

    • Fixed errors.

    hashtag
    8.0.2 – July 13, 2023

    • The SDK now works properly with baseURL set to null.

    hashtag
    8.0.1 – June 28, 2023

    • The dependencies' versions have been brought into line with Kotlin version.

    hashtag
    8.0.0 – June 19, 2023

    • Added the new analysis mode – hybrid (Liveness only). If the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.

    • Kotlin version requirements lowered to 1.7.21.

    • Improved the on-device models.

    • For some phone models, fixed the fatal device error.

    • The hint text width can now exceed the frame width (when using the main camera).

    • Photos taken during the One Shot analysis are now being sent to the server in the original size.

    • Removed the OzAnalysisResult class. The onSuccess method ofAnalysisRequest.run now uses the RequestResult structure instead of List<OzAnalysisResult>.

    • All exceptions are moved to the com.ozforensics.liveness.sdk.core.exceptions package (See changes below).

    • Classes related to AnalysisRequest are moved to the com.ozforensics.liveness.sdk.analysispackage (See changes below).

    • The methods below are no longer supported:

    Removed method

    Replacement

    OzLivenessSDK.uploadMediaAndAnalyze

    AnalysisRequest.run

    OzLivenessSDK.uploadMedia

    AnalysisRequest.Builder.uploadMedia

    OzLivenessSDK.runOnDeviceBiometryAnalysis

    AnalysisRequest.run

    OzLivenessSDK.runOnDeviceLivenessAnalysis

    AnalysisRequest.run

    AnalysisRequest.build(): AnalysisRequest

    -

    chevron-rightPublic interface changeshashtag

    hashtag
    New entities

    • AnalysisRequest.Type.HYBRID in com.ozforensics.liveness.sdk.analysis.entity

    • AnalysisError in com.ozforensics.liveness.sdk.analysis.entity

    • SourceMedia in com.ozforensics.liveness.sdk.analysis.entity

    • ResultMedia in com.ozforensics.liveness.sdk.analysis.entity

    • RequestResult in com.ozforensics.liveness.sdk.analysis.entity

    hashtag
    Moved entities

    • NoAnalysisException from com.ozforensics.liveness.sdk.exceptions в com.ozforensics.liveness.sdk.core.exceptions

    • NoNetworkException from com.ozforensics.liveness.sdk.exceptions в com.ozforensics.liveness.sdk.core.exceptions

    hashtag
    Changed classes

    OzLivenessSDK

    • Removed uploadMediaAndAnalyze

    • Removed uploadMedia

    • Removed runOnDeviceBiometryAnalysis

    AnalysisRequest

    • Removed build(): AnalysisRequest

    AnalysisRequest.Builder

    • Removed addMedia

    • Removed onSuccess(result: List<OzAnalysisResult>)

    • Added onSuccess(result: RequestResult)

    hashtag
    7.3.1 – June 07, 2023

    • Restructured the settings screen.

    • Added the center hint background customization.

    • Added new face frame forms (Circle, Square).

    • Added the antiscam widget and its . This feature allows you to alert your customers that the video recording is being conducted, for instance, for loan application purposes. The purpose of this is to safeguard against scammers who may attempt to deceive an individual into approving a fraudulent transaction.

    • The OzLivenessSDK::init method no longer crashes if there is a StatusListener parameter passed.

    • Changed the scan gesture animation.

    circle-info

    Please note: for this version, we updated Kotlin to 1.8.20.

    hashtag
    7.2.0 – May 04, 2023

    • Improved the SDK algorithms.

    hashtag
    7.1.4 – Mar. 30, 2023

    • Updated the model for the on-device analyses.

    • Fixed the animation for sunglasses/mask.

    • The oval size for Liveness is now smaller.

    hashtag
    7.1.3 – Mar. 03, 2023

    • Fixed the error with the server-based analyses while using permanentAccessToken for authorization.

    hashtag
    7.1.2 – Feb. 22, 2023

    • Added customization for the hint animation.

    • You can now hide the status bar and system buttons (works with 7.0.0 and higher).

    • OzLivenessSDK.init now requires context as the first parameter.

    • OzAnalysisResult now shows the server-based analyses' scores properly.

    • Fixed initialization issues, displaying of wrong customization settings, authorization failures on Android <7.1.1.

    hashtag
    7.1.1 – Jan. 16, 2023

    • Fixed crashes for Android v.6 and below.

    • Fixed oval positioning for some phone models.

    • Internal fixes and improvements.

    hashtag
    7.1.0 – Dec. 16, 2022

    • Updated security.

    • Implemented some internal improvements.

    • The addMedia method is now deprecated, please use uploadMedia for uploading.

    hashtag
    7.0.0 – Nov. 23, 2022

    • Changed the way of sharing dependencies. Due to security issues, now we share two types of libraries as shown below: sdk is a server analysis only, full provides both server and on-device analyses:

    • UICustomization has been implemented instead of OzCustomization.

    • Implemented a range of UI customization options and switched to the new design. To restore the previous settings, please refer to this article.

    • Added the Spanish locale.

    hashtag
    6.4.2.3

    • Fixed the bug with freezes that had appeared on some phone models.

    • SDK now captures videos in 720p.

    hashtag
    6.4.1

    • Synchronized the names of the analysis modes with iOS: SERVER_BASED and ON_DEVICE.

    • Fixed the bug with displaying of localization settings.

    hashtag
    6.4.0

    • Now you can use Fragment as Liveness screen.

    • Added a new field to the Analysis structure. The params field is for any additional parameters, for instance, if you need to set extracting the best shot on server to true. The best shot algorithm chooses the most high-quality frame from a video.

    hashtag
    6.3.7

    • The Zoom in and Zoom out gestures are no longer supported.

    hashtag
    6.3.6

    • Updated the biometry model.

    hashtag
    6.3.5

    • Added a new simplified API – AnalysisRequest. With it, it’s easier to create a request for the media and analysis you need.

    hashtag
    6.3.4

    • Published the on-device module for on-device liveness and biometry analyses. To add this module to your project, use:

    To launch these analyses, use runOnDeviceBiometryAnalysis and runOnDeviceLivenessAnalysis methods from the OzLivenessSDK class:

    hashtag
    6.3.3

    • Liveness now goes smoother.

    • Fixed freezes on Xiaomi devices.

    • Optimized image converting.

    hashtag
    6.3.1

    • New metadata parameter for OzLivenessSDK.uploadMedia and new OzLivenessSDK.uploadMediaAndAnalyze method to pass this parameter to folders.

    hashtag
    6.2.8

    • Added functions for SDK initialization with LicenseSources: LicenseSource.LicenseAssetId and LicenseSource.LicenseFilePath. Use the OzLivenessSDK.init method to start initialization.

    • Now you can get the license info upon initialization val licensePayload = OzLivenessSDK.getLicensePayload().

    hashtag
    6.2.4

    • Added the Kyrgyz locale.

    hashtag
    6.2.0

    • Added local analysis functions.

    • You can now configure the face frame.

    • Fixed version number at the Liveness screen.

    hashtag
    6.1.0

    • Added the main camera support.

    hashtag
    6.0.1

    • Added configuration from license support.

    hashtag
    6.0.0

    • Added the OneShot gesture.

    • Added new states for OzAnalysisResult.Resolution.

    • Added the uploadMediaAndAnalyze method to load a bunch of media to the server at once and send them to analysis immediately.

    • OzMedia is renamed to OzAbstractMedia and got subclasses for images and videos.

    • Fixed camera bugs for some devices.

    hashtag
    5.1.0

    • Access token updates automatically.

    • Renamed accessToken to permanentAccessToken.

    • Added R8 rules.

    • Configuration became easier: config settings are mutable.

    hashtag
    5.0.2

    • Fixed the oval frame.

    • Removed the unusable parameters from AnalyseRequest.

    • Removed default attempt limits.

    hashtag
    5.0.0

    • To customize the configuration options, the config property is added instead of baseURL, accessToken, etc. Use OzConfig.Builder for initialization.

    • Added license support. Licences should be installed as raw resources. To pass them to OzConfig, use setLicenseResourceId.

    • Replaced the context-dependent methods with analogs.

    • Improved the image analysis.

    • Removed unusable dependencies.

    • Fixed logging.

    implementation 'com.ozforensics.liveness:full:7.0.0'
    implementation 'com.ozforensics.liveness:sdk:7.0.0'
        implementation 'com.ozforensics.liveness:on-device:6.3.4'
    val mediaList: List<OzAbstractMedia> = ...
    val biometryAnalysisResult: OzAnalysisResult = OzLivenessSDK.runOnDeviceBiometryAnalysis(mediaList)
    val livenessAnalysisResult: OzAnalysisResult = OzLivenessSDK.runOnDeviceLivenessAnalysis(mediaList)
  • TokenException from com.ozforensics.liveness.sdk.exceptions в com.ozforensics.liveness.sdk.core.exceptions

  • NoMediaInAnalysisException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions

  • EmptyMediaListException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions

  • NoSuchMediaException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions

  • LicenseException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.security.exception

  • Analysis from com.ozforensics.liveness.sdk.analysis.entity to com.ozforensics.liveness.sdk.core.model

  • AnalysisRequest from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core

  • AnalysisListener from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core

  • AnalysisStatus from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core

  • AnalysisRequest.Builder from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core

  • OzException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions

  • Removed runOnDeviceLivenessAnalysis

  • AnalysisRequest.Builder.addMedia

    AnalysisRequest.Builder.uploadMedia

    customization

    User Roles

    Each of the new API users should obtain a role to define access restrictions for direct API connections. Set the role in the user_type field when you create a new user.

    • ADMIN is a system administrator, who has unlimited access to all system objects, but can't change the analyses' statuses;

    • OPERATOR is a system operator, who can view all system objects and choose the analysis result via the Make Decision button (usually needed if the is OPERATOR_REQUIRED);

    • CLIENT is a regular consumer account, who can upload media files, process analyses, view results in personal folders, generate reports for analyses.

      • can_start_analysis_biometry – an additional flag to allow access to analyses (enabled by default);

    • CLIENT ADMIN is a company administrator that can manage their company account and users within it. Additionally, CLIENT ADMIN can view and edit data of all users within their company, delete files in folders, add or delete report templates with or without attachments, the reports themselves and single analyses, check statistics, add new blacklist collections.

    • CLIENT OPERATOR is similar to OPERATOR within their company.

    • CLIENT SERVICE is a service user account for automatic connection purposes. Authentication with this user creates a long-live access token (5 years by default). The token lifetime for regular uses is 15 minutes by default (parameterized) and, also by default, the lifetime of a token is extended with each request (parameterized).

    chevron-rightFor API versions below 6.0hashtag

    For API 5.3 and below, to create a CLIENT user with admin or service rights, you require to set the corresponding flags to true:

    • is_admin – if set, the user obtains access to other users' data within this admin's company.

    Here's the detailed information on access levels.

    hashtag
    Company

    hashtag
    Folder

    hashtag
    Report template

    hashtag
    Report template attachments

    hashtag
    Report

    hashtag
    Analysis

    hashtag
    Collection

    hashtag
    Person

    hashtag
    Person image

    hashtag
    User

    Look-and-Feel Customization

    To set your own look-and-feel options, use the style section in the Ozliveness.open method. The options are listed below the example.

    hashtag
    baseColorCustomization

    Main color settings.

    hashtag
    baseFontCustomization

    Main font settings.

    hashtag
    titleFontCustomization

    Title font settings.

    hashtag
    buttonCustomization

    Buttons’ settings.

    hashtag
    toolbarCustomization

    Toolbar settings.

    hashtag
    centerHintCustomization

    Center hint settings.

    hashtag
    hintAnimation

    Hint animation settings.

    hashtag
    faceFrameCustomization

    Face frame settings.

    hashtag
    documentFrameCustomization

    Document capture frame settings.

    hashtag
    backgroundCustomization

    Background settings.

    hashtag
    antiscamCustomization

    Scam protection settings: the antiscam message warns user about their actions being recorded.

    hashtag
    versionTextCustomization

    SDK version text settings.

    hashtag
    maskCustomization

    3D mask settings. The mask has been implemented in 1.2.1.

    hashtag
    switchCameraButtonCustomization

    Use this setting to hide the Switch Camera button (added in 1.7.14).

    hashtag
    loaderSlot

    Settings for a custom loader (added in 1.7.15).

    hashtag
    loaderTransition

    Loader transition settings (added in 1.8.0).

    hashtag
    ozliveness_face_stroke

    Use this class to customize the oval (added in 1.9.2). All standard CSS properties are supported.

    hashtag
    Migrating to the New Design from the Previous Versions (before 1.0.1)

    Table of parameters' correspondence:

    OzLiveness.open({
    style: {
        baseColorCustomization: {
            textColorPrimary: "#000000",
            backgroundColorPrimary: "#FFFFFF",
            textColorSecondary: "#8E8E93",
            backgroundColorSecondary: "#F2F2F7",
            iconColor: "#00A5BA"
            },
        baseFontCustomization: {
            textFont: "Roboto, sans-serif",
            textSize: "16px",
            textWeight: "400",
            textStyle: "normal"
            },
        titleFontCustomization: {
            textFont: "inherit",
            textSize: "36px",
            textWeight: "500",
            textStyle: "normal"
            },
        buttonCustomization: {
            textFont: "inherit",
            textSize: "14px",
            textWeight: "500",
            textStyle: "normal",
            textColorPrimary: "#FFFFFF",
            backgroundColorPrimary: "#00A5BA",
            textColorSecondary: "#00A5BA",
            backgroundColorSecondary: "#DBF2F5",
            cornerRadius: "10px"
            },
        toolbarCustomization: {
            closeButtonIcon: "cross",
            iconColor: "#707070"
            },
        centerHintCustomization: {
            textFont: "inherit",
            textSize: "24px",
            textWeight: "500",
            textStyle: "normal",
            textColor: "#FFFFFF",
            backgroundColor: "#1C1C1E",
            backgroundOpacity: "56%",
            backgroundCornerRadius: "14px",
            verticalPosition: "38%"
            },
        hintAnimation: {
            hideAnimation: false,
            hintGradientColor: "#00BCD5",
            hintGradientOpacity: "100%",
            animationIconSize: "80px"
            },
        faceFrameCustomization: {
            geometryType: "oval",
            cornersRadius: "0px",
            strokeDefaultColor: "#D51900",
            strokeFaceInFrameColor: "#00BCD5",
            strokeOpacity: "100%",
            strokeWidth: "6px",
            strokePadding: "4px"
            },
        documentFrameCustomization: {
            cornersRadius: "20px",
            templateColor: "#FFFFFF",
            templateOpacity: "100%"
            },
        backgroundCustomization: {
            backgroundColor: "#FFFFFF",
            backgroundOpacity: "88%"
            },
        antiscamCustomization: {
            enableAntiscam: false,
            textMessage: "",
            textFont: "inherit",
            textSize: "14px",
            textWeight: "500",
            textStyle: "normal",
            textColor: "#000000",
            textOpacity: "100%",
            backgroundColor: "#F2F2F7",
            backgroundOpacity: "100%",
            backgroundCornerRadius: "20px",
            flashColor: "#FF453A"
            },
        versionTextCustomization: {
            textFont: "inherit",
            textSize: "16px",
            textWeight: "500",
            textStyle: "normal",
            textColor: "#000000",
            textOpacity: "56%"
            },
        maskCustomization: {
                maskColor: "#008700",
                glowColor: "#000102",
                minAlpha: "30%", // 0 to 1 or 0% to 100%
                maxAlpha: "100%" // 0 to 1 or 0% to 100%
    
            },
            switchCameraButtonCustomization: {
                enableSwitchCameraButton: true,
            },
        .ozliveness_wrap_overlay .ozliveness_face_stroke {
                overflow: hidden;
                box-shadow: 0 0 10px rgb(90 201 207);
                border: none; /* Important to remove default border */
            }, 
    /* for an HTML string, use string; for HTMLElement, insert it via cloneNode(true) */
        loaderSlot: yourLoader, /* <string | HTMLElement> */
        loaderTransition: {type: 'fade', duration: 500}
    
        }
    });
    can_start_analysis_quality – an additional flag to allow access to LIVENESS (QUALITY) analyses (enabled by default);
  • can_start_analysis_collection – an additional flag to allow access to BLACK LIST analyses (enabled by default).

  • is_service is a flag that marks the user account as a service accountfor automatic connection purposes. Authentication with this user creates a long-live access token (5 years by default). The token lifetime for regular uses is 15 minutes by default (parameterized) and, also by default, the lifetime of a token is extended with each request (parameterized).

    CLIENT

    -

    their company data

    -

    -

    CLIENT SERVICE

    -

    their company data

    -

    -

    CLIENT OPERATOR

    -

    their company data

    -

    -

    CLIENT ADMIN

    -

    their company data

    their company data

    their company data

    CLIENT

    their folders

    their folders

    their folders

    -

    CLIENT SERVICE

    within their company

    within their company

    within their company

    -

    CLIENT OPERATOR

    within their company

    within their company

    within their company

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    within their company

    CLIENT

    -

    within their company

    -

    -

    CLIENT SERVICE

    -

    within their company

    -

    -

    CLIENT OPERATOR

    within their company

    within their company

    within their company

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    within their company

    within their company

    -

    CLIENT SERVICE

    -

    within their company

    -

    CLIENT OPERATOR

    within their company

    within their company

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    in their folders

    -

    CLIENT SERVICE

    within their company

    within their company

    -

    CLIENT OPERATOR

    within their company

    within their company

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    CLIENT

    in their folders

    in their folders

    -

    -

    CLIENT SERVICE

    within their company

    within their company

    within their company

    -

    CLIENT OPERATOR

    within their company

    within their company

    within their company

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    within their company

    CLIENT

    -

    within their company

    -

    -

    CLIENT SERVICE

    within their company

    within their company

    -

    -

    CLIENT OPERATOR

    -

    within their company

    -

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    within their company

    within their company

    -

    CLIENT SERVICE

    within their company

    within their company

    -

    CLIENT OPERATOR

    -

    within their company

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    within their company

    -

    CLIENT SERVICE

    -

    within their company

    -

    CLIENT OPERATOR

    -

    within their company

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    CLIENT

    -

    their data

    their data

    -

    CLIENT SERVICE

    -

    within their company

    their data

    -

    CLIENT OPERATOR

    -

    within their company

    their data

    -

    CLIENT ADMIN

    within their company

    within their company

    within their company

    within their company

    Create

    Read

    Update

    Delete

    ADMIN

    +

    +

    +

    +

    OPERATOR

    -

    +

    -

    Create

    Read

    Update

    Delete

    ADMIN

    +

    +

    +

    +

    OPERATOR

    +

    +

    +

    Create

    Read

    Update

    Delete

    ADMIN

    +

    +

    +

    +

    OPERATOR

    +

    +

    +

    Create

    Read

    Delete

    ADMIN

    +

    +

    +

    OPERATOR

    +

    +

    -

    CLIENT

    Create

    Read

    Delete

    ADMIN

    +

    +

    +

    OPERATOR

    +

    +

    -

    CLIENT

    Create

    Read

    Update

    Delete

    ADMIN

    +

    +

    +

    +

    OPERATOR

    +

    +

    +

    Create

    Read

    Update

    Delete

    ADMIN

    +

    +

    +

    +

    OPERATOR

    -

    +

    -

    Create

    Read

    Delete

    ADMIN

    +

    +

    +

    OPERATOR

    -

    +

    -

    CLIENT

    Create

    Read

    Delete

    ADMIN

    +

    +

    +

    OPERATOR

    -

    +

    -

    CLIENT

    Create

    Read

    Update

    Delete

    ADMIN

    +

    +

    +

    +

    OPERATOR

    -

    +

    their data

    status
    BIOMETRY

    -

    -

    -

    -

    in their folders

    -

    -

    -

    -

    -

    Main background color

    textColorSecondary

    Secondary text color

    backgroundColorSecondary

    Secondary background color

    cornerRadius

    Button corner radius

    Background color

    backgroundOpacity

    Background opacity

    backgroundCornerRadius

    Frame corner radius

    verticalPosition

    Vertical position

    Stroke width

    strokePadding

    Padding from stroke

    Text color

    textOpacity

    Text opacity

    backgroundColor

    Background color

    backgroundOpacity

    Background opacity

    backgroundCornerRadius

    Frame corner radius

    flashColor

    Flashing indicator color

    Text opacity

    {phase: 'start' | 'progress' | 'end', percent?}

    before / during / after data transmission

    loader:destroy

    when you need to hide the slot

    centerHintCustomization.verticalPosition

    centerHint.letterSpacing

    -

    centerHint.fontStyle

    centerHintCustomization.textStyle

    closeButton.image

    -

    backgroundOutsideFrame.color

    backgroundCustomization.backgroundColor

    Parameter

    Description

    textColorPrimary

    Main text color

    backgroundColorPrimary

    Main background color

    textColorSecondary

    Secondary text color

    backgroundColorSecondary

    Secondary background color

    iconColor

    Icons’ color

    Parameter

    Description

    textFont

    Font

    textSize

    Font size

    textWeight

    Font weight

    textStyle

    Font style

    Parameter

    Description

    textFont

    Font

    textSize

    Font size

    textWeight

    Font weight

    textStyle

    Font style

    Parameter

    Description

    textFont

    Font

    textSize

    Font size

    textWeight

    Font weight

    textStyle

    Font style

    textColorPrimary

    Main text color

    Parameter

    Description

    closeButtonIcon

    Close button icon

    iconColor

    Close button icon color

    Parameter

    Description

    textFont

    Font

    textSize

    Font size

    textWeight

    Font weight

    textStyle

    Font style

    textColor

    Text color

    Parameter

    Description

    hideAnimation

    Disable animation

    hintGradientColor

    Gradient color

    hintGradientOpacity

    Gradient opacity

    animationIconSize

    Animation icon size

    Parameter

    Description

    geometryType

    Frame shape: rectangle or oval

    cornersRadius

    Frame corner radius (for rectangle)

    strokeDefaultColor

    Frame color when a face is not aligned properly

    strokeFaceInFrameColor

    Frame color when a face is aligned properly

    strokeOpacity

    Stroke opacity

    Parameter

    Description

    cornersRadius

    Frame corner radius

    templateColor

    Document template color

    templateOpacity

    Document template opacity

    Parameter

    Description

    backgroundColor

    Background color

    backgroundOpacity

    Background opacity

    Parameter

    Description

    textMessage

    Antiscam message text

    textFont

    Font

    textSize

    Font size

    textWeight

    Font weight

    textStyle

    Font style

    Parameter

    Description

    textFont

    Font

    textSize

    Font size

    textWeight

    Font weight

    textStyle

    Font style

    textColor

    Text color

    Parameter

    Description

    maskColor

    The color of the mask itself

    glowColor

    The color of the glowing mask shape

    minAlpha

    Minimum mask transparency level. Implemented in 1.3.1

    maxAlpha

    Maximum mask transparency level. Implemented in 1.3.1

    Parameter

    Description

    enableSwitchCameraButton

    • true (default) – shows the Switch Camera button,

    • false – hides the button.

    Event

    Payload

    Is called

    loader:init

    {os, browser, platform}

    immediately after inserting the slot

    loader:waitingCamera

    {os, browser, platform, waitedMs}

    every waitedMs ms, while waiting for camera access

    loader:cameraReady

    when access is granted and loader should be hidden

    loader:processing

    {phase: 'start' | 'end'}

    before / after data preparation

    Parameter

    Description

    type

    Animation type: none, fade, slide, scale

    duration

    Animation length in ms

    easing (optional)

    easing: linear, ease-in-out, etc

    Previous design

    New design

    doc_color

    -

    face_color_success

    faceFrame.faceReady

    faceFrameCustomization.strokeFaceInFrameColor

    face_color_fail

    faceFrame.faceNotReady

    faceFrameCustomization.strokeDefaultColor

    centerHint.textSize

    centerHintCustomization.textSize

    centerHint.color

    centerHintCustomization.textColor

    backgroundColorPrimary

    backgroundColor

    strokeWidth

    textColor

    textOpacity

    loader:uploading

    centerHint.yPosition

    Flutter SDK Methods and Properties

    hashtag
    clearActionVideos

    Deletes all action videos from file system (iOS 8.4.0 and higher, Android).

    Returns

    Future<Void>.

    hashtag
    getSDKVersion

    Returns the SDK version.

    Returns

    Future<String>.

    hashtag
    initSDK

    Initializes SDK with license sources.

    Returns

    hashtag
    setApiConnectionWithCredentials

    Authentication via credentials.

    Returns

    hashtag
    setApiConnectionWithToken

    Authentication via access token.

    Returns

    hashtag
    setEventConnectionWithCredentials

    Connection to the telemetry server via credentials.

    Returns

    hashtag
    setEventConnectionWithToken

    Connection to the telemetry server via access token.

    Returns

    hashtag
    isLoggedIn

    Checks whether an access token exists.

    Returns

    hashtag
    logout

    Deletes the saved access token.

    Returns

    Nothing (void).

    hashtag
    supportedLanguages

    Returns the list of SDK supported languages.

    Returns

    List<>.

    hashtag
    startLiveness

    Starts the Liveness video capturing process.

    hashtag
    setSelfieLength

    Sets the length of the Selfie gesture (in milliseconds).

    Returns

    Error if any.

    hashtag
    Analyze

    Launches the analyses.

    Returns

    List<>.

    hashtag
    setLocalization

    Sets the SDK localization.

    hashtag
    attemptSettings

    The number of attempts before SDK returns error.

    hashtag
    setUICustomization

    Sets the UI customization values for OzLivenessSDK. The values are described in the Customization structures section. Structures can be found in the lib\customization.dart file.

    hashtag
    setfaceAlignmentTimeout

    Sets the timeout for the face alignment for actions.

    hashtag
    Fonts and Other Customized Resources

    hashtag
    For iOS

    Add fonts and drawable resources to the application/ios project.

    hashtag
    For Android

    Fonts and images should be placed into related folders:

    ozforensics_flutter_plugin\android\src\main\res\drawable ozforensics_flutter_plugin\android\src\main\res\font

    hashtag
    Customization structures

    These are defined in the customization.dart file.

    hashtag
    UICustomization

    Contains the information about customization parameters.

    hashtag
    ToolbarCustomization

    Toolbar customization parameters.

    hashtag
    CenterHintCustomization

    Center hint customization parameters.

    hashtag
    HintAnimation

    Hint animation customization parameters.

    hashtag
    FaceFrameCustomization

    Frame around face customization parameters.

    hashtag
    VersionLabelCustomization

    SDK version customization parameters.

    hashtag
    BackgroundCustomization

    Background customization parameters.

    hashtag
    Flutter structures

    Defined in the models.dart file.

    hashtag
    enum Locale

    Stores the language information.

    hashtag
    enum MediaType

    The type of media captured.

    hashtag
    enum FileType

    The type of media captured.

    hashtag
    enum MediaTag

    Contains an action from the captured video.

    hashtag
    Media

    Stores information about media.

    hashtag
    RequestResult

    Stores information about the analysis result.

    hashtag
    Analysis

    Stores data about a single analysis.

    hashtag
    Structures

    hashtag
    enum Type

    Analysis type.

    hashtag
    enum Mode

    Analysis mode.

    hashtag
    enum VerificationAction

    Contains the action from the captured video.

    hashtag
    Resolution

    The general status for all analyses applied to the folder created.

    hashtag
    enum SizeReductionStrategy

    Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully. By default, the system uploads the compressed video.

    hashtag
    sslPin

    Contains information about the .

    hashtag
    Customization resources

    This is a Map to define the platform-specific resources on the plugin level.

    hashtag
    closeButtonIcon

    This key is a Map for the close button icon.

    hashtag
    titleFont

    This key is a Map containing the data on the uploaded fonts.

    hashtag
    titleStyle

    This key is a Map containing the data on the uploaded font styles.

    hashtag
    faceFrameGeometry

    This key is a Map containing the data on grame shape.

    API Lite Methods

    circle-info

    From 1.1.0, Oz API Lite works with base64 as an input format and is also able to return the biometric templates in this format. To enable this option, add Content-Transfer-Encoding = base64 to the request headers.

    hashtag

    Whitelisted certificates

    Whitelisted certificates

    Additional parameters

    Text font

    titleSize

    int

    Font size

    titleFontStyle

    String

    Font style

    titleColor

    String

    Color #XXXXXX

    titleAlpha

    int

    Header text opacity

    isTitleCentered

    bool

    Sets the text centered

    backgroundColor

    String

    Header background color #XXXXXX

    backgroundAlpha

    int

    Header background opacity

    Font size

    verticalPosition

    int

    Y position

    textAlpha

    int

    Text opacity

    centerBackground

    bool

    Sets the text centered

    Gradient color

    Color #XXXXXX

    strokeFaceAlignedColor

    String

    Color #XXXXXX

    strokeAlpha

    int

    Stroke opacity

    strokePadding

    int

    Stroke padding

    Font size

    textAlpha

    int

    Text opacity

    Spanish

    pt_br

    Portuguese (Brazilian)

    A video with the smile gesture

    videoSelfieHigh

    A video with the lifting head up gesture

    videoSelfieDown

    A video with the tilting head downwards gesture

    videoSelfieRight

    A video with the turning head right gesture

    videoSelfieLeft

    A video with the turning head left gesture

    photoIdPortrait

    A photo from a document

    photoIdBack

    A photo of the back side of the document

    photoIdFront

    A photo of the front side of the document

    A type of media

    iOS

    videoPath

    String

    A path to a video

    bestShotPath

    String

    path of the best shot in PNG for video or image path for liveness

    preferredMediaPath

    String

    URL of the API media container

    photoPath

    String

    A path to a photo

    archivePath

    String

    A path to an archive

    tag

    A tag for media

    Android

    The error code

    Android only

    errorMessage

    String

    The error message

    mode

    The mode of the analysis

    confidenceScore

    Double

    The resulting score

    resolution

    The completed analysis' result

    status

    Boolean

    The analysis state:

    true- success;

    false- failed

    Additional analysis parameters

    sizeReductionStrategy

    Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully

    Head tilted downwards

    headUp

    Head lifted up

    eyeBlink

    Blink

    smile

    Smile

    Parameter

    Type

    Description

    licenses

    List<String>

    A list of licences

    Case

    Text

    True

    Initialization has completed successfully

    False

    Initialization error

    Parameter

    Type

    Description

    email

    String

    User email

    password

    String

    User password

    host

    String

    Server URL

    sslPins (optional)

    Case

    Text

    Success

    Nothing (void)

    Failed

    PlatformException:

    • code = AUTHENTICATION_FAILED

    • message = exception details

    Parameter

    Type

    Description

    token

    String

    User email

    host

    String

    Server URL

    sslPins (optional)

    sslPin

    Whitelisted certificates

    Case

    Text

    Success

    Nothing (void)

    Failed

    PlatformException:

    • code = AUTHENTICATION_FAILED

    • message = exception details

    Parameter

    Type

    Description

    email

    String

    User email

    password

    String

    User password

    host

    String

    Server URL

    sslPins (optional)

    Case

    Text

    Success

    Nothing (void)

    Failed

    PlatformException:

    • code = AUTHENTICATION_FAILED

    • message = exception details

    Parameter

    Type

    Description

    token

    String

    User email

    host

    String

    Server URL

    sslPins (optional)

    sslPin

    Whitelisted certificates

    Case

    Text

    Success

    Nothing (void)

    Failed

    PlatformException:

    • code = AUTHENTICATION_FAILED

    • message = exception details

    Case

    Returns

    Token exists

    True

    Token does not exist

    False

    Parameter

    Type

    Description

    actions

    List<VerificationAction>

    Actions to execute

    mainCamera

    Boolean

    Use main (True) or front (False) camera

    Parameter

    Type

    Description

    selfieLength

    Int

    The length of the Selfie gesture (in milliseconds). Should be within 500-5000 ms, the default length is 700

    Parameter

    Type

    Description

    analysis

    List<Analysis>

    The list of Analysis structures

    folder ID (optional)

    String

    Folder ID, if you want to perform an anaoysis for a particular folder

    uploadMedia

    List<Media>

    The list of the captures videos

    params

    Parameter

    Type

    Description

    locale

    Locale

    The SDK language

    Parameter

    Type

    Description

    singleCount

    int

    Attempts on a single action/gesture

    commonCount

    int

    Total number of attempts on all actions/gestures if you use a sequence of them

    Parameter

    Type

    Description

    timeout

    int

    Timeout in milliseconds

    Parameter

    Type

    Description

    closeButtonIcon

    String

    Close button icon received from plugin

    closeButtonColor

    String

    Color #XXXXXX

    titleText

    String

    Header text

    titleFont

    Parameter

    Type

    Description

    textFont

    String

    Text font

    textFontStyle

    String

    Font style

    textColor

    String

    Color #XXXXXX

    textSize

    Parameter

    Type

    Description

    hideAnimation

    bool

    Hides the hint animation

    animationIconSize

    int

    Animation icon size in px (40-160)

    hintGradientColor

    String

    Color #XXXXXX

    hintGradientOpacity

    Parameter

    Type

    Description

    geometryType

    String

    Frame shape received from plugin

    geometryTypeRadius

    int

    Corner radius for rectangle

    strokeWidth

    int

    Frame stroke width

    strokeFaceNotAlignedColor

    Parameter

    Type

    Description

    textFont

    String

    Text font

    textFontStyle

    String

    Font style

    textColor

    String

    Color #XXXXXX

    textSize

    Parameter

    Type

    Description

    backgroundColor

    String

    Color #XXXXXX

    backgroundAlpha

    int

    Background opacity

    Case

    Description

    en

    English

    hy

    Armenian

    kk

    Kazakh

    ky

    Kyrgyz

    tr

    Turkish

    Case

    Description

    movement

    A media with an action

    documentBack

    The back side of the document

    documentFront

    The front side of the document

    Case

    Description

    documentPhoto

    A photo of a document

    video

    A video

    shotSet

    A frame archive

    Case

    Description

    blank

    A video with no gesture

    photoSelfie

    A selfie photo

    videoSelfieOneShot

    A video with the best shot taken

    videoSelfieScan

    A video with the scanning gesture

    videoSelfieEyes

    A video with the blink gesture

    Parameter

    Type

    Description

    Platform

    fileType

    FileType

    The type of the file

    Android

    movement

    VerificationAction

    An action on a media

    iOS

    mediatype

    Parameter

    Type

    Description

    Platform

    folderId

    String

    The folder identifier

    type

    Type

    The analysis type

    errorCode

    Parameter

    Type

    Description

    type

    Type

    The type of the analysis

    mode

    Mode

    The mode of the analysis

    mediaList

    List<Media>

    Media to analyze

    params

    Case

    Description

    biometry

    The algorithm that allows comparing several media and check if the people on them are the same person or not

    quality

    The algorithm that aims to check whether a person in a video is a real human acting in good faith, not a fake of any kind.

    Case

    Description

    onDevice

    The on-device analysis with no server needed

    serverBased

    The server-based analysis

    hybrid

    The hybrid analysis for Liveness: if the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.

    Case

    Description

    oneShot

    The best shot from the video taken

    blank

    A selfie with face alignment check

    scan

    Scan

    headRight

    Head turned right

    headLeft

    Head turned left

    Case

    Description

    failed

    One or more analyses failed due to some error and couldn't get finished

    declined

    The check failed (e.g., faces don't match or some spoofing attack detected)

    success

    Everything went fine, the check succeeded (e.g., faces match or liveness confirmed)

    operatorRequired

    The result should be additionally checked by a human operator

    uploadOriginal

    The original video

    uploadCompressed

    The compressed video

    uploadBestShot

    The best shot taken from the video

    uploadNothing

    Nothing is sent (note that no folder will be created)

    Parameter

    Type

    Description

    hash

    String

    SHA256 key hash in base64

    expired_at

    UNIX timestamp, UTC time

    The date of certificate expiration, ms

    Key

    Value

    Close

    Android drawable resource / iOS Pods resource

    Arrow

    Android drawable resource / iOS Pods resource

    Key

    Value

    Flutter application font name

    Android font resource / iOS Pods resource, used to retrieve the font on the plugin level

    Key

    Value

    Flutter application font style name

    Name of the style retrieved for the font creation on the plugin level

    Key

    Value

    Oval

    Oval shape

    Rectangle

    Rectangular shape

    Locale
    RequestResult
    whitelisted certificates

    Map<String, Any>

    String

    int

    int

    String

    int

    es

    videoSelfieSmile

    String

    int

    Map<String, String>

    headDown

    version – component version check

    Use this method to check what versions of components are used (available from 1.1.1).

    Call GET /version

    hashtag
    Input parameters

    -

    hashtag
    Request example

    GET localhost/version

    hashtag
    Successful response

    In case of success, the method returns a message with the following parameters.

    HTTP response content type: “application/json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    core

    String

    API Lite core version number.

    tfss

    String

    TFSS version number.

    models

    [String]

    An array of model versions, each record contains model name and model version number.

    hashtag
    Response example

    hashtag
    Biometry

    hashtag
    health – biometric processor status check

    Use this method to check whether the biometric processor is ready to work.

    Call GET /v1/face/pattern/health

    hashtag
    Input parameters

    -

    hashtag
    Request example

    GET localhost/v1/face/pattern/health

    hashtag
    Successful response

    In case of success, the method returns a message with the following parameters.

    HTTP response content type: “application/json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    status

    Int

    0 – the biometric processor is working correctly.

    3 – the biometric processor is inoperative.

    message

    String

    Message.

    hashtag
    Response example

    hashtag
    extract – the biometric template extraction

    The method is designed to extract a biometric template from an image.

    HTTP request content type: “image / jpeg” or “image / png”

    Call POST /v1/face/pattern/extract

    hashtag
    Input parameters

    Parameter name

    Type

    Description

    Not specified*

    Stream

    Required parameter. Image to extract the biometric template.

    The “Content-Type” header field must indicate the content type.

    chevron-right*hashtag

    The name itself is not mandatory for a parameter of the Stream type.

    circle-info

    To transfer data in base64, add Content-Transfer-Encoding = base64 to the request headers.

    hashtag
    Request example

    hashtag
    Successful response

    In case of success, the method returns a biometric template.

    The content type of the HTTP response is “application/octet-stream”.

    circle-info

    If you've passed Content-Transfer-Encoding = base64 in headers, the template will be in base64 as well.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    Not specified*

    Stream

    A biometric template derived from an image

    chevron-right*hashtag

    The name itself is not mandatory for a parameter of the Stream type.

    hashtag
    Response example

    hashtag
    compare – the comparison of biometric templates

    The method is designed to compare two biometric templates.

    The content type of the HTTP request is “multipart / form-data”.

    CallPOST /v1/face/pattern/compare

    hashtag
    Input parameters

    Parameter name

    Type

    Description

    bio_feature

    Stream

    Required parameter.

    First biometric template.

    bio_template

    Stream

    Required parameter.

    Second biometric template.

    circle-info

    To transfer data in base64, add Content-Transfer-Encoding = base64 to the request headers.

    hashtag
    Request example

    hashtag
    Successful response

    In case of success, the method returns the result of comparing the two templates.

    HTTP response content type: “application/json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    score

    Float

    The result of comparing two templates

    decision

    String

    Recommended solution based on the score.

    approved – positive. The faces match.

    operator_required – additional operator verification is required.

    declined – negative result. The faces don't match.

    hashtag
    Response example

    hashtag
    verify – the biometric verification

    The method combines the two methods from above, extract and compare. It extracts a template from an image and compares the resulting biometric template with another biometric template that is also passed in the request.

    The content type of the HTTP request is “multipart / form-data”.

    Call POST /v1/face/pattern/verify

    hashtag
    Input parameters

    Parameter name

    Type

    Description

    sample

    Stream

    Required parameter.

    Image to extract the biometric template.

    bio_template

    Stream

    Required parameter.

    The biometric template to compare with.

    circle-info

    To transfer data in base64, add Content-Transfer-Encoding = base64 to the request headers.

    hashtag
    Request example

    hashtag
    Successful response

    In case of success, the method returns the result of comparing two biometric templates and the biometric template.

    The content type of the HTTP response is “multipart/form-data”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    score

    Float

    The result of comparing two templates

    bio_feature

    Stream

    Biometric template derived from image

    hashtag
    Response example

    hashtag
    extract_and_compare – extracting and comparison of templates derived from two images

    The method also combines the two methods from above, extract and compare. It extracts templates from two images, compares the received biometric templates, and transmits the comparison result as a response.

    The content type of the HTTP request is “multipart / form-data”.

    Call POST /v1/face/pattern/extract_and_compare

    hashtag
    Input parameters

    Parameter name

    Type

    Description

    sample_1

    Stream

    Required parameter.

    First image.

    sample_2

    Stream

    Required parameter.

    Second image

    circle-info

    To transfer data in base64, add Content-Transfer-Encoding = base64 to the request headers.

    hashtag
    Request example

    hashtag
    Successful response

    In case of success, the method returns the result of comparing the two extracted biometric templates.

    HTTP response content type: “application / json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    score

    Float

    The result of comparing the two extracted templates.

    decision

    String

    Recommended solution based on the score.

    approved – positive. The faces are match.

    operator_required – additional operator verification is required.

    declined – negative result. The faces don't match.

    hashtag
    Response example

    hashtag
    compare_n – 1:N biometric template comparison

    Use this method to compare one biometric template to N others.

    The content type of the HTTP request is “multipart/form-data”.

    Call POST /v1/face/pattern/compare_n

    hashtag
    Input parameters

    Parameter name

    Type

    Description

    template_1

    Stream

    This parameter is mandatory. The first (main) biometric template

    templates_n

    Stream

    A list of N biometric templates. Each of them should be passed separately but the parameter name should be templates_n. You also need to pass the filename in the header.

    hashtag
    Request example

    hashtag
    Successful response

    In case of success, the method returns the result of the 1:N comparison.

    HTTP response content type: “application / json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    results

    List[JSON]

    A list of N comparison results. The Nth result contains the comparison result for the main and Nth templates. The result has the fields as follows:

    *filename

    String

    A filename for the Nth template.

    *score

    Float

    The result of comparing the main and Nth templates.

    *decision

    hashtag
    Response example

    hashtag
    verify_n – 1:N biometric verification

    The method combines the extract and compare_n methods. It extracts a biometric template from an image and compares it to N other biometric templates that are passed in the request as a list.

    The content type of the HTTP request is “multipart/form-data”.

    Call POST /v1/face/pattern/verify_n

    hashtag
    Input parameters

    Parameter name

    Type

    Description

    sample_1

    Stream

    This parameter is mandatory. The main image.

    templates_n

    Stream

    A list of N biometric templates. Each of them should be passed separately but the parameter name should be templates_n. You also need to pass the filename in the header.

    circle-info

    To transfer data in base64, add Content-Transfer-Encoding = base64 to the request headers.

    hashtag
    Request example

    hashtag
    Successful response

    In case of success, the method returns the result of the 1:N comparison.

    HTTP response content type: “application / json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    results

    List[JSON]

    A list of N comparison results. The Nth result contains the comparison result for the template derived from the main image and the Nth template. The result has the fields as follows:

    *filename

    String

    A filename for the Nth template.

    *score

    Float

    The result of comparing the template derived from the main image and the Nth template.

    *decision

    hashtag
    Response example

    hashtag
    extract_and_compare_n – 1:N template extraction and comparison

    This method also combines the extract and compare_n methods but in another way. It extracts biometric templates from the main image and a list of other images and then compares them in the 1:N mode.

    The content type of the HTTP request is “multipart/form-data”.

    Call POST /v1/face/pattern/extract_and_compare_n

    hashtag
    Input parameters

    Parameter name

    Type

    Description

    sample_1

    Stream

    This parameter is mandatory. The first (main) image.

    samples_n

    Stream

    A list of N images. Each of them should be passed separately but the parameter name should be samples_n. You also need to pass the filename in the header.

    circle-info

    To transfer data in base64, add Content-Transfer-Encoding = base64 to the request headers.

    hashtag
    Request example

    hashtag
    Successful response

    In case of success, the method returns the result of the 1:N comparison.

    HTTP response content type: “application / json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    results

    List[JSON]

    A list of N comparison results. The Nth result contains the comparison result for the main and Nth images. The result has the fields as follows:

    *filename

    String

    A filename for the Nth image.

    *score

    Float

    The result of comparing the main and Nth images.

    *decision

    hashtag
    Response example

    hashtag
    Method errors

    HTTP response content type: “application / json”.

    HTTP response codes

    The value of the “code” parameter

    Description

    400

    BPE-002001

    Invalid Content-Type of HTTP request

    400

    BPE-002002

    Invalid HTTP request method

    400

    BPE-002003

    Failed to read the biometric sample*

    400

    chevron-right*hashtag

    A biometric sample is an input image.

    hashtag
    Liveness

    hashtag
    health – checking the status of liveness processor

    Use this method to check whether the liveness processor is ready to work.

    Call GET /v1/face/liveness/health

    hashtag
    Input parameters

    • None.

    hashtag
    Request example

    GET localhost/v1/face/liveness/health

    hashtag
    Successful response

    In case of success, the method returns a message with the following parameters.

    HTTP response content type: “application/json”.

    hashtag
    Output parameters

    Parameter name

    Type

    Description

    status

    Int

    0 – the liveness processor is working correctly.

    3 – the liveness processor is inoperative.

    message

    String

    Message.

    hashtag
    Response example

    hashtag
    detect – presentation attack detection

    The detect method is made to reveal presentation attacks. It detects a face in each image or video (since 1.2.0), sends them for analysis, and returns a result.

    The method supports the following content types:

    • image/jpeg or image/png for an image;

    • multipart/form-data for images, videos, and archives. You can use payload to add any parameters that affect the analysis.

    To run the method, call POST /{version}/face/liveness/detect.

    hashtag
    Image

    Accepts an image in JPEG or PNG format. No payload attached.

    chevron-rightRequest examplehashtag
    chevron-rightSuccessful response examplehashtag

    hashtag
    Multipart/form-data

    Accepts the multipart/form-data request.

    • Each media file should have a unique name, e.g., media_key1, media_key2.

    • The payload parameters should be a JSON placed in the payload field.

    circle-info

    Temporary IDs will be deleted once you get the result.

    chevron-rightRequest examplehashtag
    chevron-rightSuccessful response examplehashtag

    hashtag
    Multipart/form-data with Best Shot

    To extract the best shot from your video or archive, in analyses, set extract_best_shot = true (as shown in the request example below). In this case, API Lite will analyze your archives and videos, and, in response, will return the best shot. It will be a base64 image in analysis->output_images->image_b64.

    Additionally, you can change the Liveness threshold. In analyses, set the new threshold in the threshold_spoofing parameter. If the resulting score will be higher than this parameter's value, the analysis will end up with the DECLINED status. Otherwise, the status will be SUCCESS.

    chevron-rightRequest examplehashtag
    chevron-rightSuccessful response examplehashtag
    chevron-rightThe payload fieldhashtag

    hashtag
    Method errors

    HTTP response content type: “application / json”.

    HTTP response codes

    The value of the “code” parameter

    Description

    400

    LDE-002001

    Invalid Content-Type of HTTP request

    400

    LDE-002002

    Invalid HTTP request method

    400

    LDE-002004

    Failed to extract the biometric sample*

    400

    chevron-right*hashtag

    A biometric sample is an input image.

    POST /v1/face/liveness/detect HTTP/1.1
    Host: localhost
    Content-Type: image/jpeg
    Content-Length: [the size of the message body]
    [Image byte stream]
    HTTP/1.1 200 OK
    Content-Type: application/json
    {
      "passed": false,
      "score": 0.999484062
    }
    POST /v1/face/liveness/detect HTTP/1.1
    Host: localhost
    Content-Length: [the size of the message body]
    Content-Type: multipart/form-data; boundary=--BOUNDARY--
    
    --BOUNDARY--
    Content-Disposition: form-data; name="media_key1"; filename="video.mp4"
    Content-Type: multipart/form-data; 
    
    [media file byte stream]
    --BOUNDARY--
    Content-Disposition: form-data; name="payload"
    
        {
            "folder:meta_data": {
                "partner_side_folder_id": "partner_side_folder_id_if_needed",
                "person_info": {
                    "first_name": "John",
                    "middle_name": "Jameson",
                    "last_name": "Doe"
                }
            },
            "resolution_endpoint": "https://www.your-custom-endpoint.com",
            "media:meta_data": {
                "media_key1": {
                    "foo": "bar2"
                }
            },
            "media:tags": {
                "media_key1": [
                    "video_selfie",
                    "video_selfie_blank"
                ]
            },
            "analyses": [
              {
                "type": "quality",
                "meta_data": {
                  "example1": "some_example1"
                },
                "params": {
                    "threshold_spoofing": 0.6,
                    "extract_best_shot": false
                }
              }
    ]
        }
    --BOUNDARY--
    {
        "company_id": null,
        "time_created": 1720180784.769608,
        "folder_id": "folder_id", // temporary ID
        "user_id": null,
        "resolution_endpoint": "https://www.your-custom-endpoint.com",
        "resolution_status": "FINISHED",
        "resolution_comment": "[]",
        "system_resolution": "SUCCESS",
        "resolution_time": null,
        "resolution_author_id": null,
        "resolution_state_hash": null,
        "operator_comment": null,
        "operator_status": null,
        "is_cleared": null,
        "meta_data": {
            "partner_side_folder_id": "partner_side_folder_id_if_needed",
            "person_info": {
                "first_name": "John",
                "middle_name": "Jameson",
                "last_name": "Doe"
            }
        },
        "technical_meta_data": {},
        "time_updated": 1720180787.531983,
        "media": [
            {
                "folder_id": "folder_id", // temporary ID
                "media_id": "video_id", // temporary ID
                "media_type": "VIDEO_FOLDER",
                "info": {
                    "thumb": null,
                    "video": {
                        "duration": 3.76,
                        "FPS": 22.83,
                        "width": 960,
                        "height": 720,
                        "md5": "8879b4fa9ee7add77aceb8d7d5d7b92d",
                        "size": 6017119,
                        "mime-type": "video/mp4"
                    }
                },
                "tags": [
                    "video_selfie",
                    "video_selfie_blank",
                    "orientation_portrait"
                ],
                "original_name": "video-5mb.mp4",
                "original_url": null,
                "company_id": null,
                "technical_meta_data": {},
                "time_created": 1719573752.78253,
                "time_updated": 1720180787.531801,
                "meta_data": {
                    "foo4": "bar5"
                },
                "thumb_url": null,
                "folder_time_created": null,
                "video_id": "video_id", // temporary ID
                "video_url": null
            }
        ],
        "analyses": [
            {
                "analyse_id": null,
                "analysis_id": null,
                "folder_id": "folder_id", // temporary ID
                "folder_time_created": null,
                "type": "QUALITY",
                "state": "FINISHED",
                "company_id": null,
                "group_id": null,
                "results_data": null,
                "confs": {
                    "threshold_replay": 0.5,
                    "extract_best_shot": false,
                    "threshold_liveness": 0.5,
                    "threshold_spoofing": 0.42
                },
                "error_message": null,
                "error_code": null,
                "resolution_operator": null,
                "technical_meta_data": {},
                "time_created": 1720180784.769944,
                "time_updated": 1720180787.531877,
                "meta_data": {
                    "some_key": "some_value"
                },
                "source_media": [
                    {
                        "folder_id": "folder_id", // temporary ID
                        "media_id": "video_id", // temporary ID
                        "media_type": "VIDEO_FOLDER",
                        "info": {
                            "thumb": null,
                            "video": {
                                "duration": 3.76,
                                "FPS": 22.83,
                                "width": 960,
                                "height": 720,
                                "md5": "8879b4fa9ee7add77aceb8d7d5d7b92d",
                                "size": 6017119,
                                "mime-type": "video/mp4"
                            }
                        },
                        "tags": [
                            "video_selfie",
                            "video_selfie_blank",
                            "orientation_portrait"
                        ],
                        "original_name": "video-5mb.mp4",
                        "original_url": null,
                        "company_id": null,
                        "technical_meta_data": {},
                        "time_created": 1719573752.78253,
                        "time_updated": 1720180787.531801,
                        "meta_data": {
                            "foo4": "bar5"
                        },
                        "thumb_url": null,
                        "folder_time_created": null,
                        "video_id": "video_id", // temporary ID
                        "video_url": null
                    }
                ],
                "results_media": [
                    {
                        "company_id": null,
                        "media_association_id": "video_id", // temporary ID
                        "analysis_id": null,
                        "results_data": {
                            "confidence_spoofing": 0.000541269779
                        },
                        "source_media_id": "video_id", // temporary ID
                        "output_images": [],
                        "collection_persons": [],
                        "folder_time_created": null
                    }
                ],
                "resolution_status": "SUCCESS",
                "resolution": "SUCCESS"
            }
        ]
    }
    POST /v1/face/liveness/detect HTTP/1.1
    Host: localhost
    Content-Length: [the size of the message body]
    Content-Type: multipart/form-data; boundary=--BOUNDARY--
    
    --BOUNDARY--
    Content-Disposition: form-data; name="media_key1"; filename="video.mp4"
    Content-Type: multipart/form-data; 
    
    [media file byte stream]
    --BOUNDARY--
    Content-Disposition: form-data; name="payload"
    
        {
            "folder:meta_data": {
                "partner_side_folder_id": "partner_side_folder_id_if_needed",
                "person_info": {
                    "first_name": "John",
                    "middle_name": "Jameson",
                    "last_name": "Doe"
                }
            },
            "resolution_endpoint": "https://www.your-custom-endpoint.com",
            "media:meta_data": {
                "media_key1": {
                    "foo": "bar2"
                }
            },
            "media:tags": {
                "media_key1": [
                    "video_selfie",
                    "video_selfie_blank"
                ]
            },
            "analyses": [
              {
                "type": "quality",
                "meta_data": {
                  "example1": "some_example1"
                },
                "params": {
                    "threshold_spoofing": 0.6,
                    "extract_best_shot": true
                }
              }
    ]
        }
    --BOUNDARY--
    {
        "company_id": null,
        "time_created": 1720177371.120899,
        "folder_id": "folder_id", // temporary ID
        "user_id": null,
        "resolution_endpoint": "https://www.your-custom-endpoint.com",
        "resolution_status": "FINISHED",
        "resolution_comment": "[]",
        "system_resolution": "SUCCESS",
        "resolution_time": null,
        "resolution_author_id": null,
        "resolution_state_hash": null,
        "operator_comment": null,
        "operator_status": null,
        "is_cleared": null,
        "meta_data": {
            "partner_side_folder_id": "partner_side_folder_id_if_needed",
            "person_info": {
                "first_name": "John",
                "middle_name": "Jameson",
                "last_name": "Doe"
            }
        },
        "technical_meta_data": {},
        "time_updated": 1720177375.531137,
        "media": [
            {
                "folder_id": "folder_id", // temporary ID
                "media_id": "media_id", // temporary ID
                "media_type": "VIDEO_FOLDER",
                "info": {
                    "thumb": null,
                    "video": {
                        "duration": 3.76,
                        "FPS": 22.83,
                        "width": 960,
                        "height": 720,
                        "md5": "8879b4fa9ee7add77aceb8d7d5d7b92d",
                        "size": 6017119,
                        "mime-type": "video/mp4"
                    }
                },
                "tags": [
                    "video_selfie",
                    "video_selfie_blank",
                    "orientation_portrait"
                ],
                "original_name": "video-5mb.mp4",
                "original_url": null,
                "company_id": null,
                "technical_meta_data": {},
                "time_created": 1719573752.781861,
                "time_updated": 1720177373.772401,
                "meta_data": {
                    "foo4": "bar5"
                },
                "thumb_url": null,
                "folder_time_created": null,
                "video_id": "media_id", // temporary ID
                "video_url": null
            }
        ],
        "analyses": [
            {
                "analyse_id": null,
                "analysis_id": null,
                "folder_id": "folder_id", // temporary ID
                "folder_time_created": null,
                "type": "QUALITY",
                "state": "FINISHED",
                "company_id": null,
                "group_id": null,
                "results_data": null,
                "confs": {
                    "threshold_replay": 0.5,
                    "extract_best_shot": true,
                    "threshold_liveness": 0.5,
                    "threshold_spoofing": 0.42
                },
                "error_message": null,
                "error_code": null,
                "resolution_operator": null,
                "technical_meta_data": {},
                "time_created": 1720177371.121241,
                "time_updated": 1720177375.531043,
                "meta_data": {
                    "some_key": "some_value"
                },
                "source_media": [
                    {
                        "folder_id": "folder_id", // temporary ID
                        "media_id": "media_id", // temporary ID
                        "media_type": "VIDEO_FOLDER",
                        "info": {
                            "thumb": null,
                            "video": {
                                "duration": 3.76,
                                "FPS": 22.83,
                                "width": 960,
                                "height": 720,
                                "md5": "8879b4fa9ee7add77aceb8d7d5d7b92d",
                                "size": 6017119,
                                "mime-type": "video/mp4"
                            }
                        },
                        "tags": [
                            "video_selfie",
                            "video_selfie_blank",
                            "orientation_portrait"
                        ],
                        "original_name": "video-5mb.mp4",
                        "original_url": null,
                        "company_id": null,
                        "technical_meta_data": {},
                        "time_created": 1719573752.781861,
                        "time_updated": 1720177373.772401,
                        "meta_data": {
                            "foo4": "bar5"
                        },
                        "thumb_url": null,
                        "folder_time_created": null,
                        "video_id": "media_id", // temporary ID
                        "video_url": null
                    }
                ],
                "results_media": [
                    {
                        "company_id": null,
                        "media_association_id": "media_id", // temporary ID
                        "analysis_id": null,
                        "results_data": {
                            "confidence_spoofing": 0.000541269779
                        },
                        "source_media_id": "media_id", // temporary ID
                        "output_images": [
                            {
                                "folder_id": "folder_id", // temporary ID
                                "media_id": "media_id", // temporary ID
                                "media_type": "IMAGE_RESULT_ANALYSIS_SINGLE",
                                "info": {
                                    "thumb": null,
                                    "original": {
                                        "md5": "e6effeceb94e79b8cb204c6652283b57",
                                        "width": 720,
                                        "height": 960,
                                        "size": 145178,
                                        "mime-type": "image/jpeg"
                                    }
                                },
                                "tags": [],
                                "original_name": "<PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=720x960 at 0x766811DF8E90>",
                                "original_url": null,
                                "company_id": null,
                                "technical_meta_data": {},
                                "time_created": 1719573752.781861,
                                "time_updated": 1719573752.781871,
                                "meta_data": null,
                                "folder_time_created": null,
                                "image_b64": "",
                                "media_association_id": "media_id" // temporary ID
                            }
                        ],
                        "collection_persons": [],
                        "folder_time_created": null
                    }
                ],
                "resolution_status": "SUCCESS",
                "resolution": "SUCCESS"
            }
        ]
    }
    {
        "folder:meta_data": {
            "partner_side_folder_id": "partner_side_folder_id_if_needed",
            "person_info": {
                "first_name": "John",
                "middle_name": "Jameson",
                "last_name": "Doe"
            }   },
        "resolution_endpoint": "https://www.your-custom-endpoint.com",
        "media:meta_data": {
            "media_key1": {
                "foo": "bar2",
                "additional_info": "additional_info" // might affect the score
            },
            "media_key2": {
                "foo2": "bar3"
            },
            "media_key3": {
                "foo4": "bar5"
            }
        },
        "media:tags": {
            "media_key1": [
                "video_selfie",
                "video_selfie_blank",
                "orientation_portrait"
            ],
            "media_key2": [
                "photo_selfie"
            ],
            "media_key3": [
                "video_selfie",
                "video_selfie_blank",
                "orientation_portrait"
            ]
        },
    "analyses": [
        {
          "type": "quality",
          "meta_data": {
            "some_key": "some_value"
          },
          "params": {
          	"threshold_spoofing": 0.42, // affects resolution
          	"extract_best_shot":true // analysis will return the best shot
          }
        }
      ]
    }
    200 OK
    Content-Type: application/json
    {
    2	"core": "core_version",
    3	"tfss": "tfss_version",
    4	"models": [
    5		{
    6			"name": "model_name",
    7			"version": "model_version"
    8		}
    9	]
    10}
    200 OK
    Content-Type: application/json
    {“status”: 0, message: “”}
    POST localhost/v1/face/pattern/extract
    Content-Type: image/jpeg
    {Image byte stream}
    200 OK
    Content-Type: application/octet-stream
    {Biometric template byte stream}
    POST localhost/v1/face/pattern/compare
    Content-Type: multipart/form-data;
    boundary=--BOUNDARY--
    Content-Length: Message body length
    --BOUNDARY--
    Content-Disposition: form-data; name=”bio_feature”
    Content_type: application/octet-stream
    {Biometric template byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”bio_template”
    Content_type: application/octet-stream
    {Biometric template byte stream}
    --BOUNDARY--
    200 OK
    Content-Type: application/json
    {“score”: 1.0, “decision”: “approved”}
    POST localhost/v1/face/pattern/verify
    Content-Type: multipart/form-data;
    boundary=--BOUNDARY--
    Content-Length: Message body length
    --BOUNDARY--
    Content-Disposition: form-data; name=”bio_template”
    Content_type: application/octet-stream
    {Biometric template byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”sample”
    Content_type: image/jpeg
    {Image byte stream}
    --BOUNDARY--
    200 OK
    Content-Type: multipart/form-data;
    boundary=--BOUNDARY--
    Content-Length: Message body length
    --BOUNDARY--
    Content-Disposition: form-data; name=”score”
    Content_type: application/json
    {“score”: 1.0}
    --BOUNDARY--
    Content-Disposition: form-data; name=”bio_feature”
    Content_type: application/octet-stream
    {Biometric template byte stream}
    --BOUNDARY--
    POST localhost/v1/face/pattern/extract_and_compare
    Content-Type: multipart/form-data;
    boundary=--BOUNDARY--
    Content-Length: Message body length
    --BOUNDARY--
    Content-Disposition: form-data; name=”sample_1”
    Content_type: image/jpeg
    {Image byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”sample_2”
    Content_type: image/jpeg
    {Image byte stream}
    --BOUNDARY--
    200 OK
    Content-Type: application/json
    {“score”: 1.0, “decision”: “approved”}
    POST localhost/v1/face/pattern/compare_n
    Content-Type: multipart/form-data;
    boundary=--BOUNDARY--
    Content-Length: message body length
    --BOUNDARY--
    Content-Disposition: form-data; name=”template_1”
    Content_type: application/octet-stream
    {biometric template byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”templates_n”; filename=”1.template”
    Content_type: application/octet-stream
    {biometric template byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”templates_n”; filename=”2.template”
    Content_type: application/octet-stream
    {biometric template byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”templates_n”; filename=”3.template”
    Content_type: application/octet-stream
    {biometric template byte stream}
    --BOUNDARY--
    200 OK
    Content-Type: application/json
    {'results': [
        {'filename': '1.template', 'score': 0.0, 'decision': 'declined'}, 
        {'filename': '2.template', 'score': 1.0, 'decision': 'approved'}, 
        {'filename': '3.template', 'score': 0.21, 'decision': 'declined'}
    ]}
    POST localhost/v1/face/pattern/verify_n
    Content-Type: multipart/form-data;
    boundary=--BOUNDARY--
    Content-Length: message body length
    --BOUNDARY--
    Content-Disposition: form-data; name=”sample_1”
    Content_type: image/jpeg
    {image byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”templates_n”; filename=”1.template”
    Content_type: application/octet-stream
    {biometric template byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”templates_n”; filename=”2.template”
    Content_type: application/octet-stream
    {biometric template byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”templates_n”; filename=”3.template”
    Content_type: application/octet-stream
    {biometric template byte stream}
    --BOUNDARY--
    200 OK
    Content-Type: application/json
    {'results': [
        {'filename': '1.template', 'score': 0.0, 'decision': 'declined'}, 
        {'filename': '2.template', 'score': 1.0, 'decision': 'approved'}, 
        {'filename': '3.template', 'score': 0.21, 'decision': 'declined'}
    ]}
    POST localhost/v1/face/pattern/extract_and_compare_n
    Content-Type: multipart/form-data;
    boundary=--BOUNDARY--
    Content-Length: message body length
    --BOUNDARY--
    Content-Disposition: form-data; name=”sample_1”
    Content_type: image/jpeg
    {image byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”samples_n”; filename=”1.jpeg”
    Content_type: image/jpeg
    {image byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”samples_n”; filename=”2.jpeg”
    Content_type: image/jpeg
    {image byte stream}
    --BOUNDARY--
    Content-Disposition: form-data; name=”samples_n”; filename=”3.jpeg”
    Content_type: image/jpeg
    {image byte stream}
    --BOUNDARY--
    200 OK
    Content-Type: application/json
    {'results': [
        {'filename': '1.jpeg', 'score': 0.0, 'decision': 'declined'}, 
        {'filename': '2.jpeg', 'score': 1.0, 'decision': 'approved'}, 
        {'filename': '3.jpeg', 'score': 0.21, 'decision': 'declined'}
    ]}
    200 OK
    Content-Type: application/json
    {“status”: 0, message: “”}
    sslPin
    sslPin
    MediaTag
    Mode
    Resolution
    SizeReductionStrategy

    String

    Recommended solution based on the score.

    approved – positive. The faces are match.

    operator_required – additional operator verification is required.

    declined – negative result. The faces don't match.

    String

    Recommended solution based on the score.

    approved – positive. The faces are match.

    operator_required – additional operator verification is required.

    declined – negative result. The faces don't match.

    String

    Recommended solution based on the score.

    approved – positive. The faces are match.

    operator_required – additional operator verification is required.

    declined – negative result. The faces don't match.

    BPE-002004

    Failed to read the biometric template

    400

    BPE-002005

    Invalid Content-Type of the multiparted HTTP request part

    400

    BPE-003001

    Failed to retrieve the biometric template

    400

    BPE-003002

    The biometric sample* is missing face

    400

    BPE-003003

    More than one person is present on the biometric sample*

    500

    BPE-001001

    Internal bioprocessor error

    400

    BPE-001002

    TFSS error. Call the biometry health method.

    LDE-002005

    Invalid Content-Type of the multiparted HTTP request part

    500

    LDE-001001

    Liveness detection processor internal error

    400

    LDE-001002

    TFSS error. Call the Liveness health method.

    iOS SDK Methods and Properties

    hashtag
    OZSDK

    A singleton for Oz SDK.

    hashtag
    Methods

    hashtag
    OZSDK

    Initializes OZSDK with the license data. The closure is either license data or .

    Returns

    -

    hashtag
    setLicense

    Forces the license installation.

    hashtag
    setApiConnection

    Retrieves an access token for a user.

    Returns

    The access token or an error.

    hashtag
    setEventsConnection

    Retrieves an access token for a user to send telemetry.

    Returns

    The access token or an error.

    hashtag
    isLoggedIn

    Checks whether an access token exists.

    Parameters

    -

    Returns

    The result – the true or false value.

    hashtag
    logout

    Deletes the saved access token

    Parameters

    -

    Returns

    -

    hashtag
    createVerificationVCWithDelegate

    Creates the Liveness check controller.

    Returns

    UIViewController or an exception.

    hashtag
    createVerificationVC

    Creates the Liveness check controller.

    Returns

    UIViewController or an exception.

    hashtag
    cleanTempDirectory

    Deletes all videos.

    Parameters

    -

    Returns

    -

    hashtag
    getEventSessionId

    Retrieves the telemetry session ID.

    Parameters

    -

    Returns

    The telemetry session ID (String parameter).

    hashtag
    set

    Sets the bundle to look for translations in.

    Returns

    -

    hashtag
    setSelfieLength

    Sets the length of the Selfie gesture (in milliseconds).

    hashtag
    generateSignedPayload

    Generates the payload with media signatures.

    Returns

    Payload to be sent along with media files that were used for generation.

    hashtag
    Properties

    hashtag
    localizationCode

    SDK locale (if not set, works automatically).

    hashtag
    host

    The host to call for Liveness video analysis.

    hashtag
    attemptSettings

    The holder for attempts counts before SDK returns error.

    hashtag
    version

    The SDK version.

    hashtag
    OZLivenessDelegate

    A delegate for OZSDK.

    hashtag
    Methods

    hashtag
    onOZLivenessResult

    Gets the Liveness check results.

    Returns

    -

    hashtag
    onError

    The error processing method.

    Returns

    -

    hashtag
    AnalysisRequest

    A protocol for performing checks.

    hashtag
    Methods

    hashtag
    AnalysisRequestBuilder

    Creates the AnalysisRequest instance.

    Returns

    The AnalysisRequest instance.

    hashtag
    addAnalysis

    Adds an analysis to the AnalysisRequest instance.

    Returns

    -

    hashtag
    uploadMedia

    Uploads media on server.

    Returns

    -

    hashtag
    addFolderId

    Adds the folder ID to upload media to a certain folder.

    Returns

    -

    hashtag
    addFolderMeta

    Adds metadata to a folder.

    Returns

    -

    hashtag
    run

    Runs the analyses.

    Returns

    The analysis result or an error.

    hashtag
    Customization

    Customization for OzLivenessSDK (use OZSDK.customization).

    hashtag
    toolbarCustomization

    A set of customization parameters for the toolbar.

    hashtag
    centerHintCustomization

    A set of customization parameters for the center hint that guides a user through the process of taking an image of themselves.

    hashtag
    hintAnimationCustomization

    A set of customization parameters for the hint animation.

    hashtag
    faceFrameCustomization

    A set of customization parameters for the frame around the user face.

    hashtag
    backgroundCustomization

    A set of customization parameters for the background outside the frame.

    hashtag
    versionCustomization

    A set of customization parameters for the SDK version text.

    hashtag
    antiscamCustomization

    A set of customization parameters for the antiscam message that warns user about their actions being recorded.

    hashtag
    logoCustomization

    Logo customization parameters. Custom logo should be allowed by license. By default, logo is placed on the bottom left.

    hashtag
    Variables and Objects

    hashtag
    enum LicenseSource

    A source of a license.

    hashtag
    struct LicenseData

    The license data.

    hashtag
    enum OzVerificationMovement

    Contains action from the captured video.

    hashtag
    enum OZLocalizationCode

    Contains the locale code according to .

    hashtag
    struct OZMedia

    Contains all the information on the media captured.

    hashtag
    enum MediaType

    The type of media captured.

    hashtag
    enum OZVerificationStatus

    Error description. These errors are deprecated and will be deleted in the upcoming releases.

    hashtag
    struct Analysis

    Contains information on what media to analyze and what analyses to apply.

    hashtag
    enum AnalysisType

    The type of the analysis.

    circle-info

    Currently, the .document analysis can't be performed in the on-device mode.

    hashtag
    enum AnalysisMode

    The mode of the analysis.

    hashtag
    enum ScenarioState

    Shows the media processing status.

    hashtag
    struct AnalysisStatus

    Shows the files' uploading status.

    hashtag
    RequestStatus

    Shows the analysis processing status.

    hashtag
    ResultMedia

    Describes the analysis result for the single media.

    hashtag
    RequestResult

    Contains the consolidated analysis results for all media.

    hashtag
    class AnalysisResult

    Contains the results of the checks performed.

    hashtag
    enum AnalyseResolutionStatus

    The general status for all analyses applied to the folder created.

    hashtag
    struct AnalyseResolution

    Contains the results for single analyses.

    hashtag
    enum GeometryType

    Frame shape settings.

    hashtag
    enum LicenseError

    Possible license errors.

    hashtag
    enum Connection

    The authorization type.

    hashtag
    struct UploadMediaSettings

    Defines the settings for the repeated media upload.

    Parameter
    Type
    Description

    hashtag
    enum SizeReductionStrategy

    Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully. By default, the system uploads the compressed video.

    hashtag
    sslPin

    Contains information about the .

    hashtag
    Data Container

    The methods below apply to the new feature – that has been implemented in 8.22.

    hashtag
    addContainer

    This method replaces addAnalysis in the AnalysisRequest structure when you use the data container flow.

    Input

    hashtag
    createMediaCaptureScreen

    Captures media file with all information you need and packages it into a data container.

    Input

    Output

    hashtag
    public data class CaptureRequest

    Detects a request for video capture.

    hashtag
    public data class AnalysisProfile

    Contains information on media files and analyses that should be applied to them.

    hashtag
    public sealed class MediaRequest

    Stores information about a media file.

    circle-exclamation

    Please note: you should add actionMedia OR userMedia, these parameters are mutually exclusive.

    Sets the number of attempts and timeout between them

    Toolbar title text color

    backgroundColor

    UIColor

    Toolbar background color

    titleText

    String

    Text on the toolbar

    Center hint vertical position from the screen top (in %, 0-100)

    hideTextBackground

    Bool

    Hides text background

    backgroundCornerRadius

    Int

    Center hint background frame corner radius

    Frame color when a face is aligned properly

    strokeWidth

    CGFloat

    Frame stroke width (in dp, 0-20)

    strokePadding

    CGFloat

    A padding from the stroke to the face alignment area (in dp, 0-10)

    Antiscam message text color

    customizationAntiscamBackgroundColor

    UIColor

    Antiscam message text background color

    customizationAntiscamCornerRadius

    CGFloat

    Background frame corner radius

    customizationAntiscamFlashColor

    UIColor

    Color of the flashing indicator close to the antiscam message

    Additional configuration

    Head turned left

    right

    Head turned right

    down

    Head tilted downwards

    up

    Head lifted up

    Spanish

    pt-BR

    Portuguese (Brazilian)

    custom(String)

    Custom language (language ISO 639-1 code, two letters)

    URL of the Liveness video

    bestShotURL

    URL

    URL of the best shot in PNG

    preferredMediaURL

    URL

    URL of the API media container

    timestamp

    Date

    Timestamp for the check completion

    The Liveness check can't be performed: attempts limit exceeded

    failedBecausePreparingTimout

    The Liveness check can't be performed: face alignment timeout

    failedBecauseOfLowMemory

    The Liveness check can't be performed: no memory left

    Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully

    params (optional)

    String

    Additional parameters

    Object uploading status

    Resulting score

    mediaType

    String

    Media file type: VIDEO / IMAGE / SHOT_SET

    media

    Media that is being analyzed

    error

    AnalysisError (inherits from Error)

    Error

    Analysis identifier

    error

    AnalysisError (inherits from Error)

    Error

    resultMedia

    []

    Results of the analysis for single media files

    confidenceScore

    Float

    The resulting score

    serverRawResponse

    String

    Server response

    Everything went fine, the check succeeded (e.g., faces match or liveness confirmed)

    OPERATOR_REQUIRED

    The result should be additionally checked by a human operator

    The result of the check performed

    Parameter

    Type

    Description

    licenseSources

    [LicenseSource]

    The source of the license

    Parameter

    Type

    Description

    licenseSource

    LicenseSource

    Source of the license

    Parameter

    Type

    Description

    apiConnection

    Connection

    Authorization parameters

    Parameter

    Type

    Description

    eventsConnection

    Connection

    Telemetry authorization parameters

    Parameter

    Type

    Description

    delegate

    OZLivenessDelegate

    The delegate for Oz Liveness

    actions

    OzVerificationMovement

    Captured action

    cameraPosition (optional)

    AVCaptureDevice.Position

    front – front camera (default), back – rear camera

    Parameter

    Type

    Description

    actions

    OzVerificationMovement

    Captured action

    FaceCaptureCompletion

    type alias used as follows:

    public typealias FaceCaptureCompletion = (_ results: [OZMedia]?, _ error: OZVerificationStatus?) -> Void

    The handler that is executed when the method completes. The closure is either an array of OZMedia objects or an error.

    cameraPosition (optional)

    AVCaptureDevice.Position

    front – front camera (default), back – rear camera

    Parameter

    Type

    Description

    languageBundle

    Bundle

    The bundle that contains translations

    Parameter

    Type

    Description

    selfieLength

    Int

    The length of the Selfie gesture (in milliseconds). Should be within 500-5000 ms, the default length is 700

    Parameter

    Type

    Description

    media

    OZMedia

    An array of media files

    folderMeta

    [String]

    Additional folder metadata

    Parameter

    Type

    Description

    localizationCode

    OZLocalizationCode

    The localization code

    Parameter

    Type

    Description

    host

    String

    Host address

    Parameter

    Type

    Description

    singleCount

    Int

    Attempts on a single action/gesture

    commonCount

    Int

    Total number of attempts on all actions/gestures if you use a sequence of them

    faceAlignmentTimeout

    Float

    Time needed to align face into frame

    uploadMediaSettings

    Parameter

    Type

    Description

    version

    String

    Version number

    Parameter

    Type

    Description

    results

    [OzMedia]

    An array of the OzMedia objects.

    Parameter

    Type

    Description

    status

    OZVerificationStatus

    The error description.

    Parameter

    Type

    Description

    folderId (optional)

    String

    The identifier to define when you need to upload media to a certain folder.

    Parameter

    Type

    Description

    analysis

    Analysis

    A structure containing information on the analyses required.

    Parameter

    Type

    Description

    media

    OZMedia

    Media or an array of media objects to be uploaded.

    Parameter

    Type

    Description

    folderId

    String

    The folder identifier.

    Parameter

    Type

    Description

    meta

    [String]

    An array of metadata as follows:

    ["meta1": "data1"]

    Parameter

    Type

    Description

    statusHandler

    A callback function as follows:

    statusHandler: @escaping ((_ status: RequestStatus) -> Void)

    The handler that is executed when the scenario state changes

    errorHandler

    A callback function as follows:

    errorHandler: @escaping ((_ error: Error) -> Void)

    Error handler

    completionHandler

    A callback function as follows:

    completionHandler: @escaping (_ results : RequestResult) -> Void)

    The handler that is executed when the run method completes.

    Parameter

    Type

    Description

    closeButtonIcon

    UIImage

    An image for the close button

    closeButtonColor

    UIColor

    Close button tintColor

    titleFont

    UIFont

    Toolbar title text font

    titleColor

    Parameter

    Type

    Description

    textFont

    UIFont

    Center hint text font

    textColor

    UIColor

    Center hint text color

    backgroundColor

    UIColor

    Center hint text background

    verticalPosition

    Parameter

    Type

    Description

    hideAnimation

    Bool

    A switcher for hint animation, if True, the animation is hidden

    animationIconSize

    CGfloat

    A side size of the animation icon square

    hintGradientColor

    UIColor

    The close-to-frame gradient color

    Parameter

    Type

    Description

    geometryType

    GeometryType

    The frame type: oval, rectangle, circle, or square

    cornerRadius

    CGFloat

    Rectangle corner radius (in dp)

    strokeFaceNotAlignedColor

    UIColor

    Frame color when a face is not aligned properly

    strokeFaceAlignedColor

    Parameter

    Type

    Description

    backgroundColor

    UIColor

    Background color

    Parameter

    Type

    Description

    textFont

    UIFont

    SDK version text font

    textColor

    UIColor

    SDK version text color

    Parameter

    Type

    Description

    customizationEnableAntiscam

    Bool

    Adds the antiscam message

    customizationAntiscamTextMessage

    String

    Antiscam message text

    customizationAntiscamTextFont

    UIFont

    Antiscam message text font

    customizationAntiscamTextColor

    Parameter

    Type

    Description

    image

    UIImage

    Logo image

    size

    CGSize

    Logo size (in dp)

    verticalPosition

    Int (0-100), default: 100

    Vertical offset

    horizontalPosition

    Int (0-100), default: 0

    Horizontal offset

    Case

    Description

    licenseFilePath

    An absolute path to a license (String)

    licenseFileName

    The name of the license file

    Parameter

    Type

    Description

    appIDS

    [String]

    An array of bundle IDs

    expires

    TimeInterval

    The expiration interval

    features

    Features

    License features

    configs (optional)

    Case

    Description

    smile

    Smile

    eyes

    Blink

    scanning

    Scan

    selfie

    A selfie with face alignment check

    one_shot

    The best shot from the video taken

    Case

    Description

    en

    English

    hy

    Armenian

    kk

    Kazakh

    ky

    Kyrgyz

    tr

    Turkish

    Parameter

    Type

    Description

    movement

    OZVerificationMovement

    User action type

    mediaType

    MediaType

    Type of media

    metaData

    [String] as follows:

    ["meta1": "data1"]

    Metadata if any

    videoURL

    Case

    Description

    movement

    A media with an action

    documentBack

    The back side of the document

    documentFront

    The front side of the document

    Case

    Description

    userNotProcessed

    The Liveness check was not processed

    failedBecauseUserCancelled

    The check was interrupted by user

    failedBecauseCameraPermissionDenied

    The Liveness check can't be performed: no camera access

    failedBecauseOfBackgroundMode

    The Liveness check can't be performed: background mode

    failedBecauseOfTimeout

    The Liveness check can't be performed: timeout

    Parameter

    Type

    Description

    media

    [OzMedia]

    An array of the OzMedia objects

    type

    AnalysisType

    The type of the analysis

    mode

    AnalysisMode

    The mode of the analysis

    sizeReductionStrategy

    Case

    Description

    biometry

    The algorithm that allows comparing several media and check if the people on them are the same person or not

    quality

    The algorithm that aims to check whether a person in a video is a real human acting in good faith, not a fake of any kind.

    document (deprecated)

    The analysis that aims to recognize the document and check if its fields are correct according to its type.

    blacklist

    The analysis that compares a face on a captured media with faces from the pre-made media database.

    Case

    Description

    onDevice

    The on-device analysis with no server needed. We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results

    serverBased

    The server-based analysis

    hybrid

    The hybrid analysis for Liveness: if the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.

    Case

    Description

    addToFolder

    The system is creating a folder and adding files to this folder

    addAnalyses

    The system is adding analyses

    waitAnalysisResult

    The system is waiting for the result

    Parameter

    Type

    Description

    media

    OzMedia

    The object that is being uploaded at the moment

    index

    Int

    Number of this object in a list

    from

    Int

    Objects quantity

    progress

    Parameter

    Type

    Description

    status

    ScenarioState

    Processing analysis status

    progressStatus

    AnalysisStatus

    Media uploading status

    Parameter

    Type

    Description

    resolution

    AnalysisResolutionStatus

    Consolidated analysis result

    sourceId

    String

    Media identifier

    isOnDevice

    Bool

    Analysis mode

    confidenceScore

    Parameter

    Type

    Description

    resolution

    AnalysisResolutionStatus

    Consolidated analysis result

    folderId

    String

    Folder identifier

    analysisResults

    [AnalysisResult]

    A list of analysis results

    Parameter

    Type

    Description

    resolution

    AnalyseResolutionStatus

    Analysis resolution

    type

    AnalysisType

    Analysis type

    mode

    AnalysisMode

    Analysis mode

    analysisId

    Case

    Description

    INITIAL

    No analyses have been applied yet

    PROCESSING

    The analyses are in progress

    FAILED

    One or more analyses failed due to some error and couldn't get finished

    FINISHED

    The analyses are finished

    DECLINED

    The check failed (e.g., faces don't match or some spoofing attack detected)

    Parameter

    Type

    Description

    analyseResolutionStatus

    AnalyseResolutionStatus

    The analysis status

    type

    AnalysisType

    The analysis type

    folderID

    String

    The folder identifier

    score

    Case

    Description

    oval

    Oval frame

    rectangle(cornerRadius: CGFloat)

    Rectangular frame (with corner radius)

    circle

    Circular frame

    square(cornerRadius: CGFloat)

    Square frame (with corner radius)

    Case

    Description

    licenseFileNotFound

    The license is not found

    licenseParseError

    Cannot parse the license file, the license might be invalid

    licenseBundleError

    The bundle_id in the license file doesn't match with bundle_id used.

    licenseExpired

    The license is expired

    Case

    Description

    fromServiceToken

    Authorization with a token:

    • host: String

    • token: String

    • pins (optional): a list of sslPin

    fromCredentials

    Authorization with credentials:

    • host: String

    • login: String

    • password: String

    • pins (optional): a list of

    attemptsCount

    Int

    Number of attempts for media upload

    attemptsTimeout

    Int

    Timeout between attempts

    uploadOriginal

    The original video

    uploadCompressed

    The compressed video

    uploadBestShot

    The best shot taken from the video

    uploadNothing

    Nothing is sent (note that no folder will be created)

    Parameter

    Type

    Description

    publicKeyHash

    String

    SHA256 key hash in base64

    expiration date

    UNIX timestamp, UTC time

    The date of certificate expiration

    Parameter

    Type

    Description

    OzDataContainer

    bytearray[]

    An encrypted file containing media and collateral info, the output of the createMediaCaptureScreen method

    Parameter

    Type

    Description

    request

    CaptureRequest

    Detects a request for video capture

    session_token

    String

    Stores additional information to protect against replay attacks

    Parameter

    Type

    Description

    OzDataContainer

    bytearray[]

    An encrypted file containing media and collateral info

    Parameter

    Type

    Description

    analysisProfileList

    List<AnalysisProfile>

    A list of objects that contain information on media and analyses that should be applied to them

    folderMeta (optional)

    Map<String, Any>

    Additional folder metadata

    additionalMediaList (optional)

    List<MediaRequest>

    Media files that you need to upload to server, but it’s not necessary for analyses

    cameraPosition (optional)

    String

    front (default) – front camera

    back – rear camera

    Parameter

    Type

    Description

    mediaList

    List<MediaRequest>

    A list of media to be analyzed

    type

    AnalysisType

    Analysis type

    params (optional)

    Map<String, Any>

    Additional analysis parameters

    Parameter

    Type

    Description

    id

    String (UUID v4)

    Media ID

    actionMedia

    OzVerificationMovement

    An action that user should perform in a video

    userMedia

    OZMedia

    An external media file, e.g., a reference or a document photo

    LicenseError
    ISO 639-1arrow-up-right
    whitelisted certificates
    OzCapsula data container

    UIColor

    Int

    UIColor

    UIColor

    ABTestingConfigs

    left

    es

    URL

    failedBecauseOfAttemptLimit

    Progress

    Float

    String

    SUCCESS

    Float

    sslPin
    UploadMediaSettings
    SizeReductionStrategy
    OZMedia
    ResultMedia

    Android SDK Methods and Properties

    hashtag
    OzLivenessSDK

    A singleton for Oz SDK.

    hashtag
    clearActionVideos

    Deletes all action videos from file system.

    Parameters

    -

    Returns

    -

    hashtag
    createStartIntent

    Creates an intent to start the Liveness activity.

    Returns

    -

    hashtag
    getErrorFromIntent

    Utility function to get the SDK error from OnActivityResult's intent.

    Returns

    The – String.

    hashtag
    getLicensePayload

    Retrieves the SDK license payload.

    Parameters

    -

    Returns

    The license payload () – the object that contains the extended info about licensing conditions.

    hashtag
    getResultFromIntent

    Utility function to get SDK results from OnActivityResult's intent.

    Returns

    A list of OzAbstractMedia objects.

    hashtag
    init

    Initializes SDK with license sources.

    Returns

    -

    hashtag
    log

    Enables logging using the Oz Liveness SDK logging mechanism.

    Returns

    -

    hashtag
    setApiConnection

    Connection to API.

    hashtag
    setEventsConnection

    Connection to the telemetry server.

    hashtag
    logout

    Deletes the saved token.

    Parameters

    -

    Returns

    -

    hashtag
    getEventSessionId

    Retrieves the telemetry session ID.

    Parameters

    -

    Returns

    The telemetry session ID (String parameter).

    hashtag
    version

    Retrieves the SDK version.

    Parameters

    -

    Returns

    The SDK version (String parameter).

    hashtag
    generateSignedPayload

    Generates the payload with media signatures.

    Returns

    Payload to be sent along with media files that were used for generation.

    hashtag
    AnalysisRequest

    A class for performing checks.

    hashtag
    run

    The analysis launching method.

    hashtag
    class Builder

    A builder class for AnalysisRequest.

    hashtag
    build

    Creates the AnalysisRequest instance.

    Parameters

    -

    Returns

    The class instance.

    hashtag
    addAnalysis

    Adds an analysis to your request.

    Returns

    Error if any.

    hashtag
    addAnalyses

    Adds a list of analyses to your request. Allows executing several analyses for the same folder on the server side.

    Returns

    Error if any.

    hashtag
    addFolderMeta

    Adds metadata to a folder you create (for the server-based analyses only). You can add a pair key-value as additional information to the folder with the analysis result on the server side.

    Returns

    Error if any.

    hashtag
    uploadMedia

    Uploads one or more media to a folder.

    Returns

    Error if any.

    hashtag
    setFolderId

    For the previously created folder, sets a folderId. The folder should exist on the server side. Otherwise, a new folder will be created.

    Returns

    Error if any.

    hashtag
    OzConfig

    Configuration for OzLivenessSDK (use OzLivenessSDK.config).

    hashtag
    setSelfieLength

    Sets the length of the Selfie gesture (in milliseconds).

    Returns

    Error if any.

    hashtag
    allowDebugVisualization

    The possibility to enable additional debug info by clicking on version text.

    hashtag
    attemptSettings

    The number of attempts before SDK returns error.

    hashtag
    uploadMediaSettings

    Settings for repeated media upload.

    hashtag
    faceAlignmentTimeout

    Timeout for face alignment (measured in milliseconds).

    hashtag
    livenessErrorCallback

    Interface implementation to retrieve error by Liveness detection.

    hashtag
    localizationCode

    Locale to display string resources.

    hashtag
    logging

    Logging settings.

    hashtag
    useMainCamera

    Uses the main (rear) camera instead of the front camera for liveness detection.

    hashtag
    disableFramesCountValidation

    Disables the option that prevents videos to be too short (3 frames or less).

    hashtag
    UICustomization

    Customization for OzLivenessSDK (use OzLivenessSDK.config.customization).

    hashtag
    hideStatusBar

    Hides the status bar and the three buttons at the bottom. The default value is True.

    hashtag
    toolbarCustomization

    A set of customization parameters for the toolbar.

    hashtag
    centerHintCustomization

    A set of customization parameters for the center hint that guides a user through the process of taking an image of themselves.

    hashtag
    hintAnimation

    A set of customization parameters for the hint animation.

    hashtag
    faceFrameCustomization

    A set of customization parameters for the frame around the user face.

    hashtag
    backgroundCustomization

    A set of customization parameters for the background outside the frame.

    hashtag
    versionTextCustomization

    A set of customization parameters for the SDK version text.

    hashtag
    antiscamCustomization

    A set of customization parameters for the antiscam message that warns user about their actions being recorded.

    hashtag
    logoCustomization

    Logo customization parameters. Custom logo should be allowed by license. By default, logo is placed on the bottom left.

    hashtag
    Variables and Objects

    hashtag
    enum OzAction

    Contains the action from the captured video.

    hashtag
    class LicensePayload

    Contains the extended info about licensing conditions.

    hashtag
    sealed class OzAbstractMedia

    A class for the captured media that can be:

    hashtag
    OzDocumentPhoto

    A document photo.

    hashtag
    OzShotSet

    A set of shots in an archive.

    hashtag
    OzVideo

    A Liveness video.

    hashtag
    enum OzMediaTag

    Contains an action from the captured video.

    hashtag
    sealed class LicenseSource

    A class for license that can be:

    hashtag
    LicenseAssetId

    Contains the license ID.

    hashtag
    LicenseFilePath

    Contains the path to a license.

    hashtag
    class AnalysisStatus

    A class for analysis status that can be:

    hashtag
    RunningAnalysis

    This status means the analysis is launched.

    hashtag
    UploadingMedia

    This status means the media is being uploaded.

    hashtag
    enum Type

    The type of the analysis.

    circle-info

    Currently, the DOCUMENTS analysis can't be performed in the on-device mode.

    hashtag
    enum Mode

    The mode of the analysis.

    circle-info

    We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results.

    hashtag
    class Analysis

    Contains information on what media to analyze and what analyses to apply.

    hashtag
    enum Resolution

    The general status for all analyses applied to the folder created.

    hashtag
    class OzAttemptsSettings

    Holder for attempts counts before SDK returns error.

    hashtag
    enum OzLocalizationCode

    Contains the locale code according to .

    hashtag
    class OzLogging

    Contains logging settings.

    hashtag
    sealed class Color

    A class for color that can be (depending on the value received):

    hashtag
    ColorRes

    hashtag
    ColorHex

    hashtag
    ColorInt

    hashtag
    enum GeometryType

    Frame shape settings.

    hashtag
    class AnalysisError

    Exception class for AnalysisRequest.

    hashtag
    class SourceMedia

    Structure that describes media used in AnalysisRequest.

    hashtag
    class ResultMedia

    Structure that describes the analysis result for the single media.

    hashtag
    class RequestResult

    Consolidated result for all analyses performed.

    hashtag
    class AnalysisResult

    Result of the analysis for all media it was applied to.

    hashtag
    class OzConnection

    Defines the authentication method.

    hashtag
    OzConnection.fromServiceToken

    Authentication via token.

    hashtag
    OzConnection.fromCredentials

    Authentication via credentials.

    hashtag
    class OzUploadMediaSettings

    Defines the settings for the repeated media upload.

    hashtag
    enum SizeReductionStrategy

    Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully. By default, the system uploads the compressed video.

    hashtag
    sslPin

    Contains information about the .

    hashtag
    Data Container

    The methods below apply to the new feature – that has been implemented in 8.22.

    hashtag
    addContainer

    This method replaces addAnalysis in the AnalysisRequest structure when you use the data container flow.

    Input

    hashtag
    createMediaCaptureScreen

    Captures media file with all information you need and packages it into a data container.

    Input

    Output

    hashtag
    public data class CaptureRequest

    Detects a request for video capture.

    hashtag
    public data class AnalysisProfile

    Contains information on media files and analyses that should be applied to them.

    hashtag
    public sealed class MediaRequest

    Stores information about a media file.

    circle-exclamation

    Please note: you should add actionMedia OR userMedia, these parameters are mutually exclusive.

    hashtag
    Error Description

    Toolbar title text font style

    titleTextSize

    Int

    Toolbar title text size (in sp, 12-18)

    titleTextAlpha

    Int

    Toolbar title text opacity (in %, 0-100)

    titleTextColor

    Toolbar title text color

    backgroundColor

    Toolbar background color

    backgroundAlpha

    Int

    Toolbar background opacity (in %, 0-100)

    isTitleCentered

    Boolean

    Defines whether the text on the toolbar is centered or not

    title

    String

    Text on the toolbar

    Center hint text color

    textAlpha

    Int

    Center hint text opacity (in %, 0-100)

    verticalPosition

    Int

    Center hint vertical position from the screen bottom (in %, 0-100)

    backgroundColor

    Center hint background color

    backgroundOpacity

    Int

    Center hint background opacity

    backgroundCornerRadius

    Int

    Center hint background frame corner radius (in dp, 0-20)

    A switcher for hint animation, if True, the animation is hidden

    Frame color when a face is aligned properly

    strokeAlpha

    Int

    Frame opacity (in %, 0-100)

    strokeWidth

    Int

    Frame stroke width (in dp, 0-20)

    strokePadding

    Int

    A padding from the stroke to the face alignment area (in dp, 0-10)

    SDK version text opacity (in %, 20-100)

    Antiscam message text color

    textAlpha

    Int

    Antiscam message text opacity (in %, 0-100)

    backgroundColor

    Antiscam message background color

    backgroundOpacity

    Int

    Antiscam message background opacity

    cornerRadius

    Int

    Background frame corner radius (in px, 0-20)

    flashColor

    Color of the flashing indicator close to the antiscam message

    Head tilted downwards

    HeadUp

    Head lifted up

    EyeBlink

    Blink

    Smile

    Smile

    Media metadata

    Media metadata

    URL of the API media container

    additionalTags (optional)

    String

    Additional tags if needed (including those not from the OzMediaTag enum)

    metaData

    Map<String, String>

    Media metadata

    A video with the smile gesture

    VideoSelfieHigh

    A video with the lifting head up gesture

    VideoSelfieDown

    A video with the tilting head downwards gesture

    VideoSelfieRight

    A video with the turning head right gesture

    VideoSelfieLeft

    A video with the turning head left gesture

    PhotoIdPortrait

    A photo from a document

    PhotoIdBack

    A photo of the back side of the document

    PhotoIdFront

    A photo of the front side of the document

    Completion percentage

    Additional parameters

    sizeReductionStrategy

    Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully

    Spanish

    PT-BR

    Portuguese (Brazilian)

    Media object

    tags

    List<String>

    Tags for media

    Source media

    type

    Type of the analysis

    A list of results of the analyses for single media

    confidenceScore

    Float

    Resulting score

    analysisId

    String

    Analysis identifier

    params

    @RawValue Map<String, Any>

    Additional folder parameters

    error

    Error if any

    serverRawResponse

    String

    Response from backend

    Whitelisted certificates

    No found in a video

    FORCE_CLOSED = 7

    Error. Liveness activity is force closed from client application.

    A user closed the Liveness screen during video recording

    DEVICE_HAS_NO_FRONT_CAMERA = 8

    Error. Device has not front camera.

    No front camera found

    DEVICE_HAS_NO_MAIN_CAMERA = 9

    Error. Device has not main camera.

    No rear camera found

    DEVICE_CAMERA_CONFIGURATION_NOT_SUPPORTED = 10

    Error. Device camera configuration is not supported.

    Oz Liveness doesn't support the camera configuration of the device

    FACE_ALIGNMENT_TIMEOUT = 12

    Error. Face alignment timeout in OzLivenessSDK.config.faceAlignmentTimeout milliseconds

    Time limit for the is exceeded

    ERROR = 13

    The check was interrupted by user

    User has closed the screen during the Liveness check.

    Parameter

    Type

    Description

    actions

    OzAction

    A list of possible actions

    Parameter

    Type

    Description

    data

    Intent

    The object to test

    Parameter

    Type

    Description

    data

    Intent

    The object to test

    Parameter

    Type

    Description

    context

    Context

    The Context class

    licenseSources

    [LicenseSource]

    A list of license references

    statusListener

    StatusListener

    Optional listener to check the license load result

    Parameter

    Type

    Description

    tag

    String

    Message tag

    log

    String

    Message log

    Parameter

    Type

    Description

    connection

    OzConnection

    Connection type

    statusListener

    StatusListener<String?>

    Listener

    Parameter

    Type

    Description

    connection

    OzConnection

    Connection type

    statusListener

    StatusListener<String?>

    Listener

    Parameter

    Type

    Description

    media

    OzAbstractMedia

    An array of media files

    folderMeta (optional)

    [string:any]

    Additional folder metadata

    Parameter

    Type

    Description

    onStatusChange

    A callback function as follows:

    onStatusChange(status: AnalysisRequest.AnalysisStatus) { handleStatus() }

    The function is executed when the status of the AnalysisRequest changes.

    onError

    A callback function as follows:

    onError(error: OzException) { handleError() }

    The function is executed in case of errors.

    onSuccess

    A callback function as follows:

    onSuccess(result: RequestResult) {

    handleResults() }

    The function is executed when all the analyses are completed.

    Parameter

    Type

    Description

    analysis

    Analysis

    A structure for analysis

    Parameter

    Type

    Description

    analysis

    [Analysis]

    A list of Analysis structures

    Parameter

    Type

    Description

    key

    String

    Key for metadata.

    value

    String

    Value for metadata.

    Parameter

    Type

    Description

    mediaList

    [OzAbstractMedia]

    An OzAbstractMedia object or a list of objects.

    Parameter

    Type

    Description

    folderID

    String

    A folder identifier.

    Parameter

    Type

    Description

    selfieLength

    Int

    The length of the Selfie gesture (in milliseconds). Should be within 500-5000 ms, the default length is 700

    Parameter

    Type

    Description

    allowDebugVisualization

    Boolean

    Enables or disables the debug info.

    Parameter

    Type

    Description

    attemptsSettings

    OzAttemptsSettings

    Sets the number of attempts

    Parameter

    Type

    Description

    uploadMediaSettings

    OzUploadMediaSettings

    Sets the number of attempts and timeout between them

    Parameter

    Type

    Description

    faceAlignmentTimeout

    Long

    A timeout value

    Parameter

    Type

    Description

    livenessErrorCallback

    ErrorHandler

    A callback value

    Parameter

    Type

    Description

    localizationCode

    OzLocalizationCode

    A locale code

    Parameter

    Type

    Description

    logging

    OzLogging

    Logging settings

    Parameter

    Type

    Description

    useMainCamera

    Boolean

    True– rear camera, False– front camera

    Parameter

    Type

    Description

    disableFramesCountValidation

    Boolean

    True– validation is off, False– validation is on

    Parameter

    Type

    Description

    closeIconRes

    Int (@DrawableRes)

    An image for the close button

    closeIconTint

    Color

    Close button color

    titleTextFont

    Int (@FontRes)

    Toolbar title text font

    titleTextFontStyle

    Parameter

    Type

    Description

    textFont

    String

    Center hint text font

    textStyle

    Int (values from android.graphics.Typeface properties, e.g.,Typeface.BOLD)

    Center hint text style

    textSize

    Int

    Center hint text size (in sp, 12-34)

    textColor

    Parameter

    Type

    Description

    hintGradientColor

    Color

    Gradient color

    hintGradientOpacity

    Int

    Gradient opacity

    animationIconSize

    Int

    A side size of the animation icon square

    hideAnimation

    Parameter

    Type

    Description

    geometryType

    GeometryType

    The frame type: oval, rectangle, circle, square

    cornerRadius

    Int

    Rectangle corner radius (in dp, 0-20)

    strokeDefaultColor

    Color

    Frame color when a face is not aligned properly

    strokeFaceInFrameColor

    Parameter

    Type

    Description

    backgroundColor

    Color

    Background color

    backgroundAlpha

    Int

    Background opacity (in %, 0-100)

    Parameter

    Type

    Description

    textFont

    Int (@FontRes)

    SDK version text font

    textSize

    Int

    SDK version text size (in sp, 12-16)

    textColor

    Color

    SDK version text color

    textAlpha

    Parameter

    Type

    Description

    textMessage

    String

    Antiscam message text

    textFont

    String

    Antiscam message text font

    textSize

    Int

    Antiscam message text size (in px, 12-18)

    textColor

    Parameter

    Type

    Description

    image

    Bitmap (@DrawableRes)

    Logo image

    size

    Size

    Logo size (in dp)

    verticalPosition

    Int (0-100), default: 100

    Vertical offset

    horizontalPosition

    Int (0-100), default: 0

    Horizontal offset

    Case

    Description

    OneShot

    The best shot from the video taken

    Blank

    A selfie with face alignment check

    Scan

    Scan

    HeadRight

    Head turned right

    HeadLeft

    Head turned left

    Parameter

    Type

    Description

    expires

    Float

    The expiration interval

    features

    Features

    License features

    appIDS

    [String]

    An array of bundle IDs

    Parameter

    Type

    Description

    tag

    OzMediaTag

    A tag for a document photo.

    photoPath

    String

    An absolute path to a photo.

    additionalTags (optional)

    String

    Additional tags if needed (including those not from the OzMediaTag enum).

    metaData

    Parameter

    Type

    Description

    tag

    OzMediaTag

    A tag for a shot set

    archivePath

    String

    A path to an archive

    additionalTags (optional)

    String

    Additional tags if needed (including those not from the OzMediaTag enum)

    metaData

    Parameter

    Type

    Description

    tag

    OzMediaTag

    A tag for a video

    videoPath

    String

    A path to a video

    bestShotPath (optional)

    String

    URL of the best shot in PNG

    preferredMediaPath (optional)

    Case

    Description

    Blank

    A video with no gesture

    PhotoSelfie

    A selfie photo

    VideoSelfieOneShot

    A video with the best shot taken

    VideoSelfieScan

    A video with the scanning gesture

    VideoSelfieEyes

    A video with the blink gesture

    Parameter

    Type

    Description

    id

    Int

    License ID

    Parameter

    Type

    Description

    path

    String

    An absolute path to a license

    Parameter

    Type

    Description

    analysis

    Analysis

    Contains information on what media to analyze and what analyses to apply.

    Parameter

    Type

    Description

    media

    OzAbstractMedia

    The object that is being uploaded at the moment

    index

    Int

    Number of this object in a list

    from

    Int

    Objects quantity

    percentage

    Case

    Description

    BIOMETRY

    The algorithm that allows comparing several media and check if the people on them are the same person or not

    QUALITY

    The algorithm that aims to check whether a person in a video is a real human acting in good faith, not a fake of any kind.

    DOCUMENTS (deprecated)

    The analysis that aims to recognize the document and check if its fields are correct according to its type.

    Case

    Description

    ON_DEVICE

    The on-device analysis with no server needed

    SERVER_BASED

    The server-based analysis

    HYBRID

    The hybrid analysis for Liveness: if the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.

    Parameter

    Type

    Description

    type

    Type

    The type of the analysis

    mode

    Mode

    The mode of the analysis

    mediaList

    [OzAbstractMedia]

    An array of the OzAbstractMedia objects

    params (optional)

    Case

    Description

    FAILED

    One or more analyses failed due to some error and couldn't get finished

    DECLINED

    The check failed (e.g., faces don't match or some spoofing attack detected)

    SUCCESS

    Everything went fine, the check succeeded (e.g., faces match or liveness confirmed)

    OPERATOR_REQUIRED

    The result should be additionally checked by a human operator

    Parameter

    Type

    Description

    singleCount

    Int

    Attempts on a single action/gesture

    commonCount

    Int

    Total number of attempts on all actions/gestures if you use a sequence of them

    Case

    Description

    EN

    English

    HY

    Armenian

    KK

    Kazakh

    KY

    Kyrgyz

    TR

    Turkish

    Parameter

    Type

    Description

    allowDefaultLogging

    Boolean

    Allows logging to LogCat

    allowFileLogging

    Boolean

    Allows logging to an internal file

    journalObserver

    StatusListener

    An event listener to receive journal events on the application side

    Parameter

    Type

    Description

    resId

    Int

    Link to the color in the Android resource system

    Parameter

    Type

    Description

    hex

    String

    Color hex (e.g., #FFFFFF)

    Parameter

    Type

    Description

    color

    Int

    The Int value of a color in Android

    Case

    Description

    Oval

    Oval frame

    Rectangle

    Rectangular frame

    Circle

    Circular frame

    Square

    Square frame

    Parameter

    Type

    Description

    apiErrorCode

    Int

    Error code

    message

    String

    Error message

    Parameter

    Type

    Description

    mediaId

    String

    Media identifier

    mediaType

    String

    Type of the media

    originalName

    String

    Original media name

    ozMedia

    Parameter

    Type

    Description

    confidenceScore

    Float

    Resulting score

    isOnDevice

    Boolean

    Mode of the analysis

    resolution

    Resolution

    Consolidated analysis result

    sourceMedia

    Parameter

    Type

    Description

    analysisResults

    List<AnalysisResult>

    Analysis result

    folderId

    String

    Folder identifier

    resolution

    Resolution

    Consolidated analysis result

    Parameter

    Type

    Description

    resolution

    Resolution

    Consolidated analysis result

    type

    Type

    Type of the analysis

    mode

    Mode

    Resulting score

    resultMedia

    Parameter

    Type

    Description

    host

    String

    API address

    token

    String

    Access token

    sslPins (optional)

    List<sslPin>

    Whitelisted certificates

    Parameter

    Type

    Description

    host

    String

    API address

    username

    String

    User name

    password

    String

    Password

    sslPins (optional)

    Parameter

    Type

    Description

    attemptsCount

    Int

    Number of attempts for media upload

    attemptsTimeout

    Int

    Timeout between attempts

    Case

    Description

    UPLOAD_ORIGINAL

    The original video

    UPLOAD_COMPRESSED

    The compressed video

    UPLOAD_BEST_SHOT

    The best shot taken from the video

    UPLOAD_NOTHING

    Nothing is sent (note that no folder will be created)

    Parameter

    Type

    Description

    hash

    String

    SHA256 key hash in base64

    expiredAt

    UNIX timestamp, UTC time

    The date of certificate expiration

    Parameter

    Type

    Description

    OzDataContainer

    bytearray[]

    An encrypted file containing media and collateral info, the output of the createMediaCaptureScreen method

    Parameter

    Type

    Description

    request

    CaptureRequest

    Detects a request for video capture

    session_token

    String

    Stores additional information to protect against replay attacks

    Parameter

    Type

    Description

    OzDataContainer

    bytearray[]

    An encrypted file containing media and collateral info

    Parameter

    Type

    Description

    analysisProfileList

    List<AnalysisProfile>

    A list of objects that contain information on media and analyses that should be applied to them

    folderMeta (optional)

    Map<String, Any>

    Additional folder metadata

    additionalMediaList (optional)

    List<MediaRequest>

    Media files that you need to upload to server, but it’s not necessary for analyses

    cameraPosition (optional)

    String

    front (default) – front camera

    back – rear camera

    Parameter

    Type

    Description

    mediaList

    List<MediaRequest>

    A list of media to be analyzed

    type

    String (Type)

    Analysis type

    params (optional)

    Map<String, Any>

    Additional analysis parameters

    Parameter

    Type

    Description

    id

    String (UUID v4)

    Media ID

    actionMedia

    OzAction

    An action that user should perform in a video

    userMedia

    OzAbstractMedia

    An external media file, e.g., a reference or a document photo

    Error Code

    Error Message

    Description

    ERROR = 3

    Error.

    An unknown error has happened

    ATTEMPTS_EXHAUSTED_ERROR = 4

    Error. Attempts exhausted for liveness action.

    The number of action attempts is exceeded

    VIDEO_RECORD_ERROR = 5

    Error by video record.

    An error happened during video recording

    NO_ACTIONS_ERROR = 6

    error text
    LicensePayload
    AnalysisRequest
    ISO 639-1arrow-up-right
    whitelisted certificates
    OzCapsula data container

    Int (values from android.graphics.Typeface properties, e.g., Typeface.BOLD)

    Boolean

    Int

    HeadDown

    Map<String, String>

    Map<String, String>

    String

    VideoSelfieSmile

    Int

    Map<String, Any>

    ES

    List<>

    List<>

    Error. OzLivenessSDK started without actions.

    Color
    Color
    Color
    Color
    Color
    Color
    Color
    Color
    SizeReductionStrategy
    OzAbstractMedia
    SourceMedia
    Type
    ResultMedia
    AnalysisError
    sslPin
    actions
    face alignment