Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Add the following URL to the build.gradle of the project:
allprojects {
repositories {
maven { url "https://ozforensics.jfrog.io/artifactory/main" }
}
}Add this to the build.gradle of the module (VERSION is the version you need to implement. Please refer to Changelog):
Please note: this is the default version.
dependencies {
implementation 'com.ozforensics.liveness:sdk:VERSION'
}Please note: the resulting file will be larger.
Also, regardless of the mode chosen, add:
To start using Oz Android SDK, follow the steps below.
Embed Oz Android SDK into your project as described .
Get a trial license for SDK on our or a production license by . We'll need your application id. Add the license to your project as described .
dependencies {
implementation 'com.ozforensics.liveness:full:VERSION'
}android {
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
}Capture videos using methods described here. You'll send them for analysis afterward.
Analyze media you've taken at the previous step. The process of checking liveness and face biometry is described here.
If you want to customize the look-and-feel of Oz Android SDK, please refer to this section.
Recommended Android version: 5+ (the newer the smartphone is, the faster the analyses are).
Recommended versions of components:
Gradle
7.5.1
Kotlin
1.7.21
AGP
7.3.1
Java Target Level
1.8
JDK
17
We do not support emulators.
Available languages: EN, ES, HY, KK, KY, TR, PT-BR.
To obtain the sample apps source code for the Oz Liveness SDK, proceed to the GitLab repository:
Follow the link below to see a list of SDK methods and properties:
Download the demo app latest build here.
To start recording, use thestartActivityForResult method:
val intent = OzLivenessSDK.createStartIntent(listOf(OzAction.Smile, OzAction.Blank))
startActivityForResult(intent, REQUEST_CODE)List<OzAction> actions = Arrays.asList(OzAction.Smile, OzAction.Scan);
Intent intent = OzLivenessSDK.createStartIntent(actions);
startActivityForResult(intent, REQUEST_CODE);actions – a list of user actions while recording video.
For Fragment, use the code below. LivenessFragment is the Fragment representation of the Liveness screen UI.
To ensure the license being processed properly, we recommend initializing SDK first, then opening the Liveness screen.
To obtain the captured video, use theonActivityResult method:
sdkMediaResult – an object with video capturing results for interactions with Oz API (a list of the objects),
sdkErrorString – description of , if any.
If a user closes the capturing screen manually, resultCode receives the Activity.RESULT_CANCELED value.
Code example:
Master license is the offline license that allows using Mobile SDKs with any bundle_id, unlike the regular licenses. To get a master license, create a pair of keys as shown below. Email us the public key, and we will email you the master license shortly after that.
Your application needs to sign its bundle_id with the private key, and the Mobile SDK checks the signature using the public key from the master license. Master licenses are time-limited.
This section describes the process of creating your private and public keys.
To create a private key, run the commands below one by one.
You will get these files:
privateKey.der is a private .der key;
privateKey.txt is privateKey.der encoded by base64. This key containing will be used as the host app bundle_id signature.
The OpenSSL command specification:
To create a public key, run this command.
You will get the public key file: publicKey.pub. To get a license, please email us this file. We will email you the license.
SDK initialization:
Prior to the SDK initializing, create a base64-encoded signature for the host app bundle_id using the private key.
Signature creation example:
Pass the signature as the masterLicenseSignature parameter during the SDK initialization.
If the signature is invalid, the initialization continues as usual: the SDK checks the list of bundle_id included into the license, like it does it by default without a master license.
If you want to get back to the previous (up to 6.4.2) versions' design, reset the customization settings of the capture screen and apply the parameters that are listed below.
If you use our SDK just for capturing videos, omit this step.
To check liveness and face biometry, you need to upload media to our system and then analyze them.
Here’s an example of performing a check:
You can generate the trial license or contact us by to get a productive license. To create the license, your applicationId (bundle id) is required.
To pass your license file to the SDK, call the OzLivenessSDK.init method with a list of LicenseSources. Use one of the following:
LicenseSource.LicenseAssetId should contain a path to a license file called forensics.license
childFragmentManager.beginTransaction()
.replace(R.id.content, LivenessFragment.create(actions))
.commit()
// subscribing to the Fragment result
childFragmentManager.setFragmentResultListener(OzLivenessSDK.Extra.REQUEST_CODE, this) { _, result ->
when (result.getInt(OzLivenessSDK.Extra.EXTRA_RESULT_CODE)) {
OzLivenessResultCode.SUCCESS -> { /* start analysis */ }
else -> { /* show error */ }
}
}getSupportFragmentManager().beginTransaction()
.replace(R.id.content, LivenessFragment.Companion.create(actions, null, null, false))
.addToBackStack(null)
.commit();
// subscribing to the Fragment result
getSupportFragmentManager().setFragmentResultListener(OzLivenessSDK.Extra.REQUEST_CODE, this, (requestKey, result) -> {
switch (result.getInt(OzLivenessSDK.Extra.EXTRA_RESULT_CODE)) {
case OzLivenessResultCode.SUCCESS: {/* start analysis */}
default: {/* show error */}
}
});override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == REQUEST_CODE) {
sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
sdkErrorString = OzLivenessSDK.getErrorFromIntent(data)
}
}@Override
protected void onActivityResult(int requestCode, int resultCode, @androidx.annotation.Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_CODE) {
List<OzAbstractMedia> sdkMediaResult = OzLivenessSDK.INSTANCE.getResultFromIntent(data);
String sdkErrorString = OzLivenessSDK.INSTANCE.getErrorFromIntent(data);
}OzLivenessSDK.config.customization = UICustomization(
// customization parameters for the toolbar
toolbarCustomization = ToolbarCustomization(
closeIconTint = Color.ColorHex("#FFFFFF"),
backgroundColor = Color.ColorHex("#000000"),
backgroundAlpha = 100,
),
// customization parameters for the center hint
centerHintCustomization = CenterHintCustomization(
verticalPosition = 70
),
// customization parameters for the hint animation
new HintAnimation(
hideAnimation = true
),
// customization parameters for the frame around the user face
faceFrameCustomization = FaceFrameCustomization(
strokeDefaultColor = Color.ColorHex("#EC574B"),
strokeFaceInFrameColor = Color.ColorHex("#00FF00"),
strokeWidth = 6,
),
// customization parameters for the background outside the frame
backgroundCustomization = BackgroundCustomization(
backgroundAlpha = 100
),
)clearActionVideos method.To add metadata to a folder, use the addFolderMeta method.
In the params field of the Analysis structure, you can pass any additional parameters (key + value), for instance, to extract the best shot on the server side.
To use a media file that is captured with another SDK (not Oz Android SDK), specify the path to it in OzAbstractMedia:
If you want to add your media to the existing folder, use the setFolderId method:
analysisCancelable = AnalysisRequest.Builder()
// mediaToAnalyze is an array of OzAbstractMedia that were captured or otherwise created
.addAnalysis(Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaToAnalyze))// or ON_DEVICE if you want the on-device analysis
.build()
//initiating the analyses and setting up a listener
.run(object : AnalysisRequest.AnalysisListener {
override fun onStatusChange(status: AnalysisRequest.AnalysisStatus) { handleStatus(status) // or your status handler
}
override fun onSuccess(result: RequestResult) {
handleResults(result) // or your result handler
}
override fun onError(error: OzException) { handleError(error) // or your error handler
}
})analysisCancelable = new AnalysisRequest.Builder()
// mediaToAnalyze is an array of OzAbstractMedia that were captured or otherwise created
.addAnalysis(new Analysis(Analysis.Type.QUALITY, Analysis.Mode.SERVER_BASED, mediaToAnalyze)) // or ON_DEVICE if you want the on-device analysis
.build()
//initiating the analyses and setting up a listener
.run(new AnalysisRequest.AnalysisListener() {
@Override
public void onSuccess(@NonNull RequestResult list) { handleResults(list); } // or your result handler
@Override
public void onError(@NonNull OzException e) { handleError(e); } // or your error handler
@Override
public void onStatusChange(@NonNull AnalysisRequest.AnalysisStatus analysisStatus) { handleStatus(analysisStatus); } // or your status handler
})when (resultCode) {
Activity.RESULT_CANCELED -> *USER CLOSED THE SCREEN*
OzLivenessResultCode.SUCCESS -> {
val sdkMediaResult = OzLivenessSDK.getResultFromIntent(data)
*SUCCESS*
}
else -> {
val errorMessage = OzLivenessSDK.getErrorFromIntent(data)
*FAILURE*
}
}openssl genpkey -algorithm RSA -outform DER -out privateKey.der -pkeyopt rsa_keygen_bits:2048
# for MacOS
base64 -i privateKey.der -o privateKey.txt
# for Linux
base64 -w 0 privateKey.der > privateKey.txtopenssl rsa -pubout -in privateKey.der -out publicKey.pubfun init(
context: Context,
licenseSources: List<LicenseSource>,
masterLicenseSignature: String,
statusListener: StatusListener<LicensePayload>? = null,
)private fun getMasterSignature(): String {
Security.insertProviderAt(org.spongycastle.jce.provider.BouncyCastleProvider(), 1)
val privateKeyBase64String = "the string copied from the privateKey.txt file"
// with key example:
// val privateKeyBase64String = "MIIEpAIBAAKCAQEAxnpv02nNR34uNS0yLRK1o7Za2hs4Rr0s1V1/e1JZpCaK8o5/3uGV+qiaTbKqU6x1tTrlXwE2BRzZJLLQdTfBL/rzqVLQC/n+kAmvsqtHMTUqKquSybSTY/zAxqHF3Fk59Cqisr/KQamPh2tmg3Gu61rr9gU1rOglnuqt7FioNMCMvjW7ciPv+jiawLxaPrzNiApLqHVN+xCFh6LLb4YlGRaNUXlOgnoLGWSQEsLwBZFkDJDSLTJheNVn9oa3PXg4OIlJIPlYVKzIDDcSTNKdzM6opkS5d+86yjI1aTKEH3Zs64+QoEuoDfXUxS3TOUFx8P+wfjOR5tYAT+7TRN4ocwIDAQABAoIBAATWJPV05ZCxbXTURh29D/oOToZ0FVn78CS+44Vgy1hprAcfG9SVkK8L/r6X9PiXAkNJTR+Uivly64Oua8//bNC7f8aHgxRXojFmWwayj8iOMBncFnad1N2h4hy1AnpNHlFp3I8Yh1g0RpAZOOVJFucbTxaup9Ev0wLdWyGgQ3ENmRXAyLU5iUDwUSXg59RCBFKcmsMT2GmmJt1BU4P3lL9KVyLBktqeDWR/l5K5y8pPo6K7m9NaOkynpZo+mHVoOTCtmTj5TC/MH9YRHlF15VxQgBbZXuBPxlYoQCsMDEcZlMBWNw3cNR6VBmGiwHIc/tzSHZVsbY0VRCYEbxhCBZkCgYEA+Uz0VYKnIWViQF2Na6LFuqlfljZlkOvdpU4puYTCdlfpKNT3txYzO0T00HHY9YG9k1AW78YxQwsopOXDCmCqMoRqlbn1SBe6v49pVB85fPYU2+L+lftpPlx6Wa0xcgzwOBZonHb4kvp1tWhUH+B5t27gnvRz/rx5jV2EfmWinycCgYEAy8/aklZcgoXWf93N/0EZcfzQo90LfftkKonpzEyxSzqCw7B9fHY68q/j9HoP4xgJXUKbx1Fa8Wccc0DSoXsSiQFrLhnT8pE2s1ZWvPaUqyT5iOZOW6R+giFSLPWEdwm6+BeFoPQQFHf8XH3Z2QoAepPrEPiDoGN1GSIXcCwoe9UCgYEAgoKj4uQsJJKT1ghj0bZ79xVWQigmEbE47qI1u7Zhq1yoZkTfjcykc2HNHBaNszEBks45w7qo7WU5GOJjsdobH6kst0eLvfsWO9STGoPiL6YQE3EJQHFGjmwRbUL7AK7/Tw2EJG0wApn150s/xxRYBAyasPxegTwgEj6j7xu7/78CgYEAxbkI52zG5I0o0fWBcf9ayx2j30SDcJ3gx+/xlBRW74986pGeu48LkwMWV8fO/9YCx6nl7JC9dHI+xIT/kk8OZUGuFBRUbP95nLPHBB0Hj50YRDqBjCBh5qaizSEGeGFFNIfFSKddri3U8nnZTNiKLGCx7E3bjE7QfCh5qoX8ZF0CgYAtsEPTNKWZKA23qTFI+XAg/cVZpbSjvbHDSE8QB6X8iaKJFXbmIC0LV5tQO/KT4sK8g40m2N9JWUnaryTiXClaUGU3KnSlBdkIA+I77VvMKMGSg+uf4OdfJvvcs4hZTqZRdTm3dez8rsUdiW1cX/iI/dJxF4964YIFR65wL+SoRg=="
val sig = Signature.getInstance("SHA512WithRSA")
val keySpec = PKCS8EncodedKeySpec(Base64.decode(privateKeyBase64String, Base64.DEFAULT))
val keyFactory = KeyFactory.getInstance("RSA")
sig.initSign(keyFactory.generatePrivate(keySpec))
sig.update(packageName.toByteArray(Charsets.UTF_8))
return Base64.encodeToString(sig.sign(), Base64.DEFAULT).replace("\n", "")
}OzLivenessSDK.INSTANCE.getConfig().setCustomization(new UICustomization(
// customization parameters for the toolbar
new ToolbarCustomization(
R.drawable.ib_close,
new Color.ColorHex("#FFFFFF"),
new Color.ColorHex("#000000"),
100, // toolbar text opacity (in %)
),
// customization parameters for the center hint
new CenterHintCustomization(
70, // vertical position (in %)
),
// customization parameters for the hint animation
new HintAnimation(
true, // hide animation
),
// customization parameters for the frame around the user face
new FaceFrameCustomization(
new Color.ColorHex("#EC574B"),
new Color.ColorHex("#00FF00"),
6, // frame stroke width (in dp)
),
// customization parameters for the background outside the frame
new BackgroundCustomization(
100 // background opacity (in %)
),
)
); .addFolderMeta(
mapOf(
"key1" to "value1",
"key2" to "value2"
)
).addFolderMeta(Collections.singletonMap("key", "value")) mapOf("extract_best_shot" to true) val file = File(context.filesDir, "media.mp4") // use context.getExternalFilesDir(null) instead of context.filesDir for external app storage
val media = OzAbsractMedia.OzVideo(OzMediaTag.VideoSelfieSmile, file.absolutePath) .setFolderId(folderId)LicenseSource.LicenseFilePath should contain a file path to the place in the device's storage where the license file is located.
OzLivenessSDK.init(context,
listOf(
LicenseSource.LicenseAssetId(R.raw.your_license_name),
LicenseSource.LicenseFilePath("absolute_path_to_your_license_file")
),
object : StatusListener<LicensePayload> {
override fun
OzLivenessSDK.INSTANCE.getConfig().setBaseURL(BASE_URL);
OzLivenessSDK.INSTANCE.init(context,
Arrays.asList(
new LicenseSource.LicenseAssetId(R.raw.forensics)
In case of any license errors, the onError function is called. Use it to handle the exception as shown above. Otherwise, the system will return information about license. To check the license data manually, use the getLicensePayload method.
License error. License at (your_URI) not found
The license file is missing. Please check its name and path to the file.
License error. Cannot parse license from (your_URI), invalid format
The license file is somehow damaged. Please email us the file.
License error. Bundle company.application.id is not in the list allowed by license (bundle.id1, bundle.id2)
The bundle (application) identifier you specified is missing in the allowed list. Please check the spelling, if it is correct, you need to get another license for your application.
License error. Current date yyyy-mm-dd hh:mm:ss is later than license expiration date yyyy-mm-dd hh:mm:ss
Your license has expired. Please contact us.
License is not initialized. Call 'OzLivenessSDK.init before using SDK
You haven't initialized the license. Call OzLivenessSDK.init with your license data as explained above.
To connect SDK to Oz API, specify the API URL and access token as shown below.
OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(host, token))OzLivenessSDK.INSTANCE.setApiConnection(
OzConnection.Companion.fromServiceToken(host, token),
null
);Please note:
In your host application, it is recommended that you set the API address on the screen that precedes the liveness check. Setting the API URL initiates a service call to the API, which may cause excessive server load when being done at the application initialization or startup. We recommend calling the setApiConnection method once, for example, in the Application class.
The order of SDK initialization and API connection does not matter, but both methods must be finished successfully before invoking the createStartIntent method.
Alternatively, you can use the login and password provided by your Oz Forensics account manager:
Although, the preferred option is authentication via access token – for security reasons.
By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for as shown below:
Clearing authorization:
Check for the presence of the saved Oz API access token:
LogOut:
Please note: this feature has been implemented in 8.1.0.
To add or update the language pack for Oz iOS SDK, use the set(languageBundle: Bundle) method. It shows the SDK that you are going to use the non-standard bundle. In OzLocalizationCode, use the custom language (optional).
To connect SDK to Oz API, specify the API URL and as shown below.
Alternatively, you can use the login and password provided by your Oz Forensics account manager:
By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for as shown below:
Please note: this feature has been implemented in 8.1.0.
To add or update the language pack for Oz Android SDK, please follow these instructions:
This callback is called when the system encounters any error. It contains the error details and telemetry ID that you can use for further investigation.
Create the file called strings.xml.
Copy the strings from the attached file to your freshly created file.
Redefine the strings you need in the appropriate localization records.
A list of keys for Android:
The keys action_*_go refer to the appropriate gestures. Others refer to the hints for any gesture, info messages, or errors.
When new keys appear with new versions, if no translation is provided in your file, the new strings are shown in English.
OzLivenessSDK.setApiConnection(
OzConnection.fromCredentials(host, username, password),
statusListener(
{ token -> /* token */ },
{ ex -> /* error */ }
)
)OzLivenessSDK.INSTANCE.setApiConnection(
OzConnection.Companion.fromCredentials(host, username, password),
new StatusListener<String>() {
@Override
public void onStatusChanged(@Nullable String s) {}
@Override
public void onSuccess(String token) { /* token */ }
@Override
public void onError(@NonNull OzException e) { /* error */ }
}
);OzLivenessSDK.setEventsConnection(
OzConnection.fromCredentials(
"https://echo.cdn.ozforensics.com/",
"<[email protected]>",
"your_telemetry_password"
)
)OzLivenessSDK.setEventsConnection(
OzConnection.fromCredentials(
"https://tm.ozforensics.com/",
"<[email protected]>",
"your_telemetry_password"
)
);OZSDK.setApiConnection(Connection.fromServiceToken(host: "https://sandbox.ohio.ozforensics.com", token: token)) { (token, error) in
}OZSDK.setApiConnection(Connection.fromCredentials(host: “https://sandbox.ohio.ozforensics.com”, login: login, password: p)) { (token, error) in
// Your code to handle error or token
}let eventsConnection = Connection.fromCredentials(host: https://echo.cdn.ozforensics.com/,
login: <[email protected]>,
password: your_telemetry_password)
OZSDK.setEventsConnection(eventsConnection) { (token, error) in
}on_error {
"code": "error_code",
"event_session_id": "id_of_telemetry_session_with_error",
"message": "<error decription>",
"context": {} // additional information if any
}Create a controller that will capture videos as follows:
let actions: [OZVerificationMovement] = [.selfie]
let ozLivenessVC: UIViewController = OZSDK.createVerificationVCWithDelegate(self, actions: actions)
self.present(ozLivenessVC, animated: true)action – a list of user’s actions while capturing the video.
Once video is captured, the system calls the onOZLivenessResult method:
extension viewController: OZLivenessDelegate {
func onError(status: OZVerificationStatus?) {
// show error
}
}
func onOZLivenessResult(results: [OZMedia]) {
// proceed to the checks step
}
}The method returns the results of video capturing: the [OZMedia] objects. The system uses these objects to perform checks.
If a user closes the capturing screen manually, the failedBecauseUserCancelled error appears.
When result_mode is safe, the on_complete callback contains the state of the analysis only:
Please note: The options listed below are for testing purposes only. If you require more information than what is available in the Safe mode, please follow Security Recommendations.
For the status value, the callback contains the state of the analysis, and for each of the analysis types, the name of the type, state, and resolution.
The folder value is almost similar to the status value, with the only difference: the folder_id is added.
In this case, you receive the detailed response possibly containing sensitive data. This mode is deprecated; for security reasons, we recommend using the safe mode.
plugin_liveness.php) that you received from Oz Forensics to the HTML code of the page. web-sdk-root-url is the Web Adapter link you've received from us.OzLivenessSDK.setApiConnection(null)OzLivenessSDK.INSTANCE.setApiConnection(null, null);val isLoggedIn = OzLivenessSDK.isLoggedInboolean isLoggedIn = OzLivenessSDK.INSTANCE.isLoggedIn();OzLivenessSDK.logout()OzLivenessSDK.INSTANCE.logout();{
"state": "finished"
}{
"state": "finished",
"analyses": {
"quality": {
"state": "finished",
"resolution": "success"
}
}
}{
"state": "finished",
"folder_id": "your_folder_id",
"analyses": {
"quality": {
"state": "finished",
"resolution": "success"
}
}
}<script src="https://web-sdk-root-url/plugin_liveness.php"></script>If you don’t set the custom language and bundle, the SDK uses the pre-installed languages only.
If the custom bundle is set (and language is not), it has a priority when checking translations, i.e, SDK checks for the localization record in the custom bundle localization file. If the key is not found in the custom bundle, the standard bundle text for this key is used.
If both custom bundle and language are set, SDK retrieves all the translations from the custom bundle localization file.
A list of keys for iOS:
The keys Action.*.Task refer to the appropriate gestures. Others refer to the hints for any gesture, info messages, or errors.
When new keys appear with new versions, if no translation is provided by your custom bundle localization file, you’ll see the default (English) text.
To start using Oz iOS SDK, follow the steps below.
Embed Oz iOS SDK into your project as described here.
Connect SDK to API as described . This step is optional, as this connection is required only when you need to process data on a server.
Capture videos by creating the controller as described . You'll send them for analysis afterwards.
Upload and analyze media you've taken at the previous step. The process of checking liveness and face biometry is described .
If you want to customize the look-and-feel of Oz iOS SDK, please refer to .
Minimal iOS version: 11.
Minimal Xcode version: 16.
Available languages: EN, ES, HY, KK, KY, TR, PT-BR.
A sample app source code using the Oz Liveness SDK is located in the GitLab repository:
Follow the link below to see a list of SDK methods and properties:
Download the demo app latest build .
In this section, we explain how to use Oz Flutter SDK for iOS and Android.
Before you start, it is recommended that you install:
Flutter 3.0.0 or higher;
Android SDK 21 or higher;
dart 2.18.6 or higher;
iOS platform 13 or higher;
Xcode.
Please find the Flutter repository .
To integrate OZLivenessSDK into an Xcode project via the dependency manager, add the following code to Podfile:
Version is optional as, by default, the newest version is integrated. However, if necessary, you can find the older version number in .
Since 8.1.0, you can also use a simpler code:
By default, the full version is being installed. It contains both server-based and on-device analysis modes. To install the server-based version only, use the following code:
If you use our SDK just for capturing videos, omit this step.
To check liveness and face biometry, you need to upload media to our system and then analyze them.
Below, you'll see the example of performing a check and its description.
To delete media files after the checks are finished, use the cleanTempDirectory method.
The add_lang(lang_id, lang_obj) method allows adding a new or customized language pack.
Parameters:
lang_id: a string value that can be subsequently used as lang parameter for the open() method;
To set your own look-and-feel options, use the style section in the Ozliveness.open method. Here is what you can change:
faceFrame – the color of the frame around a face:
faceReady
Web Plugin is a plugin called by your web application. It works in a browser context. The Web Plugin communicates with Web Adapter, which, in turn, communicates with Oz API.
Please find a sample for Oz Liveness Web SDK . To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.
For the samples below, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.
sample
If you want to get back to the previous (up to 6.4.2) versions' design, reset the customization settings of the capture screen and apply the parameters that are listed below.
Please note: for the plugin to work, your browser version should support JavaScript ES6 and be the one as follows or newer.
To generate the license, we need the domain name of the website where you are going to use Oz Forensics Web SDK, for instance, your-website.com. You can also define subdomains.

17
Opera
47
*Web SDK doesn't work in Internet Explorer compatibility mode due to lack of important functions.
Google Chrome (and other browsers based on the Chromium engine)
56
Mozilla Firefox
55
Safari
11
Microsoft Edge*
// customization parameters for the toolbar
let toolbarCustomization = ToolbarCustomization(
closeButtonColor: .white,
backgroundColor: .black)
// customization parameters for the center hint
let centerHintCustomization = CenterHintCustomization(
verticalPosition: 70)
// customization parameters for the center hint animation
let hintAnimationCustomization = HintAnimationCustomization(
hideAnimation: true)
// customization parameters for the frame around the user face
let faceFrameCustomization = FaceFrameCustomization(
strokeWidth: 6,
strokeFaceAlignedColor: .green,
strokeFaceNotAlignedColor: .red)
// customization parameters for the background outside the frame
let backgroundCustomization = BackgroundCustomization(
backgroundColor: .clear)
OZSDK.customization = OZCustomization(toolbarCustomization: toolbarCustomization,
centerHintCustomization: centerHintCustomization,
hintAnimationCustomization: hintAnimationCustomization,
faceFrameCustomization: faceFrameCustomization,
versionCustomization: vesrionCustomization,
backgroundCustomization: backgroundCustomization)Please note: installation via SPM is available for versions 8.7.0 and above.
Add the following package dependencies via SPM: https://gitlab.com/oz-forensics/oz-mobile-ios-sdk (if you need a guide on adding the package dependencies, please refer to the Apple documentation). OzLivenessSDK is mandatory. If you don't need the on-device analyses, skip the OzLivenessSDKOnDevice file.
You can also add the necessary frameworks to your project manually.
Download the SDK files from here and add them to your project.
OZLivenessSDK.xcframework,
OZLivenessSDKResources.bundle,
OZLivenessSDKOnDeviceResources.bundle (if you don't need the on-device analyses, skip this file).
Download the TensorFlow framework 2.11 from here.
Make sure that:
both xcframework are in Target-Build Phases -> Link Binary With Libraries and Target-General -> Frameworks, Libraries, and Embedded Context;
the bundle file(s) are in Target-Build Phases -> Copy Bundle Resources.
To add metadata to a folder, use AnalysisRequest.addFolderMeta.
In the params field of the Analysis structure, you can pass any additional parameters (key + value), for instance, to extract the best shot on the server side.
To use a media file that is captured with another SDK (not Oz iOS SDK), specify the path to it in the OzMedia structure (the bestShotURL property):
If you want to add your media to the existing folder, use the addFolderId method:
During the runtime: when initializing SDK, use the following method.
or
LicenseSource a source of license, and LicenseData is the information about your license. Please note: this method checks whether you have an active license or not and if yes, this license won't be replaced with a new one. To force the license replacement, use the setLicense method.
In case of any license errors, the system will use your error handling code as shown above. Otherwise, the system will return information about license. To check the license data manually, use OZSDK.licenseData.
License error. License at (your_URI) not found
The license file is missing. Please check its name and path to the file.
License error. Cannot parse license from (your_URI), invalid format
The license file is somehow damaged. Please email us the file.
License error. Bundle company.application.id is not in the list allowed by license (bundle.id1, bundle.id2)
The bundle (application) identifier you specified is missing in the allowed list. Please check the spelling, if it is correct, you need to get another license for your application.
License error. Current date yyyy-mm-dd hh:mm:ss is later than license expiration date yyyy-mm-dd hh:mm:ss
Your license has expired. Please contact us.
License is not initialized.
You haven't initialized the license. Please add the license to your project as described above.
When result_mode is safe, the on_result callback contains the state of the analysis only:
or
Please note: the options listed below are for testing purposes only. If you require more information than what is available in the Safe mode, please follow Security Recommendations.
For the status value, the callback contains the state of the analysis, and for each of the analysis types, the name of the type, state, and resolution.
or
The folder value is almost similar to the status value, with the only difference: the folder_id is added.
In this case, you receive the detailed response possibly containing sensitive data. This mode is deprecated; for security reasons, we recommend using the safe mode.
lang_obj: an object that includes identifiers of translation strings as keys and translation strings themselves as values.A list of language identifiers:
en
English
es
Spanish
pt-br*
Portuguese (Brazilian)
kz
Kazakh
*Formerly pt, changed in 1.3.1.
An example of usage:
OzLiveness.add_lang('en', enTranslation), where enTranslation is a JSON object.
To set the SDK language, when you launch the plugin, specify the language identifier in lang:
You can check which locales are installed in Web SDK: use the ozLiveness.get_langs() method. If you have added a locale manually, it will also be shown.
A list of all language identifiers:
The keys oz_action_*_go refer to the appropriate gestures. oz_tutorial_camera_* – to the hints on how to enable camera in different browsers. Others refer to the hints for any gesture, info messages, or errors.
Since 1.5.0, if your language pack doesn't include a key, the message for this key will be shown in English.
faceNotReady – the frame color when the face is placed improperly and can't be analyzed.
centerHint – the text of the hint that is displayed in the center.
textSize – the size of the text;
color – the color of the text;
yPosition – the vertical position measured from top;
letterSpacing – the spacing between letters;
fontStyle – the style of font (bold, italic, etc.).
closeButton – the button that closes the plugin:
image – the button image, can be an image in PNG or dataURL in base64.
backgroundOutsideFrame – the color of the overlay filling (outside the frame):
color – the fill color.
Example:
Set the license as shown below:
With license data:
With license path:
Check whether the license is updated properly.
Example
Proceed to your website origin and launch Liveness -> Simple selfie.
Once the license is added, the system will check its validity on launch.
OzLiveness.open({
license: {
'payload_b64': 'some_payload',
'signature': 'some_data',
'enc_public_key': 'some_key'
},
...,
})OzLiveness.open({
licenseUrl: 'https://some_url',
...,
})pod 'OZLivenessSDK', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios', :tag => 'VERSION'// for the latest version
pod ‘OZLivenessSDK’
// OR, for the specific version
// pod ‘OZLivenessSDK’, ‘8.10.0’pod 'OZLivenessSDK/Core', :git => 'https://gitlab.com/oz-forensics/oz-liveness-ios.git', :tag => 'VERSION'pod ‘OZLivenessSDK/Core’
// OR
// pod ‘OZLivenessSDK/Core’, ‘8.1.0’let analysisRequest = AnalysisRequestBuilder()
// create one or more analyses
let analysis = Analysis.init(
media: mediaToAnalyze, // mediaToAnalyze is an array of OzMedia that were captured or otherwise created
type: .quality, // check the analysis types in iOS methods
mode: .serverBased) // or .onDevice if you want the on-device analysis
analysisRequest.uploadMedia(mediaToAnalyze)
analysisRequest.addAnalysis(analysis)
// initiate the analyses
analysisRequest.run(
statusHandler: { state in }, // scenario steps progress handler
errorHandler: { _ in }
) { result in
// receive and handle analyses results here
}let analysis = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased)
var folderMeta: [String: Any] = ["key1": "value1"]
analysisRequest.addFolderMeta(folderMeta)
...let analysis = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased, params: [“extract_best_shot” : true])let referenceMedia = OZMedia.init(movement: .selfie,
mediaType: .movement,
metaData: ["meta":"data"],
videoURL: nil,
bestShotURL: imageUrl,
preferredMediaURL: nil,
timestamp: Date())let analysis = Analysis.init(media: mediaToAnalyze, type: .quality, mode: .serverBased)
analysisRequest.addFolderId(IdRequired)OZSDK(licenseSources: [.licenseFileName(“forensics.license”)]) { licenseData, error in
if let error = error {
print(error)
}
}OZSDK(licenseSources: [.licenseFilePath(“path_to_file”)]) { licenseData, error in
if let error = error {
print(error)
}
}{
"state": "processing"
}{
"state": "finished"
}{
"state": "processing",
"analyses": {
"quality": {
"state": "processing",
"resolution": ""
}
}
}{
"state": "finished",
"analyses": {
"quality": {
"state": "finished",
"resolution": "success"
}
}
}{
"state": "processing",
"folder_id": "your_folder_id",
"analyses": {
"quality": {
"state": "processing",
"resolution": ""
}
}
}// Editing the button text
OzLiveness.add_lang('en', {
action_photo_button: 'Take a photo'
});OzLiveness.open({
lang: 'es', // the identifier of the needed language
...
});OzLiveness.open({
// ...
style: {
// the backward compatibility block
doc_color: "",
face_color_success: "",
face_color_fail: "",
// the current customization block
faceFrame: {
faceReady: "",
faceNotReady: "",
},
centerHint: {
textSize: "",
color: "",
yPosition: "",
letterSpacing: "",
fontStyle: "",
},
closeButton: {
image: "",
},
backgroundOutsideFrame: {
color: "",
},
},
// ...
});To customize the Oz Liveness interface, use UIcustomization as shown below. For the description of customization parameters, please refer to Android SDK Methods and Properties.
OzLivenessSDK.config.customization = UICustomization(
// customization parameters for the toolbar
toolbarCustomization = ToolbarCustomization(
closeIconRes = R.drawable.ib_close,
closeIconTint = Color.ColorRes(R.color.white),
titleTextFont = R.font.roboto,
titleTextSize =
By default, SDK uses the locale of the device. To switch the locale, use the code below:
// connecting to the API server
OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(HOST, TOKEN))
// settings for the number of attempts to detect an action
OzLivenessSDK.config.attemptSettings = attemptSettings
// the possibility to display additional debug information (you can do it by clicking the SDK version number)
OzLivenessSDK.config.allowDebugVisualization = allowDebugVisualization
// logging settings
OzLivenessSDK.config.logging = ozLogging OzConfig config = OzLivenessSDK.INSTANCE.getConfig();
// connecting to the API server
OzLivenessSDK.setApiConnection(OzConnection.fromServiceToken(HOST, TOKEN));
// settings for the number of attempts to detect an action
config.setAttemptSettings(attemptSettings);
// the possibility to display additional debug information (you can do it by clicking the SDK version number)
config.setAllowDebugVisualization(allowDebugVisualization);
// logging settings
config.setLogging(ozLogging); Vue sample
Svelte sample
In 8.8.0, we’ve implemented SSL pinning to protect our clients from MITM attacks. We strongly recommend adding a built-in certificate whitelist to your application to prevent fraud with third-party certificates set as trusted.
You can add a list of certificates your application should trust at the moment of connection to Oz API via the optional sslPins field of OzConnection class. As an input, this field takes a list of public certificate key hashes with their expiration dates as shown below:
Go to the website.
Enter your domain address. Once the address is processed, you’ll see a list of your servers.
Click the server address needed to load a list of certificates. Certificate key is in the Pin SHA256 line of the Subject field. Expiration date is shown in the Valid until field.
Certificate number one is your host certificate. Your root certificate is in the very bottom of the list. Others are intermediate. For SSL pinning, any of them fits.
The higher the certificate is on the list, the better the level of protection against theft. Thus, if you use the host certificate to pin in your application, you get the highest security level. However, the lifetime of these certificates is significantly shorter than that of intermediate or root certificates. To keep your application secure, you will need to change your pins as soon as they expire; otherwise, functionality might become unavailable.
As a reasonable balance between safety and the resources needed to maintain it, we recommend using intermediate or even root certificate keys for pinning. While the security level is slightly lower, you won’t need to change these pins as often because these certificates have a much longer lifetime.
To obtain the hash, run the following command with your server domain and port:
In the response, you’ll receive hash for your SslPin.
To get the certificate’s expiration date, run the next command – again with your server domain and port:
The date you require will be in the notAfter parameter.
We’ll provide you with the hash and date of our API server certificate.
To force the closing of the plugin window, use the close() method. All requests to server and callback functions (except on_close) within the current session will be aborted.
Example:
var session_id = 123;
OzLiveness.open({
// We transfer the arbitrary meta data, by which we can later identify the session in Oz API
meta: {
session_id: session_id
},
// After sending the data, forcibly close the plugin window and independently request the result
on_submit: function() {
OzLiveness.close();
my_result_function(session_id);
}
});To hide the plugin window without cancelling the requests for analysis results and user callback functions, call the hide() method. Use this method, for instance, if you want to display your own upload indicator after submitting data.
An example of usage:
Even though the analysis result is available to the host application via Web Plugin callbacks, it is recommended that the application back end receives it directly from Oz API. All decisions of the further process flow should be made on the back end as well. This eliminates any possibility of malicious manipulation with analysis results within the browser context.
To find your folder from the back end, you can follow these steps:
On the front end, add your unique identifier to the folder metadata.
You can add your own key-value pairs to attach user document numbers, phone numbers, or any other textual information. However, ensure that tracking personally identifiable information (PII) complies with relevant regulatory requirements.
Use the on_complete callback of the plugin to be notified when the analysis is done. Once used, call your back end and pass the transaction_id value.
On the back end side, find the folder by the identifier you've specified using the Oz API Folder LIST method:
To speed up the processing of your request, we recommend adding the time filter as well:
Web Adapter may send analysis results to the Web Plugin with various levels of verbosity. It is recommended that, in production, the level of verbosity is set to minimum.
In the Web Adapter file, set the result_mode parameter to "safe".
To customize the Oz Liveness interface, use OZCustomization as shown below. For the description of customization parameters, please refer to .
Please note: the customization methods should be called before the video capturing ones.
Oz Liveness Web SDK is a module for processing data on clients' devices. With Oz Liveness Web SDK, you can take photos and videos of people via their web browsers and then analyze these media. Most browsers and devices are supported. Available languages: EN, ES, PT-BR, KK.
Please find a sample for Oz Liveness Web SDK . To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.
For Angular and React, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.
OzLivenessSDK.INSTANCE.getConfig().setCustomization(new UICustomization(
// customization parameters for the toolbar
new ToolbarCustomization(
R.drawable.ib_close,
new Color.ColorRes(R.color.white),
R.style.Sdk_Text_Primary,
new Color.ColorRes(R.color.white),
R.font.roboto,
Typeface.NORMAL,
100, // toolbar text opacity (in %)
18, // toolbar text size (in sp)
new Color.ColorRes(R.color.black),
60, // toolbar alpha (in %)
"Liveness", // toolbar title
true // center toolbar title
),
// customization parameters for the center hint
new CenterHintCustomization(
R.font.roboto,
new Color.ColorRes(R.color.text_color),
20,
50,
R.style.Sdk_Text_Primary,
new Color.ColorRes(R.color.color_surface),
100, // background opacity
14, // corner radius for background frame
100 // text opacity
),
// customization parameters for the hint animation
new HintAnimation(
new Color.ColorRes(R.color.red), // gradient color
80, // gradient opacity (in %)
120, // the side size of the animation icon square
false // hide animation
),
// customization parameters for the frame around the user face
new FaceFrameCustomization(
GeometryType.RECTANGLE,
10, // frame corner radius (for GeometryType.RECTANGLE)
new Color.ColorRes(R.color.error_red),
new Color.ColorRes(R.color.success_green),
100, // frame stroke alpha (in %)
5, // frame stroke width (in dp)
3 // frame stroke padding (in dp)
),
// customization parameters for the background outside the frame
new BackgroundCustomization(
new Color.ColorRes(R.color.black),
60 // background alpha (in %)
),
// customization parameters for the SDK version text
new VersionTextCustomization(
R.style.Sdk_Text_Primary,
R.font.roboto,
12, // version text size
new Color.ColorRes(R.color.white),
100 // version text alpha
),
// customization parameters for the antiscam protection text
new AntiScamCustomization(
"Recording .. ",
R.font.roboto,
12,
new Color.ColorRes(R.color.text_color),
100,
R.style.Sdk_Text_Primary,
new Color.ColorRes(R.color.color_surface),
100,
14,
new Color.ColorRes(R.color.green)
)
// custom logo parameters
new LogoCustomization(
new Image.Drawable(R.drawable.ic_logo),
new Size(176, 64)
)
)
);OzLivenessSDK.config.localizationCode = OzLivenessSDK.OzLocalizationCode.ENOzLivenessSDK.INSTANCE.getConfig().setLocalizationCode(OzLivenessSDK.OzLocalizationCode.EN)Certificate owner
Trust level
Resources requirements (depend on the certificate’s lifetime)
Host
Highest
High, but requires the most resources to maintain: keys’ list should be updated at the same time as certificate
Intermediate certificate authority
Above average; the application considers all certificates that have been issued by this authority as trusted
Average
Root certificate authority
Average; the application considers all certificates that have been issued by this authority as trusted, including the intermediate authority-issued certificates
Low


OzLiveness.open({
// When receiving an intermediate result, hide the plugin window and show your own loading indicators
on_result: function(result) {
OzLiveness.hide();
if (result.state === 'processing') {
show_my_loader();
}
},
on_complete: function() {
hide_my_loader();
}
});folder_id for future reference.Connection.fromServiceToken(
"your API server host",
"your token",
listOf(
SslPin(
"your hash", // SHA256 key hash in base64
<date> // key expiration date as a UNIX timestamp, UTC time
)
),
)let pins = [SSLPin.pin(
publicKeyHash: "your hash", // SHA256 key hash in base64
expirationDate: date)] // key expiration date as a UNIX timestamp, UTC time
OZSDK.setApiConnection(.fromServiceToken(
host: "your API server host",
token: "your token",
sslPins: pins)) { (token, error) in
//
}echo | openssl s_client -connect {SERVER_DOMAIN_NAME}:{PORT} 2> /dev/null | openssl x509 -pubkey -noout | openssl pkey -pubin -outform der | openssl dgst -sha256 -binary| openssl enc -base64openssl s_client -servername {SERVER_DOMAIN_NAME} -connect {SERVER_DOMAIN_NAME}:{PORT} | openssl x509 -noout -dates
echo -n Q | openssl s_client -servername {SERVER_DOMAIN_NAME} -connect {SERVER_DOMAIN_NAME}:{PORT} | openssl x509 -noout -datesOzLiveness.open({
...
meta: {
// the user or lead ID from an external lead generator
// that you can pass to keep track of multiple attempts made by the same user
'end_user_id': '<user_or_lead_id>',
// the unique attempt ID
'transaction_id': '<unique_transaction_id>'
}
});/api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true/api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true/api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true&time_created.min=([CURRENT_TIME]-1hour)"result_mode": "safe"// customization parameters for the toolbar
let toolbarCustomization = ToolbarCustomization(
closeButtonIcon: UIImage(named: "example"),
closeButtonColor: .black.withAlphaComponent(0.8),
titleText: "",
titleFont: .systemFont(ofSize: 18, weight: .regular),
titleColor: .gray,
backgroundColor: .lightGray)
// customization parameters for the center hint
let centerHintCustomization = CenterHintCustomization(
textColor: .white,
textFont: .systemFont(ofSize: 22, weight: .regular),
verticalPosition: 42,
backgroundColor: UIColor.init(hexRGBA: "1C1C1E8F")!,
hideTextBackground: false,
backgroundCornerRadius: 14)
// customization parameters for the center hint animation
let hintAnimationCustomization = HintAnimationCustomization(
hideAnimation: false,
animationIconSize: 80,
toFrameGradientColor: UIColor.red)
// customization parameters for the frame around the user face
let faceFrameCustomization = FaceFrameCustomization(
strokeWidth: 4,
strokeFaceAlignedColor: .green,
strokeFaceNotAlignedColor: .red,
geometryType: .rect(cornerRadius: 10),
strokePadding: 3)
// customization parameters for the SDK version text
let versionCustomization = VersionLabelCustomization(
textFont: .systemFont(ofSize: 12, weight: .regular),
textColor: .gray
)
// customization parameters for the background outside the frame
let backgroundCustomization = BackgroundCustomization(
backgroundColor: .lightGray
)
// customization parameters for the antiscam protection text
let antiscamCustomization: AntiscamCustomization = AntiscamCustomization(
customizationEnableAntiscam: false,
customizationAntiscamTextMessage: "Face recognition",
customizationAntiscamTextFont: UIFont.systemFont(ofSize: 15, weight: .semibold),
customizationAntiscamTextColor: UIColor.black,
customizationAntiscamBackgroundColor: UIColor.init(hexRGBA: "F2F2F7FF")!,
customizationAntiscamCornerRadius: 18,
customizationAntiscamFlashColor: UIColor.init(hexRGBA: "FF453AFF")!)
// customization parameters for your logo
// should be allowed by license
let logoCustomization = LogoCustomization(image: UIImage(), size: CGSize(width: 100, height: 100))
OZSDK.customization = Customization(toolbarCustomization: toolbarCustomization,
antiscamCustomization: antiscamCustomization,
centerHintCustomization: centerHintCustomization,
hintAnimationCustomization: hintAnimationCustomization,
faceFrameCustomization: faceFrameCustomization,
versionCustomization: vesrionCustomization,
backgroundCustomization: backgroundCustomization,
logoCustomization: logoCustomization)
Web SDK requires HTTPS (with SSL encryption) to work; however, at localhost and 127.0.01, you can check the resources' availability via HTTP.
Oz Liveness Web SDK consists of two components:
Client side – a JavaScript file that is being loaded within the frontend part of your application. It is called Oz Liveness Web Plugin.
Server side – a separate server module with OZ API. The module is called Oz Liveness Web Adapter.
The integration guides can be found here:
Oz Web SDK can be provided via SaaS, when the server part works on our servers and is maintained by our engineers, and you just use it, or on-premise, when Oz Web Adapter is installed on your servers. Contact us for more details and choose the model that is convenient for you.
Oz Web SDK requires a license to work. To issue a license, we need the domain name of the website where you are going to use our SDK.
This is a guide on how to start with Oz Web SDK:
Integrate the plugin into your page.
If you want to customize the look-and-feel of Oz Web SDK, please refer to this section.
Master license is the offline license that allows using Mobile SDKs with any bundle_id, unlike the regular licenses. To get a master license, create a pair of keys as shown below. Email us the public key, and we will email you the master license shortly after that.
Your application needs to sign its bundle_id with the private key, and the Mobile SDK checks the signature using the public key from the master license. Master licenses are time-limited.
This section describes the process of creating your private and public keys.
To create a private key, run the commands below one by one.
You will get these files:
privateKey.der is a private .der key;
privateKey.txt is privateKey.der encoded by base64. This key containing will be used as the host app bundle_id signature.
The OpenSSL command specification:
To create a public key, run this command.
You will get the public key file: publicKey.pub. To get a license, please email us this file. We will email you the license.
SDK initialization:
License setting:
Prior to the SDK initializing, create a base64-encoded signature for the host app bundle_id using the private key.
Signature creation example:
Pass the signature as the masterLicenseSignature parameter either during the SDK initialization or license setting.
If the signature is invalid, the initialization continues as usual: the SDK checks the list of bundle_id included into the license like it does it by default without a master license.
Android: Resolved an issue with warning that could appear when running Fragment.
Android: SDK no longer crashes when calling copyPlane.
Android: When you choose to send compressed videos for a hybrid analysis, SDK no longer saves original media as well as compressed.
iOS: The Scan gesture animation now works properly.
iOS: Fixed the bug where SDK didn’t call completion during initialization in debug mode.
Enhanced security.
initSDK in the iOS debugging mode now works properly.
You can now .
Fixed an error in the example code.
The Scan gesture hint is now properly voiced.
If you try to delete the reference photo, SDK now asks you to confirm deletion.
Changed the wording for the head_down gesture: the new wording is “tilt down”.
Updated the authorization logic.
Improved voiceover.
Bug fixes.
Security and telemetry updates.
The SDK hints and UI controls can be voiced in accordance to WCAG requirements.
Improved user experience with head movement gestures.
Android: moved the large video compression step to the Liveness screen closure.
The executeLiveness method is now deprecated, please use startLiveness instead.
Updated the code needed to obtain the Liveness results.
Security and telemetry updates.
Added descriptions for the errors that occur when providing an empty string as an ID in the addFolderID (iOS) and setFolderID (Android) methods.
Android: fixed a bug causing an endless spinner to appear if the user switches to another application during the Liveness check.
Android: fixed some smartphone model specific-bugs.
Android: upgraded the on-device Liveness model.
Android: security updates.
iOS: the messages displayed by the SDK after uploading media have been synchronized with Android.
iOS: the bug causing analysis delays that might have occurred for the One Shot gesture has been fixed.
The length of the Selfie gesture is now (affects the video file size).
Removed the pause after the Scan gesture.
Security and logging updates.
Bug fixes.
Android: updated the on-device Liveness model.
iOS: changed the default behavior in case a localization key is missing: now the English string value is displayed instead of a key.
Fixed some bugs.
Implemented the possibility of using a master license that works with any bundle_id.
Fixed the bug with background color flashing.
Video compression failure on some phone models is now fixed.
First version.
Background is no longer dark when you launch SDK.
SDK no longer flips one of the images during the Biometry analysis.
Fixed some bugs related to the on-device and hybrid analysis types.
Android: added support for Google Dynamic Feature Delivery.
Android: resolved the issue with possible SDK crashes when closing the Liveness screen.
iOS: resolved the issue with integration via Swift UI.
iOS: Xcode updated to version 16 to comply with Apple requirements.
Enhanced security and updated telemetry.
Security updates.
Android: you can now disable video validation that has been implemented to avoid recording extremely short videos (3 frames and less).
iOS: SDK now compresses videos if their size exceeds 10 MB.
iOS: Head movement gestures are now handled properly.
iOS: Xcode updated to version 16 to comply with Apple requirements.
Android: fixed the bug when the best shot frame could contain an image with closed eyes.
Android: resolved codec issues on some smartphone models.
Android: fixed the bug when the recorded videos might appear green.
iOS: added Xcode 16 support.
iOS: the screen brightness no longer changes when the rear camera is used.
iOS: fixed the video recording issues on some smartphone models.
Android: if the recorded video is larger than 10 MB, it gets compressed.
On-Premise
Install our Web SDK. Our engineers will help you to install the components needed using the standalone installer or manually. The license will be installed as well; to update it, please refer to this article.
Configure the adapter.
SaaS
This part is fully covered by the Oz Forensics engineers. You get a link for Oz Web Plugin (see step 2).
openssl genpkey -algorithm RSA -outform DER -out privateKey.der -pkeyopt rsa_keygen_bits:2048
# for MacOS
base64 -i privateKey.der -o privateKey.txt
# for Linux
base64 -w 0 privateKey.der > privateKey.txtopenssl rsa -pubout -in privateKey.der -out publicKey.pubOZSDK(licenseSources: [LicenseSource], masterLicenseSignature: String? = nil, completion: @escaping ((LicenseData?, LicenseError?) -> Void))setLicense(licenseSource: LicenseSource, masterLicenseSignature: String? = nil)private func getSignature() -> String? {
let privateKeyBase64String = "the string copied from the privateKey.txt file"
// with key example:
// let privateKeyBase64String = "MIIEogIBAAKCAQEAvxpyONpif2AjXiiG8fs9pQkn5C9yfiP0lD95Z0UF84t0Ox1S5U1UuVE5kkTYYGvS2Wm7ykUEGuHhqt/PyOAxrrNkAGz3OcVTsvcqPmwcf4UNdYZmug6EnQ5ok9wxYARS0aYqJUdzUb4dKOYal6WpHZE4yLx08R0zQ5jPkg5asT2u2PLB7JHZNnXwBcvdUonAgocNzdakUbWTNHKMxjwdAvwdIICdIneLZ9nCqe1d0cx7JBIhLzSPu/DVRANF+DOsE9JM8DT/Snnjok2xXzqpxBs1GwqiMJh98KYP78AVRWFuq3qbq0hWpjbq+mWl8xa7UMv8WxVd4PvQkWVYq/ojJwIDAQABAoIBAEvkydXwTMu/N2yOdcEmAP5I25HQkgysZNZ3OtSbYdit2lQbui8cffg23MFNHA125L65Mf4LWK0AZenBhriE6NYzohRVMf28cxgQ9rLhppOyGH1DCgr79wiUj02hVe6G6Qkfj39Ml+yvrs7uS0NMZBQ89yspRNv4t8IxrsWXc8cNQr33fdArlZ021Z12u2wdamaagiFwTa6ZYcQ5OYl3d/xL+oAwf9ywHwRrVM2JksGCxrcLJ7JCOL6lLyjp8rRrIG4iZ1V8YDfUNHmwD4w1fl30H6ejA+Cy5qge7CBZK+hqKr+hOcfBfakfOtgcEbFq2L8DqHoSaTeY6n9wjPJiFrkCgYEA8fc/Cg+d0BN98aON80TNT+XLEZxXMH+8qdmJYe/LB2EckBj1N0en87OEhqMFm9BjYswGf/uO+q2kryEat1n3nejbENN9XaO36ahlXTpQ6gdHO3TuO+hnnUkXeUNgiGYs+1L8Ot6PuNykwL0BZ09U0iBVoawEjTAg9tLNfVW2upsCgYEAyi/75YFT3ApxfSf/9XR74a0aKrfGGlBppcGvxuhVUC2aW2fPEZGXo4qlEhP2tyVTfj78F2p0Fnf3NSyvLZAzSwKo4w8EyZfXn1RI4sM95kzIMhH3Gz8VxCZWKEgr7cKNU5Zhs8un/VFd9Mc0KyZfmVy4VrZ5JumgahBRzSn9zGUCgYA7TTt3/cfRvVU6qbkajBw9nrYcRNLhogzdG+GdzSVXU6eqcVN4DunMwoySatXvEC2rgxF8wGyUZ4ZbHaPsl/ImE3HNN+gb0Qo8C/d719UI5mvA2LGioRzz4XwNTkQUaeZQWlBTJUTYK8t9KVV0um6xaRdTnlMnP0p088lFFILKTQKBgDsR98selKyF1JBXPl2s8YCGfU2bsWIAukz2IG/BcyNgn2czFfkxCxd5qy5z7LGnUxRgPHBu5oml9PBxJKDwLzwsA8GKosBu/00KZ9zwY8ZECn0uaH5qWOacuLE+HK9zFq0kE1lfF65XtlaMWH5+0JFS2HxlBVJMEVTLfcquCPtNAoGAG6ytPm24AS1satPwlKnZODQEg0kc7d5S42jbt4X7lwICY/7gORSdbNS8OmrqYhP/8tDOAUtzPQ20fEt43/VA87bq88BVzoSp4GVQcSL44MzRBQHQwTVkoVnbCXSD9K9gZ71wii+m+8rZZ0EMdiTR3hsRXRuSmw4t8y3CuzlZ9k4="
guard let data = Data(base64Encoded: privateKeyBase64String, options: [.ignoreUnknownCharacters]) else {
return nil
}
let sizeInBits = data.count * 8
let keyDict: [CFString: Any] = [
kSecAttrKeyType: kSecAttrKeyTypeRSA,
kSecAttrKeyClass: kSecAttrKeyClassPrivate,
kSecAttrKeySizeInBits: NSNumber(value: sizeInBits)
]
var error: Unmanaged<CFError>?
guard let secKey = SecKeyCreateWithData(data as CFData, keyDict as CFDictionary, &error) else {
return nil
}
guard let bundleID = Bundle.main.bundleIdentifier else {
return nil
}
guard let signature = SecKeyCreateSignature(secKey,
.rsaSignatureMessagePKCS1v15SHA512,
Data(bundleID.utf8) as CFData,
&error) else {
return nil
}
return (signature as Data).base64EncodedString()
}Please find the Flutter repository .
Add the lines below in pubspec.yaml of the project you want to add the plugin to.
Add the license file (e.g., license.json or forensics.license) to the Flutter application/assets folder. In pubspec.yaml, specify the Flutter asset:
For Android, add the Oz repository to /android/build.gradle, allprojects → repositories section:
For Flutter 8.24.0 and above or Android Gradle plugin 8.0.0 and above, add to android/gradle.properties:
The minimum SDK version should be 21 or higher:
For iOS, set the minimum platform to 13 or higher in the Runner → Info → Deployment target → iOS Deployment Target.
In ios/Podfile, comment the use_frameworks! line (#use_frameworks!).
Initialize SDK by calling the init plugin method. Note that the license file name and path should match the ones specified in pubspec.yaml (e.g., assets/license.json).
Use the API credentials (login, password, and API URL) that you’ve received from us.
In production, instead of hard-coding the login and password inside the application, it is recommended to get the access token on your backend via the API auth method, then pass it to your application:
By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for telemetry as shown below:
or
To start recording, use the startLiveness method to obtain the recorded media:
Parameter
Type
Description
actions
List<VerificationAction>
Actions from the captured video
use_main_camera
Boolean
If True, uses the main camera, otherwise the front one.
Please note: for versions 8.11 and below, the method name is executeLiveness, and it returns the recorded media.
To obtain the media result, subscribe to livenessResult as shown below:
To run the analyses, execute the code below.
Create the Analysis object:
Execute the formed analysis:
If you need to run an analysis for a particular folder, pass its ID:
The analysisResult list of objects contains the result of the analysis.
If you want to use media captured by another SDK, the code should look like this:
The whole code block will look like this:
// replace VerificationAction.blank with your Liveness gesture if needed
final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);
final analysis = [
Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
];
final analysisResult = await OZSDK.analyze(analysis, [], {});// replace VerificationAction.blank with your Liveness gesture if needed
final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);
final biometryMedia = [...cameraMedia];
biometryMedia.add(
Media(
FileType.documentPhoto,
VerificationAction.blank,
MediaType.movement,
null,
<your reference image path>,
null,
null,
MediaTag.photoSelfie,
),
);
final analysis = [
Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
Analysis(Type.biometry, Mode.serverBased, biometryMedia, {}),
];
final analysisResult = await OZSDK.analyze(analysis, [], {}); ozsdk:
git:
url: https://gitlab.com/oz-forensics/oz-mobile-flutter-plugin.git
ref: '8.8.2'assets
- assets/license.json // please note that the license file name must match to the one placed in assetsallprojects {
repositories {
google()
mavenCentral()
maven { url ‘https://ozforensics.jfrog.io/artifactory/main’ } // repository URL
}
}android.nonTransitiveRClass=falsedefaultConfig {
...
minSDKVersion 21
...
}await OZSDK.initSDK([<% license path and license file name %>]);await OZSDK.setApiConnectionWithCredentials(<login>, <password>, <host>); await OZSDK.setApiConnectionWithToken(token, host);await OZSDK.setEventConnectionWithCredentials(<login>, <password>, <host>);await OZSDK.setEventConnectionWithToken(<token>, <host>);await OZSDK.startLiveness(<actions>, <use_main_camera>);class Screen extends StatefulWidget {
static const route = 'liveness';
const Screen({super.key});
@override
State<Screen> createState() => _ScreenState();
}
class _ScreenState extends State<Screen> {
late StreamSubscription<List<Media>> _subscription;
@override
void initState() {
super.initState();
// subscribe to liveness result
_subscription = OZSDK.livenessResult.listen(
(List<Media> medias) {
// media contains liveness media
},
onError: (Object error) {
// handle error, in most cases PlatformException
},
);
}
@override
Widget build(BuildContext context) {
// omitted to shorten the example
}
void _startLiveness() async {
// use startLiveness to start liveness screen
OZSDK.startLiveness(<list of actions>);
}
@override
void dispose() {
// cancel subscription
_subscription.cancel();
super.dispose();
}
}List<Analysis> analysis = [ Analysis(Type.quality, Mode.serverBased, <media>, {}), ];final analysisResult = await OZSDK.analyze(analysis, [], {});final analysisResult = await OZSDK.analyze(analysis, folderID, [], {});media = Media(FileTypedocumentPhoto, VerificationAction.oneShot, “photo_selfie”, null, <path to image>, null, null, “”)The plugin window is launched with open(options) method:
Call GET /api/folders/?meta_data=transaction_id==<your_transaction_id> to find a folder in Oz API from your backend by your unique identifier.
Read more about Oz API.
The full list of OzLiveness.open() parameters:
options– an object with the following settings:
token – (optional) the auth token;
license – an object containing the license data;
OzLiveness.open({
lang: 'en',
action: [
'photo_id_front', // request photo ID picture
'video_selfie_blank' // request passive liveness video
],
meta: {
// an ID of user undergoing the check
// add for easier conversion calculation
'end_user_id': '<user_or_lead_id>',
// Your unique identifier that you can use later to find this folder in Oz API
// Optional, yet recommended
'transaction_id': '<your_transaction_id>',
// You can add iin if you plan to group transactions by the person identifier
'iin': '<your_client_iin>',
// Other meta data
'meta_key': 'meta_value',
},
on_error: function (result) {
// error details
console.error('on_error', result);
},
on_complete: function (result) {
// This callback is invoked when the analysis is complete
// It is recommended to commence the transaction on your backend,
// using transaction_id to find the folder in Oz API and get the results
console.log('on_complete', result);
},
on_capture_complete: function (result) {
// Handle captured data here if necessary
console.log('on_capture_complete', result);
}
});licenseUrl – a string containing the path to the license;
lang – a string containing the identifier of one of the installed language packs;
meta– an object with names of meta fields in keys and their string values in values. Metadata is transferred to Oz API and can be used to obtain analysis results or for searching;
params– an object with identifiers and additional parameters:
extract_best_shot– true or false: run the best frame choice in the Quality analysis;
action– an array of strings with identifiers of actions to be performed.
Available actions:
photo_id_front – photo of the ID front side;
photo_id_back – photo of the ID back side;
video_selfie_left – turn head to the left;
video_selfie_right – turn head to the right;
video_selfie_down – tilt head downwards;
video_selfie_high – raise head up;
video_selfie_smile – smile;
video_selfie_eyes – blink;
video_selfie_scan – scanning;
video_selfie_blank – no action, simple selfie;
video_selfie_best – special action to select the best shot from a video and perform analysis on it instead of the full video.
overlay_options – the document's template displaying options:
show_document_pattern: true/false – true by default, displays a template image, if set to false, the image is replaced by a rectangular frame;
on_submit– a callback function (no arguments) that is called after submitting customer data to the server (unavailable for the capture mode).
on_capture_complete – a callback function (with one argument) that is called after the video is captured and retrieves the information on this video. The example of the response is described here.
on_result– a callback function (with one argument) that is called periodically during the analysis and retrieves an intermediate result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode configuration parameter and is described here.
on_complete– a callback function (with one argument) that is called after the check is completed and retrieves the analysis result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode configuration parameter and is described here.
on_error – a callback function (with one argument) that is called in case of any error happened during video capturing and retrieves the error information: an object with the error code, error message, and telemetry ID for logging.
on_close– a callback function (no arguments) that is called after the plugin window is closed (whether manually by the user or automatically after the check is completed).
style – the customization section.
device_id – (optional) identifier of camera that is being used.
enable_3d_mask – enables the 3D mask as the default face capture behavior. This parameter works only if load_3d_mask in the Web Adapter configuration parameters is set to true; the default value is false.
cameraFacingMode (since 1.4.0) – the parameter that defines which camera to use; possible values: user (front camera), environment (rear camera). This parameter only works if the use_for_liveness option in the Web Adapter configuration file is undefined. If use_for_liveness is set (with any value), cameraFacingMode gets overridden and ignored.
disable_adaptive_aspect_ratio (since 1.5.0) – if True, disables the video adaptive aspect ratio, so your video doesn’t automatically adjust to the window aspect ratio. The default value is False, and by default, the video adjusts to the closest ratio of 4:3, 3:4, 16:9, or 9:16. Please note: smartphones still require the portrait orientation to work.
get_user_media_timeout (since 1.5.0) – when Web SDK can’t get access to the user camera, after this timeout it displays a hint on how to solve the problem. The default value is 40000 (ms).
if the getUserMedia() function hangs, you can manage the SDK behavior using the following parameters (since 1.7.15):
get_user_media_promise_timeout_ms – set the timeout (in ms) after which SDK will throw an error or display an instruction. This parameter is an object with the following keys: "platform_browser", "browser", "platform", "default"(the priority matches the sequence).
get_user_media_promise_timeout_throw_error – defines whether, after the time period defined in the parameter above, SDK should call an error (if true) or display a user instruction (if false).
In this article, you’ll learn how to capture videos and send them through your backend to Oz API.
Here is the data flow for your scenario:
1. Oz Web SDK takes a video and makes it available for the host application as a frame sequence.
2. The host application calls your backend using an archive of these frames.
3. After the necessary preprocessing steps, your backend calls Oz API, which performs all necessary analyses and returns the analyses’ results.
4. Your backend responds back to the host application if needed.
On the server side, Web SDK must be configured to operate in the Capture mode:
The architecture parameter must be to capture in the app_config.json file.
In your Web app, add a callback to process captured media when opening the Web SDK :
The result object structure depends on whether any virtual camera is detected or not.
Here’s the list of variables with descriptions.
The video from Oz Web SDK is a frame sequence, so, to send it to Oz API, you’ll need to archive the frames and transmit them as a ZIP file via the POST /api/folders request (check our).
You can retrieve the MP4 video from a folder using the /api/folders/{{folder_id}} request with this folder's ID. In the JSON that you receive, look for the preview_url in source_media. The preview_url parameter contains the link to the video. From the plugin, MP4 videos are unavailable (only as frame sequences).
Oz API accepts data without the base64 encoding.
The coordinates of the face landmarks (left eye, right eye, nose, mouth, left ear, right ear) in the best frame
frame_list
Array[String]
All frames in the data URL format
frame_bounding_box_list
Array[Array[Named_parameter: Int]]
The coordinates of the bounding boxes where the face is located in the corresponding frames
frame_landmarks
Array[Named_parameter: Array[Int, Int]]
The coordinates of the face landmarks (left eye, right eye, nose, mouth, left ear, right ear) in the corresponding frames
action
String
An action code
additional_info
String
Information about client environment
Also, in the POST {{host}}/api/folders request, you need to add the additional_info field. It is required for the capture architecture mode to gather the necessary information about client environment. Here’s the example of filling in the request’s body:
Variable
Type
Description
best_frame
String
The best frame, JPEG in the data URL format
best_frame_png
String
The best frame, PNG in the data URL format, it is required for protection against virtual cameras when video is not used
best_frame_bounding_box
Array[Named_parameter: Int]
The coordinates of the bounding box where the face is located in the best frame
best_frame_landmarks
Array[Named_parameter: Array[Int, Int]]
OZLiveness.open({
... // other parameters
on_capture_complete: function(result) {
// Your code to process media/send it to your API, this is STEP #2
}
}){
"action": <action>,
"best_frame": <bestframe>,
"best_frame_png": <bestframe_png>,
"best_frame_bounding_box": {
"left": <bestframe_bb_left>,
"top": <bestframe_bb_top>,
"right": <bestframe_bb_right>,
"bottom": <bestframe_bb_bottom>
},
"best_frame_landmarks": {
"left_eye": [bestframe_x_left_eye, bestframe_y_left_eye],
"right_eye": [bestframe_x_right_eye, bestframe_y_right_eye],
"nose_base": [bestframe_x_nose_base, bestframe_y_nose_base],
"mouth_bottom": [bestframe_x_mouth_bottom, bestframe_y_mouth_bottom],
"left_ear": [bestframe_x_left_ear, bestframe_y_left_ear],
"right_ear": [bestframe_x_right_ear, bestframe_y_right_ear]
},
"frame_list": [<frame1>, <frame2>],
"frame_bounding_box_list": [
{
"left": <frame1_bb_left>,
"top": <frame1_bb_top>,
"right": <frame1_bb_right>,
"bottom": <frame1_bb_bottom>
},
{
"left": <frame2_bb_left>,
"top": <frame2_bb_top>,
"right": <frame2_bb_right>,
"bottom": <frame2_bb_bottom>
},
],
"frame_landmarks": [
{
"left_eye": [frame1_x_left_eye, frame1_y_left_eye],
"right_eye": [frame1_x_right_eye, frame1_y_right_eye],
"nose_base": [frame1_x_nose_base, frame1_y_nose_base],
"mouth_bottom": [frame1_x_mouth_bottom, frame1_y_mouth_bottom],
"left_ear": [frame1_x_left_ear, frame1_y_left_ear],
"right_ear": [frame1_x_right_ear, frame1_y_right_ear]
},
{
"left_eye": [frame2_x_left_eye, frame2_y_left_eye],
"right_eye": [frame2_x_right_eye, frame2_y_right_eye],
"nose_base": [frame2_x_nose_base, frame2_y_nose_base],
"mouth_bottom": [frame2_x_mouth_bottom, frame2_y_mouth_bottom],
"left_ear": [frame2_x_left_ear, frame2_y_left_ear],
"right_ear": [frame2_x_right_ear, frame2_y_right_ear]
}
],
"from_virtual_camera": null,
"additional_info": <additional_info>
}{
"action": <action>,
"best_frame": null,
"best_frame_png": null,
"best_frame_bounding_box": null,
"best_frame_landmarks": null
"frame_list": null,
"frame_bounding_box_list": null,
"frame_landmarks": null,
"from_virtual_camera": {
"additional_info": <additional_info>,
"best_frame": <bestframe>,
"best_frame_png": <best_frame_png>,
"best_frame_bounding_box": {
"left": <bestframe_bb_left>,
"top": <bestframe_bb_top>,
"right": <bestframe_bb_right>,
"bottom": <bestframe_bb_bottom>
},
"best_frame_landmarks": {
"left_eye": [bestframe_x_left_eye, bestframe_y_left_eye],
"right_eye": [bestframe_x_right_eye, bestframe_y_right_eye],
"nose_base": [bestframe_x_nose_base, bestframe_y_nose_base],
"mouth_bottom": [bestframe_x_mouth_bottom, bestframe_y_mouth_bottom],
"left_ear": [bestframe_x_left_ear, bestframe_y_left_ear],
"right_ear": [bestframe_x_right_ear, bestframe_y_right_ear]
},
"frame_list": [<frame1>, <frame2>],
"frame_bounding_box_list": [
{
"left": <frame1_bb_left>,
"top": <frame1_bb_top>,
"right": <frame1_bb_right>,
"bottom": <frame1_bb_bottom>
},
{
"left": <frame2_bb_left>,
"top": <frame2_bb_top>,
"right": <frame2_bb_right>,
"bottom": <frame2_bb_bottom>
},
],
"frame_landmarks": [
{
"left_eye": [frame1_x_left_eye, frame1_y_left_eye],
"right_eye": [frame1_x_right_eye, frame1_y_right_eye],
"nose_base": [frame1_x_nose_base, frame1_y_nose_base],
"mouth_bottom": [frame1_x_mouth_bottom, frame1_y_mouth_bottom],
"left_ear": [frame1_x_left_ear, frame1_y_left_ear],
"right_ear": [frame1_x_right_ear, frame1_y_right_ear]
},
{
"left_eye": [frame2_x_left_eye, frame2_y_left_eye],
"right_eye": [frame2_x_right_eye, frame2_y_right_eye],
"nose_base": [frame2_x_nose_base, frame2_y_nose_base],
"mouth_bottom": [frame2_x_mouth_bottom, frame2_y_mouth_bottom],
"left_ear": [frame2_x_left_ear, frame2_y_left_ear],
"right_ear": [frame2_x_right_ear, frame2_y_right_ear]
}
]
}
}"VIDEO_FILE_KEY": VIDEO_FILE_ZIP_BINARY
"payload": "{
"media:meta_data": {
"VIDEO_FILE_KEY": {
"additional_info": <additional_info>
}
}
}"iOS SDK changes
Resolved the issue with SDK not returning the license-related callbacks.
Enhanced security.
Resolved the issue with SDK sometimes not responding to user actions on some devices.
Updated SDK to support the upcoming security features.
Fixed the bug with crashes that might happen during the Biometry analysis after taking a reference photo using camera.
Enhanced security.
The Scan gesture animation now works properly.
Fixed the bug where SDK didn’t call completion during initialization in debug mode.
Enhanced security.
Addressed an SDK crash that occasionally happened when invoking the license.
We highly recommend updating to this version.
Resolved the issue with integration via Swift UI.
SDK no longer crashes on smartphones that are running low on storage.
Security and telemetry updates.
Security updates.
Xcode updated to version 16 to comply with Apple requirements.
Security updates.
Updated the authorization logic.
Improved voiceover.
SDK now compresses videos if their size exceeds 10 MB.
Head movement gestures are now handled properly.
Changed the wording for the head_down gesture: the new wording is “tilt down”.
Added proper focus order for VoiceOver when the antiscam hint is enabled.
Added the public setting extract_action_shot in the Demo Application.
Bug fixes.
Accessibility updates according to WCAG requirements: the SDK hints and UI controls can be voiced.
Improved user experience with head movement gestures.
Minor bug fixes and telemetry updates.
The screen brightness no longer changes when the rear camera is used.
Fixed the video recording issues on some smartphone models.
Security and telemetry updates.
Internal SDK improvements.
Added Xcode 16 support.
Security and telemetry updates.
Security updates.
Bug fixes.
SDK now requires Xcode 15 and higher.
Security updates.
Bug fixes.
Internal SDK improvements.
Internal SDK improvements.
Bug fixes.
Logging updates.
Security updates.
You can now install iOS SDK via Swift Package Manager.
The sample is now available on SwiftUI. Please find it .
Added a description for the error that occurs when providing an empty string as an ID in the addFolderID method.
Bug fixes.
The messages displayed by the SDK after uploading media have been synchronized with Android.
The bug causing analysis delays that might have occurred for the One Shot gesture has been fixed.
The length of the Selfie gesture is now (affects the video file size).
You can instead of Oz logo if your license allows it.
Removed the pause after the Scan gesture.
The code in is now up-to-date.
Security updates.
Changed the default behavior in case a localization key is missing: now the English string value is displayed instead of a key.
Fixed some bugs.
Internal licensing improvements.
Implemented the possibility of using a master license that works with any bundle_id.
Fixed the bug with background color flashing.
Bug fixes.
The Analysis structure now contains the sizeReductionStrategy field. This field defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully.
The messages for the errors that are retrieved from API are now detailed.
The toFrameGradientColor option in hintAnimationCustomization
If multiple analyses are applied to the folder simultaneously, the system sends them as a group. It means that the “worst” of the results will be taken as resolution, not the latest. Please refer to for details.
For the Liveness analysis, the system now treats the highest score as a quantitative result. The Liveness analysis output is described .
Updated the Liveness on-device model.
Added the Portuguese (Brazilian) locale.
You can now add a custom or update an existing language pack. The instructions can be found .
If a media hasn't been uploaded correctly, the system now repeats the upload.
Fixed some bugs and improved the SDK algorithms.
Added the new analysis mode – hybrid (Liveness only). If the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.
Improved the on-device models.
Updated the run method.
Added new structures: RequestStatus
Added the center hint background customization.
Added new face frame forms (Circle, Square).
Added the antiscam widget and its . This feature allows you to alert your customers that the video recording is being conducted, for instance, for loan application purposes. The purpose of this is to safeguard against scammers who may attempt to deceive an individual into approving a fraudulent transaction.
Fixed the issue with the server-based One shot analysis.
Improved the SDK algorithms.
Fixed error handling when uploading a file to API. From this version, an error will be raised to a host application in case of an error during file upload.
Improved the on-device Liveness.
Fixed the animation for sunglasses/mask.
Fixed the bug with the .document analysis.
Updated the descriptions of customization methods and structures.
Updated the TensorFlow version to 2.11.
Fixed several bugs, including the Biometry check failures on some phone models.
Added customization for the hint animation.
Integrated a new model.
Added the uploadMedia method to AnalysisRequest. The addMedia method is now deprecated.
Fixed the combo analysis error.
Implemented a range of options and switched to the new design. To restore the previous settings, please refer to .
The run method now works similar to the one in Android SDK and returns an .
Synchronized the version numbers with Android SDK.
Added a new field to the Analysis structure. The params field is for any additional parameters, for instance, if you need to set extracting the best shot on server to true. The best shot algorithm chooses the most high-quality frame from a video.
Fixed some localization issues.
The Zoom in and Zoom out gestures are no longer supported.
Added a new simplified analysis structure – AnalysisRequest.
Added methods of on-device analysis: runOnDeviceLivenessAnalysis and runOnDeviceBiometryAnalysis.
You can choose the installation version. Standard installation gives access to full functionality. The core version (OzLivenessSDK/Core) installs SDK without the on-device functionality.
Added the Turkish locale.
Added the Kyrgyz locale.
Added Completion Handler for analysis results.
Added Error User Info to telemetry to show detailed info in case of an analysis error.
Added local on-device analysis.
Added oval and rectangular frames.
Added Xcode 12.5.1+ support.
Added SDK configuration with licenses.
Added the One Shot gesture.
Improved OZVerificationResult: added bestShotURL which contains the best shot image and preferredMediaURL which contains an URL to the best quality video.
When performing a local check, you can now choose a main or back camera.
Authorization sessions extend automatically.
Updated authorization interfaces.
Added the Kazakh locale.
Added license error texts.
You can cancel network requests.
You can specify Bundle for license.
Added analysis parameterization documentAnalyse.
Added license support.
Added Xcode 12 support instead of 11.
Fixed the documentAnalyse error where you had to fill analyseStates to launch the analysis.
Security updates.
Security and logging updates.
hintGradientColorGot back the iOS 11 support.
Added a new method to retrieve the telemetry (logging) identifier: getEventSessionId.
The setPermanentAccessToken, configure and login methods are now deprecated. Please use the setApiConnection method instead.
The setLicense(from path:String) method is now deprecated. Please use the setLicense(licenseSource: LicenseSource) method instead.
Fixed some bugs and improved the SDK work.
ResultMediaRequestResultThe updated AnalysisResult structure should be now used instead of OzAnalysisResult.
For the OZMedia object, you can now specify additional tags that are not included into our tags list.
The Selfie video length is now about 0.7 sec, the file size and upload time are reduced.
The hint text width can now exceed the frame width (when using the main camera).
The methods below are no longer supported:
AnalysisRequest.run
addMedia
uploadMedia
Added the Spanish locale.
iOS 11 is no longer supported, the minimal required version is 12.
Added a button to reset the SDK theme and language settings.
Fixed some bugs and localization issues.
Extended the network request timeout to 90 sec.
Added a setting for the animation icon size.
Changed the Combo gesture.
Now you can launch the Liveness check to analyze images taken with another SDK.
uploadAndAnalyse.Improved the licensing process, now you can add a license when initializing SDK: OZSDK(licenseSources: [LicenseSource], completion: @escaping ((LicenseData?, LicenseError?) -> Void)), where LicenseSource is a path to physical location of your license, LicenseData contains the license information.
Added the setLicense method to force license adding.
Removed method
Replacement
analyse
AnalysisRequest.run
addToFolder
uploadMedia
documentAnalyse
AnalysisRequest.run
uploadAndAnalyse
AnalysisRequest.run
runOnDeviceBiometryAnalysis
AnalysisRequest.run
runOnDeviceLivenessAnalysis
Enhanced security.
You can now launch Web Plugin in the windowed mode. Define parent_container in OpenOptions: parent_container: string | HTMLElement.
Loader transitions are now customizable. Please refer to the customization settings.
Improved handling of head movement gestures (up / down).
Resolved an issue with excessive error messages in console.
Updated telemetry.
Enhanced security.
You can now replace the default loader with your custom one. Please refer to the customization settings.
The behavior of SDK, when getting camera access takes too long, is now manageable. Please refer to get_user_media_promise_timeout_* parameters listed here.
Security and telemetry updates.
Breaking changes: we no longer transmit scores in callbacks. Replace confidence_spoofing with 0 for SUCCESS and with 1 for other statuses.
Fixed the bug with Web SDK being unable to read the image_data_tensor properties.
Resolved the issue when OzLiveness was sometimes called later than expected.
Security update.
Added support for API 6.0.
CORS headers in server configuration should now be specified without quotation marks.
Added a new parameter to manage authorization. auth defines whether and what authorization is used:
true (the default value) – authorization is required and is based on the generated key;
user:pass – authorization is required and is based on the user login and password;
false – no authorization needed.
Fixed the bug with colors being applied incorrectly during 3D mask customization.
Resolved the issue with incorrect mirroring when the use_for_liveness parameter is not set.
The document scan in plugin now works properly.
Improved accessibility: the hints throughout the customer journey (camera access, processing data, uploading data, requesting results) are now properly and completely voiced by screen readers in assertive mode (changes in hints are announced immediately).
Created an endpoint for license verification: [GET] /check_license.php.
Reduced the bundle size.
Fixed the issue with missing analysis details in the on_complete callback when using result_mode: full.
Fixed the issue when the camera switch button might have been missed.
The front camera no longer displays the user's actions as in a mirror image.
Improved error handling.
Improved support for low-performance devices.
Added the closed eyes check to the Scan gesture.
Internal improvements, bug fixes, major telemetry and security updates.
Simplified the checks that require user to move their head: turning left or right, tilting, or looking down.
Decreased the distance threshold for the head-moving actions: turning left or right, tilting, or looking down.
The application's behavior when the opened dev-tools are detected is now manageable.
You can now configure method signatures to make them trusted via checksum of the modified function.
Changed the wording for the head_down gesture: the new wording is “tilt down”.
For possible regulatory requirements, updated the with a new parameter: extract_action_shot. If True, for each gesture, the system saves the corresponding image to display it in report, e.g., closed eyes for blinking, instead of random frame for thumbnail.
Fixed an issue where an arrow incorrectly appeared after capturing head-movement gestures.
Fixed an issue where the oval disappeared when the "Great!" phrase was displayed.
Improved the selection of .
Security updates.
Security updates.
Resolved the issue where a video could not be generated from a sequence of frames.
The on_complete callback now is called upon folder status change.
Updated instructions for camera access in the Android Chrome and Facebook browsers. New keys:
error_no_camera_access,
oz_tutorial_camera_android_chrome_with_screens_title,
oz_tutorial_camera_android_chrome_instruction_screen_click_settings,
oz_tutorial_camera_android_chrome_instruction_screen_permissions,
oz_tutorial_camera_android_chrome_instruction_screen_allow_access,
try_again,
oz_tutorial_camera_external_browser_button,
oz_tutorial_camera_external_browser_manual_open_link,
oz_tutorial_camera_external_browser_title.
Added the get_langs() method that returns a list of locales available in the installed Web SDK.
Added an error for the case of setting a non-available locale.
Added an error for the case of lacking of a necessary resource. New key: unable_load_resource.
Changed texts for the error_connection_lost and error_service_unavailable errors.
Uploaded new Web SDK string files.
The crop function no longer adds borders for images smaller than 512×512.
In case of camera access timeout, we now display a page with instructions for users to enable camera access: default for all browsers and specific for Facebook.
Added several localization records to the Web SDK strings file. New localization keys:
accessing_camera_switch_to_another_browser,
error_camera_timeout_instruction,
error_camera_timeout_title,
error_camera_timeout_android_facebook_instruction.
Improved user experience for card printer machines. Users no longer need to get that close to the screen with face frame.
Added the disable_adaptive_aspect_ratio parameter to the Web Plugin. This parameter switches off the default video aspect ratio adjustment to the window.
Implemented the get_user_media_timeout parameter for Web Plugin: when SDK can’t get access to the user camera, after this timeout it displays a hint on how to solve the problem.
Added several localization records into the . New keys:
oz_tutorial_camera_android_edge_browser
oz_tutorial_camera_android_edge_instruction
oz_tutorial_camera_android_edge_title
Improved the localization: when SDK can’t find a translation for a key, it displays a message in English.
You can now distribute the serverless Web SDK via Node Package Manager.
You can switch off the display of API errors in modal windows. Set the disable_adapter_errors_on_screen parameter in the configuration file to True.
The mobile browsers now use the rear camera to take the documents’ photos.
Updated samples.
Fixed the bug with abnormal 3D mask reaction when user needs to repeat a gesture.
Logging and security updates.
Fixed the bug where the warning about incorrect device orientation was not displayed when a mobile user attempted to take a video with their face in landscape orientation.
Some users may have experienced freezes while using WebView. Now, users can tap a button to continue working with the application. The corresponding string has been added to the string file in the localization section. Key: tap_to_continue.
Debugging improvements.
Major security updates: improved protection against virtual cameras and JavaScript tampering.
Improved WebView support:
Added camera access instructions for applications within the generic WebView browsers on Android and iOS. The corresponding events are added to telemetry.
Improved the React Native app integration by adding the webkit-playsinline attribute, thereby fixing the issue of the full-screen camera launch on iOS WebView.
The iFrame using error when iframe_allowed = False is now shown properly.
New localization keys:
oz_tutorial_camera_android_webview_browser
oz_tutorial_camera_android_webview_instruction
oz_tutorial_camera_android_webview_title
You can now use Web SDK for the Black List analysis: to compare the face from your Liveness video with faces from your database. Create a collection (or collections) with these photos via API or Web UI, and add the corresponding ID (or IDs) to the analyses.collection_ids array in the Web Adapter configuration file.
The iframe support is back: set the iframe_allowed parameter in the Web Adapter configuration file to True.
The interval for polling for the analyses’ results is now configurable. Change it in the results_polling_interval parameter of the Web Adapter configuration file if necessary.
You can now select the front or back camera via Web Plugin. In the OzLiveness.open() method, set cameraFacingMode to user for the front camera and environment for the back one. This parameter only works when the use_for_liveness option in the Web Adapter configuration file is not set.
The plugin styles are now being added automatically. Please remove <link rel="stylesheet" href="/plugin/ozliveness.css" /> from your page to prevent style conflicts.
Fixed some bugs and updated telemetry.
Improved the protection against injection attacks.
Replaced the code for Brazilian Portuguese from pt to pt-br to match the ISO standard.
Removed the lang_default adapter parameter.
The 3D mask transparency became customizable.
Implemented the possibility of using a master license that works with any domain.
Added the master_license_signature option into Web Adapter configuration parameters.
Fixed some bugs.
Internal SDK improvements.
To enhance your clients’ experience with Web SDK, we implemented the 3D-mask that replaces the oval during face capture. To make it work, set the load_3d_mask in Configuration file settings to true.
Updated telemetry (logging).
Logging updates.
Security updates.
Internal SDK improvements.
Internal SDK improvements.
Fixed some bugs.
Changed the signature of the on_error() callback: now it returns an object with the error code, error message, and telemetry ID for logging.
Added the configuration parameter for the debug mode. If True, the Web SDK enables access to the /debug.php page, which contains information about the current configuration and the current license.
Fixed some bugs and improved logging.
If your device has multiple cameras, you can now choose one when launching the Web Plugin.
Implemented the new design for SDK and demo, including the scam protection option: the antiscam message warns user about their actions being recorded. Please check the new customization options here.
Added the Portuguese, Spanish, and Kazakh locales.
Added the combo gesture.
Added the progress bar for media upload.
Removed the Zoom in / Zoom out gestures.
On tablets, you can now capture video in landscape orientation.
Removed the lang_allow option from Web Adapter configuration file.
In the capture architecture, when a virtual camera is detected, the additional_info parameter is inside the from_virtual_camera section.
You can now crop the lossless frame without losing quality.
Fixed face landmarks for the capture architecture.
Improved the recording quality;
Reforged licensing:
added detailed error descriptions;
now you can set the license in JS during the runtime;
when you set a license in OzLiveness.open(), it rewrites the previous license;
the license no longer requires port and protocol;
you can now specify subdomains in the license;
upon the launch of the plugin on a server, the license payload is displayed in the Docker log;
localhost and 127.0.0.1 no longer ask for a license;
The on_capture_complete callback is now available on any architecture: it is called once a video is taken and returns info on actions from the video;
Oz Web Liveness and Oz Web Adapter versions are displayed in the Docker log upon launch;
Deleted the deprecated adapter_version field from order metadata;
to pass the information about the bounding box – landmarks that define where the face in the frame is;
Fixed the Switch camera button in Google Chrome;
Upon the start of Web SDK, the actual configuration parameters are displayed in the Docker log.
Changed the extension of some Oz system files from .bin to .dat.
Additional scripts are now called using the main script's address.
Web SDK now can be installed via static files only (works for the capture type of architecture).
Web SDK can now work with CDN.
Now, you can launch several Oz Liveness plugins on different pages. In this case, you need to specify the path to scripts in head of these pages.
Fixed a bug with the shooting screen.
Added licensing (requires origin).
You can now customize the look-and-feel of Web SDK.
Fixed Angular integration.
Fixed the bug where the IMAGE_FOLDER section was missed in the JSON response with the lossless frame enabled.
Fixed issues with the ravenjs library.
A frame for taking a documents photo is now customizable.
Implemented security updates.
Metadata now contains names of all cameras you can use.
Video and zip formats now allow loading a lossless image.
Fixed Best Shot.
Separated the error code and error description in server responses.
If the SDK mode is set in the environment variables architecture, api_url, it is passed to settings automatically.
In the Lite mode, you can select the best frame for any action.
In the Lite mode, an image sent via API gets the on_complete status only after a successful liveness.
You can manage CORS using the environment variables (CORS headers are not added by default).
Added the folder value for result_mode: it returns the same value as status but with folder_id.
Updated encryption: now only metadata required to decrypt an object is encrypted.
Updated data transfer: images are being sent in separate form fields.
Added the camera parameters check.
Enabled a new method for image encryption.
Optimized image transfer format.
Added the use_for_liveness option: mobile devices use back camera by default, on desktop, flip and oval circling are off. By default, the option is switched off.
Decreased video length for video_selfie_best (the Selfie gesture) from 1 to 0,2 sec.
Loading scripts is now customizable.
Improved UX.
Added the Kazakh locale.
Added a guide for accessing the camera on a desktop.
Improved logging: plugin_liveness.php requests and recording user-agent to the server log.
Added the Lite mode.
Added encryption.
Updated libraries.
You can now hide the Oz Forensics logo.
Updated a guide for Facebook, Instagram, Samsung, Opera.
Added handlers for unknown variables and a guide for “unknown” browsers.
Optimized memory usage for a frame.
Added a guide on how to switch cameras on using Android browsers.
error_camera_timeout_instruction
error_camera_timeout_title
Android SDK changes
Fixed a minor bug.
Fixed occasional SDK crashes in specific cases and / or on specific devices.
Enhanced security.
Improved SDK performance for some devices.
Updated SDK to support the upcoming security features.
Fixed the bug with green videos on some smartphone models.
Resolved the issue with mediaId appearing null.
Enhanced security.
Resolved an issue with warning that could appear when running Fragment.
SDK no longer crashes when calling copyPlane.
When you choose to send compressed videos for a hybrid analysis, SDK no longer saves original media as well as compressed.
Updated Oz Forensics website link.
To support memory page size of 16 KB, switched TensorFlow to LiteRT.
We highly recommend updating to this version.
Fixed the bug that caused video duration and file size to increase.
Added support for Google Dynamic Feature Delivery.
Resolved an issue that might have caused crashes when taping the close button on the Liveness screen.
Fixed a bug where the SDK would crash with "CameraDevice was already closed" exception.
Security and telemetry updates.
Resolved the issue with OkHttp compatibility.
Fixed the bug with Fragment missing context.
Resolved a camera access issue affecting some mobile device models.
Security updates.
Security updates.
Security updates.
Resolved the issue with possible SDK crashes when closing the Liveness screen.
Security updates.
Updated the authorization logic.
Improved voiceover.
Fixed the issue with SDK lags and the non-responding error that users might have encountered on some devices after completing the video recording.
Resolved the issue with SDK crashes on some devices that might have occurred because of trying to access non-initialized or closed resources.
Security updates.
You can now disable video validation that has been implemented to avoid recording of extremely short videos (3 frames and less): switch the option off using .
Fixed the bug with green videos on some smartphone models.
Security updates.
Fixed bugs that could have caused crashes on some phone models.
Changed the wording for the head_down gesture: the new wording is “tilt down”.
Added proper focus order for TalkBack when the antiscam hint is enabled.
Added the public setting extract_action_shot in the Demo Application.
Fixed bugs.
Fixed the bug when the recorded videos might appear green.
Resolved codec issues on some smartphone models.
Accessibility updates according to WCAG requirements: the SDK hints and UI controls can be voiced.
Improved user experience with head movement gestures.
Moved the large video compression step to the Liveness screen closure.
Fixed the bug when the best shot frame could contain an image with closed eyes.
Security and telemetry updates.
Security updates.
Security updates.
Security and telemetry updates.
Fixed the RuntimeException error with the server-based Liveness that appeared on some devices.
Security updates.
Security updates.
Bug fixes.
Updated the Android Gradle plugin version to 8.0.0.
Internal SDK improvements.
Internal SDK improvements.
Security updates.
Security updates.
Security updates.
Security updates.
Added a description for the error that occurs when providing an empty string as an ID in the setFolderID method.
Fixed a bug causing an endless spinner to appear if the user switches to another application during the Liveness check.
Fixed some smartphone model specific-bugs.
Upgraded the on-device Liveness model.
Security updates.
The length of the Selfie gesture is now (affects the video file size).
You can instead of Oz logo if your license allows it.
Removed the pause after the Scan gesture.
If the recorded video is larger than 10 MB, it gets compressed.
Changed the master license validation algorithm.
Downgraded the required compileSdkVersion from 34 to 33.
Security updates.
Updated the on-device Liveness model.
Fixed some bugs.
Internal licensing improvements.
Internal SDK improvements.
Bug fixes.
Implemented the possibility of using a master license that works with any bundle_id.
Video compression failure on some phone models is now fixed.
Bug fixes.
The Analysis structure now contains the sizeReductionStrategy field. This field defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully.
The messages for the errors that are retrieved from API are now detailed.
If multiple analyses are applied to the folder simultaneously, the system sends them as a group. It means that the “worst” of the results will be taken as resolution, not the latest. Please refer to for details.
For the Liveness analysis, the system now treats the highest score as a quantitative result. The Liveness analysis output is described .
Updated the Liveness on-device model.
Added the Portuguese (Brazilian) locale.
You can now add a custom or update an existing language pack. The instructions can be found .
If a media hasn't been uploaded correctly, the system repeats the upload.
Fixed errors.
The SDK now works properly with baseURL set to null.
The dependencies' versions have been brought into line with Kotlin version.
Added the new analysis mode – hybrid (Liveness only). If the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.
Kotlin version requirements lowered to 1.7.21.
Improved the on-device models.
For some phone models, fixed the fatal device error.
Restructured the settings screen.
Added the center hint background customization.
Added new face frame forms (Circle, Square).
Added the antiscam widget and its . This feature allows you to alert your customers that the video recording is being conducted, for instance, for loan application purposes. The purpose of this is to safeguard against scammers who may attempt to deceive an individual into approving a fraudulent transaction.
Improved the SDK algorithms.
Updated the model for the on-device analyses.
Fixed the animation for sunglasses/mask.
The oval size for Liveness is now smaller.
Fixed the error with the server-based analyses while using permanentAccessToken for authorization.
Added customization for the .
You can now hide the status bar and system buttons (works with 7.0.0 and higher).
OzLivenessSDK.init now requires context as the first parameter.
Fixed crashes for Android v.6 and below.
Fixed oval positioning for some phone models.
Internal fixes and improvements.
Updated security.
Implemented some internal improvements.
The addMedia method is now deprecated, please use uploadMedia for uploading.
Changed the way of sharing dependencies. Due to security issues, now we share two types of libraries as shown below: sdk is a server analysis only, full provides both server and on-device analyses:
UICustomization has been implemented instead of OzCustomization.
Implemented a range of options and switched to the new design. To restore the previous settings, please refer to .
Added the Spanish locale.
Fixed the bug with freezes that had appeared on some phone models.
SDK now captures videos in 720p.
Synchronized the names of the analysis modes with iOS: SERVER_BASED and ON_DEVICE.
Fixed the bug with displaying of localization settings.
Now you can use Fragment as Liveness screen.
Added a new field to the Analysis structure. The params field is for any additional parameters, for instance, if you need to set extracting the best shot on server to true. The best shot algorithm chooses the most high-quality frame from a video.
The Zoom in and Zoom out gestures are no longer supported.
Updated the biometry model.
Added a new simplified API – AnalysisRequest. With it, it’s easier to create a request for the media and analysis you need.
Published the on-device module for on-device liveness and biometry analyses. To add this module to your project, use:
To launch these analyses, use runOnDeviceBiometryAnalysis and runOnDeviceLivenessAnalysis methods from the OzLivenessSDK class:
Liveness now goes smoother.
Fixed freezes on Xiaomi devices.
Optimized image converting.
New metadata parameter for OzLivenessSDK.uploadMedia and new OzLivenessSDK.uploadMediaAndAnalyze method to pass this parameter to folders.
Added functions for SDK initialization with LicenseSources: LicenseSource.LicenseAssetId and LicenseSource.LicenseFilePath. Use the OzLivenessSDK.init method to start initialization.
Now you can get the license info upon initialization val licensePayload = OzLivenessSDK.getLicensePayload().
Added the Kyrgyz locale.
Added local analysis functions.
You can now configure the face frame.
Fixed version number at the Liveness screen.
Added the main camera support.
Added configuration from license support.
Added the OneShot gesture.
Added new states for OzAnalysisResult.Resolution.
Added the uploadMediaAndAnalyze method to load a bunch of media to the server at once and send them to analysis immediately.
Access token updates automatically.
Renamed accessToken to permanentAccessToken.
Added R8 rules.
Fixed the oval frame.
Removed the unusable parameters from AnalyseRequest.
Removed default attempt limits.
To customize the configuration options, the config property is added instead of baseURL, accessToken, etc. Use OzConfig.Builder for initialization.
Added license support. Licences should be installed as raw resources. To pass them to OzConfig, use setLicenseResourceId.
Replaced the context-dependent methods with analogs.
Enhanced security.
Security updates.
Security updates.
Minor bug fixes and telemetry updates.
Security and logging updates.
Created a new method to retrieve the telemetry (logging) identifier: getEventSessionId.
The login and auth methods are now deprecated. Use the setAPIConnection method instead.
OzConfig.baseURL and OzConfig.permanentAccessToken are now deprecated.
If a user closes the screen during video capture, the appropriate error is now being handled by SDK.
Fixed some bugs and improved the SDK work.
The hint text width can now exceed the frame width (when using the main camera).
Photos taken during the One Shot analysis are now being sent to the server in the original size.
Removed the OzAnalysisResult class. The onSuccess method ofAnalysisRequest.run now uses the RequestResult structure instead of List<OzAnalysisResult>.
All exceptions are moved to the com.ozforensics.liveness.sdk.core.exceptions package (See changes below).
Classes related to AnalysisRequest are moved to the com.ozforensics.liveness.sdk.analysispackage (See changes below).
The methods below are no longer supported:
AnalysisRequest.Builder.uploadMedia
AnalysisRequest.Type.HYBRID in com.ozforensics.liveness.sdk.analysis.entityAnalysisError in com.ozforensics.liveness.sdk.analysis.entity
SourceMedia in com.ozforensics.liveness.sdk.analysis.entity
ResultMedia in com.ozforensics.liveness.sdk.analysis.entity
RequestResult in com.ozforensics.liveness.sdk.analysis.entity
NoAnalysisException from com.ozforensics.liveness.sdk.exceptions в com.ozforensics.liveness.sdk.core.exceptions
NoNetworkException from com.ozforensics.liveness.sdk.exceptions в com.ozforensics.liveness.sdk.core.exceptions
TokenException from com.ozforensics.liveness.sdk.exceptions в com.ozforensics.liveness.sdk.core.exceptions
NoMediaInAnalysisException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions
EmptyMediaListException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions
NoSuchMediaException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions
LicenseException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.security.exception
Analysis from com.ozforensics.liveness.sdk.analysis.entity to com.ozforensics.liveness.sdk.core.model
AnalysisRequest from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core
AnalysisListener from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core
AnalysisStatus from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core
AnalysisRequest.Builder from com.ozforensics.liveness.sdk.analysis to com.ozforensics.liveness.sdk.core
OzException from com.ozforensics.liveness.sdk.exceptions to com.ozforensics.liveness.sdk.core.exceptions
OzLivenessSDK
Removed uploadMediaAndAnalyze
Removed uploadMedia
Removed runOnDeviceBiometryAnalysis
Removed runOnDeviceLivenessAnalysis
AnalysisRequest
Removed build(): AnalysisRequest
AnalysisRequest.Builder
Removed addMedia
Removed onSuccess(result: List<OzAnalysisResult>)
Added onSuccess(result: RequestResult)
The OzLivenessSDK::init method no longer crashes if there is a StatusListener parameter passed.
Changed the scan gesture animation.
OzAnalysisResult now shows the server-based analyses' scores properly.
Fixed initialization issues, displaying of wrong customization settings, authorization failures on Android <7.1.1.
OzMedia is renamed to OzAbstractMedia and got subclasses for images and videos.
Fixed camera bugs for some devices.
Improved the image analysis.
Removed unusable dependencies.
Fixed logging.
Removed method
Replacement
OzLivenessSDK.uploadMediaAndAnalyze
AnalysisRequest.run
OzLivenessSDK.uploadMedia
AnalysisRequest.Builder.uploadMedia
OzLivenessSDK.runOnDeviceBiometryAnalysis
AnalysisRequest.run
OzLivenessSDK.runOnDeviceLivenessAnalysis
AnalysisRequest.run
AnalysisRequest.build(): AnalysisRequest
-
AnalysisRequest.Builder.addMedia
implementation 'com.ozforensics.liveness:full:7.0.0'
implementation 'com.ozforensics.liveness:sdk:7.0.0' implementation 'com.ozforensics.liveness:on-device:6.3.4'val mediaList: List<OzAbstractMedia> = ...
val biometryAnalysisResult: OzAnalysisResult = OzLivenessSDK.runOnDeviceBiometryAnalysis(mediaList)
val livenessAnalysisResult: OzAnalysisResult = OzLivenessSDK.runOnDeviceLivenessAnalysis(mediaList)To set your own look-and-feel options, use the style section in the Ozliveness.open method. The options are listed below the example.
Main color settings.
Parameter
Main font settings.
Title font settings.
Buttons’ settings.
Toolbar settings.
Center hint settings.
Hint animation settings.
Face frame settings.
Document capture frame settings.
Background settings.
Scam protection settings: the antiscam message warns user about their actions being recorded.
SDK version text settings.
3D mask settings. The mask has been implemented in 1.2.1.
Settings for a custom loader (added in 1.7.15).
Loader transition settings (added in 1.8.0).
Table of parameters' correspondence:
OzLiveness.open({
style: {
baseColorCustomization: {
textColorPrimary: "#000000",
backgroundColorPrimary: "#FFFFFF",
textColorSecondary: "#8E8E93",
backgroundColorSecondary: "#F2F2F7",
iconColor: "#00A5BA"
},
baseFontCustomization: {
textFont: "Roboto, sans-serif",
textSize: "16px",
textWeight: "400",
textStyle: "normal"
},
titleFontCustomization: {
textFont: "inherit",
textSize: "36px",
textWeight: "500",
textStyle: "normal"
},
buttonCustomization: {
textFont: "inherit",
textSize: "14px",
textWeight: "500",
textStyle: "normal",
textColorPrimary: "#FFFFFF",
backgroundColorPrimary: "#00A5BA",
textColorSecondary: "#00A5BA",
backgroundColorSecondary: "#DBF2F5",
cornerRadius: "10px"
},
toolbarCustomization: {
closeButtonIcon: "cross",
iconColor: "#707070"
},
centerHintCustomization: {
textFont: "inherit",
textSize: "24px",
textWeight: "500",
textStyle: "normal",
textColor: "#FFFFFF",
backgroundColor: "#1C1C1E",
backgroundOpacity: "56%",
backgroundCornerRadius: "14px",
verticalPosition: "38%"
},
hintAnimation: {
hideAnimation: false,
hintGradientColor: "#00BCD5",
hintGradientOpacity: "100%",
animationIconSize: "80px"
},
faceFrameCustomization: {
geometryType: "oval",
cornersRadius: "0px",
strokeDefaultColor: "#D51900",
strokeFaceInFrameColor: "#00BCD5",
strokeOpacity: "100%",
strokeWidth: "6px",
strokePadding: "4px"
},
documentFrameCustomization: {
cornersRadius: "20px",
templateColor: "#FFFFFF",
templateOpacity: "100%"
},
backgroundCustomization: {
backgroundColor: "#FFFFFF",
backgroundOpacity: "88%"
},
antiscamCustomization: {
enableAntiscam: false,
textMessage: "",
textFont: "inherit",
textSize: "14px",
textWeight: "500",
textStyle: "normal",
textColor: "#000000",
textOpacity: "100%",
backgroundColor: "#F2F2F7",
backgroundOpacity: "100%",
backgroundCornerRadius: "20px",
flashColor: "#FF453A"
},
versionTextCustomization: {
textFont: "inherit",
textSize: "16px",
textWeight: "500",
textStyle: "normal",
textColor: "#000000",
textOpacity: "56%"
},
maskCustomization: {
maskColor: "#008700",
glowColor: "#000102",
minAlpha: "30%", // 0 to 1 or 0% to 100%
maxAlpha: "100%" // 0 to 1 or 0% to 100%
},
/* for an HTML string, use string; for HTMLElement, insert it via cloneNode(true) */
loaderSlot: yourLoader, /* <string | HTMLElement> */
loaderTransition: {type: 'fade', duration: 500}
}
});Main background color
textColorSecondary
Secondary text color
backgroundColorSecondary
Secondary background color
cornerRadius
Button corner radius
Background color
backgroundOpacity
Background opacity
backgroundCornerRadius
Frame corner radius
verticalPosition
Vertical position
Stroke width
strokePadding
Padding from stroke
Text color
textOpacity
Text opacity
backgroundColor
Background color
backgroundOpacity
Background opacity
backgroundCornerRadius
Frame corner radius
flashColor
Flashing indicator color
Text opacity
{phase: 'start' | 'progress' | 'end', percent?}
before / during / after data transmission
loader:destroy
when you need to hide the slot
centerHintCustomization.verticalPosition
centerHint.letterSpacing
-
centerHint.fontStyle
centerHintCustomization.textStyle
closeButton.image
-
backgroundOutsideFrame.color
backgroundCustomization.backgroundColor
Description
textColorPrimary
Main text color
backgroundColorPrimary
Main background color
textColorSecondary
Secondary text color
backgroundColorSecondary
Secondary background color
iconColor
Icons’ color
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
textColorPrimary
Main text color
Parameter
Description
closeButtonIcon
Close button icon
iconColor
Close button icon color
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
textColor
Text color
Parameter
Description
hideAnimation
Disable animation
hintGradientColor
Gradient color
hintGradientOpacity
Gradient opacity
animationIconSize
Animation icon size
Parameter
Description
geometryType
Frame shape: rectangle or oval
cornersRadius
Frame corner radius (for rectangle)
strokeDefaultColor
Frame color when a face is not aligned properly
strokeFaceInFrameColor
Frame color when a face is aligned properly
strokeOpacity
Stroke opacity
Parameter
Description
cornersRadius
Frame corner radius
templateColor
Document template color
templateOpacity
Document template opacity
Parameter
Description
backgroundColor
Background color
backgroundOpacity
Background opacity
Parameter
Description
textMessage
Antiscam message text
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
textColor
Text color
Parameter
Description
maskColor
The color of the mask itself
glowColor
The color of the glowing mask shape
minAlpha
Minimum mask transparency level. Implemented in 1.3.1
maxAlpha
Maximum mask transparency level. Implemented in 1.3.1
Event
Payload
Is called
loader:init
{os, browser, platform}
immediately after inserting the slot
loader:waitingCamera
{os, browser, platform, waitedMs}
every waitedMs ms, while waiting for camera access
loader:cameraReady
when access is granted and loader should be hidden
loader:processing
{phase: 'start' | 'end'}
before / after data preparation
Parameter
Description
type
Animation type: none, fade, slide, scale
duration
Animation length in ms
easing (optional)
easing: linear, ease-in-out, etc
Previous design
New design
doc_color
-
face_color_success
faceFrame.faceReady
faceFrameCustomization.strokeFaceInFrameColor
face_color_fail
faceFrame.faceNotReady
faceFrameCustomization.strokeDefaultColor
centerHint.textSize
centerHintCustomization.textSize
centerHint.color
centerHintCustomization.textColor
backgroundColorPrimary
backgroundColor
strokeWidth
textColor
textOpacity
loader:uploading
centerHint.yPosition
Deletes all action videos from file system (iOS 8.4.0 and higher, Android).
Returns
Future<Void>.
Returns the SDK version.
Returns
Future<String>.
Initializes SDK with license sources.
Returns
Authentication via credentials.
Returns
Authentication via access token.
Returns
Connection to the telemetry server via credentials.
Returns
Connection to the telemetry server via access token.
Returns
Checks whether an access token exists.
Returns
Deletes the saved access token.
Returns
Nothing (void).
Returns the list of SDK supported languages.
Returns
List<>.
Starts the Liveness video capturing process.
Sets the length of the Selfie gesture (in milliseconds).
Returns
Error if any.
Launches the analyses.
Returns
List<>.
Sets the SDK localization.
The number of attempts before SDK returns error.
Sets the UI customization values for OzLivenessSDK. The values are described in the Customization structures section. Structures can be found in the lib\customization.dart file.
Sets the timeout for the face alignment for actions.
Add fonts and drawable resources to the application/ios project.
Fonts and images should be placed into related folders:
ozforensics_flutter_plugin\android\src\main\res\drawable ozforensics_flutter_plugin\android\src\main\res\font
These are defined in the customization.dart file.
Contains the information about customization parameters.
Toolbar customization parameters.
Center hint customization parameters.
Hint animation customization parameters.
Frame around face customization parameters.
SDK version customization parameters.
Background customization parameters.
Defined in the models.dart file.
Stores the language information.
The type of media captured.
The type of media captured.
Contains an action from the captured video.
Stores information about media.
Stores information about the analysis result.
Stores data about a single analysis.
Analysis type.
Analysis mode.
Contains the action from the captured video.
The general status for all analyses applied to the folder created.
Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully. By default, the system uploads the compressed video.
Contains information about the .
This is a Map to define the platform-specific resources on the plugin level.
This key is a Map for the close button icon.
This key is a Map containing the data on the uploaded fonts.
This key is a Map containing the data on the uploaded font styles.
This key is a Map containing the data on grame shape.
Whitelisted certificates
Whitelisted certificates
Additional parameters
Text font
titleSize
int
Font size
titleFontStyle
String
Font style
titleColor
String
Color #XXXXXX
titleAlpha
int
Header text opacity
isTitleCentered
bool
Sets the text centered
backgroundColor
String
Header background color #XXXXXX
backgroundAlpha
int
Header background opacity
Font size
verticalPosition
int
Y position
textAlpha
int
Text opacity
centerBackground
bool
Sets the text centered
Gradient color
Color #XXXXXX
strokeFaceAlignedColor
String
Color #XXXXXX
strokeAlpha
int
Stroke opacity
strokePadding
int
Stroke padding
Font size
textAlpha
int
Text opacity
Spanish
pt_br
Portuguese (Brazilian)
A video with the smile gesture
videoSelfieHigh
A video with the lifting head up gesture
videoSelfieDown
A video with the tilting head downwards gesture
videoSelfieRight
A video with the turning head right gesture
videoSelfieLeft
A video with the turning head left gesture
photoIdPortrait
A photo from a document
photoIdBack
A photo of the back side of the document
photoIdFront
A photo of the front side of the document
A type of media
iOS
videoPath
String
A path to a video
bestShotPath
String
path of the best shot in PNG for video or image path for liveness
preferredMediaPath
String
URL of the API media container
photoPath
String
A path to a photo
archivePath
String
A path to an archive
tag
A tag for media
Android
The error code
Android only
errorMessage
String
The error message
mode
The mode of the analysis
confidenceScore
Double
The resulting score
resolution
The completed analysis' result
status
Boolean
The analysis state:
true- success;
false- failed
Additional analysis parameters
sizeReductionStrategy
Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully
Head tilted downwards
headUp
Head lifted up
eyeBlink
Blink
smile
Smile
Parameter
Type
Description
licenses
List<String>
A list of licences
Case
Text
True
Initialization has completed successfully
False
Initialization error
Parameter
Type
Description
String
User email
password
String
User password
host
String
Server URL
sslPins (optional)
Case
Text
Success
Nothing (void)
Failed
PlatformException:
code = AUTHENTICATION_FAILED
message = exception details
Parameter
Type
Description
token
String
User email
host
String
Server URL
sslPins (optional)
Whitelisted certificates
Case
Text
Success
Nothing (void)
Failed
PlatformException:
code = AUTHENTICATION_FAILED
message = exception details
Parameter
Type
Description
String
User email
password
String
User password
host
String
Server URL
sslPins (optional)
Case
Text
Success
Nothing (void)
Failed
PlatformException:
code = AUTHENTICATION_FAILED
message = exception details
Parameter
Type
Description
token
String
User email
host
String
Server URL
sslPins (optional)
Whitelisted certificates
Case
Text
Success
Nothing (void)
Failed
PlatformException:
code = AUTHENTICATION_FAILED
message = exception details
Case
Returns
Token exists
True
Token does not exist
False
Parameter
Type
Description
actions
List<VerificationAction>
Actions to execute
mainCamera
Boolean
Use main (True) or front (False) camera
Parameter
Type
Description
selfieLength
Int
The length of the Selfie gesture (in milliseconds). Should be within 500-5000 ms, the default length is 700
Parameter
Type
Description
analysis
List<Analysis>
The list of Analysis structures
folder ID (optional)
String
Folder ID, if you want to perform an anaoysis for a particular folder
uploadMedia
List<Media>
The list of the captures videos
params
Parameter
Type
Description
locale
The SDK language
Parameter
Type
Description
singleCount
int
Attempts on a single action/gesture
commonCount
int
Total number of attempts on all actions/gestures if you use a sequence of them
Parameter
Type
Description
timeout
int
Timeout in milliseconds
Parameter
Type
Description
closeButtonIcon
String
Close button icon received from plugin
closeButtonColor
String
Color #XXXXXX
titleText
String
Header text
titleFont
Parameter
Type
Description
textFont
String
Text font
textFontStyle
String
Font style
textColor
String
Color #XXXXXX
textSize
Parameter
Type
Description
hideAnimation
bool
Hides the hint animation
animationIconSize
int
Animation icon size in px (40-160)
hintGradientColor
String
Color #XXXXXX
hintGradientOpacity
Parameter
Type
Description
geometryType
String
Frame shape received from plugin
geometryTypeRadius
int
Corner radius for rectangle
strokeWidth
int
Frame stroke width
strokeFaceNotAlignedColor
Parameter
Type
Description
textFont
String
Text font
textFontStyle
String
Font style
textColor
String
Color #XXXXXX
textSize
Parameter
Type
Description
backgroundColor
String
Color #XXXXXX
backgroundAlpha
int
Background opacity
Case
Description
en
English
hy
Armenian
kk
Kazakh
ky
Kyrgyz
tr
Turkish
Case
Description
movement
A media with an action
documentBack
The back side of the document
documentFront
The front side of the document
Case
Description
documentPhoto
A photo of a document
video
A video
shotSet
A frame archive
Case
Description
blank
A video with no gesture
photoSelfie
A selfie photo
videoSelfieOneShot
A video with the best shot taken
videoSelfieScan
A video with the scanning gesture
videoSelfieEyes
A video with the blink gesture
Parameter
Type
Description
Platform
fileType
The type of the file
Android
movement
An action on a media
iOS
mediatype
Parameter
Type
Description
Platform
folderId
String
The folder identifier
type
The analysis type
errorCode
Parameter
Type
Description
type
The type of the analysis
mode
The mode of the analysis
mediaList
List<Media>
Media to analyze
params
Case
Description
biometry
The algorithm that allows comparing several media and check if the people on them are the same person or not
quality
The algorithm that aims to check whether a person in a video is a real human acting in good faith, not a fake of any kind.
Case
Description
onDevice
The on-device analysis with no server needed
serverBased
The server-based analysis
hybrid
The hybrid analysis for Liveness: if the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.
Case
Description
oneShot
The best shot from the video taken
blank
A selfie with face alignment check
scan
Scan
headRight
Head turned right
headLeft
Head turned left
Case
Description
failed
One or more analyses failed due to some error and couldn't get finished
declined
The check failed (e.g., faces don't match or some spoofing attack detected)
success
Everything went fine, the check succeeded (e.g., faces match or liveness confirmed)
operatorRequired
The result should be additionally checked by a human operator
uploadOriginal
The original video
uploadCompressed
The compressed video
uploadBestShot
The best shot taken from the video
uploadNothing
Nothing is sent (note that no folder will be created)
Parameter
Type
Description
hash
String
SHA256 key hash in base64
expired_at
UNIX timestamp, UTC time
The date of certificate expiration, ms
Key
Value
Close
Android drawable resource / iOS Pods resource
Arrow
Android drawable resource / iOS Pods resource
Key
Value
Flutter application font name
Android font resource / iOS Pods resource, used to retrieve the font on the plugin level
Key
Value
Flutter application font style name
Name of the style retrieved for the font creation on the plugin level
Key
Value
Oval
Oval shape
Rectangle
Rectangular shape
Map<String, Any>
String
int
int
String
int
es
videoSelfieSmile
String
int
Map<String, String>
headDown
A singleton for Oz SDK.
Initializes OZSDK with the license data. The closure is either license data or .
Returns
-
Forces the license installation.
Retrieves an access token for a user.
Returns
The access token or an error.
Retrieves an access token for a user to send telemetry.
Returns
The access token or an error.
Checks whether an access token exists.
Parameters
-
Returns
The result – the true or false value.
Deletes the saved access token
Parameters
-
Returns
-
Creates the Liveness check controller.
Returns
UIViewController or an exception.
Creates the Liveness check controller.
Returns
UIViewController or an exception.
Deletes all videos.
Parameters
-
Returns
-
Retrieves the telemetry session ID.
Parameters
-
Returns
The telemetry session ID (String parameter).
Sets the bundle to look for translations in.
Returns
-
Sets the length of the Selfie gesture (in milliseconds).
Generates the payload with media signatures.
Returns
Payload to be sent along with media files that were used for generation.
SDK locale (if not set, works automatically).
The host to call for Liveness video analysis.
The holder for attempts counts before SDK returns error.
The SDK version.
A delegate for OZSDK.
Gets the Liveness check results.
Returns
-
The error processing method.
Returns
-
A protocol for performing checks.
Creates the AnalysisRequest instance.
Returns
The AnalysisRequest instance.
Adds an analysis to the AnalysisRequest instance.
Returns
-
Uploads media on server.
Returns
-
Adds the folder ID to upload media to a certain folder.
Returns
-
Adds metadata to a folder.
Returns
-
Runs the analyses.
Returns
The analysis result or an error.
Customization for OzLivenessSDK (use OZSDK.customization).
A set of customization parameters for the toolbar.
A set of customization parameters for the center hint that guides a user through the process of taking an image of themselves.
A set of customization parameters for the hint animation.
A set of customization parameters for the frame around the user face.
A set of customization parameters for the background outside the frame.
A set of customization parameters for the SDK version text.
A set of customization parameters for the antiscam message that warns user about their actions being recorded.
Logo customization parameters. Custom logo should be allowed by license.
A source of a license.
The license data.
Contains action from the captured video.
Contains the locale code according to .
Contains all the information on the media captured.
The type of media captured.
Error description. These errors are deprecated and will be deleted in the upcoming releases.
Contains information on what media to analyze and what analyses to apply.
The type of the analysis.
The mode of the analysis.
Shows the media processing status.
Shows the files' uploading status.
Shows the analysis processing status.
Describes the analysis result for the single media.
Contains the consolidated analysis results for all media.
Contains the results of the checks performed.
The general status for all analyses applied to the folder created.
Contains the results for single analyses.
Frame shape settings.
Possible license errors.
The authorization type.
Defines the settings for the repeated media upload.
Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully. By default, the system uploads the compressed video.
Contains information about the .
Sets the number of attempts and timeout between them
Toolbar title text color
backgroundColor
UIColor
Toolbar background color
titleText
String
Text on the toolbar
Center hint vertical position from the screen top (in %, 0-100)
hideTextBackground
Bool
Hides text background
backgroundCornerRadius
Int
Center hint background frame corner radius
Frame color when a face is aligned properly
strokeWidth
CGFloat
Frame stroke width (in dp, 0-20)
strokePadding
CGFloat
A padding from the stroke to the face alignment area (in dp, 0-10)
Antiscam message text color
customizationAntiscamBackgroundColor
UIColor
Antiscam message text background color
customizationAntiscamCornerRadius
CGFloat
Background frame corner radius
customizationAntiscamFlashColor
UIColor
Color of the flashing indicator close to the antiscam message
Additional configuration
Head turned left
right
Head turned right
down
Head tilted downwards
up
Head lifted up
Spanish
pt-BR
Portuguese (Brazilian)
custom(String)
Custom language (language ISO 639-1 code, two letters)
URL of the Liveness video
bestShotURL
URL
URL of the best shot in PNG
preferredMediaURL
URL
URL of the API media container
timestamp
Date
Timestamp for the check completion
The Liveness check can't be performed: attempts limit exceeded
failedBecausePreparingTimout
The Liveness check can't be performed: face alignment timeout
failedBecauseOfLowMemory
The Liveness check can't be performed: no memory left
Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully
params (optional)
String
Additional parameters
Object uploading status
Resulting score
mediaType
String
Media file type: VIDEO / IMAGE / SHOT_SET
media
Media that is being analyzed
error
AnalysisError (inherits from Error)
Error
Analysis identifier
error
AnalysisError (inherits from Error)
Error
resultMedia
[]
Results of the analysis for single media files
confidenceScore
Float
The resulting score
serverRawResponse
String
Server response
Everything went fine, the check succeeded (e.g., faces match or liveness confirmed)
OPERATOR_REQUIRED
The result should be additionally checked by a human operator
The result of the check performed
Parameter
Type
Description
licenseSources
The source of the license
Parameter
Type
Description
licenseSource
Source of the license
Parameter
Type
Description
apiConnection
Authorization parameters
Parameter
Type
Description
eventsConnection
Telemetry authorization parameters
Parameter
Type
Description
delegate
The delegate for Oz Liveness
actions
Captured action
cameraPosition (optional)
AVCaptureDevice.Position
front – front camera (default),
back – rear camera
Parameter
Type
Description
actions
Captured action
FaceCaptureCompletion
type alias used as follows:
public typealias FaceCaptureCompletion = (_ results: [OZMedia]?, _ error: OZVerificationStatus?) -> Void
cameraPosition (optional)
AVCaptureDevice.Position
front – front camera (default),
back – rear camera
Parameter
Type
Description
languageBundle
Bundle
The bundle that contains translations
Parameter
Type
Description
selfieLength
Int
The length of the Selfie gesture (in milliseconds). Should be within 500-5000 ms, the default length is 700
Parameter
Type
Description
media
An array of media files
folderMeta
[String]
Additional folder metadata
Parameter
Type
Description
localizationCode
The localization code
Parameter
Type
Description
host
String
Host address
Parameter
Type
Description
singleCount
Int
Attempts on a single action/gesture
commonCount
Int
Total number of attempts on all actions/gestures if you use a sequence of them
faceAlignmentTimeout
Float
Time needed to align face into frame
uploadMediaSettings
Parameter
Type
Description
version
String
Version number
Parameter
Type
Description
results
[OzMedia]
An array of the OzMedia objects.
Parameter
Type
Description
status
The error description.
Parameter
Type
Description
folderId (optional)
String
The identifier to define when you need to upload media to a certain folder.
Parameter
Type
Description
analysis
A structure containing information on the analyses required.
Parameter
Type
Description
media
Media or an array of media objects to be uploaded.
Parameter
Type
Description
folderId
String
The folder identifier.
Parameter
Type
Description
meta
[String]
An array of metadata as follows:
["meta1": "data1"]
Parameter
Type
Description
statusHandler
A callback function as follows:
statusHandler: @escaping ((_ status: RequestStatus) -> Void)
The handler that is executed when the scenario state changes
errorHandler
A callback function as follows:
errorHandler: @escaping ((_ error: Error) -> Void)
Error handler
completionHandler
A callback function as follows:
completionHandler: @escaping (_ results : RequestResult) -> Void)
The handler that is executed when the run method completes.
Parameter
Type
Description
closeButtonIcon
UIImage
An image for the close button
closeButtonColor
UIColor
Close button tintColor
titleFont
UIFont
Toolbar title text font
titleColor
Parameter
Type
Description
textFont
UIFont
Center hint text font
textColor
UIColor
Center hint text color
backgroundColor
UIColor
Center hint text background
verticalPosition
Parameter
Type
Description
hideAnimation
Bool
A switcher for hint animation, if True, the animation is hidden
animationIconSize
CGfloat
A side size of the animation icon square
hintGradientColor
UIColor
The close-to-frame gradient color
Parameter
Type
Description
geometryType
The frame type: oval, rectangle, circle, or square
cornerRadius
CGFloat
Rectangle corner radius (in dp)
strokeFaceNotAlignedColor
UIColor
Frame color when a face is not aligned properly
strokeFaceAlignedColor
Parameter
Type
Description
backgroundColor
UIColor
Background color
Parameter
Type
Description
textFont
UIFont
SDK version text font
textColor
UIColor
SDK version text color
Parameter
Type
Description
customizationEnableAntiscam
Bool
Adds the antiscam message
customizationAntiscamTextMessage
String
Antiscam message text
customizationAntiscamTextFont
UIFont
Antiscam message text font
customizationAntiscamTextColor
Parameter
Type
Description
image
UIImage
Logo image
size
CGSize
Logo size (in dp)
Case
Description
licenseFilePath
An absolute path to a license (String)
licenseFileName
The name of the license file
Parameter
Type
Description
appIDS
[String]
An array of bundle IDs
expires
TimeInterval
The expiration interval
features
Features
License features
configs (optional)
Case
Description
smile
Smile
eyes
Blink
scanning
Scan
selfie
A selfie with face alignment check
one_shot
The best shot from the video taken
Case
Description
en
English
hy
Armenian
kk
Kazakh
ky
Kyrgyz
tr
Turkish
Parameter
Type
Description
movement
User action type
mediaType
Type of media
metaData
[String] as follows:
["meta1": "data1"]
Metadata if any
videoURL
Case
Description
movement
A media with an action
documentBack
The back side of the document
documentFront
The front side of the document
Case
Description
userNotProcessed
The Liveness check was not processed
failedBecauseUserCancelled
The check was interrupted by user
failedBecauseCameraPermissionDenied
The Liveness check can't be performed: no camera access
failedBecauseOfBackgroundMode
The Liveness check can't be performed: background mode
failedBecauseOfTimeout
The Liveness check can't be performed: timeout
Parameter
Type
Description
media
[OzMedia]
An array of the OzMedia objects
type
The type of the analysis
mode
The mode of the analysis
sizeReductionStrategy
Case
Description
biometry
The algorithm that allows comparing several media and check if the people on them are the same person or not
quality
The algorithm that aims to check whether a person in a video is a real human acting in good faith, not a fake of any kind.
document (deprecated)
The analysis that aims to recognize the document and check if its fields are correct according to its type.
blacklist
The analysis that compares a face on a captured media with faces from the pre-made media database.
Case
Description
onDevice
The on-device analysis with no server needed. We recommend using server-based analyses whenever possible, as on-device ones tend to produce less accurate results
serverBased
The server-based analysis
hybrid
The hybrid analysis for Liveness: if the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.
Case
Description
addToFolder
The system is creating a folder and adding files to this folder
addAnalyses
The system is adding analyses
waitAnalysisResult
The system is waiting for the result
Parameter
Type
Description
media
The object that is being uploaded at the moment
index
Int
Number of this object in a list
from
Int
Objects quantity
progress
Parameter
Type
Description
status
Processing analysis status
progressStatus
Media uploading status
Parameter
Type
Description
resolution
Consolidated analysis result
sourceId
String
Media identifier
isOnDevice
Bool
Analysis mode
confidenceScore
Parameter
Type
Description
resolution
Consolidated analysis result
folderId
String
Folder identifier
analysisResults
A list of analysis results
Parameter
Type
Description
resolution
Analysis resolution
type
Analysis type
mode
Analysis mode
analysisId
Case
Description
INITIAL
No analyses have been applied yet
PROCESSING
The analyses are in progress
FAILED
One or more analyses failed due to some error and couldn't get finished
FINISHED
The analyses are finished
DECLINED
The check failed (e.g., faces don't match or some spoofing attack detected)
Parameter
Type
Description
analyseResolutionStatus
The analysis status
type
The analysis type
folderID
String
The folder identifier
score
Case
Description
oval
Oval frame
rectangle(cornerRadius: CGFloat)
Rectangular frame (with corner radius)
circle
Circular frame
square(cornerRadius: CGFloat)
Square frame (with corner radius)
Case
Description
licenseFileNotFound
The license is not found
licenseParseError
Cannot parse the license file, the license might be invalid
licenseBundleError
The bundle_id in the license file doesn't match with bundle_id used.
licenseExpired
The license is expired
Case
Description
fromServiceToken
Authorization with a token:
host: String
token: String
pins (optional): a list of sslPin
fromCredentials
Authorization with credentials:
host: String
login: String
password: String
pins (optional): a list of
attemptsCount
Int
Number of attempts for media upload
attemptsTimeout
Int
Timeout between attempts
uploadOriginal
The original video
uploadCompressed
The compressed video
uploadBestShot
The best shot taken from the video
uploadNothing
Nothing is sent (note that no folder will be created)
Parameter
Type
Description
publicKeyHash
String
SHA256 key hash in base64
expiration date
UNIX timestamp, UTC time
The date of certificate expiration
UIColor
Int
UIColor
UIColor
ABTestingConfigs
left
es
URL
failedBecauseOfAttemptLimit
Progress
Float
String
SUCCESS
Float
A singleton for Oz SDK.
Deletes all action videos from file system.
Parameters
-
Returns
-
Creates an intent to start the Liveness activity.
Returns
-
Utility function to get the SDK error from OnActivityResult's intent.
Returns
The – String.
Retrieves the SDK license payload.
Parameters
-
Returns
The license payload () – the object that contains the extended info about licensing conditions.
Utility function to get SDK results from OnActivityResult's intent.
Returns
A list of OzAbstractMedia objects.
Initializes SDK with license sources.
Returns
-
Enables logging using the Oz Liveness SDK logging mechanism.
Returns
-
Connection to API.
Connection to the telemetry server.
Deletes the saved token.
Parameters
-
Returns
-
Retrieves the telemetry session ID.
Parameters
-
Returns
The telemetry session ID (String parameter).
Retrieves the SDK version.
Parameters
-
Returns
The SDK version (String parameter).
Generates the payload with media signatures.
Returns
Payload to be sent along with media files that were used for generation.
A class for performing checks.
The analysis launching method.
A builder class for AnalysisRequest.
Creates the AnalysisRequest instance.
Parameters
-
Returns
The class instance.
Adds an analysis to your request.
Returns
Error if any.
Adds a list of analyses to your request. Allows executing several analyses for the same folder on the server side.
Returns
Error if any.
Adds metadata to a folder you create (for the server-based analyses only). You can add a pair key-value as additional information to the folder with the analysis result on the server side.
Returns
Error if any.
Uploads one or more media to a folder.
Returns
Error if any.
For the previously created folder, sets a folderId. The folder should exist on the server side. Otherwise, a new folder will be created.
Returns
Error if any.
Configuration for OzLivenessSDK (use OzLivenessSDK.config).
Sets the length of the Selfie gesture (in milliseconds).
Returns
Error if any.
The possibility to enable additional debug info by clicking on version text.
The number of attempts before SDK returns error.
Settings for repeated media upload.
Timeout for face alignment (measured in milliseconds).
Interface implementation to retrieve error by Liveness detection.
Locale to display string resources.
Logging settings.
Uses the main (rear) camera instead of the front camera for liveness detection.
Disables the option that prevents videos to be too short (3 frames or less).
Customization for OzLivenessSDK (use OzLivenessSDK.config.customization).
Hides the status bar and the three buttons at the bottom. The default value is True.
A set of customization parameters for the toolbar.
A set of customization parameters for the center hint that guides a user through the process of taking an image of themselves.
A set of customization parameters for the hint animation.
A set of customization parameters for the frame around the user face.
A set of customization parameters for the background outside the frame.
A set of customization parameters for the SDK version text.
A set of customization parameters for the antiscam message that warns user about their actions being recorded.
Logo customization parameters. Custom logo should be allowed by license.
Contains the action from the captured video.
Contains the extended info about licensing conditions.
A class for the captured media that can be:
A document photo.
A set of shots in an archive.
A Liveness video.
Contains an action from the captured video.
A class for license that can be:
Contains the license ID.
Contains the path to a license.
A class for analysis status that can be:
This status means the analysis is launched.
This status means the media is being uploaded.
The type of the analysis.
The mode of the analysis.
Contains information on what media to analyze and what analyses to apply.
The general status for all analyses applied to the folder created.
Holder for attempts counts before SDK returns error.
Contains the locale code according to .
Contains logging settings.
A class for color that can be (depending on the value received):
Frame shape settings.
Exception class for AnalysisRequest.
Structure that describes media used in AnalysisRequest.
Structure that describes the analysis result for the single media.
Consolidated result for all analyses performed.
Result of the analysis for all media it was applied to.
Defines the authentication method.
Authentication via token.
Authentication via credentials.
Defines the settings for the repeated media upload.
Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully. By default, the system uploads the compressed video.
Contains information about the .
Toolbar title text font style
titleTextSize
Int
Toolbar title text size (in sp, 12-18)
titleTextAlpha
Int
Toolbar title text opacity (in %, 0-100)
titleTextColor
Toolbar title text color
backgroundColor
Toolbar background color
backgroundAlpha
Int
Toolbar background opacity (in %, 0-100)
isTitleCentered
Boolean
Defines whether the text on the toolbar is centered or not
title
String
Text on the toolbar
Center hint text color
textAlpha
Int
Center hint text opacity (in %, 0-100)
verticalPosition
Int
Center hint vertical position from the screen bottom (in %, 0-100)
backgroundColor
Center hint background color
backgroundOpacity
Int
Center hint background opacity
backgroundCornerRadius
Int
Center hint background frame corner radius (in dp, 0-20)
A switcher for hint animation, if True, the animation is hidden
Frame color when a face is aligned properly
strokeAlpha
Int
Frame opacity (in %, 0-100)
strokeWidth
Int
Frame stroke width (in dp, 0-20)
strokePadding
Int
A padding from the stroke to the face alignment area (in dp, 0-10)
SDK version text opacity (in %, 20-100)
Antiscam message text color
textAlpha
Int
Antiscam message text opacity (in %, 0-100)
backgroundColor
Antiscam message background color
backgroundOpacity
Int
Antiscam message background opacity
cornerRadius
Int
Background frame corner radius (in px, 0-20)
flashColor
Color of the flashing indicator close to the antiscam message
Head tilted downwards
HeadUp
Head lifted up
EyeBlink
Blink
Smile
Smile
Media metadata
Media metadata
URL of the API media container
additionalTags (optional)
String
Additional tags if needed (including those not from the OzMediaTag enum)
metaData
Map<String, String>
Media metadata
A video with the smile gesture
VideoSelfieHigh
A video with the lifting head up gesture
VideoSelfieDown
A video with the tilting head downwards gesture
VideoSelfieRight
A video with the turning head right gesture
VideoSelfieLeft
A video with the turning head left gesture
PhotoIdPortrait
A photo from a document
PhotoIdBack
A photo of the back side of the document
PhotoIdFront
A photo of the front side of the document
Completion percentage
Additional parameters
sizeReductionStrategy
Defines what type of media is being sent to the server in case of the hybrid analysis once the on-device analysis is finished successfully
Spanish
PT-BR
Portuguese (Brazilian)
Media object
tags
List<String>
Tags for media
Source media
type
Type of the analysis
A list of results of the analyses for single media
confidenceScore
Float
Resulting score
analysisId
String
Analysis identifier
params
@RawValue Map<String, Any>
Additional folder parameters
error
Error if any
serverRawResponse
String
Response from backend
Whitelisted certificates
No found in a video
FORCE_CLOSED = 7
Error. Liveness activity is force closed from client application.
A user closed the Liveness screen during video recording
DEVICE_HAS_NO_FRONT_CAMERA = 8
Error. Device has not front camera.
No front camera found
DEVICE_HAS_NO_MAIN_CAMERA = 9
Error. Device has not main camera.
No rear camera found
DEVICE_CAMERA_CONFIGURATION_NOT_SUPPORTED = 10
Error. Device camera configuration is not supported.
Oz Liveness doesn't support the camera configuration of the device
FACE_ALIGNMENT_TIMEOUT = 12
Error. Face alignment timeout in OzLivenessSDK.config.faceAlignmentTimeout milliseconds
Time limit for the is exceeded
ERROR = 13
The check was interrupted by user
User has closed the screen during the Liveness check.
Parameter
Type
Description
actions
A list of possible actions
Parameter
Type
Description
data
Intent
The object to test
Parameter
Type
Description
data
Intent
The object to test
Parameter
Type
Description
context
Context
The Context class
licenseSources
A list of license references
statusListener
StatusListener
Optional listener to check the license load result
Parameter
Type
Description
tag
String
Message tag
log
String
Message log
Parameter
Type
Description
connection
Connection type
statusListener
StatusListener<String?>
Listener
Parameter
Type
Description
connection
Connection type
statusListener
StatusListener<String?>
Listener
Parameter
Type
Description
media
An array of media files
folderMeta (optional)
[string:any]
Additional folder metadata
Parameter
Type
Description
onStatusChange
A callback function as follows:
onStatusChange(status: AnalysisRequest.AnalysisStatus) { handleStatus() }
The function is executed when the status of the AnalysisRequest changes.
onError
A callback function as follows:
onError(error: OzException) { handleError() }
The function is executed in case of errors.
onSuccess
The function is executed when all the analyses are completed.
Parameter
Type
Description
analysis
A structure for analysis
Parameter
Type
Description
analysis
[Analysis]
A list of Analysis structures
Parameter
Type
Description
key
String
Key for metadata.
value
String
Value for metadata.
Parameter
Type
Description
mediaList
An OzAbstractMedia object or a list of objects.
Parameter
Type
Description
folderID
String
A folder identifier.
Parameter
Type
Description
selfieLength
Int
The length of the Selfie gesture (in milliseconds). Should be within 500-5000 ms, the default length is 700
Parameter
Type
Description
allowDebugVisualization
Boolean
Enables or disables the debug info.
Parameter
Type
Description
attemptsSettings
Sets the number of attempts
Parameter
Type
Description
uploadMediaSettings
Sets the number of attempts and timeout between them
Parameter
Type
Description
faceAlignmentTimeout
Long
A timeout value
Parameter
Type
Description
livenessErrorCallback
ErrorHandler
A callback value
Parameter
Type
Description
localizationCode
A locale code
Parameter
Type
Description
logging
Logging settings
Parameter
Type
Description
useMainCamera
Boolean
True– rear camera,
False– front camera
Parameter
Type
Description
disableFramesCountValidation
Boolean
True– validation is off,
False– validation is on
Parameter
Type
Description
closeIconRes
Int (@DrawableRes)
An image for the close button
closeIconTint
Close button color
titleTextFont
Int (@FontRes)
Toolbar title text font
titleTextFontStyle
Parameter
Type
Description
textFont
String
Center hint text font
textStyle
Int (values from android.graphics.Typeface properties, e.g.,Typeface.BOLD)
Center hint text style
textSize
Int
Center hint text size (in sp, 12-34)
textColor
Parameter
Type
Description
hintGradientColor
Gradient color
hintGradientOpacity
Int
Gradient opacity
animationIconSize
Int
A side size of the animation icon square
hideAnimation
Parameter
Type
Description
geometryType
The frame type: oval, rectangle, circle, square
cornerRadius
Int
Rectangle corner radius (in dp, 0-20)
strokeDefaultColor
Frame color when a face is not aligned properly
strokeFaceInFrameColor
Parameter
Type
Description
backgroundColor
Background color
backgroundAlpha
Int
Background opacity (in %, 0-100)
Parameter
Type
Description
textFont
Int (@FontRes)
SDK version text font
textSize
Int
SDK version text size (in sp, 12-16)
textColor
SDK version text color
textAlpha
Parameter
Type
Description
textMessage
String
Antiscam message text
textFont
String
Antiscam message text font
textSize
Int
Antiscam message text size (in px, 12-18)
textColor
Parameter
Type
Description
image
Bitmap (@DrawableRes)
Logo image
size
Size
Logo size (in dp)
Case
Description
OneShot
The best shot from the video taken
Blank
A selfie with face alignment check
Scan
Scan
HeadRight
Head turned right
HeadLeft
Head turned left
Parameter
Type
Description
expires
Float
The expiration interval
features
Features
License features
appIDS
[String]
An array of bundle IDs
Parameter
Type
Description
tag
A tag for a document photo.
photoPath
String
An absolute path to a photo.
additionalTags (optional)
String
Additional tags if needed (including those not from the OzMediaTag enum).
metaData
Parameter
Type
Description
tag
A tag for a shot set
archivePath
String
A path to an archive
additionalTags (optional)
String
Additional tags if needed (including those not from the OzMediaTag enum)
metaData
Parameter
Type
Description
tag
A tag for a video
videoPath
String
A path to a video
bestShotPath (optional)
String
URL of the best shot in PNG
preferredMediaPath (optional)
Case
Description
Blank
A video with no gesture
PhotoSelfie
A selfie photo
VideoSelfieOneShot
A video with the best shot taken
VideoSelfieScan
A video with the scanning gesture
VideoSelfieEyes
A video with the blink gesture
Parameter
Type
Description
id
Int
License ID
Parameter
Type
Description
path
String
An absolute path to a license
Parameter
Type
Description
analysis
Contains information on what media to analyze and what analyses to apply.
Parameter
Type
Description
media
The object that is being uploaded at the moment
index
Int
Number of this object in a list
from
Int
Objects quantity
percentage
Case
Description
BIOMETRY
The algorithm that allows comparing several media and check if the people on them are the same person or not
QUALITY
The algorithm that aims to check whether a person in a video is a real human acting in good faith, not a fake of any kind.
DOCUMENTS (deprecated)
The analysis that aims to recognize the document and check if its fields are correct according to its type.
Case
Description
ON_DEVICE
The on-device analysis with no server needed
SERVER_BASED
The server-based analysis
HYBRID
The hybrid analysis for Liveness: if the score received from an on-device analysis is too high, the system initiates a server-based analysis as an additional check.
Parameter
Type
Description
type
Type
The type of the analysis
mode
Mode
The mode of the analysis
mediaList
An array of the OzAbstractMedia objects
params (optional)
Case
Description
FAILED
One or more analyses failed due to some error and couldn't get finished
DECLINED
The check failed (e.g., faces don't match or some spoofing attack detected)
SUCCESS
Everything went fine, the check succeeded (e.g., faces match or liveness confirmed)
OPERATOR_REQUIRED
The result should be additionally checked by a human operator
Parameter
Type
Description
singleCount
Int
Attempts on a single action/gesture
commonCount
Int
Total number of attempts on all actions/gestures if you use a sequence of them
Case
Description
EN
English
HY
Armenian
KK
Kazakh
KY
Kyrgyz
TR
Turkish
Parameter
Type
Description
allowDefaultLogging
Boolean
Allows logging to LogCat
allowFileLogging
Boolean
Allows logging to an internal file
journalObserver
StatusListener
An event listener to receive journal events on the application side
Parameter
Type
Description
resId
Int
Link to the color in the Android resource system
Parameter
Type
Description
hex
String
Color hex (e.g., #FFFFFF)
Parameter
Type
Description
color
Int
The Int value of a color in Android
Case
Description
Oval
Oval frame
Rectangle
Rectangular frame
Circle
Circular frame
Square
Square frame
Parameter
Type
Description
apiErrorCode
Int
Error code
message
String
Error message
Parameter
Type
Description
mediaId
String
Media identifier
mediaType
String
Type of the media
originalName
String
Original media name
ozMedia
Parameter
Type
Description
confidenceScore
Float
Resulting score
isOnDevice
Boolean
Mode of the analysis
resolution
Consolidated analysis result
sourceMedia
Parameter
Type
Description
analysisResults
List<AnalysisResult>
Analysis result
folderId
String
Folder identifier
resolution
Consolidated analysis result
Parameter
Type
Description
resolution
Consolidated analysis result
type
Type of the analysis
mode
Resulting score
resultMedia
Parameter
Type
Description
host
String
API address
token
String
Access token
sslPins (optional)
List<sslPin>
Whitelisted certificates
Parameter
Type
Description
host
String
API address
username
String
User name
password
String
Password
sslPins (optional)
Parameter
Type
Description
attemptsCount
Int
Number of attempts for media upload
attemptsTimeout
Int
Timeout between attempts
Case
Description
UPLOAD_ORIGINAL
The original video
UPLOAD_COMPRESSED
The compressed video
UPLOAD_BEST_SHOT
The best shot taken from the video
UPLOAD_NOTHING
Nothing is sent (note that no folder will be created)
Parameter
Type
Description
hash
String
SHA256 key hash in base64
expiredAt
UNIX timestamp, UTC time
The date of certificate expiration
Error Code
Error Message
Description
ERROR = 3
Error.
An unknown error has happened
ATTEMPTS_EXHAUSTED_ERROR = 4
Error. Attempts exhausted for liveness action.
The number of action attempts is exceeded
VIDEO_RECORD_ERROR = 5
Error by video record.
An error happened during video recording
NO_ACTIONS_ERROR = 6
Int (values from android.graphics.Typeface properties, e.g., Typeface.BOLD)
Boolean
Int
HeadDown
Map<String, String>
Map<String, String>
String
VideoSelfieSmile
Int
Map<String, Any>
ES
List<>
List<>
Error. OzLivenessSDK started without actions.