Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
To start using Oz iOS SDK, follow the steps below.
Embed Oz iOS SDK into your project as described here.
Connect SDK to API as described here. This step is optional, as this connection is required only when you need to process data on a server. If you use the on-device mode, the data is not transferred anywhere, and no connection is needed.
Capture videos by creating the controller as described here. You'll send them for analysis afterwards.
Upload and analyze media you've taken at the previous step. The process of checking liveness and face biometry is described here.
If you want to customize the look-and-feel of Oz iOS SDK, please refer to this section.
Minimal iOS version: 11.
Minimal Xcode version: 15 for versions 8.10.0 and newer.
Available languages: EN, ES, HY, KK, KY, TR, PT-BR.
A sample app source code using the Oz Liveness SDK is located in the GitLab repository:
Follow the link below to see a list of SDK methods and properties:
Download the demo app latest build here.
To integrate OZLivenessSDK into an Xcode project via the CocoaPods dependency manager, add the following code to Podfile:
Version is optional as, by default, the newest version is integrated. However, if necessary, you can find the older version number in Changelog.
Since 8.1.0, you can also use a simpler code:
By default, the full version is being installed. It contains both server-based and on-device analysis modes. To install the server-based version only, use the following code:
For 8.1.0 and higher:
Please note: installation via SPM is available for versions 8.7.0 and above.
Add the following package dependencies via SPM: https://gitlab.com/oz-forensics/oz-mobile-ios-sdk (if you need a guide on adding the package dependencies, please refer to the Apple documentation). OzLivenessSDK is mandatory. If you don't need the on-device analyses, skip the OzLivenessSDKOnDevice file.
You can also add the necessary frameworks to your project manually.
Download the SDK files from here and add them to your project.
OZLivenessSDK.xcframework,
OZLivenessSDKResources.bundle,
OZLivenessSDKOnDeviceResources.bundle (if you don't need the on-device analyses, skip this file).
Download the TensorFlow framework 2.11 from here.
Make sure that:
both xcframework are in Target-Build Phases -> Link Binary With Libraries and Target-General -> Frameworks, Libraries, and Embedded Context;
the bundle file(s) are in Target-Build Phases -> Copy Bundle Resources.
To connect SDK to Oz API, specify the API URL and access token as shown below.
Please note: in your host application, it is recommended that you set the API address on the screen that precedes the liveness check. Setting the API URL initiates a service call to the API, which may cause excessive server load when being done at the application initialization or startup.
Alternatively, you can use the login and password provided by your Oz Forensics account manager:
For telemetry, set the separate connection as shown below:
Create a controller that will capture videos as follows:
Once video is captured, the system calls the onOZLivenessResult
method:
If you use our SDK just for capturing videos, omit the Checking Liveness and Face Biometry step.
If a user closes the capturing screen manually, the failedBecauseUserCancelled
error appears.
action
– a list of user’s while capturing the video.
The method returns the results of video capturing: the [
]
objects. The system uses these objects to perform checks.