Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This section contains "old" documentation on Android and iOS SDK.
As we constantly upgrade our products, the documentation gets updated as well. This section contains the documentation on previous versions of products.
Since February 2021, we have launched the new documentation, which includes revised articles from the previous documentation and new instructions. New articles are being constantly updated so that our customers and site visitors always receive the actual documentation.
If you started implementing our products before February 2021 and haven't changed anything since then, the previous documentation is available for you at this link.
For the documentation on older product versions that are still newer than February 2021, please refer to subpages of this page.
SDK contains IOzForensicsAPI interface describing APO net calls that can be used to create a Retrofit instance.
This interface uses a gson-converter and operates classes from the com.ozforensics.liveness.sdk.api.model
pack.
Apart from this, the interface specifies a static method for creating a default Retrofit instance (without logging, interceptors and with 15-second timeouts). This instance will look for the server at the specified address:
SDK includes the OzForensicsService
class that uses a Retrofit instance from OzForensicsAPI.create()
. This class wraps in net calls from the Retrofit interface and checks the presence of a token. When an auth request is performed, the token is stored automatically for internal goals. Also, the required metadata is added where necessary when performing the net requests (creating a folder, uploading media data to be analyzed). The method calls of this class are asynchronous (the StatusListener<>
interface is used). You can obtain an instance of this class as follows:
If the TOKEN
parameter value is set to null
, authorization is required before performing any API call (except auth):
After a successful request, onSuccessCallback
is performed so that the access token can be transferred with AuthResponse
.
Please note: this method only works if your SDK version is below 3.0.0. If it is 3.0.0 or higher, please use the methods listed above.
Data to be uploaded and analyzed are stored in object results (see above), obtained after capturing and recording video. Upload it to the server and initiate the necessary analyses with the help of .
A simple scenario of interaction with Oz API can be implemented with the OZSDK.analyse
method as described below.
The completion
| resolution
block will contain the result of the assigned analysis including status (status of the analysis), type (type of the analysis), and folderID
(ID of the Oz API folder).
To perform a cross-functional analysis based on video and document photos, use the OZSDK.documentAnalyse
method:
The completion
| resolution
block will contain the result of the assigned analysis (similar to OZSDK.analyse
), where folderResolutionStatus
is the general status of analyses for the folder.
For both documents and face check, you can use the OZSDK.uploadAndAnalyse
method:
The resolution
block will contain the result of the assigned analysis (similar to OZSDK.analyse
and OZSDK.documentAnalyse
).
The methods below are used to analyze media on the device.
Biometry:
Liveness:
Data to be uploaded and analyzed is stored in object sdkMediaResult, obtained after capturing and recording video. Upload it to the server and initiate required analyses via .
A simple scenario of interaction with Oz API can be implemented with theuploadMediaAndAnalyze
method as described below.
To run the on-device analyses instead of server-based, use the following methods:
How to work with Oz Forensics applications for iOS and Android.
To test Oz Forensics Mobile SDK, use this demo kit. With it, you can check how accurate it is in case of face recognizing and liveness detection. Contact us to get your credentials and let’s begin.
To start working with Oz Forensics application, download it for your OS. The application is available for iOS and Android.
For iOS, you’ll need the TestFlight application. Download it from App Store: TestFlight. When download completes, get Oz Forensics iOS demo application here: iOS Demo App. Tap the Oz Liveness icon.
Upon the first login, you’ll be prompted to enter the credentials you’ve received from us. Check the API Address field – it should contain the following record: “https://sandbox.ohio.ozforensics.com/”. Once entered, tap the LOGIN button. You’ll be redirected to the Scenarios screen.
For Android, download the latest version of the application here: Android Demo App. Make sure you don’t have the previous version installed. You also might need to enable installation of apps from unknown sources. Once downloaded, tap the package to process it and complete the installation. Launch the app and you’ll see the Scenarios screen.
Tap the Profile icon in the top-right corner. In the new screen, enter your credentials. Check the API Address field – it should contain the following record: “https://api.sandbox.ohio.ozforensics.com”. After you log in, you’ll get back to the Scenarios screen.
For both systems, if you entered wrong credentials and need to change the user, hit the Profile icon on the top right. You’ll see your username and API server address. Tap Change User (iOS) / LOGOUT (Android) and repeat the login process with the correct credentials.
In your profile, you also can:
Switch to another documents check method
Choose between server-based and on-device analyzes (below, we’ll describe the difference)
Check the current version of the application and SDK
Within the app, you can follow one of the three scenarios.
Biometric check: a client makes a selfie, and the system compares their selfie with images from a database. This scenario is for transaction confirmations, self check-ins, access management, etc.
Liveness: a client takes a short selfie video to confirm they are a living person acting in good faith. This is a spoofing-protection scenario.
Below, we’ll explain how to manage each of the scenarios.
Please note: for testing purposes, you choose a photo to compare selfie with from your phone library or take it with your camera. Later on, at the production stage, you’ll be able to create either blacklist or whitelist which will be used as a database.
In the Biometric check section, tap Select Photo to specify an image for comparison. You can either take a photo or select it from your gallery. Confirm the choice. The photo will appear on your screen. Tap Do Check and follow the instructions.
If the faces match, you’ll get the “Authentication successful” message. If they don’t match, the warning message will appear.
In the Liveness section, you need to pick one of the gestures that will be used to confirm that you’re a real living person. They are:
One shot
Selfie
Scan
Blink
Smile
Tilt up
Slope down
Turn your head left
Turn your head right
Combo – a combination of several gestures from above
If the Liveness check is successful, you’ll get the “Authentication successful” message. If not, the warning message will appear.
All photos and videos you take can be found in the Web console*. You can find documentation on the console usage here: Oz Web UI User Guide.
*There is also a way to perform analyzes locally on your device. To enable this option, proceed to the Profile and change the Analysis type in the Settings section from Server based to On device. In this case, all checks will be performed on your device, no data is sent to the server. This is a faster and safer way of analyzing. However, the server check is more accurate and the information is stored for further needs.
Should you have any questions, please contact us.