Add the lines below in pubspec.yaml of the project you want to add the plugin to.
Add the license file (e.g., license.json or forensics.license) to the Flutter application/assets folder. In pubspec.yaml, specify the Flutter asset:
For Android, add the Oz repository to /android/build.gradle, allprojects → repositories section:
The minimum SDK version should be 21 or higher:
For iOS, set the minimum platform to 13 or higher in the Runner → Info → Deployment target → iOS Deployment Target.
In ios/Podfile, comment the use_frameworks!
line (#use_frameworks!
).
Initialize SDK by calling the init
plugin method. Note that the license file name and path should match the ones specified in pubspec.yaml (e.g., assets/license.json).
Use the API credentials (login, password, and API URL) that you’ve received from us.
In production, instead of hard-coding the login and password inside the application, it is recommended to get the access token on your backend via the API auth method, then pass it to your application:
To start recording, use the executeLiveness
method to obtain the recorded media:
The media
object contains the captured media data.
To run the analyses, execute the code below.
Create the Analysis
object:
Execute the formed analysis:
The analysisResult
list of objects contains the result of the analysis.
If you want to use media captured by another SDK, the code should look like this:
Parameter
Type
Description
actions
List<VerificationAction>
Actions from the captured video
use_main_camera
Boolean
If True
, uses the main camera, otherwise the front one.