How to Install and Use Oz Flutter Plugin

Installation and Licensing

Add the lines below in pubspec.yaml of the project you want to add the plugin to.

  ozsdk:
    git:
      url: https://gitlab.com/oz-forensics/oz-mobile-flutter-plugin.git
      ref: '8.5.0'

Add the license file (e.g., license.json or forensics.license) to the Flutter application/assets folder. In pubspec.yaml, specify the Flutter asset:

assets
  - assets/license.json // please note that the license file name must match to the one placed in assets

For Android, add the Oz repository to /android/build.gradle, allprojects → repositories section:

allprojects {
    repositories {
        google()
        mavenCentral()
        maven { url ‘https://ozforensics.jfrog.io/artifactory/main’ } // repository URL
    }
}

The minimum SDK version should be 21 or higher:

defaultConfig {
  ...
  minSDKVersion 21
  ...
}

For iOS, set the minimum platform to 13 or higher in the Runner → Info → Deployment target → iOS Deployment Target.

In ios/Podfile, comment the use_frameworks! line (#use_frameworks!).

Getting Started with Flutter

Initializing SDK

Initialize SDK by calling the init plugin method. Note that the license file name and path should match the ones specified in pubspec.yaml (e.g., assets/license.json).

await OZSDK.initSDK([<% license path and license file name %>]);

Connecting SDK to API

Use the API credentials (login, password, and API URL) that you’ve received from us.

await OZSDK.setApiConnectionWithCredentials(<login>, <password>, <host>);

In production, instead of hard-coding the login and password inside the application, it is recommended to get the access token on your backend via the API auth method, then pass it to your application:

 await OZSDK.setApiConnectionWithToken(token, host);

Capturing Videos

To start recording, use the executeLiveness method to obtain the recorded media:

final media = await OZSDK.executeLiveness(<actions>, <use_main_camera>);

Parameter

Type

Description

actions

List<VerificationAction>

Actions from the captured video

use_main_camera

Boolean

If True, uses the main camera, otherwise the front one.

The media object contains the captured media data.

Checking Liveness and Face Biometry

To run the analyses, execute the code below.

Create the Analysis object:

List<Analysis> analysis = [ Analysis(Type.quality, Mode.onDevice, <media>, {}), ];

Execute the formed analysis:

final analysisResult = await OZSDK.analyze(analysis, [], {}) ?? [];

The analysisResult list of objects contains the result of the analysis.

If you want to use media captured by another SDK, the code should look like this:

media = Media(FileTypedocumentPhoto, VerificationAction.oneShot, “photo_selfie”, null, <path to image>, null, null, “”)

Last updated