# How to Install and Use Oz Flutter Plugin

Please find the Flutter repository [here](https://gitlab.com/oz-forensics/oz-mobile-flutter-plugin/-/tree/develop/example?ref_type=heads).

## **Installation and Licensing** <a href="#installation-and-licensing" id="installation-and-licensing"></a>

Add the lines below in **pubspec.yaml** of the project you want to add the plugin to.

For 8.22 and above:

<pre class="language-yaml"><code class="lang-yaml">ozsdk: ^<a data-footnote-ref href="#user-content-fn-1">8.22.0</a>
</code></pre>

For 8.21 and below:

<pre class="language-dart"><code class="lang-dart">  ozsdk:
    git:
      url: https://gitlab.com/oz-forensics/oz-mobile-flutter-plugin.git
      ref: '<a data-footnote-ref href="#user-content-fn-1">8.8.2</a>'
</code></pre>

Add the license file (e.g., *license.json* or *forensics.license*) to the **Flutter application/assets** folder. In pubspec.yaml, specify the Flutter asset:

```yaml
assets
  - assets/license.json // please note that the license file name must match to the one placed in assets
```

For Android, add the Oz repository to **/android/build.gradle**, allprojects → repositories section:

```dart
allprojects {
    repositories {
        google()
        mavenCentral()
        maven { url ‘https://ozforensics.jfrog.io/artifactory/main’ } // repository URL
    }
}
```

For Flutter 8.24.0 and above or Android Gradle plugin 8.0.0 and above, add to **android/gradle.properties**:

```dart
android.nonTransitiveRClass=false
```

The minimum SDK version should be 21 or higher:

```yaml
defaultConfig {
  ...
  minSDKVersion 21
  ...
}
```

For iOS, set the minimum platform to 13 or higher in the **Runner → Info → Deployment target → iOS Deployment Target**.

In ios/Podfile, comment the `use_frameworks!` line (`#use_frameworks!`).

## **Getting Started with Flutter** <a href="#getting-started-with-flutter" id="getting-started-with-flutter"></a>

### **Initializing SDK** <a href="#initialize-sdk" id="initialize-sdk"></a>

Initialize SDK by calling the `init` plugin method. Note that the license file name and path should match the ones specified in pubspec.yaml (e.g., *assets/license.json*).

```dart
await OZSDK.initSDK([<% license path and license file name %>]);
```

### **Connecting SDK to API** <a href="#connect-sdk-to-oz-api" id="connect-sdk-to-oz-api"></a>

Use the API credentials (login, password, and API URL) that you’ve received from us.

```dart
await OZSDK.setApiConnectionWithCredentials(<login>, <password>, <host>);
```

In production, instead of hard-coding the login and password inside the application, it is recommended to get the access token on your backend via the API auth method, then pass it to your application:

```dart
 await OZSDK.setApiConnectionWithToken(token, host);
```

By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for [telemetry](https://doc.ozforensics.com/oz-knowledge/other/faq#what-is-telemetry-and-why-should-i-use-it) as shown below:

```dart
await OZSDK.setEventConnectionWithCredentials(<login>, <password>, <host>);
```

or

```dart
await OZSDK.setEventConnectionWithToken(<token>, <host>);
```

### **Capturing Videos** <a href="#add-face-recording" id="add-face-recording"></a>

To start recording, use the `startLiveness` method to obtain the recorded media:

```dart
await OZSDK.startLiveness(<actions>, <use_main_camera>);
```

| **Parameter**     | **Type**                  | **Description**                                           |
| ----------------- | ------------------------- | --------------------------------------------------------- |
| actions           | List\<VerificationAction> | Actions from the captured video                           |
| use\_main\_camera | Boolean                   | If `True`, uses the main camera, otherwise the front one. |

Please note: for versions 8.11 and below, the method name is `executeLiveness`, and it returns the recorded media.

To obtain the media result, subscribe to `livenessResult` as shown below:

```dart
class Screen extends StatefulWidget {
  static const route = 'liveness';

  const Screen({super.key});

  @override
  State<Screen> createState() => _ScreenState();
}

class _ScreenState extends State<Screen> {
  late StreamSubscription<List<Media>> _subscription;

  @override
  void initState() {
    super.initState();

    // subscribe to liveness result
    _subscription = OZSDK.livenessResult.listen(
      (List<Media> medias) {
          // media contains liveness media
      },
      onError: (Object error) {
        // handle error, in most cases PlatformException
      },
    );
  }

  @override
  Widget build(BuildContext context) {
    // omitted to shorten the example
  }

  void _startLiveness() async {
    // use startLiveness to start liveness screen
    OZSDK.startLiveness(<list of actions>);
  }

  @override
  void dispose() {
    // cancel subscription
    _subscription.cancel();
    super.dispose();
  }
}
```

### **Checking Liveness and Face Biometry** <a href="#run-analyses" id="run-analyses"></a>

To run the analyses, execute the code below.

Create the `Analysis` object:

```dart
List<Analysis> analysis = [ Analysis(Type.quality, Mode.serverBased, <media>, {}), ];
```

Execute the formed analysis:

```dart
final analysisResult = await OZSDK.analyze(analysis, [], {});
```

If you need to run an analysis for a particular folder, pass its ID:

```dart
final analysisResult = await OZSDK.analyze(analysis, folderID, [], {});
```

The `analysisResult` list of objects contains the result of the analysis.

If you want to use media captured by another SDK, the code should look like this:

```dart
media = Media(FileTypedocumentPhoto, VerificationAction.oneShot, “photo_selfie”, null, <path to image>, null, null, “”)
```

The whole code block will look like this:

{% tabs %}
{% tab title="Liveness" %}

```dart
// replace VerificationAction.blank with your Liveness gesture if needed
final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);

final analysis = [
  Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
];

final analysisResult = await OZSDK.analyze(analysis, [], {});
```

{% endtab %}

{% tab title="Biometry" %}

```dart
// replace VerificationAction.blank with your Liveness gesture if needed
final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);
final biometryMedia = [...cameraMedia];
biometryMedia.add(
  Media(
    FileType.documentPhoto,
    VerificationAction.blank,
    MediaType.movement,
    null,
    <your reference image path>,
    null,
    null,
    MediaTag.photoSelfie,
  ),
);

final analysis = [
  Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
  Analysis(Type.biometry, Mode.serverBased, biometryMedia, {}),
];

final analysisResult = await OZSDK.analyze(analysis, [], {});
```

{% endtab %}
{% endtabs %}

[^1]: Version number
