LogoLogo
To the Oz WebsiteOz API ReferenceContact Us
  • General
    • Oz Liveness and Biometry Key Concepts
      • Solution Architecture
      • Liveness, Face Matching, Black List Checks
      • Passive and Active Liveness
      • Hybrid Liveness
      • Oz API Key Concepts
      • Oz API vs. Oz API Lite
      • SaaS, On-premise, On-device: What to Choose
      • Oz Licensing Options
    • Integration Quick Start Guides
      • Server-Based Liveness
        • How to Integrate Server-Based Liveness into Your Web Application
        • How to Integrate Server-Based Liveness into Your Mobile Application
        • How to Check Your Media for Liveness without Oz Front End
      • On-Device Liveness
        • How to Integrate On-Device Liveness into Your Mobile Application
      • Face Matching
        • How to Add Face Matching of Liveness Video with a Reference Photo From Your Database
        • How to Add Photo ID Capture and Face Matching to Your Web or Mobile Application
  • Guides
    • Developer Guide
      • API
        • Oz API
          • Working with Oz System: Basic Scenarios
            • Authentication
            • Uploading Media
            • Liveness
            • Biometry (Face Matching)
            • Best Shot
            • Blacklist Check
              • Blacklist (Collection) Management in Oz API
            • Quantitative Results
            • Using a Webhook to Get Results
            • Single Request
            • Instant API: Non-Persistent Mode
          • System Objects
          • User Roles
          • Types of Analyses and What They Check
          • Rules of Assigning Analyses
          • Statuses in API
          • Media Tags
          • Metadata
          • API Error Codes
          • Oz API Postman Collections
          • Changelog
        • Oz API Lite
          • API Lite Methods
          • Oz API Lite Postman Collection
          • Changelog
      • SDK
        • Oz Mobile SDK (iOS, Android, Flutter)
          • On-Device Mode
          • Android
            • Getting a License for Android SDK
              • Master License for Android
            • Adding SDK to a Project
            • Connecting SDK to API
            • Capturing Videos
            • Checking Liveness and Face Biometry
            • Customizing Android SDK
              • How to Restore the Previous Design after an Update
            • Android Localization: Adding a Custom or Updating an Existing Language Pack
            • Android SDK Methods and Properties
            • Changelog
          • iOS
            • Getting a License for iOS SDK
              • Master License for iOS
            • Adding SDK to a Client’s Mobile App
            • Connecting SDK to API
            • Capturing Videos
            • Checking Liveness and Face Biometry
            • Customizing iOS SDK Interface
              • How to Restore the Previous Design after an Update
            • iOS Localization: Adding a Custom or Updating an Existing Language Pack
            • iOS SDK Methods and Properties
            • Changelog
          • Flutter
            • How to Install and Use Oz Flutter Plugin
            • Flutter SDK Methods and Properties
            • Changelog
        • Oz Liveness Web SDK
          • Web Plugin
            • Adding the Plugin to Your Web Page
            • Launching the Plugin
              • Description of the on_complete Callback
              • Description of the on_result Callback
              • Capturing Video and Description of the on_capture_complete Callback
              • Description of the on_error Callback
            • Closing or Hiding the Plugin
            • Localization: Adding a Custom Language Pack
            • Look-and-Feel Customization
              • Customization Options for Older Versions (before 1.0.1)
            • Security Recommendations
            • Browser Compatibility
            • No-Server Licensing
          • Changelog
    • Administrator Guide
      • Deployment Architecture
      • Installation in Docker
      • Installation in Kubernetes
      • Performance and Scalability Guide
      • Publishing API Methods in the Internet: Security Recommendations
      • Monitoring
      • License Server
      • Web Adapter Configuration
        • Installation and Licensing
        • Configuration File Settings
        • Configuration Using Environment Variables
        • Server Configuration via Environment Variables
      • Oz API Configuration
    • User Guide
      • Oz Web UI
        • Requesting Analyses
        • Users and Companies
        • Blacklist
        • Statistics
        • Settings
        • Changelog
  • Other
    • Media Quality Requirements
    • Oz SDK Media Quality Checks
    • Media File Size Overview
    • Compatibility
    • FAQ
    • Tips and Tricks
      • Oz Liveness Gestures: Table of Correspondence
      • Sudo without Password
      • Android: Certificate Validation Error
    • Previous Documentation
      • Mobile SDK
        • Android
          • Interactions with the Oz API Server
          • Uploading and Analyzing Media
        • iOS
          • Uploading and Analyzing Media
      • User Guides
        • Oz Demo Kit
        • Web UI
      • Oz Modules Installation
        • Standalone Installer
        • Oz System Lite
Powered by GitBook
On this page
  • Installation and Licensing
  • Getting Started with Flutter
  • Initializing SDK
  • Connecting SDK to API
  • Capturing Videos
  • Checking Liveness and Face Biometry

Was this helpful?

Export as PDF
  1. Guides
  2. Developer Guide
  3. SDK
  4. Oz Mobile SDK (iOS, Android, Flutter)
  5. Flutter

How to Install and Use Oz Flutter Plugin

PreviousFlutterNextFlutter SDK Methods and Properties

Last updated 3 months ago

Was this helpful?

Please find the Flutter repository .

Installation and Licensing

Add the lines below in pubspec.yaml of the project you want to add the plugin to.

  ozsdk:
    git:
      url: https://gitlab.com/oz-forensics/oz-mobile-flutter-plugin.git
      ref: '8.8.2'

Add the license file (e.g., license.json or forensics.license) to the Flutter application/assets folder. In pubspec.yaml, specify the Flutter asset:

assets
  - assets/license.json // please note that the license file name must match to the one placed in assets

For Android, add the Oz repository to /android/build.gradle, allprojects → repositories section:

allprojects {
    repositories {
        google()
        mavenCentral()
        maven { url ‘https://ozforensics.jfrog.io/artifactory/main’ } // repository URL
    }
}

For Flutter 8.24.0 and above or Android Gradle plugin 8.0.0 and above, add to android/gradle.properties:

android.nonTransitiveRClass=false

The minimum SDK version should be 21 or higher:

defaultConfig {
  ...
  minSDKVersion 21
  ...
}

For iOS, set the minimum platform to 13 or higher in the Runner → Info → Deployment target → iOS Deployment Target.

In ios/Podfile, comment the use_frameworks! line (#use_frameworks!).

Getting Started with Flutter

Initializing SDK

Initialize SDK by calling the init plugin method. Note that the license file name and path should match the ones specified in pubspec.yaml (e.g., assets/license.json).

await OZSDK.initSDK([<% license path and license file name %>]);

Connecting SDK to API

Use the API credentials (login, password, and API URL) that you’ve received from us.

await OZSDK.setApiConnectionWithCredentials(<login>, <password>, <host>);

In production, instead of hard-coding the login and password inside the application, it is recommended to get the access token on your backend via the API auth method, then pass it to your application:

 await OZSDK.setApiConnectionWithToken(token, host);
await OZSDK.setEventConnectionWithCredentials(<login>, <password>, <host>);

or

await OZSDK.setEventConnectionWithToken(<token>, <host>);

Capturing Videos

To start recording, use the startLiveness method to obtain the recorded media:

await OZSDK.startLiveness(<actions>, <use_main_camera>);

Parameter

Type

Description

actions

List<VerificationAction>

Actions from the captured video

use_main_camera

Boolean

If True, uses the main camera, otherwise the front one.

Please note: for versions 8.11 and below, the method name is executeLiveness, and it returns the recorded media.

To obtain the media result, subscribe to livenessResult as shown below:

class Screen extends StatefulWidget {
  static const route = 'liveness';

  const Screen({super.key});

  @override
  State<Screen> createState() => _ScreenState();
}

class _ScreenState extends State<Screen> {
  late StreamSubscription<List<Media>> _subscription;

  @override
  void initState() {
    super.initState();

    // subscribe to liveness result
    _subscription = OZSDK.livenessResult.listen(
      (List<Media> medias) {
          // media contains liveness media
      },
      onError: (Object error) {
        // handle error, in most cases PlatformException
      },
    );
  }

  @override
  Widget build(BuildContext context) {
    // omitted to shorten the example
  }

  void _startLiveness() async {
    // use startLiveness to start liveness screen
    OZSDK.startLiveness(<list of actions>);
  }

  @override
  void dispose() {
    // cancel subscription
    _subscription.cancel();
    super.dispose();
  }
}

Checking Liveness and Face Biometry

To run the analyses, execute the code below.

Create the Analysis object:

List<Analysis> analysis = [ Analysis(Type.quality, Mode.serverBased, <media>, {}), ];

Execute the formed analysis:

final analysisResult = await OZSDK.analyze(analysis, [], {}) ?? [];

The analysisResult list of objects contains the result of the analysis.

If you want to use media captured by another SDK, the code should look like this:

media = Media(FileTypedocumentPhoto, VerificationAction.oneShot, “photo_selfie”, null, <path to image>, null, null, “”)

The whole code block will look like this:

// replace VerificationAction.blank with your Liveness gesture if needed
final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);

final analysis = [
  Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
];

final analysisResult = await OZSDK.analyze(analysis, [], {});
// replace VerificationAction.blank with your Liveness gesture if needed
final cameraMedia = await OZSDK.executeLiveness([VerificationAction.blank], use_main_camera);
final biometryMedia = [...cameraMedia];
biometryMedia.add(
  Media(
    FileType.documentPhoto,
    VerificationAction.blank,
    MediaType.movement,
    null,
    <your reference image path>,
    null,
    null,
    MediaTag.photoSelfie,
  ),
);

final analysis = [
  Analysis(Type.quality, Mode.serverBased, cameraMedia, {}),
  Analysis(Type.biometry, Mode.serverBased, biometryMedia, {}),
];

final analysisResult = await OZSDK.analyze(analysis, [], {});

By default, logs are saved along with the analyses' data. If you need to keep the logs distinct from the analysis data, set up the separate connection for as shown below:

here
telemetry