Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This callback is called after the check is completed. It retrieves the analysis result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode configuration parameter.
When result_mode is safe, the on_complete callback contains the state of the analysis only:
Please note: The options listed below are for testing purposes only. If you require more information than what is available in the Safe mode, please follow .
For the status value, the callback contains the state of the analysis, and for each of the analysis types, the name of the type, state, and resolution.
The folder value is almost similar to the status value, with the only difference: the folder_id is added.
In this case, you receive the detailed response possibly containing sensitive data. This mode is deprecated; for security reasons, we recommend using the safe mode.
{
"state": "finished"
}{
"state": "finished",
"analyses": {
"quality": {
"state": "finished",
"resolution": "success"
}
}
}{
"state": "finished",
"folder_id": "your_folder_id",
"analyses": {
"quality": {
"state": "finished",
"resolution": "success"
}
}
}This callback is called when the system encounters any error. It contains the error details and telemetry ID that you can use for further investigation.
on_error {
"code": "error_code",
"event_session_id": "id_of_telemetry_session_with_error",
"message": "<error decription>",
"context": {} // additional information if any
}
plugin_liveness.php) that you received from Oz Forensics to the HTML code of the page. web-sdk-root-url is the Web Adapter link you've received from us.<script src="https://web-sdk-root-url/plugin_liveness.php"></script>Even though the analysis result is available to the host application via Web Plugin callbacks, it is recommended that the application back end receives it directly from Oz API. All decisions of the further process flow should be made on the back end as well. This eliminates any possibility of malicious manipulation with analysis results within the browser context.
To find your folder from the back end, you can follow these steps:
The add_lang(lang_id, lang_obj) method allows adding a new or customized language pack.
Parameters:
lang_id: a string value that can be subsequently used as lang parameter for the open() method;
To generate the license, we need the domain name of the website where you are going to use Oz Forensics Web SDK, for instance, your-website.com. You can also define subdomains.
On the front end, add your unique identifier to the folder metadata.
You can add your own key-value pairs to attach user document numbers, phone numbers, or any other textual information. However, ensure that tracking personally identifiable information (PII) complies with relevant regulatory requirements.
Use the on_complete callback of the plugin to be notified when the analysis is done. Once used, call your back end and pass the transaction_id value.
On the back end side, find the folder by the identifier you've specified using the Oz API Folder LIST method:
To speed up the processing of your request, we recommend adding the time filter as well:
In the response, find the analysis results and folder_id for future reference.
Web Adapter may send analysis results to the Web Plugin with various levels of verbosity. It is recommended that, in production, the level of verbosity is set to minimum.
In the Web Adapter configuration file, set the result_mode parameter to "safe".
lang_obj: an object that includes identifiers of translation strings as keys and translation strings themselves as values.A list of language identifiers:
en
English
es
Spanish
pt-br*
Portuguese (Brazilian)
kz
Kazakh
*Formerly pt, changed in 1.3.1.
An example of usage:
OzLiveness.add_lang('en', enTranslation), where enTranslation is a JSON object.
To set the SDK language, when you launch the plugin, specify the language identifier in lang:
You can check which locales are installed in Web SDK: use the ozLiveness.get_langs() method. If you have added a locale manually, it will also be shown.
A list of all language identifiers:
The keys oz_action_*_go refer to the appropriate gestures. oz_tutorial_camera_* – to the hints on how to enable camera in different browsers. Others refer to the hints for any gesture, info messages, or errors.
Since 1.5.0, if your language pack doesn't include a key, the message for this key will be shown in English.
Set the license as shown below:
With license data:
With license path:
Check whether the license is updated properly.
Example
Proceed to your website origin and launch Liveness -> Simple selfie.
Once the license is added, the system will check its validity on launch.
OzLiveness.open({
license: {
'payload_b64': 'some_payload',
'signature': 'some_data',
'enc_public_key': 'some_key'
},
...,
})OzLiveness.open({
licenseUrl: 'https://some_url',
...,
})/api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true/api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true&time_created.min=([CURRENT_TIME]-1hour)OzLiveness.open({
...
meta: {
// the user or lead ID from an external lead generator
// that you can pass to keep track of multiple attempts made by the same user
'end_user_id': '<user_or_lead_id>',
// the unique attempt ID
'transaction_id': '<unique_transaction_id>'
}
});/api/folders/?meta_data=transaction_id==unique_id1&with_analyses=true"result_mode": "safe"// Editing the button text
OzLiveness.add_lang('en', {
action_photo_button: 'Take a photo'
});OzLiveness.open({
lang: 'es', // the identifier of the needed language
...
});Web Plugin is a plugin called by your web application. It works in a browser context. The Web Plugin communicates with Web Adapter, which, in turn, communicates with Oz API.
Please find a sample for Oz Liveness Web SDK here. To make it work, replace <web-adapter-url> with the Web Adapter URL you've received from us.
For the samples below, replace https://web-sdk.sandbox.ohio.ozforensics.com in index.html.
Angular sample
sample
sample
sample
Please note: for the plugin to work, your browser version should support JavaScript ES6 and be the one as follows or newer.
Google Chrome (and other browsers based on the Chromium engine)
56
Mozilla Firefox
55
Safari
11
*Web SDK doesn't work in Internet Explorer compatibility mode due to lack of important functions.
Microsoft Edge*
17
Opera
47
To force the closing of the plugin window, use the close() method. All requests to server and callback functions (except on_close) within the current session will be aborted.
Example:
var session_id = 123;
OzLiveness.open({
// We transfer the arbitrary meta data, by which we can later identify the session in Oz API
meta: {
session_id: session_id
},
// After sending the data, forcibly close the plugin window and independently request the result
on_submit: function() {
OzLiveness.close();
my_result_function(session_id);
}
});To hide the plugin window without cancelling the requests for analysis results and user callback functions, call the hide() method. Use this method, for instance, if you want to display your own upload indicator after submitting data.
An example of usage:
OzLiveness.open({
// When receiving an intermediate result, hide the plugin window and show your own loading indicators
on_result: function(result) {
OzLiveness.hide();
if (result.state === 'processing') {
show_my_loader();
}
},
on_complete: function() {
hide_my_loader();
}
});This callback is called periodically during the analysis’ processing. It retrieves an intermediate result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode configuration parameter.
When result_mode is safe, the on_result callback contains the state of the analysis only:
or
Please note: the options listed below are for testing purposes only. If you require more information than what is available in the Safe mode, please follow .
For the status value, the callback contains the state of the analysis, and for each of the analysis types, the name of the type, state, and resolution.
or
The folder value is almost similar to the status value, with the only difference: the folder_id is added.
In this case, you receive the detailed response possibly containing sensitive data. This mode is deprecated; for security reasons, we recommend using the safe mode.
{
"state": "processing"
}{
"state": "finished"
}{
"state": "processing",
"analyses": {
"quality": {
"state": "processing",
"resolution": ""
}
}
}{
"state": "finished",
"analyses": {
"quality": {
"state": "finished",
"resolution": "success"
}
}
}{
"state": "processing",
"folder_id": "your_folder_id",
"analyses": {
"quality": {
"state": "processing",
"resolution": ""
}
}
}To set your own look-and-feel options, use the style section in the Ozliveness.open method. Here is what you can change:
faceFrame – the color of the frame around a face:
faceReady – the frame color when the face is correctly placed within the frame;
faceNotReady – the frame color when the face is placed improperly and can't be analyzed.
centerHint – the text of the hint that is displayed in the center.
textSize – the size of the text;
color – the color of the text;
closeButton – the button that closes the plugin:
image – the button image, can be an image in PNG or dataURL in base64.
backgroundOutsideFrame – the color of the overlay filling (outside the frame):
color – the fill color.
Example:
In this article, you’ll learn how to capture videos and send them through your backend to Oz API.
Here is the data flow for your scenario:
1. Oz Web SDK takes a video and makes it available for the host application as a frame sequence.
2. The host application calls your backend using an archive of these frames.
3. After the necessary preprocessing steps, your backend calls Oz API, which performs all necessary analyses and returns the analyses’ results.
yPosition – the vertical position measured from top;
letterSpacing – the spacing between letters;
fontStyle – the style of font (bold, italic, etc.).
OzLiveness.open({
// ...
style: {
// the backward compatibility block
doc_color: "",
face_color_success: "",
face_color_fail: "",
// the current customization block
faceFrame: {
faceReady: "",
faceNotReady: "",
},
centerHint: {
textSize: "",
color: "",
yPosition: "",
letterSpacing: "",
fontStyle: "",
},
closeButton: {
image: "",
},
backgroundOutsideFrame: {
color: "",
},
},
// ...
});On the server side, Web SDK must be configured to operate in the Capture mode:
The architecture parameter must be set to capture in the app_config.json file.
In your Web app, add a callback to process captured media when opening the Web SDK plugin:
The result object structure depends on whether any virtual camera is detected or not.
Here’s the list of variables with descriptions.
Variable
Type
Description
best_frame
String
The best frame, JPEG in the data URL format
best_frame_png
String
The best frame, PNG in the data URL format, it is required for protection against virtual cameras when video is not used
best_frame_bounding_box
Array[Named_parameter: Int]
The coordinates of the bounding box where the face is located in the best frame
best_frame_landmarks
The video from Oz Web SDK is a frame sequence, so, to send it to Oz API, you’ll need to archive the frames and transmit them as a ZIP file via the POST /api/folders request (check our Postman collections).
You can retrieve the MP4 video from a folder using the /api/folders/{{folder_id}} request with this folder's ID. In the JSON that you receive, look for the preview_url in source_media. The preview_url parameter contains the link to the video. From the plugin, MP4 videos are unavailable (only as frame sequences).
Also, in the POST {{host}}/api/folders request, you need to add the additional_info field. It is required for the capture architecture mode to gather the necessary information about client environment. Here’s the example of filling in the request’s body:
Oz API accepts data without the base64 encoding.
OzLiveness.open()options– an object with the following settings:
token – (optional) the auth token;
license – an object containing the license data;
licenseUrl – a string containing the path to the license;
lang – a string containing the identifier of one of the installed language packs;
meta– an object with names of meta fields in keys and their string values in values. is transferred to Oz API and can be used to obtain analysis results or for searching;
params– an object with identifiers and additional parameters:
extract_best_shot– true or false: run the best frame choice in the Quality analysis;
action– an array of strings with identifiers of actions to be performed.
Available actions:
photo_id_front – photo of the ID front side;
photo_id_back – photo of the ID back side;
overlay_options – the document's template displaying options:
show_document_pattern: true/false – true by default, displays a template image, if set to false, the image is replaced by a rectangular frame;
on_submit– a callback function (no arguments) that is called after submitting customer data to the server (unavailable for the ).
on_capture_complete – a callback function (with one argument) that is called after the video is captured and retrieves the information on this video. The example of the response is described .
on_result– a callback function (with one argument) that is called periodically during the analysis and retrieves an intermediate result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode and is described .
on_complete– a callback function (with one argument) that is called after the check is completed and retrieves the analysis result (unavailable for the capture mode). The result content depends on the Web Adapter result_mode and is described .
on_error – a callback function (with one argument) that is called in case of any error happened during video capturing and retrieves the error information: an object with the error code, error message, and telemetry ID for logging.
on_close– a callback function (no arguments) that is called after the plugin window is closed (whether manually by the user or automatically after the check is completed).
style – .
device_id – (optional) identifier of camera that is being used.
enable_3d_mask – enables the 3D mask as the default face capture behavior. This parameter works only if load_3d_mask in the Web Adapter is set to true; the default value is false.
cameraFacingMode (since 1.4.0) – the parameter that defines which camera to use; possible values: user (front camera), environment (rear camera). This parameter only works if the use_for_liveness option in the file is undefined. If use_for_liveness is set (with any value), cameraFacingMode gets overridden and ignored.
disable_adaptive_aspect_ratio (since 1.5.0) – if True, disables the video adaptive aspect ratio, so your video doesn’t automatically adjust to the window aspect ratio. The default value is False, and by default, the video adjusts to the closest ratio of 4:3, 3:4, 16:9, or 9:16. Please note: smartphones still require the portrait orientation to work.
get_user_media_timeout (since 1.5.0) – when Web SDK can’t get access to the user camera, after this timeout it displays a hint on how to solve the problem. The default value is 40000 (ms).
if the getUserMedia() function hangs, you can manage the SDK behavior using the following parameters (since 1.7.15):
get_user_media_promise_timeout_ms – set the timeout (in ms) after which SDK will throw an error or display an instruction. This parameter is an object with the following keys: "platform_browser", "browser", "platform", "default"(the priority matches the sequence).
OZLiveness.open({
... // other parameters
on_capture_complete: function(result) {
// Your code to process media/send it to your API, this is STEP #2
}
}){
"action": <action>,
"best_frame": <bestframe>,
"best_frame_png": <bestframe_png>,
"best_frame_bounding_box": {
"left": <bestframe_bb_left>,
"top": <bestframe_bb_top>,
"right": <bestframe_bb_right>,
"bottom": <bestframe_bb_bottom>
},
"best_frame_landmarks": {
"left_eye": [bestframe_x_left_eye, bestframe_y_left_eye],
"right_eye": [bestframe_x_right_eye, bestframe_y_right_eye],
"nose_base": [bestframe_x_nose_base, bestframe_y_nose_base],
"mouth_bottom": [bestframe_x_mouth_bottom, bestframe_y_mouth_bottom],
"left_ear": [bestframe_x_left_ear, bestframe_y_left_ear],
"right_ear": [bestframe_x_right_ear, bestframe_y_right_ear]
},
"frame_list": [<frame1>, <frame2>],
"frame_bounding_box_list": [
{
"left": <frame1_bb_left>,
"top": <frame1_bb_top>,
"right": <frame1_bb_right>,
"bottom": <frame1_bb_bottom>
},
{
"left": <frame2_bb_left>,
"top": <frame2_bb_top>,
"right": <frame2_bb_right>,
"bottom": <frame2_bb_bottom>
},
],
"frame_landmarks": [
{
"left_eye": [frame1_x_left_eye, frame1_y_left_eye],
"right_eye": [frame1_x_right_eye, frame1_y_right_eye],
"nose_base": [frame1_x_nose_base, frame1_y_nose_base],
"mouth_bottom": [frame1_x_mouth_bottom, frame1_y_mouth_bottom],
"left_ear": [frame1_x_left_ear, frame1_y_left_ear],
"right_ear": [frame1_x_right_ear, frame1_y_right_ear]
},
{
"left_eye": [frame2_x_left_eye, frame2_y_left_eye],
"right_eye": [frame2_x_right_eye, frame2_y_right_eye],
"nose_base": [frame2_x_nose_base, frame2_y_nose_base],
"mouth_bottom": [frame2_x_mouth_bottom, frame2_y_mouth_bottom],
"left_ear": [frame2_x_left_ear, frame2_y_left_ear],
"right_ear": [frame2_x_right_ear, frame2_y_right_ear]
}
],
"from_virtual_camera": null,
"additional_info": <additional_info>
}{
"action": <action>,
"best_frame": null,
"best_frame_png": null,
"best_frame_bounding_box": null,
"best_frame_landmarks": null
"frame_list": null,
"frame_bounding_box_list": null,
"frame_landmarks": null,
"from_virtual_camera": {
"additional_info": <additional_info>,
"best_frame": <bestframe>,
"best_frame_png": <best_frame_png>,
"best_frame_bounding_box": {
"left": <bestframe_bb_left>,
"top": <bestframe_bb_top>,
"right": <bestframe_bb_right>,
"bottom": <bestframe_bb_bottom>
},
"best_frame_landmarks": {
"left_eye": [bestframe_x_left_eye, bestframe_y_left_eye],
"right_eye": [bestframe_x_right_eye, bestframe_y_right_eye],
"nose_base": [bestframe_x_nose_base, bestframe_y_nose_base],
"mouth_bottom": [bestframe_x_mouth_bottom, bestframe_y_mouth_bottom],
"left_ear": [bestframe_x_left_ear, bestframe_y_left_ear],
"right_ear": [bestframe_x_right_ear, bestframe_y_right_ear]
},
"frame_list": [<frame1>, <frame2>],
"frame_bounding_box_list": [
{
"left": <frame1_bb_left>,
"top": <frame1_bb_top>,
"right": <frame1_bb_right>,
"bottom": <frame1_bb_bottom>
},
{
"left": <frame2_bb_left>,
"top": <frame2_bb_top>,
"right": <frame2_bb_right>,
"bottom": <frame2_bb_bottom>
},
],
"frame_landmarks": [
{
"left_eye": [frame1_x_left_eye, frame1_y_left_eye],
"right_eye": [frame1_x_right_eye, frame1_y_right_eye],
"nose_base": [frame1_x_nose_base, frame1_y_nose_base],
"mouth_bottom": [frame1_x_mouth_bottom, frame1_y_mouth_bottom],
"left_ear": [frame1_x_left_ear, frame1_y_left_ear],
"right_ear": [frame1_x_right_ear, frame1_y_right_ear]
},
{
"left_eye": [frame2_x_left_eye, frame2_y_left_eye],
"right_eye": [frame2_x_right_eye, frame2_y_right_eye],
"nose_base": [frame2_x_nose_base, frame2_y_nose_base],
"mouth_bottom": [frame2_x_mouth_bottom, frame2_y_mouth_bottom],
"left_ear": [frame2_x_left_ear, frame2_y_left_ear],
"right_ear": [frame2_x_right_ear, frame2_y_right_ear]
}
]
}
}"VIDEO_FILE_KEY": VIDEO_FILE_ZIP_BINARY
"payload": "{
"media:meta_data": {
"VIDEO_FILE_KEY": {
"additional_info": <additional_info>
}
}
}"OzLiveness.open({
lang: 'en',
action: [
'photo_id_front', // request photo ID picture
'video_selfie_blank' // request passive liveness video
],
meta: {
// an ID of user undergoing the check
// add for easier conversion calculation
'end_user_id': '<user_or_lead_id>',
// Your unique identifier that you can use later to find this folder in Oz API
// Optional, yet recommended
'transaction_id': '<your_transaction_id>',
// You can add iin if you plan to group transactions by the person identifier
'iin': '<your_client_iin>',
// Other meta data
'meta_key': 'meta_value',
},
on_error: function (result) {
// error details
console.error('on_error', result);
},
on_complete: function (result) {
// This callback is invoked when the analysis is complete
// It is recommended to commence the transaction on your backend,
// using transaction_id to find the folder in Oz API and get the results
console.log('on_complete', result);
},
on_capture_complete: function (result) {
// Handle captured data here if necessary
console.log('on_capture_complete', result);
}
});video_selfie_left – turn head to the left;
video_selfie_right – turn head to the right;
video_selfie_down – tilt head downwards;
video_selfie_high – raise head up;
video_selfie_smile – smile;
video_selfie_eyes – blink;
video_selfie_scan – scanning;
video_selfie_blank – no action, simple selfie;
video_selfie_best – special action to select the best shot from a video and perform analysis on it instead of the full video.
get_user_media_promise_timeout_throw_error – defines whether, after the time period defined in the parameter above, SDK should call an error (if true) or display a user instruction (if false).
Array[Named_parameter: Array[Int, Int]]
The coordinates of the face landmarks (left eye, right eye, nose, mouth, left ear, right ear) in the best frame
frame_list
Array[String]
All frames in the data URL format
frame_bounding_box_list
Array[Array[Named_parameter: Int]]
The coordinates of the bounding boxes where the face is located in the corresponding frames
frame_landmarks
Array[Named_parameter: Array[Int, Int]]
The coordinates of the face landmarks (left eye, right eye, nose, mouth, left ear, right ear) in the corresponding frames
action
String
An action code
additional_info
String
Information about client environment
To set your own look-and-feel options, use the style section in the Ozliveness.open method. The options are listed below the example.
Main color settings.
Parameter
Main font settings.
Title font settings.
Buttons’ settings.
Toolbar settings.
Center hint settings.
Hint animation settings.
Face frame settings.
Document capture frame settings.
Background settings.
Scam protection settings: the antiscam message warns user about their actions being recorded.
SDK version text settings.
3D mask settings. The mask has been implemented in 1.2.1.
Settings for a custom loader (added in 1.7.15).
Loader transition settings (added in 1.8.0).
Table of parameters' correspondence:
OzLiveness.open({
style: {
baseColorCustomization: {
textColorPrimary: "#000000",
backgroundColorPrimary: "#FFFFFF",
textColorSecondary: "#8E8E93",
backgroundColorSecondary: "#F2F2F7",
iconColor: "#00A5BA"
},
baseFontCustomization: {
textFont: "Roboto, sans-serif",
textSize: "16px",
textWeight: "400",
textStyle: "normal"
},
titleFontCustomization: {
textFont: "inherit",
textSize: "36px",
textWeight: "500",
textStyle: "normal"
},
buttonCustomization: {
textFont: "inherit",
textSize: "14px",
textWeight: "500",
textStyle: "normal",
textColorPrimary: "#FFFFFF",
backgroundColorPrimary: "#00A5BA",
textColorSecondary: "#00A5BA",
backgroundColorSecondary: "#DBF2F5",
cornerRadius: "10px"
},
toolbarCustomization: {
closeButtonIcon: "cross",
iconColor: "#707070"
},
centerHintCustomization: {
textFont: "inherit",
textSize: "24px",
textWeight: "500",
textStyle: "normal",
textColor: "#FFFFFF",
backgroundColor: "#1C1C1E",
backgroundOpacity: "56%",
backgroundCornerRadius: "14px",
verticalPosition: "38%"
},
hintAnimation: {
hideAnimation: false,
hintGradientColor: "#00BCD5",
hintGradientOpacity: "100%",
animationIconSize: "80px"
},
faceFrameCustomization: {
geometryType: "oval",
cornersRadius: "0px",
strokeDefaultColor: "#D51900",
strokeFaceInFrameColor: "#00BCD5",
strokeOpacity: "100%",
strokeWidth: "6px",
strokePadding: "4px"
},
documentFrameCustomization: {
cornersRadius: "20px",
templateColor: "#FFFFFF",
templateOpacity: "100%"
},
backgroundCustomization: {
backgroundColor: "#FFFFFF",
backgroundOpacity: "88%"
},
antiscamCustomization: {
enableAntiscam: false,
textMessage: "",
textFont: "inherit",
textSize: "14px",
textWeight: "500",
textStyle: "normal",
textColor: "#000000",
textOpacity: "100%",
backgroundColor: "#F2F2F7",
backgroundOpacity: "100%",
backgroundCornerRadius: "20px",
flashColor: "#FF453A"
},
versionTextCustomization: {
textFont: "inherit",
textSize: "16px",
textWeight: "500",
textStyle: "normal",
textColor: "#000000",
textOpacity: "56%"
},
maskCustomization: {
maskColor: "#008700",
glowColor: "#000102",
minAlpha: "30%", // 0 to 1 or 0% to 100%
maxAlpha: "100%" // 0 to 1 or 0% to 100%
},
/* for an HTML string, use string; for HTMLElement, insert it via cloneNode(true) */
loaderSlot: yourLoader, /* <string | HTMLElement> */
loaderTransition: {type: 'fade', duration: 500}
}
});Main background color
textColorSecondary
Secondary text color
backgroundColorSecondary
Secondary background color
cornerRadius
Button corner radius
Background color
backgroundOpacity
Background opacity
backgroundCornerRadius
Frame corner radius
verticalPosition
Vertical position
Stroke width
strokePadding
Padding from stroke
Text color
textOpacity
Text opacity
backgroundColor
Background color
backgroundOpacity
Background opacity
backgroundCornerRadius
Frame corner radius
flashColor
Flashing indicator color
Text opacity
{phase: 'start' | 'progress' | 'end', percent?}
before / during / after data transmission
loader:destroy
when you need to hide the slot
centerHintCustomization.verticalPosition
centerHint.letterSpacing
-
centerHint.fontStyle
centerHintCustomization.textStyle
closeButton.image
-
backgroundOutsideFrame.color
backgroundCustomization.backgroundColor
Description
textColorPrimary
Main text color
backgroundColorPrimary
Main background color
textColorSecondary
Secondary text color
backgroundColorSecondary
Secondary background color
iconColor
Icons’ color
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
textColorPrimary
Main text color
Parameter
Description
closeButtonIcon
Close button icon
iconColor
Close button icon color
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
textColor
Text color
Parameter
Description
hideAnimation
Disable animation
hintGradientColor
Gradient color
hintGradientOpacity
Gradient opacity
animationIconSize
Animation icon size
Parameter
Description
geometryType
Frame shape: rectangle or oval
cornersRadius
Frame corner radius (for rectangle)
strokeDefaultColor
Frame color when a face is not aligned properly
strokeFaceInFrameColor
Frame color when a face is aligned properly
strokeOpacity
Stroke opacity
Parameter
Description
cornersRadius
Frame corner radius
templateColor
Document template color
templateOpacity
Document template opacity
Parameter
Description
backgroundColor
Background color
backgroundOpacity
Background opacity
Parameter
Description
textMessage
Antiscam message text
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
Parameter
Description
textFont
Font
textSize
Font size
textWeight
Font weight
textStyle
Font style
textColor
Text color
Parameter
Description
maskColor
The color of the mask itself
glowColor
The color of the glowing mask shape
minAlpha
Minimum mask transparency level. Implemented in 1.3.1
maxAlpha
Maximum mask transparency level. Implemented in 1.3.1
Event
Payload
Is called
loader:init
{os, browser, platform}
immediately after inserting the slot
loader:waitingCamera
{os, browser, platform, waitedMs}
every waitedMs ms, while waiting for camera access
loader:cameraReady
when access is granted and loader should be hidden
loader:processing
{phase: 'start' | 'end'}
before / after data preparation
Parameter
Description
type
Animation type: none, fade, slide, scale
duration
Animation length in ms
easing (optional)
easing: linear, ease-in-out, etc
Previous design
New design
doc_color
-
face_color_success
faceFrame.faceReady
faceFrameCustomization.strokeFaceInFrameColor
face_color_fail
faceFrame.faceNotReady
faceFrameCustomization.strokeDefaultColor
centerHint.textSize
centerHintCustomization.textSize
centerHint.color
centerHintCustomization.textColor
backgroundColorPrimary
backgroundColor
strokeWidth
textColor
textOpacity
loader:uploading
centerHint.yPosition