Background
Oh how great it is to be able to reset bank details from the comfort of home and avoid all the hassle of going to the bank, queuing up, and proving you are who you say you are.
All these have become true with the help of some tech magic known as face verification, which is perfect for verifying a user's identity remotely. I have been curious about how the tech works, so here it is: I decided to integrate the face verification service from HMS Core ML Kit into a demo app. Below is how I did it.
Development Process
Preparations
1) Make necessary configurations as detailed here.
2) Configure the Maven repository address for the face verification service.
i. Open the project-level build.gradle file of the Android Studio project.
ii. Add the Maven repository address and AppGallery Connect plugin.
Go to allprojects > repositories and configure the Maven repository address for the face verification service.
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript > repositories to configure the Maven repository address.
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript > dependencies to add the plugin configuration.
buildscript{
dependencies {
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
Function Building
1) Create an instance of the face verification analyzer.
MLFaceVerificationAnalyzer analyzer = MLFaceVerificationAnalyzerFactory.getInstance().getFaceVerificationAnalyzer();
2) Create an MLFrame object via android.graphics.Bitmap. This object is used to set the face verification template image whose format can be JPG, JPEG, PNG, or BMP.
// Create an **MLFrame** object.
MLFrame templateFrame = MLFrame.fromBitmap(bitmap);
3) Set the template image. The setting will fail if the template does not contain a face, and the face verification service will use the template set last time.
List<MLFaceTemplateResult> results = analyzer.setTemplateFace(templateFrame);
for (int i = 0; i < results.size(); i++) {
// Process the result of face detection in the template.
}
4) Use android.graphics.Bitmap to create an MLFrame object that is used to set the image for comparison. The image format can be JPG, JPEG, PNG, or BMP.
// Create an **MLFrame** object.
MLFrame compareFrame = MLFrame.fromBitmap(bitmap);
5) Perform face verification by calling the asynchronous or synchronous method. The returned verification result (MLFaceVerificationResult) contains the facial information obtained from the comparison image and the confidence indicating the faces in the comparison image and template image being of the same person.
Asynchronous method:
Task<List<MLFaceVerificationResult>> task = analyzer.asyncAnalyseFrame(compareFrame);
task.addOnSuccessListener(new OnSuccessListener<List<MLFaceVerificationResult>>() {
@Override
public void onSuccess(List<MLFaceVerificationResult> results) {
// Callback when the verification is successful.
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Callback when the verification fails.
}
});
Synchronous method:
SparseArray<MLFaceVerificationResult> results = analyzer.analyseFrame(compareFrame);
for (int i = 0; i < results.size(); i++) {
// Process the verification result.
}
6) Stop the analyzer and release the resources that it occupies, when verification is complete.
if (analyzer != null) {
analyzer.stop();
}
This is how the face verification function is built. This kind of tech not only saves hassle, but is great for honing my developer skills.
Top comments (0)