DEV Community

HMS Community
HMS Community

Posted on

Capture the bills using Huawei ML Kit in Money Management Android app (Kotlin) – Part 4

Image description

Introduction

In this article, we can learn how capture the bills of text images using this Money Management app. This app converts the images to quality visibility by zooming. So, whenever the user purchases some shopping or spending, he can capture the bill using this application and can save in memory.

So, I will provide the series of articles on this Money Management App, in upcoming articles I will integrate other Huawei Kits.

If you are new to this application, follow my previous articles.

Beginner: Find the introduction Sliders and Huawei Account Kit Integration in Money Management Android app (Kotlin) - Part 1

Beginner: Integration of Huawei Ads Kit and Analytics Kit in Money Management Android app (Kotlin) – Part 2

Beginner: Manage the Budget using Room Database in Money Management Android app (Kotlin) – Part 3

ML Kit - Text Image Super-Resolution

The ML Kit - Text Image Super-Resolution feature of Huawei ML Kit. It provides better quality and visibility of old and blurred text on an image. When you take a photograph of a document from far or cannot properly adjust the focus, the text may not be clear. In this situation, it can zoom an image that contains the text up to three times and significantly improves the definition of the text.

Requirements

  1. Any operating system (MacOS, Linux and Windows).
  2. Must have a Huawei phone with HMS 4.0.0.300 or later.
  3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
  4. Minimum API Level 19 is required.
  5. Required EMUI 9.0.0 and later version devices.

How to integrate HMS Dependencies

  • First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.

  • Create a project in android studio, refer Creating an Android Studio Project.

  • Generate a SHA-256 certificate fingerprint.

  • To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.

Image description

Note: Project Name depends on the user created name.

  • Create an App in AppGallery Connect.

  • Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
    Image description

  • Enter SHA-256 certificate fingerprint and click Save button, as follows.
    Image description

  • Click Manage APIs tab and enable ML Kit.
    Image description

  • Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.

maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'

Enter fullscreen mode Exit fullscreen mode
  • Add the below plugin and dependencies in build.gradle(Module) file.
apply plugin: 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
// Import the text image super-resolution base SDK.
implementation 'com.huawei.hms:ml-computer-vision-textimagesuperresolution:2.0.4.300'
// Import the text image super-resolution model package.
implementation 'com.huawei.hms:ml-computer-vision-textimagesuperresolution-model:2.0.4.300'
Enter fullscreen mode Exit fullscreen mode
  • Now Sync the gradle.
  • Add the required permission to the AndroidManifest.xml file.
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Enter fullscreen mode Exit fullscreen mode

Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the CaptureActivity.kt we can find the business logic.

class CaptureActivity : AppCompatActivity(), View.OnClickListener {

    private var analyzer: MLTextImageSuperResolutionAnalyzer? = null
    private val QUALITY = 1
    private val ORIGINAL = 2
    private var imageView: ImageView? = null
    private var srcBitmap: Bitmap? = null

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_capture)

        imageView = findViewById(R.id.bill)
        srcBitmap = BitmapFactory.decodeResource(resources, R.drawable.bill_1)
        findViewById<View>(R.id.btn_quality).setOnClickListener(this)
        findViewById<View>(R.id.btn_original).setOnClickListener(this)
        createAnalyzer()

    }

    // Find the on click listeners
    override fun onClick(v: View?) {
        if (v!!.id == R.id.btn_quality) {
            detectImage(QUALITY)
        } else if (v.id == R.id.btn_original) {
            detectImage(ORIGINAL)
        }
    }

    private fun release() {
        if (analyzer == null) {
            return
        }
        analyzer!!.stop()
    }

    // Find the method to detect bills or text images
    private fun detectImage(type: Int) {
        if (type == ORIGINAL) {
            setImage(srcBitmap!!)
            return
        }
        if (analyzer == null) {
            return
        }
        // Create an MLFrame by using the bitmap.
        val frame = MLFrame.Creator().setBitmap(srcBitmap).create()
        val task = analyzer!!.asyncAnalyseFrame(frame)
        task.addOnSuccessListener { result -> // success.
            Toast.makeText(applicationContext, "Success", Toast.LENGTH_LONG).show()
            setImage(result.bitmap)
        }.addOnFailureListener { e ->
            // Failure
            if (e is MLException) {
                val mlException = e
                // Get the error code, developers can give different page prompts according to the error code.
                val errorCode = mlException.errCode
                // Get the error message, developers can combine the error code to quickly locate the problem.
                val errorMessage = mlException.message
                Toast.makeText(applicationContext,"Error:$errorCode Message:$errorMessage", Toast.LENGTH_LONG).show()
                // Log.e(TAG, "Error:$errorCode Message:$errorMessage")
            } else {
                // Other exception
                Toast.makeText(applicationContext, "Failed:" + e.message, Toast.LENGTH_LONG).show()
                // Log.e(TAG, e.message!!)
            }
        }
    }

    private fun setImage(bitmap: Bitmap) {
        this@CaptureActivity.runOnUiThread(Runnable {
            imageView!!.setImageBitmap(
                bitmap
            )
        })
    }

    private fun createAnalyzer() {
        analyzer = MLTextImageSuperResolutionAnalyzerFactory.getInstance().textImageSuperResolutionAnalyzer
    }

    override fun onDestroy() {
        super.onDestroy()
        if (srcBitmap != null) {
            srcBitmap!!.recycle()
        }
        release()
    }

}

Enter fullscreen mode Exit fullscreen mode

In the activity_capture.xml we can create the UI screen.

<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".mlkit.CaptureActivity">


    <LinearLayout
        android:id="@+id/buttons"
        android:layout_width="match_parent"
        android:layout_height="wrap_content"
        android:layout_alignParentBottom="true"
        android:orientation="vertical"
        tools:ignore="MissingConstraints">
        <Button
            android:id="@+id/btn_quality"
            android:layout_width="match_parent"
            android:layout_height="50dp"
            android:layout_margin="15dp"
            android:gravity="center"
            android:textSize="19sp"
            android:text="Quality"
            android:textAllCaps="false"
            android:textColor="@color/Red"
            tools:ignore="HardcodedText" />
        <Button
            android:id="@+id/btn_original"
            android:layout_width="match_parent"
            android:layout_height="50dp"
            android:layout_margin="15dp"
            android:gravity="center"
            android:text="Original"
            android:textSize="19sp"
            android:textAllCaps="false"
            android:textColor="@color/Red"
            tools:ignore="HardcodedText" />
    </LinearLayout>

    <ScrollView
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:layout_above="@+id/buttons"
        android:layout_marginBottom="15dp">
        <ImageView
            android:id="@+id/bill"
            android:layout_width="wrap_content"
            android:layout_height="wrap_content"
            android:layout_centerInParent="true"
            android:layout_gravity="center"
            tools:ignore="ObsoleteLayoutParam" />
    </ScrollView>

</RelativeLayout>
Enter fullscreen mode Exit fullscreen mode

Demo

Image description

Tips and Tricks

  1. Make sure you are already registered as Huawei developer.
  2. Set minSDK version to 19 or later, otherwise you will get AndriodManifest merge issue.
  3. Make sure you have added the agconnect-services.json file to app folder.
  4. Make sure you have added SHA-256 fingerprint without fail.
  5. Make sure all the dependencies are added properly.

Conclusion
In this article, we have learned about the Text Image Super-Resolution feature of Huawei ML Kit and its functionality. It provides better quality and visibility of old and blurred text on an image. It can zoom an image that contains the text up to three times and significantly improves the definition of the text.

Reference

ML Kit – Documentation

ML Kit – Training Video

Top comments (0)