DEV Community

Cover image for OpenCV in Android native using C++
Yuval
Yuval

Posted on • Originally published at thecodingnotebook.com

OpenCV in Android native using C++

TL;DR

Source code on GitHub

Get OpenCV

First we have to download the OpenCV SDK for Android and setup our environment.

  • Download OpenCV Android SDK from here (this tutorial was tested against OpenCV version 4.2.0).
  • Extract the zip file to some folder, I use c:\tools.
  • Define a global environment variable OPENCV_ANDROID pointing to the root folder of the opencv android sdk (i.e c:\tools\OpenCV-android-sdk), by "global environment variable" the meaning is that it will be available for Android Studio).

Create a Native Android project

Lets open Android Studio and create a new project, from the project type template select "Native C++":

Alt Text

Hit "Next" and then choose a name for you project, and on the next step just leave C++ standard on the "Toolchain Default":

Alt Text

Click "Finish" and wait for the project to fully load (let Gradle finish).

Add some C++ source files

Looking on the project tree using the "Android" view you will notice we have a "cpp" folder, by default it will contain 2 files:

  1. CMakeLists.txt - Build instructions for the native code (this file can also be found under "External Build Files")
  2. native-lib.cpp - Will contains the "bridge" (JNI) code between the managed (Kotlin/Java) and native (c++) environments.

While it is perfectly "legal" to write all our code inside native-lib.cpp we will leave that file to have only the jni related methods, and the "real" image processing we will write on another files.

  • Right-click on the cpp folder and choose New -> C/C++ Header file, call it opencv-utils.h
  • Right-click on the cpp folder and choose New -> C/C++ Source file, call it opencv-utils.cpp

The project tree should look like:

Alt Text

Configure OpenCV build

Open CMakeLists.txt and add the following at the top, right after "cmake_minimum_required":

# opencv
set(OpenCV_STATIC ON)
set(OpenCV_DIR $ENV{OPENCV_ANDROID}/sdk/native/jni)
find_package(OpenCV REQUIRED)
Enter fullscreen mode Exit fullscreen mode

Here we sort of "importing" the opencv package and build definitions from the OpenCV Android SDK, note this line set(OpenCV_DIR $ENV{OPENCV_ANDROID}/sdk/native/jni) where we we use the environment variable we defined above that points to opencv sdk, so double check it points to the correct location.

Next we should add our source files so they will get compiled, scroll down a bit inside and search for native-lib.cpp, it will be an argument in a call to add_library method, change it as follows:

add_library( # Sets the name of the library.
        native-lib

        # Sets the library as a shared library.
        SHARED

        # Provides a relative path to your source file(s).
        opencv-utils.cpp
        native-lib.cpp)
Enter fullscreen mode Exit fullscreen mode

Here we defined our native-lib library and the sources it is built from.

Next we will include libraries to help us with bitmap manipulation inside C++ (in order to convert Android Bitmap to OpenCV Mat). Scroll down a bit, and right before the call to target_link_libraries add this:

# jnigraphics lib from NDK is used for Bitmap manipulation in native code
find_library(jnigraphics-lib jnigraphics)
Enter fullscreen mode Exit fullscreen mode

Finally we have to include the OpenCV and jnigraphics libs in the link process, change target_link_libraries to:

target_link_libraries( # Specifies the target library.
        native-lib

        ${OpenCV_LIBS}
        ${jnigraphics-lib}
        # Links the target library to the log library
        # included in the NDK.
        ${log-lib})
Enter fullscreen mode Exit fullscreen mode

Sync & Build

That's it, if Android Studio is offering you to "sync" the project do it, if it doesn't then initiate a sync in the menu File -> Sync Project with Gradle Files.

And build Build -> Make Project, if the build is successful, great! If you got error like Error computing CMake server result with no "real" error, then something is wrong with the project definition, what worked for me was to remove the cmake version that is defined in the app build.gradle:

externalNativeBuild {
    cmake {
        path "src/main/cpp/CMakeLists.txt"
        version "3.10.2" # <<-- REMOVE THIS LINE
    }
}
Enter fullscreen mode Exit fullscreen mode

Let's use OpenCV

Flip & Blur

For the demo purposes, our app will flip and blur the image using OpenCV, add the following to opencv-utils.h:

#pragma once

#include <opencv2/core.hpp>

using namespace cv;

void myFlip(Mat src);
void myBlur(Mat src, float sigma);
Enter fullscreen mode Exit fullscreen mode

IT IS OK if Android Studio will mark in red the include opencv2 stuff, it should be fine after building the project.

And the implementation inside opencv-utils.cpp:

#include "opencv-utils.h"
#include <opencv2/imgproc.hpp>

void myFlip(Mat src) {
    flip(src, src, 0);
}

void myBlur(Mat src, float sigma) {
    GaussianBlur(src, src, Size(), sigma);
}
Enter fullscreen mode Exit fullscreen mode

Expose native code to managed code

Next we need to expose our flip and blur method to the "managed" world, this will happen inside native-lib.cpp, we will not go over the jni standards and rules on how to expose methods, we will just copy the pre-defined stringFromJNI method to use as template, so in my case Android Studio created this method:

extern "C" JNIEXPORT jstring JNICALL
Java_com_vyw_opencv_1demo_MainActivity_stringFromJNI(...
Enter fullscreen mode Exit fullscreen mode

NOTE the method name starts with the full activity namespace, make sure to copy yours correctly...

so our methods will be like:

extern "C" JNIEXPORT void JNICALL
Java_com_vyw_opencv_1demo_MainActivity_flip(JNIEnv* env, jobject p_this, jobject bitmapIn, jobject bitmapOut) {
    Mat src;
    bitmapToMat(env, bitmapIn, src, false);
    // NOTE bitmapToMat returns Mat in RGBA format, if needed convert to BGRA using cvtColor

    myFlip(src);

    // NOTE matToBitmap expects Mat in GRAY/RGB(A) format, if needed convert using cvtColor
    matToBitmap(env, src, bitmapOut, false);
}

extern "C" JNIEXPORT void JNICALL
Java_com_vyw_opencv_1demo_MainActivity_blur(JNIEnv* env, jobject p_this, jobject bitmapIn, jobject bitmapOut, jfloat sigma) {
    Mat src;
    bitmapToMat(env, bitmapIn, src, false);
    myBlur(src, sigma);
    matToBitmap(env, src, bitmapOut, false);
}
Enter fullscreen mode Exit fullscreen mode

The code is pretty much self-explanatory, the interesting part is bitmapToMat and matToBitmap, these 2 methods, as the name implies, converts between Android Bitmap and OpenCV Mat classes, basically it copies the pixels bytes taking into consideration the pixels format and making the needed conversions. The methods were taken from OpenCV source opencv/modules/java/generator/src/cpp/utils.cpp with some slight adjustments.

void bitmapToMat(JNIEnv *env, jobject bitmap, Mat& dst, jboolean needUnPremultiplyAlpha)
{
    AndroidBitmapInfo  info;
    void*              pixels = 0;

    try {
        CV_Assert( AndroidBitmap_getInfo(env, bitmap, &info) >= 0 );
        CV_Assert( info.format == ANDROID_BITMAP_FORMAT_RGBA_8888 ||
                   info.format == ANDROID_BITMAP_FORMAT_RGB_565 );
        CV_Assert( AndroidBitmap_lockPixels(env, bitmap, &pixels) >= 0 );
        CV_Assert( pixels );
        dst.create(info.height, info.width, CV_8UC4);
        if( info.format == ANDROID_BITMAP_FORMAT_RGBA_8888 )
        {
            Mat tmp(info.height, info.width, CV_8UC4, pixels);
            if(needUnPremultiplyAlpha) cvtColor(tmp, dst, COLOR_mRGBA2RGBA);
            else tmp.copyTo(dst);
        } else {
            // info.format == ANDROID_BITMAP_FORMAT_RGB_565
            Mat tmp(info.height, info.width, CV_8UC2, pixels);
            cvtColor(tmp, dst, COLOR_BGR5652RGBA);
        }
        AndroidBitmap_unlockPixels(env, bitmap);
        return;
    } catch(const cv::Exception& e) {
        AndroidBitmap_unlockPixels(env, bitmap);
        jclass je = env->FindClass("java/lang/Exception");
        env->ThrowNew(je, e.what());
        return;
    } catch (...) {
        AndroidBitmap_unlockPixels(env, bitmap);
        jclass je = env->FindClass("java/lang/Exception");
        env->ThrowNew(je, "Unknown exception in JNI code {nBitmapToMat}");
        return;
    }
}

void matToBitmap(JNIEnv* env, Mat src, jobject bitmap, jboolean needPremultiplyAlpha)
{
    AndroidBitmapInfo  info;
    void*              pixels = 0;

    try {
        CV_Assert( AndroidBitmap_getInfo(env, bitmap, &info) >= 0 );
        CV_Assert( info.format == ANDROID_BITMAP_FORMAT_RGBA_8888 ||
                   info.format == ANDROID_BITMAP_FORMAT_RGB_565 );
        CV_Assert( src.dims == 2 && info.height == (uint32_t)src.rows && info.width == (uint32_t)src.cols );
        CV_Assert( src.type() == CV_8UC1 || src.type() == CV_8UC3 || src.type() == CV_8UC4 );
        CV_Assert( AndroidBitmap_lockPixels(env, bitmap, &pixels) >= 0 );
        CV_Assert( pixels );
        if( info.format == ANDROID_BITMAP_FORMAT_RGBA_8888 )
        {
            Mat tmp(info.height, info.width, CV_8UC4, pixels);
            if(src.type() == CV_8UC1)
            {
                cvtColor(src, tmp, COLOR_GRAY2RGBA);
            } else if(src.type() == CV_8UC3){
                cvtColor(src, tmp, COLOR_RGB2RGBA);
            } else if(src.type() == CV_8UC4){
                if(needPremultiplyAlpha) cvtColor(src, tmp, COLOR_RGBA2mRGBA);
                else src.copyTo(tmp);
            }
        } else {
            // info.format == ANDROID_BITMAP_FORMAT_RGB_565
            Mat tmp(info.height, info.width, CV_8UC2, pixels);
            if(src.type() == CV_8UC1)
            {
                cvtColor(src, tmp, COLOR_GRAY2BGR565);
            } else if(src.type() == CV_8UC3){
                cvtColor(src, tmp, COLOR_RGB2BGR565);
            } else if(src.type() == CV_8UC4){
                cvtColor(src, tmp, COLOR_RGBA2BGR565);
            }
        }
        AndroidBitmap_unlockPixels(env, bitmap);
        return;
    } catch(const cv::Exception& e) {
        AndroidBitmap_unlockPixels(env, bitmap);
        jclass je = env->FindClass("java/lang/Exception");
        env->ThrowNew(je, e.what());
        return;
    } catch (...) {
        AndroidBitmap_unlockPixels(env, bitmap);
        jclass je = env->FindClass("java/lang/Exception");
        env->ThrowNew(je, "Unknown exception in JNI code {nMatToBitmap}");
        return;
    }
}
Enter fullscreen mode Exit fullscreen mode

Calling Native from Managed

We arrived at the last part of our demo, calling the native methods from our MainActivity.

First I have added a sample image to the res/drawable-nodpi folder (you might need to create it), I chose the nodpi flavor as I don't want Android to scale up my image, I used a relatively small image (640x427) so blurring can be real-time.

Then I setup my MainActivity view with:

  • ImageView - Pre-loaded with the test image as app:srcCompat="@drawable/mountain"
  • Button - Will be used to flip the image
  • SeekBar - Will be used to control the blur sigma, goes from 0-100 (later in the code will be converted to float in the rage 1-10)

Declaring the JNI methods

In order to use our methods from native-lib.cpp we need to declare them as external functions inside our activity, and we need to load our native-lib library (libnative-lib.so). If you created the project from the "Native C++" template it is already done, scroll to the very bottom of MainActivity.kt you will see it, then just add our blur and flip, it should look like this:

external fun stringFromJNI(): String
external fun blur(bitmapIn: Bitmap, bitmapOut: Bitmap, sigma: Float)
external fun flip(bitmapIn: Bitmap, bitmapOut: Bitmap)

companion object {
    // Used to load the 'native-lib' library on application startup.
    init {
        System.loadLibrary("native-lib")
    }
}
Enter fullscreen mode Exit fullscreen mode

Processing Android Bitmap

Now we will flip and blur the ImageView bitmap, first lets create 2 bitmaps, the first will hold the original image (srcBitmap), the other will be used as the destination bitmap (dstBitmap) which will be viewed on screen.

class MainActivity : AppCompatActivity(), SeekBar.OnSeekBarChangeListener {
    var srcBitmap: Bitmap? = null
    var dstBitmap: Bitmap? = null

    override fun onCreate(savedInstanceState: Bundle?) {
        ...

        // Load the original image
        srcBitmap = BitmapFactory.decodeResource(this.resources, R.drawable.mountain)

        // Create and display dstBitmap in image view, we will keep updating
        // dstBitmap and the changes will be displayed on screen
        dstBitmap = srcBitmap!!.copy(srcBitmap!!.config, true)
        imageView.setImageBitmap(dstBitmap)
    ...
    ...
Enter fullscreen mode Exit fullscreen mode

Whenever the user will move the seekbar we will blur the image using the seekbar value as the blur sigma:

// SeekBar event handler
override fun onProgressChanged(seekBar: SeekBar?, progress: Int, fromUser: Boolean) {
    this.doBlur()
}

fun doBlur() {
    // The SeekBar range is 0-100 convert it to 0.1-10
    val sigma = max(0.1F, sldSigma.progress / 10F)

    // This is the actual call to the blur method inside native-lib.cpp
    this.blur(srcBitmap!!, dstBitmap!!, sigma)
}
Enter fullscreen mode Exit fullscreen mode

And finally we have the event handler for the flip button:

fun btnFlip_click(view: View) {
    // This is the actual call to the blur method inside native-lib.cpp
    // note we flip srcBitmap (which is not displayed) and then call doBlur which will
    // eventually update dstBitmap (and which is displayed)
    this.flip(srcBitmap!!, srcBitmap!!)
    this.doBlur()
}
Enter fullscreen mode Exit fullscreen mode

THE END

That's it! The code can be found on GitHub

Top comments (0)