DEV Community

loading...

Monitor Real-time Health during Workouts with Body and Face Tracking

#ar
Vivi-clevercoder
・4 min read

Still wearing a smart watch to monitor health indicators during workouts? Curious at what makes AR apps so advanced? Still think that AR is only used in movies? With HUAWEI AR Engine, you can integrate AR capabilities into your own apps in just a few easy steps. If this has piqued your interest, read on to learn more!

What is AR Engine?

HUAWEI AR Engine is an engine designed for building augmented reality (AR) apps to be run on Android smartphones. It is based on the HiSilicon chipset, and integrates AR core algorithms to provide a range of basic AR capabilities, such as motion tracking, environment tracking, body tracking, and face tracking, enabling your app to bridge real and virtual worlds, by offering a brand new visually interactive user experience.

AR Engine provides for high-level health status detection, via facial information, and encompasses a range of different data indicators including heart rate, respiratory rate, facial health status, and heart rate waveform signals.

With the human body and face tracking capability, one of the engine's three major capabilities (the other two being motion tracking and environment tracking), HUAWEI AR Engine is able to monitor and display the user's real time health status during workouts.

Application scenarios:
Gym: Checking real-time body indicators during workouts.
Medical treatment: Monitoring patients' physical status in real time.
Caregiving: Monitoring health indicators of the elderly in real time.

Next, let's take a look at how to implement these powerful functions.

Advantages of AR monitoring and requirements for hardware:

  1. Detects facial health information and calculates key health information, such as real time heart rate.

  2. The human body and face tracking capabilities also equip your device to better understanding users. By locating hand locations and recognizing specific gestures, AR Engine can assist in placing a virtual object in the real world, or overlaying special effects on a hand. With the depth sensing components, the hand skeleton tracking capability is capable of tracking 21 hand skeleton points, to implement precise interactive controls and special effect overlays. With regard to body tracking, the capability can track 23 body skeleton points to detect human posture in real time, providing a strong foundation for motion sensing and fitness & health apps..

  3. For details about supported models, please refer to the software and hardware dependencies on the HUAWEI Developers website.

image

Demo Introduction

A demo is offered here for you to learn how to integrate AR Engine with simplest code in the fastest way.
 Enable health check by using ENABLE_HEALTH_DEVICE.
 FaceHealthCheckStateEvent functions as a parameter of ServiceListener.handleEvent(EventObject eventObject) that passes health check status information to the app.
 The health check HealthParameter includes the heart rate, respiratory rate, facial attributes (like age and gender), and hear rate waveform signal.

  1. Development Practice The following describes how to run the demo using source code, enabling you to understand the implementation details.

Preparations

  1. Get the tools prepared.
    a) A Huawei P30 running Android 11.
    b) Development tool: Android Studio; development language: Java.

  2. Register as a Huawei developer.
    a) Register as a Huawei developer.
    b) Create an app.
    Follow instructions in the AR Engine Development Guide to add an app in AppGallery Connect.
    c) Build the demo app.
     Import the source code to Android Studio.
     Download the agconnect-services.json file of the created app from AppGallery Connect, and add it to the app directory in the sample project.

  3. Run the demo app.
    a) Install the demo app on the test device.
    b) After the app is started, access facial recognition. During recognition, the progress will be displayed on the screen in real time.
    c) Your heart rate, respiratory rate, and real-time heart rate waveform will be displayed after successful recognition.
    The results are as shown in the following figure.

Key Steps

  1. Add the Huawei Maven repository to the project-level build.gradle file. Add the following Maven repository address to the project-level build.gradle file of your Android Studio project:
buildscript {
    repositories {
        maven { url 'http://developer.huawei.com/repo/'}
    }
dependencies {
        ...
        // Add the AppGallery Connect plugin configuration.
        classpath 'com.huawei.agconnect:agcp:1.4.2.300'
    }
}allprojects {
    repositories {
        maven { url 'http://developer.huawei.com/repo/'}
    }
}
Enter fullscreen mode Exit fullscreen mode

2. Add dependencies on the SDKs in the app-level build.gradle file.

dependencies {
   implementation 'com.huawei.hms:arenginesdk: 2.15.0.1'
}
Enter fullscreen mode Exit fullscreen mode

3. Declare system permissions in the AndroidManifest.xml file.

The required permissions include the camera permission and network permission.

Camera permission: android.permission.CAMERA, which is indispensable for using the AR Engine Server.
Network permission: android.permission.INTERNET, which is used to analyze API calling status and guide continuous capability optimization.


Note: The AR Engine SDK processes data only on the device side, and does not report data to the server.

Key Code Description

  1. Check the AR Engine availability.

Check whether AR Engine has been installed on the current device. If yes, the app can run properly. If not, the app automatically redirects the user to AppGallery to install AR Engine. Sample code:

boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
        if (!isInstallArEngineApk) {
            // ConnectAppMarketActivity.class is the activity for redirecting to AppGallery.
            startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
            isRemindInstall = true;
        }
Enter fullscreen mode Exit fullscreen mode
  1. Create a ARFaceTrackingConfig scene.
// Create an ARSession.
mArSession = new ARSession(this);
// Select a specific Config to initialize the ARSession based on the application scenario.
ARWorldTrackingConfig config = new ARWorldTrackingConfig(mArSession);
Enter fullscreen mode Exit fullscreen mode
  1. Add the listener for passing information such as the health check status and progress.
mArSession.addServiceListener(new FaceHealthServiceListener() {
    @Override
    public void handleEvent(EventObject eventObject) {
        // FaceHealthCheckStateEvent passes the health check status information to the app.
        if (!(eventObject instanceof FaceHealthCheckStateEvent)) {
            return;
        }
        // Obtain the health check status.
        final FaceHealthCheckState faceHealthCheckState =
                ((FaceHealthCheckStateEvent) eventObject).getFaceHealthCheckState();
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                mHealthCheckStatusTextView.setText(faceHealthCheckState.toString());
            }
        });
    }
    //handleProcessProgressEvent Health check progress
    @Override
    public void handleProcessProgressEvent(final int progress) {
        mHealthRenderManager.setHealthCheckProgress(progress);
        runOnUiThread(new Runnable() {
            @Override
            public void run() {
                setProgressTips(progress);
            }
        });
    }
});
Enter fullscreen mode Exit fullscreen mode

For more information, please visit:
Documentation on the HUAWEI Developers website

HUAWEI Developers official website

Development Guide

Redditto join developer discussions

GitHub or Gitee to download the demo and sample code

Stack Overflow to solve integration problems

Discussion (0)