DEV Community

Jackson for HMS Core

Posted on

Note on Developing a Person Tracking Function

Background

Videos are memories — so why not spend more time making them look better? Many mobile apps on the market simply offer basic editing functions, such as applying filters and adding stickers. That said, it is not enough for those who want to create dynamic videos, where a moving person stays in focus. Traditionally, this requires a keyframe to be added and the video image to be manually adjusted, which could scare off many amateur video editors.
I am one of those people and I've been looking for an easier way of implementing this kind of feature. Fortunately for me, I stumbled across the track person capability from HMS Core Video Editor Kit, which automatically generates a video that centers on a moving person, as the images below show.

Image description
Before using the capability

Image description
After using the capability

Thanks to the capability, I can now confidently create a video with the person tracking effect.
Let's see how the function is developed.

Development Process

Preparations

Configure the app information in AppGallery Connect.

Project Configuration

1) Set the authentication information for the app via an access token or API key.
Use the setAccessToken method to set an access token during app initialization. This needs setting only once.

MediaApplication.getInstance().setAccessToken("your access token");
Enter fullscreen mode Exit fullscreen mode

Or, use setApiKey to set an API key during app initialization. The API key needs to be set only once.

MediaApplication.getInstance().setApiKey("your ApiKey");
Enter fullscreen mode Exit fullscreen mode

2) Set a unique License ID.

MediaApplication.getInstance().setLicenseId("License ID");
Enter fullscreen mode Exit fullscreen mode

3) Initialize the runtime environment for HuaweiVideoEditor.
When creating a video editing project, first create a HuaweiVideoEditor object and initialize its runtime environment. Release this object when exiting a video editing project.
i. Create a HuaweiVideoEditor object.

HuaweiVideoEditor editor = HuaweiVideoEditor.create(getApplicationContext());
Enter fullscreen mode Exit fullscreen mode

ii. Specify the preview area position.
The area renders video images. This process is implemented via SurfaceView creation in the SDK. The preview area position must be specified before the area is created.

<LinearLayout    
    android:id="@+id/video_content_layout"    
    android:layout_width="0dp"    
    android:layout_height="0dp"    
    android:background="@color/video_edit_main_bg_color"    
    android:gravity="center"    
    android:orientation="vertical" />
// Specify the preview area position.
LinearLayout mSdkPreviewContainer = view.findViewById(R.id.video_content_layout);

// Configure the preview area layout.
editor.setDisplay(mSdkPreviewContainer);
Enter fullscreen mode Exit fullscreen mode

iii. Initialize the runtime environment. LicenseException will be thrown if license verification fails.
Creating the HuaweiVideoEditor object will not occupy any system resources. The initialization time for the runtime environment has to be manually set. Then, necessary threads and timers will be created in the SDK.

try {
        editor.initEnvironment();
   } catch (LicenseException error) { 
        SmartLog.e(TAG, "initEnvironment failed: " + error.getErrorMsg());    
        finish();
        return;
   }
Enter fullscreen mode Exit fullscreen mode

4) Add a video or an image.
Create a video lane. Add a video or an image to the lane using the file path.

// Obtain the HVETimeLine object.
HVETimeLine timeline = editor.getTimeLine();

// Create a video lane.
HVEVideoLane videoLane = timeline.appendVideoLane();

// Add a video to the end of the lane.
HVEVideoAsset videoAsset = videoLane.appendVideoAsset("test.mp4");

// Add an image to the end of the video lane.
HVEImageAsset imageAsset = videoLane.appendImageAsset("test.jpg");
Enter fullscreen mode Exit fullscreen mode

Function Building

// Initialize the capability engine.
visibleAsset.initHumanTrackingEngine(new HVEAIInitialCallback() {
        @Override
        public void onProgress(int progress) {
        // Initialization progress.
        }

        @Override
        public void onSuccess() {
        // The initialization is successful.
        }

        @Override
        public void onError(int errorCode, String errorMessage) {
        // The initialization failed.
    }
   });

// Track a person using the coordinates. Coordinates of two vertices that define the rectangle containing the person are returned.
List<Float> rects = visibleAsset.selectHumanTrackingPerson(bitmap, position2D);

// Enable the effect of person tracking.
visibleAsset.addHumanTrackingEffect(new HVEAIProcessCallback() {
        @Override
        public void onProgress(int progress) {
            // Handling progress.
        }

        @Override
        public void onSuccess() {
            // Handling successful.
        }

        @Override
        public void onError(int errorCode, String errorMessage) {
            // Handling failed.
        }
});

// Interrupt the effect.
visibleAsset.interruptHumanTracking();

// Remove the effect.
visibleAsset.removeHumanTrackingEffect();
Enter fullscreen mode Exit fullscreen mode

Top comments (0)