DEV Community

Cover image for Cultural Relics at Your Fingertips with HMS Core
Jackson for HMS Core

Posted on

Cultural Relics at Your Fingertips with HMS Core

3D technology is developing fast, and more and more museums rely on such technology to hold online exhibitions, opening up history to more people. Users can immerse themselves in online exhibitions through themed virtual environments, stunning lighting effects, and exhibit models that can be enlarged and shrunk, so that visitors can view every detail. On top of this, such exhibitions also feature background music (BGM) and audio guides, providing background to each exhibit.

Image description

Here is a virtual exhibit that showcases a high level of realism.
To create something like this, we just need an Android Studio project with Kotlin, which implements the following functions: 3D scene creation, model display, and audio playback. Let's have a look at it.

1. Preparing a 3D Model

This can be effortlessly done with 3D Modeling Kit, a service that was recently added to HMS Core. This kit automatically generates a textured 3D model using images shot from different angles with a common mobile phone camera. The kit equips an app with the ability to build and preview 3D models. For more details, please refer to How to Build a 3D Product Model Within Just 5 Minutes.

Image description

2.Creating a 3D Scene View

Next, use Scene Kit to create an interactive 3D view for the model just created. For example:

Image description

Integrate Scene Kit.
Software requirements:
• JDK version: 1.8 (recommended)
• minSdkVersion: 19 or later
• targetSdkVersion: 30 (recommended)
• compileSdkVersion: 30 (recommended)
• Gradle version: 5.4.1 or later (recommended)

Configure the following information in the project-level build.gradle file:

buildscript {
    repositories {
        ...
        maven { url 'https://developer.huawei.com/repo/' }
    }
    ...
}

allprojects {
 repositories {
  ...
  maven { url 'https://developer.huawei.com/repo/' }
 }
}

Enter fullscreen mode Exit fullscreen mode

Configure the following information in the app-level build.gradle file:

dependencies {
    ...
    implementation 'com.huawei.scenekit:full-sdk:5.1.0.300'
}
    To enable the view binding feature, add the following code in the app-level build.gradle file:
android {
    ...
    buildFeatures {
        viewBinding true
    }
    ...
}

Enter fullscreen mode Exit fullscreen mode

After synchronizing data in the build.gradle files, we can use Scene Kit in the project.

This article only describes how to use the kit to display the 3D model for an exhibit and how to interact with the model. To try other functions of Scene Kit, please refer to its official document.

Create a 3D scene view.
A custom SceneView is created to ensure that the first model can be automatically loaded after view initialization.

import android.content.Context
import android.util.AttributeSet
import android.view.SurfaceHolder
import com.huawei.hms.scene.sdk.SceneView

class CustomSceneView : SceneView {
    constructor(context: Context?) : super(context)
    constructor(
        context: Context?,
        attributeSet: AttributeSet?
    ) : super(context, attributeSet)

    override fun surfaceCreated(holder: SurfaceHolder) {
        super.surfaceCreated(holder)
        loadScene("qinghuaci/scene.gltf")
        loadSpecularEnvTexture("qinghuaci/specularEnvTexture.dds")
        loadDiffuseEnvTexture("qinghuaci/diffuseEnvTexture.dds")
    }
}

Enter fullscreen mode Exit fullscreen mode

To implement model display, add the model file of the object to a folder under src > main >assets. For example:

Image description

loadScene(), loadSpecularEnvTexture(), and loadDiffuseEnvTexture() of surfaceCreated are used to load the object models. After the surface is created, the first object model will be loaded to it.

Next, open the XML file (activity_main.xml in this project) that is used to display the 3D scene view. Add CustomSceneView just constructed. The following code adds the arrow images for switching between object models.

<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">

    <com.example.sceneaudiodemo.CustomSceneView
        android:id="@+id/csv_main"
        android:layout_width="match_parent"
        android:layout_height="match_parent"/>

    <ImageView
        android:id="@+id/iv_rightArrow"
        android:layout_width="32dp"
        android:layout_height="32dp"
        android:layout_margin="12dp"
        android:src="@drawable/ic_arrow"
        android:tint="@color/white"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintTop_toTopOf="parent" />

    <ImageView
        android:id="@+id/iv_leftArrow"
        android:layout_width="32dp"
        android:layout_height="32dp"
        android:layout_margin="12dp"
        android:rotation="180"
        android:src="@drawable/ic_arrow"
        android:tint="@color/white"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintTop_toTopOf="parent" />

</androidx.constraintlayout.widget.ConstraintLayout>

Enter fullscreen mode Exit fullscreen mode

We can now open the app to check the first exhibit: a blue and white porcelain vase.
Add the model switching function.
This function allows users to switch between different exhibit models.
Configure the following information in MainActivity:

private lateinit var binding: ActivityMainBinding
private var selectedId = 0
private val modelSceneList = arrayListOf(
    "qinghuaci/scene.gltf",
    "tangyong/scene.gltf",
)
private val modelSpecularList = arrayListOf(
    "qinghuaci/specularEnvTexture.dds",
    "tangyong/specularEnvTexture.dds",
)
private val modelDiffList = arrayListOf(
    "qinghuaci/diffuseEnvTexture.dds",
    "tangyong/diffuseEnvTexture.dds",
)

override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    binding = ActivityMainBinding.inflate(layoutInflater)
    val view = binding.root
    setContentView(view)
    binding.ivRightArrow.setOnClickListener {
        if (modelSceneList.size == 0) return@setOnClickListener
        selectedId = (selectedId + 1) % modelSceneList.size // Ensure the exhibit model ID is within the range of the model list.
        loadImage()
    }
    binding.ivLeftArrow.setOnClickListener {
        if (modelSceneList.size == 0) return@setOnClickListener
        if (selectedId == 0) selectedId = modelSceneList.size - 1 // Ensure the exhibit model ID is within the range of the model list.
        else selectedId -= 1
        loadImage()
    }
}

private fun loadImage() {
    binding.csvMain.loadScene(modelSceneList[selectedId])
    binding.csvMain.loadSpecularEnvTexture(modelSpecularList[selectedId])
    binding.csvMain.loadDiffuseEnvTexture(modelDiffList[selectedId])
}

Enter fullscreen mode Exit fullscreen mode

A simple logic is created in onCreate(), which is used to switch between the next and previous model. Paths of object models are saved as hard-coded character strings in each model list. On top of this, the logic can be modified to enable dynamic model display. selectedId indicates the ID of the object model being displayed.

Now we've successfully implemented 3D model display via SceneView. The images below illustrate the effect.

Image description

Image description

3. Adding Audio Guides for Exhibits
To help users grasp a greater understanding of the exhibits, Audio Kit supports voice recordings, which can be played when models are displayed to introduce their history and background.
Integrate Audio Kit.
Software requirements:
• JDK version: 1.8.211 or later
• minSdkVersion: 21 or later
• targetSdkVersion: 30 (recommended)
• compileSdkVersion: 30 (recommended)
• Gradle version: 4.6 or later (recommended)

Audio Kit has higher software demands than Scene Kit does, so ensure that the software meets these requirements.

Add configurations for Audio Kit to the app-level build.gradle file:

dependencies {
    ...
    implementation 'com.huawei.hms:audiokit-player:1.1.0.300'
    ...
}

Enter fullscreen mode Exit fullscreen mode

Do not change the project-level build.gradle file, because the libraries needed for Audio Kit have been added during the configuration for Scene Kit.

Add a play button to the activity_main.xml file:

<Button
    android:id="@+id/btn_playSound"
    android:layout_width="wrap_content"
    android:layout_height="wrap_content"
    android:text="Play"
    app:layout_constraintBottom_toBottomOf="parent"
    app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
Enter fullscreen mode Exit fullscreen mode

This button is used to play audio for the object being displayed.
Add the following configurations to MainActivity:

private var mHwAudioManager: HwAudioManager? = null
private var mHwAudioPlayerManager: HwAudioPlayerManager? = null

override fun onCreate(savedInstanceState: Bundle?) {
    ...
    initPlayer(this)
    binding.btnPlaySound.setOnClickListener {
        mHwAudioPlayerManager?.play(selectedId) // Create a playlist instance. selectedId: parameter of the audio to be played.

    }
    ...
}

private fun initPlayer(context: Context) {
    val hwAudioPlayerConfig = HwAudioPlayerConfig(context)
    HwAudioManagerFactory.createHwAudioManager(hwAudioPlayerConfig,
    object : HwAudioConfigCallBack {
        override fun onSuccess(hwAudioManager: HwAudioManager?) {
            try {
                mHwAudioManager = hwAudioManager
                mHwAudioPlayerManager = hwAudioManager?.playerManager
                mHwAudioPlayerManager?.playList(getPlaylist(), 0, 0)
            } catch (ex: Exception) {
                ex.printStackTrace()
            }
        }

        override fun onError(p0: Int) {
            Log.e("init:onError: ","$p0")
        }
    })
}

fun getPlaylist(): List<HwAudioPlayItem>? {
    val playItemList: MutableList<HwAudioPlayItem> = ArrayList()
    val audioPlayItem1 = HwAudioPlayItem()
    val sound = Uri.parse("android.resource://yourpackagename/raw/soundfilename").toString() // soundfilename: audio file name that does not contain the extension.
    audioPlayItem1.audioId = "1000"
    audioPlayItem1.singer = "Taoge"
    audioPlayItem1.onlinePath =
        "https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-chengshilvren.mp3" // The sample code uses a song as an example.
    audioPlayItem1.setOnline(1)
    audioPlayItem1.audioTitle = "chengshilvren"
    playItemList.add(audioPlayItem1)
    val audioPlayItem2 = HwAudioPlayItem()
    audioPlayItem2.audioId = "1001"
    audioPlayItem2.singer = "Taoge"
    audioPlayItem2.onlinePath =
        "https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-dayu.mp3"// The sample code uses a song as an example.
    audioPlayItem2.setOnline(1)
    audioPlayItem2.audioTitle = "dayu"
    playItemList.add(audioPlayItem2)
    return playItemList
}

Enter fullscreen mode Exit fullscreen mode

Once the configurations above are added, the app can begin to play audio guides for exhibits.
Note that the audio files added in the sample project are online files, so if you want to know how to add local audio files, please refer to the API reference of Audio Kit, which is a service that allows you to add audio files to the project to play when the object models are displayed.
What we've created is the exhibit models that can be rotated 360° and zoomed in and out, and feature sound effects, by utilizing HMS Core services.
These services can be used in industries other than displaying cultural relics, for example:

  • In social media, to generate 3D Qmojis, video memes, and virtual video backgrounds for users.
  • In e-commerce, for 3D product display, indoor scene rendering for furniture layout preview, and AR try-on.
  • In audio and video, for 3D lock screen/theme generation, 3D special effect rendering, and generation of special effects for live streaming.
  • In education, for creating 3D teaching demonstration/3D books and implementing VR distance learning. Sounds interesting, right? To learn more about these kits, check out: 3D Modeling Kit and its sample code Scene Kit and its sample code Audio Kit and its sample code

Top comments (0)