DEV Community

Syauqy Nurul Aziz
Syauqy Nurul Aziz

Posted on

HandSign - Learn a Sign Language with Your Camera

What I built

HandSign is a real-time sign language translator that detects your hand pose and translate it into the American Sign Language letter or alphabet.

Category Submission:

Program for the People

App Link:

https://handsign-m4qq6.ondigitalocean.app/

Screenshots

Handsign demo

Description

HandSign is a simple AI-based hand gesture recognition that translates a hand pose into the American Sign Language (ASL) alphabet. Using Tensorflow JS and its Handpose preloaded model to detect the hand object and its parts. HandSign also uses an additional library called Fingerpose to classify certain of custom hand gestures based on the finger position.

Link to Source Code

https://github.com/syauqy/handsign-tensorflow-gatsby/settings

Permissive License

BSD-2

Background

HandSign is a part of my current learning journey in web development where I also try out interesting open-source projects and frameworks like TensorFlow JS. The best way to learn to code is by making a project. TensorFlow's Handpose model helps makers create a new solution. And I see the leverage from this technology to help us communicate or at least learn how to communicate using sign language.

How I built it

So this project is the extended part of my first project in learning a TensorFlow, where I create an app that asking you to demonstrate a hand gesture that matches with the several emojis shown in the app.

My work for this project is mainly to create a custom gesture to identify sign language. From logging each of sign gesture and converting it into a Fingerpose model.

I'm using Gatsby as a foundation because it's fast to set up with several helpful plugins that allow me to focus on building the core of the project and tidying it up later.

Then I add Tensorflow JS and Handpose models, so I can start to detect the hand. I just follow their instruction here it's pretty straightforward.

To detect and identify a certain gesture or custom gesture, I use the Fingerpose library. The instructions are very simple here.

Then I work on the app flow of how to start detecting the gesture and how the app should work like showing the letter/alphabet that we need to demonstrate and other things like the UI design of the app.

Lastly, I'm using Chakra-UI as the CSS framework since it's very simple to use and makes the interface much better.

Deployment using Digital Ocean

Deploy an app (in my case is Gatsby) it's super easy and fast. With several clicks and without confusing configuration. My app is deployed in 2 minutes!

Want to try? Simply click this button below to deploy it to your site

Deploy to DO

Notes

This app only helps you to learn ASL letters/alphabet. There's still a lot of limitations to this model since it only captures the hand parts position/coordinates at a certain time. Also, this app still can't capture more complex things, like phrases and words since it takes more than a hand gesture like a motion to recognize.

Additional Resources/Info

  • Tensorflow JS - A Library for ML in JS.

  • Handpose - A lightweight ML pipeline consisting of two models: A palm detector and a hand-skeleton finger tracking model.

  • Fingerpose - A pose classifier for hand landmarks detected by TensorFlow.js Handpose's model.

  • Sign language illustration is created by Pelin Kahraman

If you want to learn more about Tensorflow JS and custom gesture handpose, please kindly check these amazing videos

Top comments (1)

Collapse
 
ivavay profile image
Ivy Chen

Tried out the demo and this is such a cool idea for learning ASL!