Last week I wrote about a crash in Android Lint, and how you can work around it until the issue is resolved. This time, I want to tell you about how you can use the sensors in your phone to accomplish amazing things.
Smartphones are incredible devices. Compared with phones made just 10 years ago, there has been greater than a 5x improvement in processor performance, 6x improvement in camera resolution, screens have gotten substantially larger and clearer, storage has grown by more than 10x, and most relevant for anyone reading this — you are able to build and deploy your own apps to smartphones now!
Additionally, nearly all modern smartphones sport a vast collection of various sensors — cameras (sometimes 3 or more!), accelerometers, altimeters, gyroscopes, light sensors, barometers, proximity sensors, GPS, microphones, not to mention WiFi, Bluetooth, NFC, and Cellular connectivity!
This assortment of sensors, when used together, can provide a wealth of insights into what the user is doing, and luckily for us, Google has developed an easy-to-use API that lets you leverage these!
The ActivityRecognitionClient API is the newest way to detect what activity the user is currently performing, and when they change what they’re doing:
The principle is simple. The API supports the following “activity” types:
Similar to how your fitness tracker may be able to detect when you’ve started going for a run, Android is constantly monitoring your sensors and applying complex Machine learning algorithms to watch for certain combinations of sensor input to make a determination about what the user is doing.
For each of these activity types, you can specify whether you are interested in being notified via a callback function when that activity has started or ended. You’re even able to specify with what frequency you’d like to receive updates, and the phone will do its best to monitor at the desired frequency.
There are many great articles that detail how to use this, so I’m not going to re-write that information. This one was pretty good:
What I’d like to do to add on top of this is help inspire you to think about what you could accomplish for your users with this API.
Your app could detect when someone is driving and warn them against using the app, simplify its interface, or even turn on spoken alerts so the user doesn’t have to look down.
A navigation app could display a different UI depending on whether the user is walking/biking/driving.
An app with lots of text (email, news app, etc.) could increase the text size, increase the contrast/brighten the content when it detects the user is walking as it’s harder to read things when your phone is jostling around and you’re in motion.
Your app could provide user-configurable options — increase the volume when you detect I’m driving so I can hear more easily over the road noise, and then lower it back down when you detect that I’ve stopped.
The possibilities are nearly endless! I encourage you to put on your most creative hat and really think outside the box — how could you leverage this type of data to provide a better experience for your users?
I’d love to hear your ideas — please leave a comment below and let me know how else you might use activity detection in your app. And, please follow me on Medium if you’re interested in being notified of future tidbits.
Interested in joining the awesome team here at Intrepid? We’re hiring!
This tidbit was originally delivered on October 25, 2019.