DEV Community

Discussion on: Acoustic activity recognition in JavaScript

Collapse
 
supunkavinda profile image
Supun Kavinda

Okay, one question.

low-frequency whistling = brushing teeth.

How does this happen? Does the software recognize frequency, or it also recognizes the tone?

Collapse
 
devdevcharlie profile image
Charlie Gerard

I didn't train it with whistling so it's predicting whatever is closest to what it was trained with, which must be brushing teeth in this case.
It's looking at all the data coming from the WebAudio API

Collapse
 
supunkavinda profile image
Supun Kavinda

Yep. I asked if it is trained for frequency or the tone? or both?

Thread Thread
 
devdevcharlie profile image
Charlie Gerard

If by tone you mean notes, notes are named frequencies.

The data you get back from the WebAudio API is the frequencies picked up by the microphone copied into a Uint8Array using the getFrequencyByteData method on the AnalyserNode.

Thread Thread
 
supunkavinda profile image
Supun Kavinda

By tone, I meant tone: which makes a C note from violin and piano distinct.

However, I got what you meant. Nice and creative work!

Thread Thread
 
josepjaume profile image
Josep Jaume Rey

Notes are just particular frequencies that are given a name. In any case, the tone you're using is a bit off.

Thread Thread
 
techpeace profile image
Matt Buck

Traditionally, that's referred to as timbre. It's the volume of specific frequencies in a note's harmonic series that account for differences in timbre in pitched instruments. Here's a great video explaining how that works.