DEV Community

Cover image for Make Your Animal Crossing Character React to Your Gestures With Machine Learning
Mathew Chan
Mathew Chan

Posted on • Updated on

Make Your Animal Crossing Character React to Your Gestures With Machine Learning

demo-gesture

Overview

By training models in Google's Teachable Machine to learn our gestures, we could swing our hands or make a face to send the corresponding reaction command to Animal Crossing's API.

Reverse Engineer Animal Crossing's API

The NSO app on the phone allows us to send reaction commands to the game. By using a tool called mitmproxy, we could know what requests are sent from our phone and simulate the reaction command.

brew install mitmproxy
Enter fullscreen mode Exit fullscreen mode

Or use pip install mitmproxy.

Run mitmproxy

mitmproxy
Enter fullscreen mode Exit fullscreen mode

Or if you prefer the web interface run mitmweb.

mitmweb
Enter fullscreen mode Exit fullscreen mode

Install the mitmproxy certificate on your phone

With your phone connected to the same internet as your computer, visit http://mitm.it/ and install the certificate. In the internet settings on your phone, add a manual proxy that points to your computer's IP address.

Checking IP Address on your Mac

1_EmOSYwm_HrWN5gk19nBMwQ

Setting Manual Proxy

1_P7kxd2Iy890nzjanLA6zDw

Download Certificate on http://mitm.it

1__YLfocKpqlcfSqRKwVzD6Q

About > Certificate Trust Settings > Enable Certificate

1_TaW2bdtKLkM1sXoZrWSKDw

Sending Requests through Nintendo Switch App

Now launch the NSO app on the phone and play around with the Animal Crossing App. You should see your phone's request data coming in through the mitmproxy terminal. We can start finding out the request format of reactions by sending them from our phone.

Alt Text

Screenshot 2020-11-24 at 12.34.59 PM

The request endpoint for messaging and reaction is api/sd/v1/messages. Click on it and you should see the cookies and form data of this post request.

Screenshot 2020-11-24 at 12.36.21 PM

The post data is as follows.

{
  "body": "Smiling",
  "type": "emoticon"
}
Enter fullscreen mode Exit fullscreen mode

Tip: Press q in the mitmproxy terminal to return to the request list.

These are some of the reaction types I've collected: Hello, Greeting, HappyFlower, Negative, Apologize, Aha, QuestionMark...

Screenshot 2020-11-25 at 5.07.53 PM

List of Reaction Values

Note: I don't have all the reactions in my game right now. It would be great if anyone could provide the other reaction values!

Accessing Nintendo Switch API

Access to Nintendo Switch API requires making multiple requests to Nintendo's server with an authentication token.
Full tutorial:

Successful authentication will give us three values:

  • _g_token cookie
  • _park_session cookie
  • authentication bearer token
import requests

user_auth_app_head = {
    'Host': 'web.sd.lp1.acbaa.srv.nintendo.net',
    'User-Agent': 'Mozilla/5.0 (Linux; Android 7.1.2; Pixel Build/NJH47D; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/59.0.3071.125 Mobile Safari/537.36',
    'Accept': 'application/json, text/plain, */*',
    'Connection': 'keep-alive',
    'Referer': 'https://web.sd.lp1.acbaa.srv.nintendo.net/?lang=en-US&na_country=US&na_lang=en-US',
    'Authorization' : 'tmp',
    'Accept-Encoding': 'gzip, deflate, br',
    'Accept-Language': 'en-us'
}

def sendReaction(reaction_value):
    data = {
        'body': reaction_value,
        'type': 'emoticon'
    }
    res = post_AC_NSOJSON(user_auth_app_head, data, 'https://web.sd.lp1.acbaa.srv.nintendo.net/api/sd/v1/messages')
    if res is not None:
        if 'status' in res:
            if res['status'] == 'success':
                return 'Reaction sent!'
        elif 'code' in res:
            if res['code'] == '4102':
                refresh_tokens(
                return sendReaction(reaction_value)
            if res['code'] == '3002':
                return 'Reaction not found'
            if res['code'] == '1001':
                return 'Animal Crossing Session Not Connected'
    return res

def post_AC_NSOJSON(header, body, url):
  h = header
  h['Authorization'] = 'Bearer ' + tokens['ac_b']
  pcookie = {}
  pcookie['_gtoken'] = tokens['ac_g']
  pcookie['_park_session'] = tokens['ac_p']
  r = requests.post(url, headers=h, cookies=pcookie, json=body)
  thejson = json.loads(r.text)
  return thejson
Enter fullscreen mode Exit fullscreen mode

Test and see if it works :)

sendReaction('Aha')
Enter fullscreen mode Exit fullscreen mode

EnkWy81UcAAvyr_

Teachable Machine

Google's Teachable Machine is an easy-to-use online tool to train models to recognize your speech, photo, and video. If you're new to machine learning, I highly recommend watching Google's 5 minute tutorial.

Screenshot 2020-11-24 at 2.29.55 PM

First create a Pose Project.

Screenshot 2020-11-24 at 2.32.20 PM

Choose Webcam for Pose Samples. Name your first class neutral and record yourself without any gestures. Then add extra classes such as clapping or waving. You can be as creative as you want.

Screenshot 2020-11-24 at 5.06.04 PM

When you’re done, press train. When training is complete, you can test the model in the preview panel. Once you’re satisfied, press Export Model above the preview panel and download the TensorFlow model.

tw4v4ehpbvbu4vew0bg3

We can use the provided Tensorflow.js Sample Script for a simple user interface. Copy the sample script to an empty html file and serve it through Node.js.

npm install http-server -g
cd my-pose-model
http-server 
Enter fullscreen mode Exit fullscreen mode

Insert our API call inside the predict() function. The API endpoint should direct to our python server to send the reaction.

const confidence = 0.8; // Confidence range is 0 to 1

async function predict() {
  ...
  const prediction = await model.predict(posenetOutput);
  for (let i = 0; i < maxPredictions; i++) {
    const classPrediction = prediction[i].className + ": " + prediction[i].probability.toFixed(2);
    // Insert the API call here
    if (prediction[i].probability > confidence) { 
      callReaction(prediction[i].className);
    }
    labelContainer.childNodes[i].innerHTML = classPrediction;
  }
  // finally draw the poses
  drawPose(pose);
};

let clapping = 0;
const threshold = 5; // number of times of detection to run API
async function callReaction(predictionClassName) {
  if (predictionClassName == 'Clapping') {
    clapping += 1
    if (clapping > threshold) {
      fetch('https://myapi.com/?reaction=Clapping');  // Change to your own API endpoint
      clapping = 0; // reset for threshold
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Be creative and have fun!

Screenshot 2020-11-24 at 5.19.11 PM

Summary

  • Reverse engineer private APIs with mitmproxy
  • Send API requests with Python
  • Use Google's Teachable Machine for ML prototyping

References

Setting-up mitmproxy on macOS to intercept https requests

Top comments (3)

Collapse
 
amananandrai profile image
amananandrai

Amazing post.

Collapse
 
mathewthe2 profile image
Mathew Chan

Thanks! Glad you liked it.

Collapse
 
gv_rene profile image
René González

Loved it, this post has many new toys to play with.