After a looong weekend of getting pretty much wrecked.
I finally found the time to write a proper blog post for the contest entry.
You start with seed capital of $30000.
Every round you create products by clicking as fast as you can and submitting your product.
Your products price (= your profit) is calculated by
500000 * clicks / milliseconds
You also lose $5000 every round, because of the burn-rate of your startup.
So you'll probably lose money every round, haha.
If you lose all your money you are out of the game.
Last player left, wins.
My idea was to build it completely serverless. While Pusher allows for a serverless setup, it sadly doesn't have a serverless pricing model, hehe.
I used AWS SAM, which is an extension of CloudFormation, so besides Pusher, it is 100% infrastructure as code.
Amazon API-Gateway for HTTP requests. Joining games, submitting products every round, receiving Pushers webhook data.
AWS Lambda for all the server side calculations. Checking for empty games, calculating profits for every round, notifying players via Pusher of game events (
AWS Step Function for game coordination. Starting games, starting rounds, calling Lambda every round to calculate things and notify players via Pusher.
Amazon DynamoDB to store game-data (products, player count) to make it accessible for the state-machine controlled Lambda functions.
Pusher Channels to get data from the back-end to the clients without any need of polling.
I also made an architecture diagram:
As you can see, the data comes in from players and Pusher via HTTP to my serverless back-end and is fed back to the clients via Pusher Channels.
Pusher Channels allows to broad/multicast events to the clients without the need of a direct connection to the clients.
You just call the Pusher HTTP-API to send an event and Pusher takes care to distribute it to the clients.
This is pretty cool, because WebSockets would force you to keep an open connection to all clients, which isn't possible with Lambda functions. They can only run for about 5minutes.
When a Lambda function is called (via API-Gateway or Step Functions) it can simply do its thing, send a HTTP-request to Pusher and get suspended again, while Pusher keeps track of open connections.
The Pusher API also allows to get the state of all the channels via HTTP, so you can start a Lambda, check who is online and send data depending on channel state if you like.
Pusher is advertising its Channels as realtime, but this isn't really the case.
It uses Web technology at its core, so it's build completely on TCP and it adds at least one other server (= network hop) to the whole architecture.
First you have the WebSockets connections from Pusher to the clients, which have less latency than a full HTTP request for every event, but still bring a few round-trips with them.
And second, you use the HTTP API on server side to send events to Pusher, which leads to
client -HTTP-> back-end -HTTP-> Pusher -WebSockets-> client in terms of latency.
Its unique selling point is more of pushing data to clients with a very simple setup (hence the name Pusher, haha), but not minimal latency.
It lets you use client-events to cut out your back-end as the middle man to reduce the latency even more, but you can't run code on the Pusher side of things for every event, which reduces the usefulness rather much.
So at the moment we are talking about less than 10 events per second. Which is more than enough for most applications.
This is why I don't send every click to the server, but gather them every round. This allows for one HTTP request every 10 seconds (10 second round)
Something like Lambda for Pusher events, running on Pushers infrastructure would be a killer feature, hehe.
The next problem was AWS step functions. I found it rather nice to model the game state and the rounds etc. but I haven't found a way to get data into the state-machines easily.
The problem is as follows:
You define a state-machine. This state-machine can be executed multiple times.
Every state of the state-machine can either call a Lambda function with some input or create a activity task.
I had the idea to define the game as a state-machine and every execution of a state-machine would be a running game.
While every execution can wait for an activity to be completed by a worker (for example an API-Gateway Lambda) the worker can't filter the tasks in every activity for execution.
So I wasn't able to make workers execution/game specific.
I had to add DynamoDB to the mix to get data into the state-machine.
Player sends a finished product via HTTP (API-Gateway -> Lambda) and the back-end would store it into DynamoDB, the
gameId being the primary key.
When the state-machine decides a round is finished, for example after a
Wait-state of 10 seconds, it starts a Lambda function that looks into DynamoDB, calculates the results and publishes them to the clients.
DynamoDB has nice concurrency features, so it wasn't too bad and allowed to synchronize players more easily in the end.
I also wrote a bunch of blog-posts for the process.
- Startup Clix: Pusher Presence Channels with AWS SAM
- Startup Clix: Cleanup & Fighting with Pusher Application State
- Startup Clix: ESLint & Winning with Pusher Application State
- Startup Clix: First Steps with AWS Step Functions
- Startup Clix: Pre-Authentication & Webhooks with Pusher
- Startup CliX: DynamoDB & React Front-End on GitHub Pages
- Startup CliX: Finally some Gameplay
- Startup CliX: RC1 with Private Games & Mobile UI
It was fun to use my new back-end skills and see myself doing full-stack stuff for real now.
I learned much about Pusher, AWS and serverless in general.
I would appreciate all your likes, unicorns and what not, but even if I don't win it's nice to add this project to my portfolio (maybe after a cleanup, lol)
Also, as always, issues and pull-requests are welcome. Maybe someone has ideas that would make the game really fun to play, hehe.