DEV Community

Cover image for My $646 mistake with PHP

My $646 mistake with PHP

Jonathon Ringeisen on January 19, 2022

Yesterday I learned a very valuable lesson when iterating a while loop using the Google Places API - Web Service that cost me $646. That's right, I...
Collapse
 
peter_fox profile image
Peter Fox

I think the mistake here is more, always write unit tests when you can. A test running this function could easily find that a while loop is never completing.

Equally never hurts to put a limit on a loop for an API so it never iterates more than say 50 times etc.

Collapse
 
anwar_nairi profile image
Anwar

Good idea, and since I see some Laravel code up there, just mock the API return so that you never really consume credits (and you still find out infinite loops, but for free)

Collapse
 
euperia profile image
Andrew McCombe • Edited

I did a similar thing but with the lookup taking place in a background job using SQS. A bug in the code meant the job always failed and got retried.

£20,000 later...

Luckily Google were cool about it.

Lesson: set decent retry counts on your Laravel worker jobs.

Collapse
 
jaywhy13 profile image
Jean-Mark Wright

Thanks for sharing this, and capturing the lessons you learned along the way! I believe this example underscores the importance of targeted testing to confirm expected behavior.
Assuming that you have a mock setup for the test, some examples of tests that come to mind are:

  • Verify that there's only one call to the API when there's no next page token
  • Verify that sleeps happen when there's a next page token
  • Verify the number of calls to the API when there are multiple pages
Collapse
 
michaeltharrington profile image
Michael Tharrington

Ooof. 😅 Really happy that you can see the humor in this, and thanks for sharing!

Maybe you can pass on this story and they'll give ya some sorta credits for the pain ya went through here. 🤞

It would be a good publicity move, Google. We're watching... 👀

Collapse
 
dgrigg profile image
Derrick Grigg

Could you not have changed this one line

$new_results Http::get('maps.googleapis.com ...

to

$response = Http::get('maps.googleapis.com ...

so the loop would work correctly by referencing the latest result set?

Collapse
 
stack111 profile image
Daniel Stack • Edited

Full disclosure, I am a software engineer on Microsoft Azure Maps who built a feature to help address this exact problem because API calls == $.

My other suggestions if you consider another product like Azure maps is to leverage entry level Skus which have upper limits resulting in throttling on requests. Second option is look into the authentication support; SAS token authentication allows you to configure an upper bound how many requests per second to limit charges.

I have no affiliation with Google maps or other competitors.

Collapse
 
ricardoboss profile image
Ricardo Boss

Looks like you also need to use the correct next_key_token. You are using the initial next_key_token from the first request in your URL.

Collapse
 
jringeisen profile image
Jonathon Ringeisen

Good catch! I'm actually not using the Google Web Service anymore so I won't be using this code but, still good to know.

Collapse
 
anwar_nairi profile image
Anwar

Thanks a lot for sharing your valuable eperience about this! I see this resonated on some other folks experiences as well, including mine.

I definitely advice, even if you have a great code, to put an hard coded limit on the maximum calls, wether it matches the plan limit or something lower.

We ran into the same issue at my job, where one of the plans we subscribed for a given service was mistakenly showing us limit reached, but in small characters we could actually go further by... Paying an extra per additional calls (knowing we already paid for the initial bucket of credit). Now that we have the hard limit, we get an error and we can decide to up the limit or just give up on this service until next month.

Collapse
 
roestvrijstaal profile image
RoestVrijStaal

You might want to cache the results from Google Places API for at least a day.

And use the same cached result for the same and 5-10 meters around the given latitude and radius.

Collapse
 
jringeisen profile image
Jonathon Ringeisen

As much as I would love to, Google's TOS don't allow for this.

Collapse
 
roestvrijstaal profile image
RoestVrijStaal

But how would Google verify that?

In both cases, the server(s) of Google Places API see only the IP-address of your application server.

But less frequently when you cache the results.

Collapse
 
shanks25 profile image
Kunal Rajput

I did similar thing, causing 500$ for Google timezone api.

I just sent them a email explaining them it was a mistake, they gave me waiver off on everything lol

Collapse
 
c0ldf0x profile image
ColdFox

It happened in my company as well, after a 2k€ bill we set the max limit equal to the free tier usage. This way we only pay if there's a need to increase the limit in any given month.

Collapse
 
yoursunny profile image
Junxiao Shi

Another error:
.status == "OK" should be checked first before saving the results.

Collapse
 
dimkiriakos profile image
dimkiriakos

The pay as you go is a big trap for people. be careful about these services. you are making the rich richer and yourselves more poor.

Collapse
 
jeremymoorecom profile image
Jeremy Moore

Thanks for sharing. Lots of good advice going on.

Collapse
 
tez123z profile image
tez123z

$7000 was my mistake. 😅

Not a bug in my case just overlooked the number of API calls that were attached to the external library I was using and Google had just started charging at the time.