DEV Community

Cover image for Care to share some painfully funny debugging stories?
John Alcher
John Alcher

Posted on

Care to share some painfully funny debugging stories?

Let me start:

I have a React + Django API that sits behind Nginx that I'm trying to containerize with Docker. For some reason, React can't get pass the CORS policies of the API, even though I can manually open the API endpoints on my browser without any problems. I even tried using CURL and HTTPie to issue the requests, and I can successfully get the response with the appropriate Access-Control-Allow-Origin header.

The culprit? UBlock Origin tags the endpoint /api/ping/ (that is used within the API) as an advertisement server. Big yikes!

Top comments (11)

Collapse
 
_garybell profile image
Gary Bell

I once spent 30 minutes trying to figure out why my debugger wasn't hitting my break points, or getting to the code I was trying to debug.

I felt totally moronic when I realised I hadn't turned on the debug listener. Finally got to the code I wanted to debug after finding another issue once I enabled the listener

Collapse
 
alchermd profile image
John Alcher

Funny thing about these stories is that, on hindsight, it would be REALLY weird if what we were intending to do DID work.

Collapse
 
ben profile image
Ben Halpern

The culprit? UBlock Origin tags the endpoint /api/ping/ (that is used within the API) as an advertisement server. Big yikes!

The same approximate thing happened with our backend endpoint dev.to/admin/display_ads which is where we add new stuff for the sidebar, like...

ad

But it didn't block the ad, it just blocked the backend where we went to go edit it (it would show as offline if we tried to go, due to the way service workers caught the error)...

It was a bizarre issue which didn't cause major problems, but having no idea why our admin randomly displayed as offline really made us a bit worried that something was going on we didn't understand.

Collapse
 
alchermd profile image
John Alcher

It really is a bizzarre issue to encounter. What a way to find another entry on the long running checklist in our debugging mental model.

Collapse
 
mykalcodes profile image
Mykal • Edited

I had something relatively similar to your story happen to me as well, funnily enough 😅. It's not directly related to code, but I would definitely consider it debugging to some degree 😂

I typically work on web apps & websites, but one of my university term projects this year was a native app. My team and I chose to build an Android app.
One of the final steps in the project was to submit the app to the Google play store... As part of that submission, you have to sign the google ads TOS. For some reason, we just couldn't find the confirmation button. After literal hours of trying to figure it out, I turned off my adblocker and reloaded... turns out the confirmation button was inside an iframe from ads.google.com ( a blocked domain ).

I don't think I've ever felt so simultaneously relieved and idiotic at the same time...

Collapse
 
habereder profile image
Raphael Habereder • Edited

Wow, I never would have thought about that, nicely done!
Imagine the pain if it was a centrally managed dns adblocker like pi-hole. Not even your local adblocker would have known there was a button :D

Collapse
 
alchermd profile image
John Alcher

That would have been horrifying lmao. At least in my case, Firefox is kind enough to give a hint (from the dev tools) that maybe an extension of some sort is the cause of my problem.

Collapse
 
mykalcodes profile image
Mykal

oh that would be terrible 😰
I definitely don't think I would have figured it out if that was the case.

Collapse
 
alchermd profile image
John Alcher

There has to be a clause in the ToS that we implicitly accept when we install these Adblockers stating that we'll have a "heart-dropping conundrum at least once while this extension is installed".

Collapse
 
mykalcodes profile image
Mykal

LOL no kidding huh 😅

Collapse
 
habereder profile image
Raphael Habereder • Edited

When we switched to kubernetes a few weeks back, we had quite a few services to migrate from our docker compose setup.

So we happily migrated with the utmost speed for our review, but one microservice didn't behave. Traefik, our ingress just didn't see it. The ingress was picked up fine, but pings and curls went nowhere. Our deployment was stuck in some kind of void.

I kid you not, I debugged this for a whole week straight. I did everything, from updating software versions to tearing down and setting up the whole environment again multiple times (thankfully this is completely automated by now).

I finally gave up and migrated a new service, just to have something to show for in our review. I wrote my three k8s files and, as I expected, it worked smoothly. Which bugged me even more!

So I tackled the broken service again, and did a stupid vimdiff to compare it with a working service. Then it struck me, in colorful diff text. A label was wrong....

Since not everyone is familiar with Kubernetes, here are the 3 most important files of a kubernetes deployment:

  • deployment.yaml
  • service.yaml
  • ingress.yaml

For kubernetes to know that a service belongs to a deployment, you have to set labels as glue.

Example:

kind: Deployment
spec:
  selector:
    matchLabels:
      app: prometheus
  template:
    metadata:
      labels:
        app: prometheus
<snip>
---
apiVersion: v1
kind: Service
metadata:
  name: prometheus
  namespace: monitoring
spec:
  selector:
    app: prometheus
---
apiVersion: traefik.containo.us/v1alpha1
kind: IngressRoute
spec:
  <snip>
    services:
    - name: prometheus

Guess what happens if the service up there does not have app: prometheus but ms: prometheus.
Kubernetes has no idea it belongs to the deployment with

matchLabels:
      app: prometheus

and routes requests to dev/null. Guess what the reason was? A colleague copied templates of those three files from some blog, not checking if the labels were correct. And the dumbass that I am was expecting that to be correct and looked everywhere else..

So that was my nightmare for a week.