Like many "rules of thumb" in programming, it's talked about to beginners in order to prevent certain practices without properly explaining those concepts we're trying to impart. This, of course, is code that modifies variables that are not passed to it and not "owned" by the object/library that is modifying it. Doing so makes it incredibly hard to debug, manage the program state, and refactor. Beyond that, it further complicates threading and forking.
Is there a place for "global" variables (however you want to define it)? Of course, but it's something that should be wielded with caution and deliberation, and not standard behavior. These variables should be used inside the code that creates them in ways that validates their state/value without assumptions, and on projects that may one day become "mature," you will have to accept that changing the functionality and purpose of the global variables you create will (potentially) have huge consequences on the entire code base. That's not something I expect a beginner to understand, therefor, "Global variables are bad!"
I don't like these rules of thumb because they are misapplied and misunderstood. You end up having programmers that think they are following rules but still end up making the mistakes the rule is meant to prevent, and restricting themselves from using something that would help them.
It's why the definition of "global" is so important. If a programmer chooses to narrowly apply it to a specific variable type in their language they may entirely miss the point. For example, they might freely use Windows registry values, DB values, or otherwise to achieve the same thing, thinking it's alright since they don't have a mantra against it, and it isn't violating the "global" variable rule.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.