DEV Community

Onorio Catenacci
Onorio Catenacci

Posted on

On The Value Of Listening To Warnings

One common mistake I've seen repeatedly over the years is the assumption that compiler warnings (in those languages which provide them) are simply noise that one can safely ignore. It's a bad habit to get into for a few reasons:

  1. It encourages the sloppy practice of letting warnings go for month after month with no developer investigating the problem the warning is indicating.

  2. It encourages a mentality of it compiles so it's right which is almost never a good approach to take.

I was recently having a conversation with another developer about one of the most seemingly useless warnings--"unused variable".* Now I'm unsure of the original reason for warning but it's not hard for me to imagine.

I believe (and may be wrong) that the notion of compiler warnings originates with the C language. The whole notion of compiler warnings was to flag code that is syntactically correct but may be suspect. Consider this code fragment:

if(x = 1)
{
  printf("It's 1\n");
}
else
{
  printf("It's not 1\n");
}
Enter fullscreen mode Exit fullscreen mode

I'll bet more than one or two of you can spot the problem without trying very hard. But this used to be legal C code. Doing an assignment inside a test that way is almost never a good idea but it was perfectly legal in C. You'd get a warning about it but if you're in the habit of ignoring warnings, you may never bother to check it because "it compiles so it's right." Now when you look at a code fragment that way what's wrong with it is usually pretty easy to spot. Let's say that our if test occurred in the middle of a 200 line routine. Then, maybe not so easily spotted.

Ok, so that's all well and good but why is it (or would it be) useful to warn about unused variables? Consider this code fragment:

int addemup(int n, int m)
{
  int M = 1;
  /* code
     more code
     more code */
  return n + M;
}
Enter fullscreen mode Exit fullscreen mode

Now, again, in isolation like this it's probably pretty easy to guess that someone should have written return n + m; (although that may not be right either). But the code is syntactically correct. Did the developer mean to ignore the parameter? We can't tell from the context.

This is why I think warnings about unused variables and/or parameters is a warning a lot of us should pay a lot more attention to. If you really don't need the parameter, then why is it being passed at all? You may say, "Well, there's a requirement to pass it due to old code that requires it." Consider that an odious code smell.

Code should be built intentionally, not accidentally. If you've truly got a routine with a parameter that it doesn't need then you're building code accidentally.

This, by the way, is an example of what I call code that works in spite of itself. This is when you run across code that makes you scratch your head as you're trying to figure out what the developer was doing and why they took such an odd approach. Did they mean to ignore the parameter passed in? The function is called "addemup" so it seems like it should add its parameters, right? The code doesn't seem to have any obvious problem but it just looks odd and you may have to spend a lot of time deciphering it.

Of course most code doesn't start this way; it acquires warts and oddities as it's maintained by multiple developers over the course of time.

In summary, not only should you not ignore warnings, if the option exists, treat them as errors so you cannot build your product until the warnings are addressed. You'll save yourself a lot of work and your code will be easier for others to maintain.

Notes

I am unsure if more recent versions of C have promoted this to an error but that seems to be the current behavior. I cannot find anything definitive either way.

Top comments (0)