loading...
Cover image for What gives you a false sense of certainty/security/...?

What gives you a false sense of certainty/security/...?

_bigblind profile image Frederik 👨‍💻➡️🌐 Creemers ・1 min read

While driving on a small family trip this Sunday, my father remarked that advisory bike lanes (see image above) give people a false sense of safety. Advisory bike lanes similar to normal bike lanes, but they're painted on a road that would normally be too narrow to have them. The idea is that cars in both directions use the space between the lanes, but may move into the bike lanes to pass cars going in the opposite direction.

Anything that looks like a bike lane makes a cyclist feel like it's a separate lane on the road, so they're safe. They don't think about the possibility of a car moving into their lane.

This got me thinking, are they other things, especially in the tech industry at large, that feel like they have a positive impact on security, productivity, company culture, .. but don't? I think these things can often even do damage, because we just assume their benefit is there.

Posted on by:

_bigblind profile

Frederik 👨‍💻➡️🌐 Creemers

@_bigblind

I'm never sure what to put in a bio. If there's anything you want to know, don't be afraid to ask!

Discussion

markdown guide
 

Honestly, Unit tests. Based on the assumption that 💯% coverage was achieved, how many of those tests are actually testing what they are meant to test, and what does that mean for the coverage statistic.

 

Unit tests are good at proving that the program does what the programmer says it's going to do

 

Precisely, as humans make mistakes, tests also may be prone to the same (I am pro TDD to be clear, but this is always in the back of my mind).

 

Coverage is the biggest feel good fraud.

expect(() => foo.bar())).toThrow()

That counts as coverage, but it's (probably) useless.

 

I totally agree. Unit tests are another false sense of security. Having 100% code coverage does not guarantee a working functionality.

 

So the answer is use humans to review code. But who really trusts humans to code, that's why we have tests?

 

There's an infinite number of things a piece of code could be doing, and in general tests only test that code does the things we want. But there's another infinity of things we don't want that code to do, that we're not testing for. As I'm learning more about how brittle and hard to understand large and complex systems are, the more I see the value of fuzzing, or property-based testing for smaller pieces of it.

 

After taking a course in wireless security, and following Taylor Swift on Twitter, nothing gives me a false sense of security. I no longer have any sense of security. Nothing is safe. Art is a lie.

 

Some less tech focussed ones...

Having a "permanent" job. No job is 100% secure.

Working in a big company. Once a company is big enough that you don't know everyone by name and what they all do there's a false sense of security that it must be part of someone's job to look after that important thing e.g. renewing the SSL certificates.

Job titles. Just because someone is a senior this or a principal that or a director of the other doesn't necessarily mean they know what they're doing.

Paying for something. If there's a license fee to pay that must mean the thing is better, more reliable, more secure, etc, right?

 

Sorry for going slightly off-topic... IMO anything with a purpose technically can give us a false sense of security. Bike lanes make us think that it's always safe to ride bikes on it, which in fact it's not. Automated testing makes us think that the code is always foolproof and bug-free, while in fact it's not (depending on how you write them, the rest of the processes, PR review sessions, etc).

Maybe we should instead lower down our expectations? When you ride your bike on a bike lane, you know that it should be safe to some degree, but not always, but of course you would always still have to keep an eye of your surrounding, regardless of whether you're on a bike lane or not, and same goes for the rest of the people on the road. When you're using a code with 100% coverage, you know that you have the confidence to ship/refactor the code to some degree, but not always, so that you would still need to stick to best practices, improve the code over time, paying the most attention in PR reviews, regardless of whether the test is doing its job or not.

Furthermore, making less assumption is also critical. If you're not sure about what degree of security one can give us, ask, or do some research. That way, you know what you're signing up for.

Just my two cents.

TL:DR; We get false sense of security IMO because we didn't know enough and/or we expected too much.

 

While I tend to take a less pessimistic view of the industry (I have to; I just graduated after all. I don't want the last four years to be a waste), I more and more get the sense that everything we rely on on a daily basis runs on unicorn tears & a prayer. The left pad incident comes to mind. I also get the feeling that there is an immeasurable amount of legacy code out there that props up just about everything, that there's just nobody around anymore who knows it well.

On a more optimistic note, we've learned a lot from things like left pad. We, as a whole, have made a push to write better code that's more maintainable and documented. I think the Open Source movement has done a lot for that. We're always working to have more reliable and foolproof systems. Sure, we're so far away from that, but there's no such thing as perfection, even though we're perpetually approaching it.