DEV Community

Cover image for Honest Security
Dave Cridland
Dave Cridland

Posted on

Honest Security

Honestly?

Not that long ago, I was in a company working heavily in cybersecurity.

One day, I started as usual, by opening my company-provided MacBook, and went to read the day's announcements. I'd just started to read—

The screen blinked off.

Surprised, I nudged the mouse, and sure enough, the screen came to life again, with a password prompt. Odd. I logged back in, found my place and started to—

The screen blinked off again.

What the heck?

The User as the Problem

Device Management solutions are pretty awful things. They enforce some arcane policy by changing your settings, usually without telling you. You, the user, have no control. In our case, we were a consultancy literally filled with experts in the cybersecurity industry, yet our laptops were working against us.

It was simply infuriating. In this case, a bug in the device management solution had meant that in enforcing a screen timeout, it enforced a one minute screen timeout.

This meant that we were unable to work without gently moving the mouse near constantly. Several of us gave up, and downloaded the source for an open source app that caused the mouse to "jiggle" when left alone, and defeated the errant software.

If you think we were wrong, just bear in mind that we frequently had to give presentations to key customers. Having to change slides at least once a minute would be a challenging presentation style.

But fundamentally, this situation arose because in the security world, the user is not trusted or involved. They are seen as part of the problem - not part of the solution. Surely, in our case at least, our team mates were an asset?

In fact, aren't the staff always the front line for any organisation's security posture and device health?

The Insider Threat

All too many cybersecurity firms - those with impressive front pages with pictures of green-lit, hoodie-wearing hackers - like to talk about The Insider Threat. In capitals, just like that.

What they tend not to note is that the insider threat - while very real - comprises almost totally of people making honest mistakes. Trying to prevent mistakes by enforcing that the mistakes cannot be made has two problems. First, it is very complex - and, as we saw, prone to error. Secondly, it often damages the productivity of employees.

Surely the best way to reduce errors like this is by inclusion and education - turning your staff into a security asset, rather than a liability?

Surely security should be more than saying "No"?

Plenty of security experts have already found, for example, that the best way to reduce the effectiveness of phishing attacks is to send phishing attacks to users periodically, gamifying the task of spotting and avoiding them.

After all, this protects not only their corporate email, but their personal email, too - and you can bet that a clever attacker will target that, too. By involving users in their own security, therefore, you are protecting areas that enforcement could never hope to cover.

Working from Home - and back again

As "Bring Your Own Device" and working from home builds momentum, the lines between corporate security and personal security blur to an unprecedented degree.

Just as we don't want our employers to gather information on our home lives systematically, we obviously don't want them to gather information on our personal devices without our understanding and consent.

For companies with staff in Europe, California, and other places around the world, this is a matter of more than idle concern. The GDPR makes gathering personal data without consent illegal. Perhaps worse, it requires companies to provide the data they do collect back to the user on demand.

Clearly, then, the old model of blind draconian enforcement isn't sustainable, even if it were desirable.

Security Leadership

What's needed is a model of corporate security that works in the best - and most effective - traditions of leadership. As security leaders, we should draw our users with us, rather than trying to corral and drive them from behind.

We need to reset the relationship users have with security. We can transform it into a positive force for not only the risk management of the company, but the personal safety of those we work with.

This will make our users happier - and perhaps even more productive. But it will also reduce the risks from security failures to the company as a whole.

Honest Security

Thoughts like these are behind the emergence of a new model of corporate security - "Honest Security". Built around concepts like consent, transparency, and inclusional security practice, the intent is to reverse the adversarial posture of security versus user.

I am not, I admit, the least cynical person on the planet. In the cybersecurity world, there's plenty to be cynical about, after all. I'm fully expecting a series of companies to jump on this bandwagon in name only.

But if the outcome is that security becomes less of a barrier and more of an enabler, I'm all for it. If this is a buzzword, it's a buzzword to watch.

Top comments (5)

Collapse
 
apotheon profile image
Chad Perrin

This is quite good. Thanks for sharing it. Two things:

  1. I suspect what you wanted there was: s/inclusional/inclusionary/

  2. How do you feel about resharing (in contexts where a link back here isn't practical) under a copyfree license?

Collapse
 
dwd profile image
Dave Cridland
  1. No, I definitely meant inclusional, which encompasses the meaning of inclusionary but is more, well, inclusional.
  2. Nope. Realistically this site is open, so anyone can easily provide a link here.
Collapse
 
apotheon profile image
Chad Perrin

Not all contexts are on the web, but okay.

Collapse
 
terracatta profile image
Jason Meller

We need to reset the relationship users have with security. We can transform it into a positive force for not only the risk management of the company, but the personal safety of those we work with

Bingo.

I think there are a lot of security practitioners reading this that likely undervalue their knowledge and underestimate how much it can improve lives of the folks they work with. It's too much work to do manually, but if this knowledge can be codified and dispensed dynamically when it's most needed, it will have a major positive impact on individuals, and ultimately, improve security at their company.

Collapse
 
dwd profile image
Dave Cridland

I think the current nature of the cybersecurity market - and I say "market" in both the normal buying and selling nature, and the more purist sense an economist might - is so fraught with smoke, mirrors, and snake-oil that the average person can't make out the good advice from the bad without help. That help is often simply absent.

I wrote about this years ago from the perspective of national cybersecurity communities - Asymmetric Information in Cyber Communities - but it's equally true in the smaller scales of corporate and home security. It's something that's being picked up by the mainstream, slowly - a "market collapse" in cybersecurity is a very bad thing indeed, and people are understandably nervous that it might be occuring.

The fact is, your corporate security practitioner should be your most trusted source of what best practice really is. And sadly, all too often, corporate security ends up being a tiresome stick we are beaten by instead. That has to change - it must change - and it has the potential to effect a step change in the quality of our security.