Security Theatre and Practise

Before I engage in what will be my personal lambasting of what is commonly coined security theatre, let me preface this by clarifying I have no doubts many who work in InfoSec (Information Security) are well-intentioned individuals, who do want to carry out their role as best as possible. With that out of the way, unfortunately I have little else positive to say about the majority of these seat-dwellers.

Security is hard; trust me, I’ve worked on it in some of the most testing environments you can think of - defending infrastructure and the people beyond it from even state orchestrated action. I am not discussing benign threats such as ransomware in such cases, but actual field experience where the lives of dissidents and activists are on the line. Such scenarios are undoubtedly technically exciting, but as with so much in security, one should never lose sight of the real objective and impact your work has on real people all the time.

Standards and Compliance

Let this not be seen as an attack against all compliance and standards procedures, but most are utterly useless. Companies who are truly secure, typically embed this into their culture and not just a series of checkboxes provided by the “compliance officer”. In setting these standards, not even the big names can agree on the advice given. The defensive branch of GCHQ, known as the NCSC, advised against mandated password rotation policies, while the PCI-DSS group still recommends this practice despite research for the past 3 years confirming this practice actually reduces security.

Taking the password example again, it is recommended by PCI-DSS, among many other organisations, that “strong cryptography” is used to protect passwords. Not only is this statement extraordinarily vague given the variety and complexity of the modern cryptography world, but it can easily lead to misunderstandings even from qualified and well-versed engineers. For example, AES-256 is an example of strong engineers, but it is reversible and thus as soon as the cryptographic key is exposed, it is a meaningless measure.

It would be more responsible instead, to modify the advice to “strong cryptographic hash functions”; at which I could recommend a few such as Argon2, PBKDF2 or Bcrypt. Such hash functions will protect your passwords even if the data is copied and will give the company and users time to discover the breach and advise users change their passwords/authentication credentials. Such hash functions also make many efforts to brute-force the passwords futile.

The Wannabe InfoSec Managers

All too often, my experience of compliance and security managers is that they have failed on a technical level, and have instead opted for the paperwork route in advancing their career. Those in charge of security should be at the forefront of technology in their field, and fully understand technical advice and best practices. If you don’t have at least 3 years equivalent experience as a Windows, Linux or Network administrator and can’t do some routine sysadmin tasks, then you are simply a mouthpiece for the advice of other people.

This isn’t to say you can’t learn, but certificates are a clear mile away from what I am recommending here. Go sit down with your sysadmins, network engineers or a good bunch of penetration testers, and see what the everyday practise is like and try to understand at least the summary of their role. If you have to continually justify your own role by introducing new measures and initiatives without considering the everyday impact of these, then you are doing it wrong indeed. Security is a trade-off between usability and good practice, and if you fail to make your policies usable and sensible, then users will go out of their way to avoid them.

Calling yourself a security expert, and being completely unable to explain the importance of a nonce in cryptography (with applications in attacks such as Logjam) or other such details, is an easy way to get laughed out of any technically competent team.

Reward Outcomes - Not Talk

I have always been a huge advocate of bug bounty programmes, especially where they are well implemented and sufficiently rewarding. Platforms for this are now springing up all the time, with one of the best known being HackerOne. These programs reward actual finds and will incentives the good guys to find and report your vulnerabilities before they are exploited.

Consultants have a place, but are often relied upon to provide advice which is subsequently treated as gospel. If you hire a consultant who will talk to you about security and doesn’t mention bug bounties and red-teaming; fire them and find another one. Likewise, if you have internal security teams who aren’t doing regular security tests against your infrastructure, then they aren’t doing their job.

If your company genuinely believes you are perfectly secure, put your money where your mouth is and open a $100,000 bounty to HackerOne, and see how long you last.

Security Culture

In most organisations I have worked with, almost everybody I have spoken to about security have admitted to knowing of a vulnerability or issue within their organisation that has not been addressed, or has not been reported. I alike this to the legal concept of Willful Blindness, a subject on which a wonderful lady called Margaret Heffernan delivered a TED talk about a few years ago. It is well worth a watch.

These issues are only addressed where there is incentive to report them, which can mean everything from offering incentives, but also to ensure every person in that organisation knows they will not be negatively impacted for reporting these issues. Too many companies out there threaten legal action against whistleblowers, but in my experience such individuals are often the most loyal people there are who want to address these issues to help the company, not to harm it.

Question Everything

To summarise everything I have written so far, it could be best described in a motto such as “question everything”. No single technology, measure, person or product can ever wide-ranging security. Best practices can be developed and guidelines released, but if you aren’t testing them all the time and incorporating secure practices into your culture, then they are useless. Security, like all things, needs to be usable and evidence-based to be effective - start with that in mind and you can’t go far wrong.