Hackers are more con artists than nerds these days.
Instead of probing networks and sending viruses, they impersonate co-workers, stage false urgency to provoke impulsiveness, or entice frustrated teams to reveal their secrets.
To do this, hackers use the entire range of human weaknesses: they create stress and distractions, they appeal to obedience to authority and ambition, and they exploit shame and arrogance.
Cybersecurity risks today are less about technology and more about behavior.
of all cybersecurity breaches involve the human element.
(Source: Data Breach Investigations Report 2022)
But cybersecurity remains nerdy and oblivious to 75% of risk.
Aviation, the nuclear industry and the military manage twelve human risks. These are risks that mostly unfold outside of technology: 75% of them have more to do with management and team dynamics and working conditions than with technology.
By actively addressing all twelve aspects of human risk, high consequence industries have fully mastered it: flying, for instance, is the safest form of transportation, even though it's also the one where the most things can go wrong.
Cybersecurity, on the other hand, addresses only three of these risks:
-
lack of knowledge about threats and how they work
-
lack of awareness to signs of a threat
-
lack of resources, i.e., wrong or poor tools
Why only these three? Because these are the ones that can be solved with technology: with interactive training systems, with flagging systems that raise awareness of potential threats, with better devices or software.
Cybersecurity providers don't consider fatigue. They don't talk about stress. They can't keep up in a conversation about the causes of complacency. There's a massive lack of capability.
As an example: when employees don't lock their computers because their devices take too long to unlock again, their devices are replaced. But when they fail to lock their computers because they are distracted and exhausted, which is just as dangerous, the risk remains unaddressed. The problem is the same, but remains unsolved just because there's no technical pathway to mitigate it.
And yet, both situations pose exactly the same danger.
adair uncovers non-technical cybersecurity risks...
adair is the first software that can reveal all behavioral cybersecurity weaknesses of a team and show how to eliminate them.
It is a self-contained, intuitive, and automated tool with an accuracy of ± 4%.
... and reveals how to get them under control.
Too many diagnostics and analysis systems leave users alone and overwhelmed by data, without supporting them in their search for actual solutions.
adair instead leverages the latest academic research on critical pattern recognition to determine which teams require the most immediate or intense attention. Our implementation of this is state of the art, and was supported by a major research grant by the German state of Brandenburg.
Once it is clear which issues have priority, adair provides an extensive repository of explanations and recommendations on how to control the risks that it has revealed.
It's 2024. Behavioral science isn't soft and fuzzy anymore, but accurate and precise. We make use of that.
Both the measurement methods and the recommendations behind adair come from reputable, publicly available, and widely accepted behavior analysis methods that have undergone extensive peer review. No buzzword bingo, no ideologies, no “proprietary research”: adair contains only proven practices and hard science.
Hundreds of thousands of studies with millions of participants.
A century of research on human risks
meets
60 years of research on quantifying behavior
adair is so easy to use that you'll master it in 20 minutes.
It measures and visualizes team realities that are relevant for cybersecurity …
… explains in detail what they mean and what their impact is …
… and then provides clear, pragmatic and highly effective instructions on how to shape them.
Team by team.
“If the airline industry had the same problems
as cybersecurity, we'd have about 70 planes
falling out of the sky every day. [...]
The persistence of human performance
problems in cybersecurity exists due to an
under-education on human factors. [...]
Now is the time for the cybersecurity industry to
get serious about human factors.”
Dr. Calvin Nobles
Chair of the IT & management department
of the Illinois Institute of Technology