Social engineering describes a range of malicious cybersecurity activities accomplished through psychological manipulation to trick users into making security mistakes or giving away sensitive information.
Simulated attacks are technical exercises that emulate the tactics, techniques and procedures of a real attacker, which help to understand how well your incident response plans hold out. Simulating the whole attack chain for most adversaries means we are not just targeting the technology, but also the people involved.
People are often talked about as the weak link and while it is important to understand human fallibility and the ways people will fail, be fooled, or tricked, we also have a moral and ethical obligation to look after those targeted and avoid causing undue distress.
Would I Lie to You?
Social engineering is a sterile term that covers many different bases and sounds more professional than alternatives such as ‘lying to people’, ‘abusing trust’, or ‘betraying relationships’. But if we are to accurately simulate the attack chain and their activities, we also need to adopt these. It’s easy to overlook those on the receiving end, with lives outside work and powerful feelings and emotions. So, when we are simulating advanced adversaries, we should not lose sight of the impact on our human targets.
Anybody involved in an urgent response to a cybersecurity incident will be familiar with the heightened emotions. People work long hours, adrenaline flows and the response team is under pressure. When making incident response plans, companies often consider the wellbeing of responders – ensuring they take breaks, eat, rest and have support – but possible ‘collateral damage’ is easy to overlook.
Imagine someone caught up in a simulated incident – in email and telephone contact with someone thought to be a customer, building a relationship over several days – only to be targeted with malware as part of the simulation by opening a received document that compromises her workstation. This was a real case.
For the internal IT team, it was their ‘Patient Zero’. They asked what happened, what had been said and what had been done in the sophisticated attack involving a closely-registered domain and telephone conversations with a non-existent person that had led to exposing the company to attack.
No cybersecurity professional would expect individual employees to defend against that level of sophisticated social engineering. What mattered was how the employee reacted, devastated to have been the cause and questioning whether she should have known better and questioning whether she could trust her judgement elsewhere. She was substantially upset, blaming herself.
Reality Versus Simulation
When the incident was revealed as part of an exercise, although relieved her error hadn’t caused any real damage, there was resentment and anger owing to anxiety and upset because of what her employer had put her through.
Was it wrong to conduct the adversary simulation and target the employee?
It’s a grey area that cybersecurity practitioners must navigate and find a balance between realistic simulations, without ethical constraints and protecting those targeted from unnecessary distress.
When employing social engineering methods, cyber-criminals use tactics to bring the best chance of success. Most rely on a ‘hook’ – something causing a victim to engage that lures people into clicking on a link or downloading a document containing malware.
Those with the greatest chance of success often have the most emotional impact. At the start of COVID-19, we saw huge changes in hooks used by adversaries: mass phishing campaigns using virus information. Fear and anxiety are powerful emotions making a person less likely to think clearly.
Ethical decisions need to be made about the hooks. Consider an employee posting on social media about their difficulty conceiving. This is information adversaries could use, on which this person is very likely to bite: a new fertility treatment, a change in employment benefits allowing fertility treatment costs to be covered. The level of emotional distress could be devastating. Even if a criminal might have no qualms doing this, cybersecurity professionals doing the same would be unconscionable. But most of the time, possible impacts of a hook might be less obvious.
Trust Equals Security
Every time a social engineering hook is chosen, there are obligations to consider the emotional impacts. Trust between employer and employee is critical to a positive and successful security culture. The most successful security programmes instil positivity and supportive cultures of reporting, valuable for resilience. People are the strongest asset here.
Breaching trust with employees by neglecting the care of victims can damage relationships you rely on for the protection of assets and data. Whilst social engineering must be done to support attack chain simulation, we must be mindful of human cost and take steps ensuring victims’ damage is minimized. This is a shades of grey landscape, but we forget the humanity of our targets at our peril.
Credit: Gemma Moore Director, Cyberis, Cyberis