Human hacking explained.
Social engineering, whilst it sounds like a benign term that might signify some 1960's think tank strategy, is essentially the security term for hacking the human.
Human hacking is a much easier way to breach an organisation than trying to attack the technology.
Social engineering sounds like the kind of benign term that might signify some 1960's think tank strategy to improve society with science. Alas, it is not.
Social engineering is essentially the security term for hacking the human.
It has long been said that when looking to breach an organisation, the technology bit is hard, and getting harder. It is much easier to hack the human as they are all still on version 1.0.
Unfortunately, this is a position supported by one of the great practitioners of the art of social engineering, Jenny Radcliffe – also known as ‘the people hacker’.
Speaking to TechPro before a recent event for mobile and security specialist CWSI in Dublin, Radcliffe said social engineering, and manipulating people, is the oldest trick in the book.
“The reason that social engineering never goes away is because people are easier to attack than the technology,” said Radcliffe.
On her web site, Radcliffe describes what she does as “the manipulation of people, through non-technical means, in order to gain access to information, data, finances or physical sites”.
“What a good social engineer knows is automatic behaviours as human beings,” she said. “Everyone has fears and desires and so can be manipulated—it doesn’t matter what your job title is.”
This practice is carried out as part of a full security audit of an organisation, or in conjunction with a penetration test, or pen-test. It is used to establish the level of awareness and capability to defend against an attack.
As phishing, and spear-phishing in particular, becomes ever more popular attack vectors, raising awareness for people to protect themselves, and thus the organisation is necessary. Radcliffe says that attackers are also looking to cultivate insiders, whether unwittingly or not, to work on their behalf.
“Social engineers will try to grow an insider threat,” said Radcliffe.
Social engineers are always looking to get someone on the inside working for them, even though the person may not know that, she said. But there is also what Radcliffe terms the ‘disenchanted’—those employees who feel undervalued, put-upon or just disgruntled. Radcliffe characterises the insider threat under the headings of “mistake, mischief or malice”.
“It is not too difficult to persuade someone who is unhappy, feels ill-treated, is on minimum wage or is leaving the company anyway, to provide access to things they shouldn’t,” Radcliffe argues.
“The incidence of the inside threat is increasing, which is something many industries don’t want to talk about,” she warned. “Social media often enables that.”
In fact, Radcliffe said that although she has been doing this kind of work for many years, recent developments have made parts of her job much easier.
“Technology, and the Internet in particular, have speeded up the intelligence gathering period of a social engineering attack, as well as the scale,” she said.
This allows her to cast the net much wider than might previously have been the case.
During her presentation, Radcliffe presented a number of case studies, demonstrating her techniques.
She said that it is imperative to present the familiar to a mark, to gain trust. In one instance, the mark’s Facebook page revealed enough information to point to an offspring’s profile. So although the mother had locked down her profile, the son’s revealed much, including an aging pet dog. Some research soon revealed a rural address, a nearby village and a local veterinarian and pub. Spoofed emails for an unpaid vet’s bill and a query regarding a surprise party at the pub before an impending birthday had the mark clicking on attachments with payloads designed to extract information—human hacked.
However, despite the white-hat nature of such actions, they can have serious consequences.
Radcliffe said that as part of such exercises, she always provides a debrief.
“We talk to them and tell them that they are not stupid, they are just human,” she said. “With a bit of research, and the right attack vector, anyone can fall for a social engineering attack.”
Many people can feel foolish when it is shown how they were hacked, and can often prompt extreme actions, such as resignations. The reaction, she said, can often depend on the organisation’s culture.
“If people feel like they are being blamed, that is counterproductive,” said Radcliffe.
Victim blaming in the information security industry has been highlighted previously as being detrimental, as it often makes people reluctant to reveal hack attempts or incidents, or can result in people feeling they have no option but to resign having failed the likes of Radcliffe’s efforts.
“Things are changing,” she said. “but what goes on in public is not necessarily what goes on in private. The blame profile of an organisation is not always explicit.”
Referring to human element of security as the weakest link is also counterproductive, even if it is subconscious, she adds.
Radcliffe said that while there is now wider knowledge and awareness of social engineering and its techniques in the world of information security, there is a dearth of knowledge in the wider business world.
“In non-technical conferences where I speak, it’s often a complete surprise to them to hear about social engineering.”
“Strangely enough,” she said, “social engineering element of the cyber-threat story is easiest for the general public to grasp, rather than the technical, the idea of one person targeting another is all too familiar.”
With increasing awareness, social engineering will get harder, but it is here to stay, in Radcliffe’s view.
“Social engineering leaves the lightest footprint possible in any attack. You can effectively get in and get out with almost no technical footprint, which means it remains very powerful,” Radcliffe warned.
However, she also said that guarding against social engineering need not be expensive. “It’s about awareness. If people are aware that these attacks are possible and that they can prevent them by being a little more suspicious and careful about how they do things, you will slow them down.”
“At least if people understand how these things work, it opens peoples’ perceptions for picking up on things,” said Radcliffe.