(With apologies to Pogo.) For all the great work that we are seeing in improving both software and hardware security, we – not the technology – are in many ways the weakest link in the security chain. Recent reports indicate we are surprisingly easy to fool, despite our much proclaimed awareness of risks.
In a recent experiment at FAU, over half of email recipients clicked on links from an unknown sender, as did 40% of Facebook users, even though ~80% of participants said they were aware of the risks in clicking on unknown links. Emails were engineered with a sender name common in the target group and said the link would lead them to pictures taken at a recent party. My inference is that the appeal of seeing who was there and whether the recipients were in any pictures was more immediately compelling than the risk they might incur.
Separately, a study at BYU found that security messages were ignored by nearly 90% of users if they appeared at an inconvenient time (while typing, watching a video, ..). The inference drawn by the researchers is that (surprise, surprise) we’re not nearly as good at multitasking as we think we are and (my inference) we tend to prioritize what we were doing, not the interrupt.
Of course there are well-known problems with USB flash drive and related products. Most of us will still happily pick these up at trade shows, conferences and other promotional campaigns. We all too frequently use them ourselves or hand them out to friends and family. And yet vendors buy such devices as cheaply as they possibly can (wouldn’t you if you were going to hand them out like candy?), sourced through murky distribution chains, despite proven examples of pre-installed viruses. One instance at mobile network provider O2 was particularly embarrassing.
Then there are the “accidentally” dropped USB sticks that many of us will happily pick up. This story at times seems like a viral meme, but a recent study confirmed this behavior is very real and very current. Recently researchers dropped ~300 flash drives round the campus of U Illinois. At least half of the drives were picked up and connected to a computer within 6 minutes. Motives were apparently altruistic (trying to find the owner of the drive so it could be returned), especially if keys were attached to the drive. Only a small percentage of users who opened a drive first checked the drive with anti-virus software.
The point of all this is that, while improvements in hardware and software security are very necessary, as these walls get higher evil-doers will naturally turn (and are already turning) to easier paths. There’s no better place to go than attacks on flaws in human architecture which aren’t easily fixed with brilliant hardware and software innovations.
We might imagine that we ourselves are too clever and too wise to be tricked in this way, but really, none of us are justified in having this level of confidence. A sufficiently interesting/convincing email could trick the best of us, especially if we’re even a little bit distracted. And unless we want to retire into a life of paranoid seclusion we will continue to share data and links with family members and friends who may not be as vigilant as us (or perhaps we are not as vigilant as them?)
So while we’re busy engineering even more impenetrable walls around our hardware and software, spare a thought for how we might improve security in the weakest link – us. The FAU study is HERE, the BYU study HERE, the O2 incident HERE and the USB study HERE. For an entertaining list of some of the best social engineering attacks, see HERE.
Share this post via:
Next Generation of Systems Design at Siemens