In a conversation with Chris Hadnagy recently, which will be released as a podcast shortly (and trust me you're not going to want to miss this episode!) we tripped over this little trick that malware authors use which just makes my blood boil.
Since the bad guys often rely on the end-user's lack of awareness and knowledge, you half-expect some of the dirty tricks, like creating a brilliant-looking and convincing web page that looks just like your antivirus software... or something equally dastardly, but there's another trick Chris brought up that made me crazy.
Here's the deal - how many of you reading this right now have read the entire EULA (end user license agreement) on the last piece of software you installed?
It's OK, no one's looking if you admit to yourself that much like me, you probably skimmed it or skipped it altogether opting instead for the 10 minutes of time-savings just to get your software installed.
The bad guys count on this, and in their sketchy software they put in things that make their criminal activity legal. And it's all because you clicked "I accept".
This is how the scene plays itself out...
- end-user lands on a site/page where a script has been planted
- end-user gets a pop-up (most likely) or some kind of pop-over/under
- 'warning' page attempts to scare users into downloading/executing script or app << the setup
- end-user is scared into clicking/downloading/executing what appears to be a legitimate application
- end-user skips EULA by "clicking through" ... agreeing to be charged by the application maker to have their computer intruded, or held ransom
- bad things happen, likely a 'pay us or else', or worse...
In case you missed it, the end-user did agree to have their computer or device pillaged and ransacked. This little trick makes me crazy, primarily because it's really the fault of the end-user for not knowing what they're getting into. But can you really blame the user?
Whose job is it to make sure the user is protected from themselves?
Let's apply that back to corporate or enterprise security ... the answer to the above question is you, Mr/Ms InfoSecurity Manager. So how do you do it?
You have options....
- Lock down corporate devices to make sure end-users can't install/modify the system in an undesirable way. This has its draw-backs as users will work against you to find ways to get their favorite anti-productivity widgets on that machine, or you'll get employees who aren't in the habit of working off-hours...
- Educate your employees to make sure they're smarter than the average bear and can tell when they're being phished, hooked into malware, or otherwise bamboozled. This endeavor is a lot easier said than done though - so good luck.
- Provide virtualization where the end-user can do work on a locked-down work-only machine while providing a virtual machine for personal endeavors. It's an interesting craps-shoot to see whether people keep the two worlds separate.
See, there really are no good defenses against human behavior, and our want to 'just move on' and save time. I know I haven't been fantastic about reading EULAs in the past, but I'm reading every word now. Whether I actually understand all that, that's another question, but hey at least I'm making the effort, right?
I don't know what the 'right' answer is, or if there really is one, to the human condition of vulnerability. What I can tell you is, exposing these little dirty tricks where we find them, and preaching them from the mountain-tops is probably the most effective way to scare typical end-users into being more careful with your corporate data, and their own lives.
Cross-posted from Following the White Rabbit