Putting a New Spin on Computer Security
By Matt WindsorNitesh Saxena (right, with graduate student Babins Shrestha) leads UAB's SPIES research group, which is testing everything from brain scans to "playful security" to keep users safe online.Computer security researchers put themselves into the minds of cybercriminals to figure out what they might do next. Nitesh Saxena, Ph.D., takes a different approach. His mission is to get inside the minds of users—quite literally, in his latest project—to figure out how to protect them from new attacks.Saxena is the head of the SPIES (Security and Privacy in Emerging Computing and Networking Systems)research group in the UAB Department of Computer and Information Sciences. “Most traditional security research focuses on the attackers,” Saxena says. “We work on the defense side, with an emphasis on the end users.”
The SPIES lab puts the “strengths and weaknesses of the computer user” under the microscope, Saxena explains. Or under the brain scanner, to be precise. In one new project, Saxena has partnered with Rajesh Kana, Ph.D., a researcher in the UAB Department of Psychology who specializes in using brain imaging for autism research. The interdisciplinary duo has started scanning volunteers while they perform everyday security tasks. The subjects have to decide whether the sites they are looking at are real or fake—the actual Facebook home page or a knockoff, for example—or they are asked to heed a security warning while reading an article.
“We want to understand, from a neuroscience perspective, what happens when people are making these security decisions, and especially what happens when they are rushed into making decisions, as often happens online,” Saxena says. “We are still in the early stages, but this may give us clues on how to design warnings and safeguards that are more effective.”
Story continues after video
The Person Is the Problem
It’s a well-known fact, Saxena says, that users themselves are the greatest security flaw. “When you want to log into a website, you really want to pay bills or read e-mails or listen to music,” he says. “If it looks like Facebook, you will log in with your password—even if it is actually a phishing site. People don’t scan sites well; they don’t pay attention to warnings. We say we are concerned about security online, but our behavior often runs counter to that. People just don’t care.”In a new project, Saxena has partnered with Rajesh Kana in the Department of Psychology to scan subjects with fMRI machines as they perform security tasks.Saxena’s team, which includes a high-energy group of undergraduates and graduate students, spends much of its time dreaming up ways to try to trick people into caring (a paradigm called “playful security”) and the rest coming up with answers to security issues that most experts—even the cybercriminals themselves—haven’t thought of yet.
The SPIES are particularly interested in emerging systems, such as location-tracking RFID tags and cloud computing: new devices and technologies that are gaining traction but haven’t reached the mainstream. “If you target these systems, start thinking about the security and privacy issues they bring, and develop solutions before they are widely deployed, you can really make a difference,” Saxena says.
The gadgetry crammed into today’s phones—cameras, GPS, motion-sensing accelerometers, and more—is already being exploited by criminals, Saxena says. Consider, for instance, those super-cool readers that allow vendors to accept credit-card payments through a smartphone. You may have seen them at the coffee shop or your local auto mechanic. They could be a major target, Saxena says. “What if you designed a program that turned on the cellphone camera every time a person swiped a card on their phone, then sent those pictures directly to you?” he explains.
Another aspect of these readers is whether users deem them to be safe and risk-free when compared to traditional card readers at places like Walmart. Saxena has teamed up with David Schwebel, Ph.D., a faculty member in the Department of Psychology, and John Sloan, Ph.D., chair of the UAB Department of Justice Sciences, to investigate this very important question of consumer psychology.
Proactive Protection
The SPIES are making a name for themselves with their proactive approach to user protection—developing innovative solutions to criminal attacks beforethey enter widespread use.Take, for example, the phantom-dialing smartphone app. Cunning criminals have developed apps that seem like innocent games. But when the phone is left unattended (but turned on), a malicious side emerges. The app starts calling or texting premium-rate numbers, racking up large phone bills for the unsuspecting user. The criminals know that this sort of behavior would probably be caught during the day, so they may program their software to make its calls in the middle of the night.
At the moment, these attacks are not in widespread use, but experts expect them to soon become one of the greatest threats to the billions of worldwide smartphone users.
The SPIES response? A dead-simple program called Tap-Wave-Rub (TWR) that takes advantage of the proximity sensors built into all smartphones. (See video here.) When installed on a phone, the software forces users to either tap the sensor, wave their hand in front of it, or rub it quickly in order to make a call. Saxena says the researchers purposely designed the program to not involve yes/no decisions on the part of the user. “We knew we had to force people to stop for a minute and think about whether or not the action requested by the phone is something they would really like to do,” he says.
Is It Really You?
Saxena also is working on ways to use the phone’s embedded sensors to beef up security. “Strong” passwords—ones that involve combinations of upper- and lowercase characters, punctuation, and lengthy character strings—are now all the rage, but even the best are notoriously easy to crack, Saxena notes. “People tend to use names and other simple combinations that can be easily cracked by brute-force techniques,” he says.“We want to replace user-generated passwords completely, with stronger mechanisms like biometrics,” Saxena continues. “The idea is you have websites generate their own passwords, and make them sufficiently long and random that they are impossible to crack.” The phone’s camera could be used to recognize unique characteristics such as facial features or thumbprints, or its microphone could analyze vocal patterns to make sure it was only unlocked for authorized users. With those security measures in place, the phone itself could securely store complex, site-generated passwords, supplying them when necessary to unlock the doors of the Internet as the user surfs on in blissful ignorance.
The approach basically turns a user’s phone into a “Swiss Army knife,” Saxena says. “With this system in place, the user can authenticate to a website with a high level of security and usability.”
CAPTCHA and Release
The holy grail of security would be a method that users actually look forward to using. It’s known as the “Tom Sawyer effect,” Saxena says. “He was asked to paint a fence, so he pretended to his friends that he was having fun, tricking them into painting the fence for him. We are looking at ways to package security as something fun.”How well do you scan? Subjects in the fMRI experiment have to find the "L" among the confusing "T"s.One thing that’s not fun: the dreaded CAPTCHA, the little box on many sites that forces users to reproduce wildly distorted characters. The idea is to prove that the user isn’t a “robot” program with nefarious intent. “Computers can’t figure out what the characters are—but trained humans can do it in seconds,” Saxena says. The trouble is, “criminals have figured out that they can pay people—a penny or less per time—to sit in front of a screen and ‘solve’ CAPTCHAs to let them do what they want.” This is known as a “CAPTCHA relay” attack.
The process only takes a few seconds, but Saxena has a solution—and a “playful” one at that. Instead of letters, the security challenge involves floating elements, such as a group of cartoon tools that users must click on and drag to a cartoon toolbox, for example. “It isn’t hard for the user; in fact, it’s kind of fun,” Saxena says. (See video here.) And because the items are constantly moving, by the time criminals can reroute the game to their confederates, there’s no way to accurately beat the system. This is a joint project with Saxena’s colleague Chengcui Zhang, Ph.D., in UAB’s computer sciences department. Saxena’s team is working with companies such as Areyouahuman, which are offering similar CAPTCHA technologies.
Living on the Edge
The National Science Foundation and companies such as Google, Intel, Nokia, Research in Motion, and Cisco are sponsors of Saxena’s research. His students have moved on to positions at Columbia University and New York University, and at industry heavyweights such as Microsoft, VMware, and cloud-computing firm EMC. But before they head out into the job market, they have a blast exploring the fringes of the security arena. One student’s current project: trying to see if “brain interface” game systems such as NeuroSky could be used by criminals to read a user’s thoughts—and steal his or her passwords. (See video here.)Saxena made his own entry to the security world through a master’s thesis on cipher machines used in World War II. He clearly gets a thrill from teaching his students to crack the codes of cybercriminals—and adding bright young minds to the security workforce. Saxena developed and co-directed the master’s degree program in cyber-security at the Polytechnic Institute of New York University (NYU-Poly) and now co-directs UAB’s new Master’s in Computer Forensics and Security Management program.
“Security is an exciting area applicable to many different industries,” Saxena says. “It’s not just security, but psychology and cognitive science as well. And it’s just plain fun.”
More Information
SPIES LabUAB Department of Computer Science
UAB College of Arts and Sciences
Apply to UABnockoff, for example—or they are asked to heed a security warning while reading an article.