Is the Password “Swordfish”?

Dr. Charles Palmer lectures on cybersecurity and privacy.

Dr. Charles Palmer lectures on cybersecurity and privacy.

Imagine, if you will, one day where the planet’s hackers have free rein.  What could possibly go wrong?  One could look at everything involving computers these days: water treatment centers, late-model vehicles, power plants.  In our technology-driven world, anything can be a target.  This sober message was part of  “An(Other) Inconvenient Truth: The Collision of Security and Privacy in the Digital Age,” a lecture delivered on April 22 by Dr. Charles Palmer, Chief Technology Officer for Security and Privacy for IBM Research and adjunct computer science professor at the College.

He gave a couple of examples of what some enterprising criminals might try.  What if someone wanted to attack an organization — the military?  Medical records are stored somewhere; Palmer noted, “I’d change blood type plusses to minuses and minuses to plusses and watch people die.  Cool.”  Another one: “Those [Tokyo] traffic lights are programmable.  They’re also not particularly secure. My favorite hack [would be] to go to Tokyo, rush hour.  Change all the lights to yellow.”

These are the things that security professionals have to consider.  While there are a few perks to the job (he drily noted, “You can’t take your work home” in the security business), there are many challenges in how plebeians approach it.  “We want security on our own terms,” Palmer observed.  “We like security when it’s convenient [and] when it makes us feel better, whether … it provides us with any security at all.”  We engage in “security theater”; the most common passwords are “password” and “123456” and “qwerty” and “baseball” and “football.” Passwords such as these are the technological equivalent of shambling about shoe-less and belt-less through the millimeter-wave machine in the airport.  Meanwhile, the world of threats is getting larger, as Stuxnet and Heartbleed and Snowden and other security vulnerabilities spill onto headlines with regularity.

To resolve these security issues while maintaining privacy, Palmer remarked, “We’re going to have to go to people. … Have you tried to set the privacy settings on Facebook?  You can, until they change the rules.  It’s not in their business model to help you protect yourself.”  We could blame computers or the Internet or the government or our operating systems (is Windows the problem?  “No,” Palmer says, “Linux is just as bad, and Mac OS is trying hard.”) But in the end, “people are part of the problem and also part of the solution … We don’t have a magic bullet.”

Palmer places people into four categories.  The first encompasses programmers.  Pointing to a graphic, Palmer disdainfully pointed out, “This is what [early Internet era programmers] wanted.  Dancing hamsters.”  They had a mañana approach, as adding security would make everything too slow; this led to today’s cycle of “Patch, penetrate, and pray.”  Programs and networks have now become webs of complexity, the enemy of security and privacy, and starting over is generally a non-starter.  Palmer recommends that new projects integrate security and privacy as design goals from the start.

The next group is made up of policymakers.  The government has had a long history of bungling security and privacy matters.  For instance, a state government once tried to censor the Internet; they ended up redacting sites mentioning chicken breasts, breast cancer, and Essex High School (“It has ‘sex’ in it — all high schools do.”)  Other times, different agencies fall over each other’s feet trying to keep us safe.  Dr. Palmer brought up process control systems for nuclear, chemical, or water plants — one regulation says that their doors must have keypad locks whose codes must be changed regularly.  This is all well and good.  However, another regulation stipulates that all doors must be accessible for emergency personnel.  After a great deal of thinking, the brilliant managers of these control systems began writing the secret codes on Post-It notes and placing them on the locked doors.

The third group of people, made up of users, is the most important.  Policies and programs must deal with them at the best of times, the worst of times; the times of wisdom and the times of foolishness.  That last trait comes up quite a bit — Dr. Palmer described jobs he had taken as an “ethical hacker,” where he would, with consent, try to break into clients’ systems, then inform them of any security issues.  “It was the most fun job ever until it became very clear that this was just too easy.  We were winning eighty-five percent of the time. … [After a while] we automated it.”  Even without programming, Palmer could reliably gain entry into sensitive systems by means of social engineering such as peeling sticky-notes with passwords off of computer monitors, walking about with a jumpsuit and fake handset, and pretending to be the UPS man and asking where the computer room is.  People simply performed insufficient security countermeasures.  “When you break into a [pharmaceutical company database] system and you’re presented with a database titled ‘patient_data_1,’ [scoffs] okay.  We found lots of patient data.  We also found porn.  In the patient database.”

The issues with user security are only going to increase as time goes on.  Jurisdictional boundaries will get fuzzier, as companies in “the cloud” will inevitably go out of business or change hands or go international, leading to disputes over who actually owns customers’ data.  The   divide will erode as the Internet and computers spread across the globe.  “[New Internet users] know less about security than we do! … Do you think the people selling them this stuff is going to tell them [about security]?”  Palmer asserted that it is necessary to educate everyone about security, teaching children to “cross the Internet safely,” akin to teaching them to cross streets safely.

This is especially true because of the fourth group: the “bad guys”.  As Willie Sutton, a bank robber, once quipped, “I rob banks because that’s where the money is.”  That has shifted online these days, and two groups have sprung up to take advantage of the digital world — those who need to the Internet to operate (organized crime, spies, identity thieves, etc.) and those who need the Internet to operate, but only once (terrorists and rogue nations).

To combat them, Dr. Palmer had suggestions for the first three groups mentioned: harmonize security and privacy from the start, with realistic expectations, capabilities, and goals.  Utilize the intelligent people we have — let non-citizens take part in security; if not in sensitive places, then in foundational research.  He listed spies in the past: “[Robert] Hanssen,” [Aldrich] Ames, Snowden.  They were citizens.”  Catalyze solutions in both government and industry to make sure that policies are CREME-y: cooperative, relevant, enforceable, meaningful, and empowering.

Be the first to comment on "Is the Password “Swordfish”?"

Leave a comment

Your email address will not be published.


*