Amid the wall décor in Karen Levy’s Gates Hall office—across from an assortment of trucker memorabilia and a signed photo of Supreme Court Justice Ruth Bader Ginsburg ’54—is a poster from the classic thriller The Conversation. A Best Picture nominee written and directed by Francis Ford Coppola, the 1974 film stars Gene Hackman as a surveillance expert tasked with bugging an “unrecordable” event: two people talking as they walk through a busy, noisy city square.
It’s apt artwork for Levy, an assistant professor of information science whose research explores the often fraught intersection of technology and privacy. Levy, who also has an appointment in the Law School, had already earned a JD from Indiana University and clerked for a federal judge when she decided to pursue a PhD in sociology from Princeton—training that gives her a novel perspective as she explores a wide variety of topics, from the use of webcams in nursing homes to the ways in which retailers track customers to the role of technology in intimate partner abuse.
But first, there were the truckers. The subject of Levy’s doctoral thesis—and the reason why she has a trucker patch and belt buckle on her office wall—was how surveillance technology has impacted workers in the trucking industry. Starting in 2011, she spent several years interviewing drivers at truck stops in eleven states as part of her research on the use of electronic monitors in their vehicles, installed to insure that they adhere to regulations limiting their driving time to prevent fatigue and accidents. “Previously, they’d had a lot of autonomy in deciding how to get their work done—when they were going to work, which rules they were going to follow,” Levy says. “This was much, much more rigid.” She found that the tech was deeply unpopular with drivers—learning, among other things, that some developed clever ways of sabotaging the equipment, and that their employers sometimes altered data to conceal violations. “Truckers get paid by the mile, so if they’re not on the road they’re making zero money, even if they’re doing other things that are required of them by their companies or by the law,” Levy explains. “So of course they’re incentivized to stay on the road as much as they possibly can, and to break the law if they have to. There’s a lot of pressure on them to do that.”
One overarching issue she examined was how the new technology clashes with trucking’s traditionally independent culture; it’s a profession that has long treasured the freedom and romance of life on the open road. “Many truckers will tell you that the reason they get into the job is that it’s a way to have control over their lives,” says Levy, who’s currently working on a book called Data Driven: Truckers and the New Workplace Surveillance, to be published by Princeton University Press. “Part of why this technology was received so negatively is that it slams up against this idea of autonomy that has been really valued in the industry.” And paradoxically, she says, the tech may initially have had a negative effect on safety by alienating more experienced truckers—prompting the very people that society would want behind the wheel of a semi to flee the industry. “Older drivers don’t want to be told, ‘We don’t trust you; we’re going to watch you now,’ ” Levy says. “Almost every trucker told me that these monitors treated them either like criminals or like children.”
In addition to her many scholarly articles, Levy has been published widely in the lay press, including the Washington Post, Vox, the Atlantic, and the L.A. Times. In a March 2018 essay in Slate, she addressed the role that tech can play in facilitating intimate partner abuse—potentially allowing someone to surreptitiously track a partner’s movements, intercept their communications, and otherwise control their lives. “What we’ve discovered in our research is that digital abuse of intimate partners is both more mundane and more complicated than we might think,” she wrote. “It’s mundane in that many forms of digital abuse require little to no sophistication and are carried out using everyday devices and services: social media platforms, find-my-friends apps, cell phone family plans. Abusers aren’t hackers: though some do install surreptitious ‘spouseware’ to monitor their victims without consent, it’s much more common to abuse victims digitally in ways that don’t require any high-tech skill.”
But as Levy notes, while tech companies tend to focus on preventing sophisticated cyberattacks, much of everyday cybersecurity hinges on factors like passwords, pre-answered questions (like the name of your first pet), and access to physical devices like a laptop or cell phone. “Passwords are saved on your home computer where your abuser probably is, and the abuser is going to know the answers to your security questions,” she says. “Many of these things that we’ve built up as checks fall flat when the abuser is in the home with you. Technologies are not designed with that in mind; they’re designed with the hacker in mind.”
For Levy, the most compelling research questions involve modes of technical surveillance that aren’t clearly good or bad; the same app that can help parents ensure their third grader gets safely home from school, for example, could allow an abusive husband to track his wife to the concealed location of a domestic violence shelter. In the American Journal of Bioethics: Empirical Bioethics, Levy and colleagues pondered the ethical issues around the use of webcams in nursing homes—often installed by relatives concerned that their loved ones aren’t getting proper care. While that’s a noble goal, she says, it raises concerns about privacy—not only of workers and roommates but of residents who may have dementia and can’t consent to being surveilled. In the same vein, Levy’s trucker study explored the complicated effects of a technology intended to save lives by keeping exhausted drivers off the road. “We often use technical solutions as a way to put a Band-Aid on a bigger social and economic problem, rather than just dealing with the root of it; you see that over and over,” she observes. “If we paid drivers a fair wage, or paid them for all the time they’re actually working and not just moving on the highway, they wouldn’t be so tired. That feels to me a much more sensible place to change the policy.” Similarly, nursing home cameras are a way for relatives to cope with the fact that facilities are often understaffed—something that could be improved by higher salaries and tighter Medicaid regulations. “But we’ve chosen not to do those things,” she says, “and we’ve created a situation where families feel like they have no other choice.”
Among researchers on tech and privacy, there’s an anecdote so canonical that Levy laughingly notes it’s “almost a drinking game; someone will eventually bring it up.” It traces its infamy to a February 2012 New York Times Magazine story by Charles Duhigg in which he described an effort by Target to market products to expectant mothers, whom it identified by mining data on their seemingly unrelated purchases, like calcium supplements and unscented lotion. Among those potential customers: a high schooler whose father was incensed by what he thought was an inappropriate mailing—until he found out that she was, in fact, pregnant. “It was shocking,” Levy says of the tale (which, she goes on to note, may in fact have been apocryphal). “It’s so visceral. It has all these great ingredients that make it super sticky and memorable.” The anecdote seemed all the more egregious, Levy says, because it came before the news of notorious breaches of privacy—like Edward Snowden’s disclosure of NSA surveillance (in 2013) and the Cambridge Analytica scandal (of 2018), when it was revealed that millions of Facebook users had their data mined without their consent. “Every few years, this issue pops up and we say, ‘Oh my God, we have to do something about it,’ ” says Levy, “and then we don’t.”