A Matter of Opinion

Polling class aims to ‘take America’s pulse’ on the election and more.

How satisfied are you with your formal education? Which harms society more, marijuana or alcohol? Is it only acceptable to vote for a candidate from the political party with which you’re registered? Is physician-assisted suicide equivalent to murder?

Those questions, and dozens of others, have been asked by students in one of the University’s more unorthodox courses. Cross-listed in government and communication but open to students of all class years and majors, “Taking America’s Pulse” teaches undergrads how to design, execute, and analyze a national public opinion poll. Not only do students concoct the questions—with each person contributing one query to the overall poll, which also collects demographic data—they serve on the front lines of administering it, working alongside professional academic pollsters at Cornell’s Survey Research Institute (SRI). “There’s enormous educational value in actually picking up the phone and cold-calling the American public for their opinions,” says co-teacher Jonathon Schuldt ’04, an assistant professor of communication. “They learn that the data are only as valuable as how rigorous the survey script was and how well the callers adhered to the exact wording of the questions. Just slight changes can have a meaningful effect on the results.”

Photo: Robert Barker/Cornell Marketing Group

Professor Peter Enns.

Photos: Robert Barker/Cornell Marketing Group

While “Taking America’s Pulse” was previously held in spring ’14 and ’15, this academic year it’s being offered in the fall—giving students the rare opportunity to conduct a poll in the thick of a contentious presidential race (though the questions won’t be limited to the election). The timing is key to getting an accurate understanding of how respondents view the candidates and related issues, observes Peter Enns, an associate professor of government and Schuldt’s co-teacher. The two are chatting with CAM in Enns’s White Hall office in mid-July—on a day when the national media is reporting the results of a poll finding that Hillary Clinton and Donald Trump are neck and neck. But Enns, who’s executive director of Cornell’s Roper Center for Public Opinion Research, points out that such “horse race” polls held so far in advance are notoriously unreliable. “If you want to understand voting and the choices the public makes, having the survey conducted close to the election really matters,” says Enns. “When a decision is far from now, the criteria used to make it is different.”

Photo: Robert Barker/Cornell Marketing Group

Professor Jonathon Schuldt ’04

By way of analogy, Enns says, imagine planning where to go out to dinner a year from now. “Where should we go?” he muses. “Maybe New York City? The fanciest restaurant? That would be cool.” But what if you’re deciding where to eat tonight? Then, he says, you’ll consider practicalities like distance, scheduling, and cost. That faraway Manhattan eatery, with its long waits and $40 entrees, is starting to look a lot less attractive. “We’re going to pick based on a certain set of fundamentals, and voting works a lot like that,” he says. “There’s your partisanship; your opinion of the current president; how you think the economy’s doing. That’s what tends to drive it.” He notes that in reporting that day’s poll, the New York Times attributed Clinton’s drop in popularity to the latest news about her use of a private e-mail server as Secretary of State. “That’s exactly what you’d expect,” he says. “If somebody asks me today how I’m going to vote I might say, ‘That e-mail thing doesn’t sound responsible.’ But let’s say I’m in the voting booth and I’m a Democrat; she’s my party’s candidate, she’s got all that political experience, and I’m relatively happy with what Obama did, so I’m going to vote for her. The idea of the e-mail scandal affecting an actual vote is slim—but in a survey today when the choice is months away, it makes complete sense that it’s influencing the responses.”

“Taking America’s Pulse,” which has two weekly lectures plus a discussion section, makes for an intense semester. The requirements include writing (and submitting for publication) an op-ed based on the survey results; students also have to create a research poster and defend their findings at a formal question-and-answer session with faculty and grad students. Plus, fully 20 percent of the grade is predicated on completing at least five telephone interviews, an undertaking that may sound straightforward but can require hours of dogged effort over the course of weeks. National polls, Enns notes, sometimes have response rates as low as 10 percent. As Kailin Koch ’15, a former government major who took the course in the spring of 2014, recalls: “Some days, it felt like all you were doing was talking to people’s voicemail.”

Danyoung Kim ’16, who’s currently studying for the LSAT after earning a government degree and interning at the White House, developed a variety of strategies for upping the odds of finding a willing interviewee. “My favorite time to call was Sundays after people come back from church,” she says with a laugh, “because I figured they’d be a little nicer.” Her own survey question was open-ended and a tad unconventional. She asked: “When you hear the phrase ‘Americans are stupid,’ which group of Americans comes to mind?” The responses didn’t turn out quite as she’d expected. “One thing we learned in class is that there’s no such thing as a perfect survey question—and a lot of people misinterpreted mine,” she says. “They thought I was asking which foreigners say that Americans are stupid. So I got a lot of answers like, ‘the French.’ ”

While Koch—whose own question involved whether meat consumption affects climate change—confesses to having been “terrified” on her first day of polling, she proved to have a knack for it. She went on to spend a summer working at SRI and now crunches polling data at a strategic consulting firm in New York, a job she credits the course with helping her land. Like Kim, she notes that having conducted a poll has made her more wary of how their results are bandied about. “Knowing what I know about sample size, methodology differences, and how giant the margin of error can be, I’m definitely more skeptical,” she says. And she and Kim have something else in common: having languished on the other end of the line, they both pledge that if pollsters ever call them, they won’t hang up.