What’s even better than getting Lady Gaga to play your Election Day rally? Sending out a mailing that applies a subtle dose of peer pressure. (New York Times Magazine - Oct 10, 2010)
By Sasha Issenberg
Over the past few days, thousands of Democratic-leaning voters nationwide — including the young people, minorities and unmarried women who were a crucial part of’s 2008 coalition and whom the party is desperate to rouse again on Tuesday — received a message in their mailboxes that effectively said: we’re keeping an eye on you. The mailers are the handiwork of Hal Malchow, a political consultant who is acting on a theory that first intrigued him four years ago. Before the 2006 Michigan gubernatorial primary, three political scientists isolated a group of voters and mailed them copies of their voting histories, listing the elections in which they participated and those they missed. Included were their neighbors’ voting histories, too, along with a warning: after the polls closed, everyone would get an updated set.
After the primary, the academics examined the voter rolls and were startled by the potency of peer pressure as a motivational tool. The mailer was 10 times better at turning nonvoters into voters than the typical piece of pre-election mail whose effectiveness has ever been measured. Malchow, a 58-year-old former Mississippi securities lawyer who managed’s first Senate campaign and went on to start a direct-mail firm, read the academics’ study and wanted to put the device to work. But he had trouble persuading his firm’s clients — which over the years have included the and the — to incorporate such a tactic into their get-out-the-vote programs. All feared a backlash from citizens who might regard the mailer as a threat from someone seeking their vote.
Then, as New Jersey prepared to elect its governor last fall, Malchow experimented with less ominous language, an idea he adopted from the Fordham political scientist Costas Panagopoulos. He removed all mention of neighbors and offered instead an expression of gratitude for having voted in the past — while still making it clear that recipients’ voting habits would continue to be monitored. “We hope to be able to thank you in the future for being the kind of citizen who makes our democracy work,” read the letter to more than 11,000 New Jerseyites.
Malchow found that the softer tone, while less effective than the original mailer,increased turnout among recipients by 2.5 percentage points. The D.N.C. ran a similar experiment during the special election in Pennsylvania’s 12th Congressional District this spring, with a letter from Senatortelling voters that “our records indicate that you voted in the 2008 election” and thanking them for their “good citizenship.” By employing the device on a larger scale, for dozens of candidates and independent groups this fall, Malchow aims to deliver votesthat would otherwise be lost to Democrats.
An increasingly influential cadre of Democratic strategists is finding new ideas in the same place Malchow did: behavioral-science experiments that treat campaigns as their laboratories and voters as unwitting guinea pigs. The growing use of experimental methods — Heather Smith, president of Rock the Vote, calls them “prescription drug trials for democracy” — is convulsing a profession where hunches and instinct have long ruled. Already, experimental findings have upended a lot of folk wisdom about how votes are won. The most effective direct mail might not be the most eye-catching in the mailbox but the least conspicuous. It is better to have an anonymous, chatty volunteer remind voters it’s Election Day than a recorded message fromor Jay-Z. The most winnable voters may be soft supporters of the opposition, not the voters who polls say are undecided. (“Undecided” may just be another word for “unlikely to vote.”)
Most of the activity on the left revolves around the Analyst Institute, a firm quietly founded in 2007 by A.F.L.-C.I.O. officials and liberal allies, whichseeks to establish a set of empirically proven “best practices” for interacting with voters. The group’s executive director, a behavioral scientist named Todd Rogers, has managed dozens of experiments around the country this year. Their lessons have shaped how Democrats are approaching and cajoling the voters they think are on their side but who haven’t yet shown that they will act on their beliefs on Election Day.
Nearly all of the Analyst Institute’s research is private, shared only among the participating groups. The institute’s Web site is almost comically empty, and the group’s name — two abstract nouns, cryptically conjoined — evokes afront. There seem to be two types of political operatives in Washington: those who think Rogers is a genius transforming their field and those who have never heard of him.
The experimental movement in politics began a decade ago, when the Yale political scientists Alan Gerber and Donald Green conducted a study testing the relative effectiveness of basic political tools. As the 1998 elections approached, Gerber and Green partnered with theto split 30,000 New Haven voters into four groups. Some received an oversize postcard encouraging them to vote, others the same message via a phone call or in-person visit. One control group received no contact whatsoever. After the election, Gerber and Green examined Connecticut records to see who actually voted. The in-person canvass yielded turnout 9.8 percent higher than for voters who were not contacted. Each piece of mail led to a turnout increase of only 0.6 percent. Telephone calls, Gerber and Green concluded, had no effect at all.
The findings were published in 2000 and quickly circulated among campaign operatives, who saw academics assailing many of their business models. A turf battle began within the political-consulting community: direct-mail vendors happily cited the Gerber-Green findings to argue candidates would waste money on phone calls.
Hal Malchow — who had previously approached the Democratic National Committee to propose using experimental controls to measure mail to voters but was repeatedly rebuffed — thought the Gerber-Green study was “the most important event in politics for a long time,” he says. “Eighty percent of what we’ve done in the past doesn’t work.” As the mail vendor for the A.F.L.-C.I.O., Malchow found a natural partner for his ideas in Mike Podhorzer, the organization’s deputy political director. Podhorzer saw Gerber and Green, who see themselves as researchers and not partisan advocates, as kindred spirits in a worldwide battle for knowledge between two camps he thought of as “gurus” versus “data.” As he says, “Until you get into a more rigorous approach, you are essentially left with what we had, which is that everything you did in a winning campaign was a good idea and everything that you did in a losing campaign was a bad idea.”
Podhorzer and Malchow began trying to adapt the Gerber-Green methods to the particular challenges faced by the A.F.L.-C.I.O., which regularly runs one of the largest independent campaign operations, almost always on behalf of Democrats. “Finding out the day after the election that Treatment A was the best is of limited value to an organization like ours,” Podhorzer says. “We’re actually trying to win the election.”
During the 2004 campaign, Podhorzer wanted to gauge voter reactions to his organization’s election messages in near-real time. A good poll shows how the electorate has moved over time, but it cannot isolate the effect of any individual appeal — and certainly not that of a single mailed leaflet, one of labor’s favorite tools for reaching member households. Focus groups offer a rich impression of how certain voters respond to that leaflet, but only the instant reaction of someone being paid $100 to have one. A focus group cannot say anything about whether a typical voter will even notice the brochure if it shows up in the mail wedged between a birthday card and a water bill.
Experiments provided a solution. The A.F.L.-C.I.O. planned to mail members monthly in 2004, and Podhorzer set out to design a “continuous feedback loop” testing different messages with small samples and then sending the most influential ones to a much larger target audience. As he examined the results, Podhorzer became even more frustrated with conventional polling. Asked if they would be more or less likely to vote for a candidate who favored shipping jobs overseas — a typical way of auditioning a promising line — voters across the board would tell pollsters that it made them “less likely.” But a draft leaflet about Bush’s policies had little impact on autoworkers who received it; they already knew what the union wanted them to think about the subject. Construction workers, however, didn’t know as much, and their minds changed. Experiments allowed Podhorzer to see which voters actually moved, not just count those who said they might.
Democrats have not been alone in experimenting with data-driven politics. As Dave Carney, once’s White House political director, prepared to guide the 2006 re-election campaign of Gov. of Texas, he invited Gerber and Green to conduct their experiments from within the campaign’s war room. Perry had spent more than $25 million to win a full term in 2002, much of it on broadcast advertising, and Carney thought a rigorous experimental regime could help “assure donors that we’re using their money as best as possible — spend it different, spend less of it.” Gerber and Green asked two political scientists who had informally advised ’s 2004 re-election, James Gimpel and Daron Shaw, to collaborate on the project. Carney invited the quartet he called “our four eggheads” to impose experimental controls on nearly every aspect of campaign operations.
Perry won easy re-election in 2006, and their findings profoundly altered his 2010 tactics. Perry’s primary campaign this year sent out no direct voter-contact mail, made no paid phone calls, printed no lawn signs, visited no editorial boards and purchased no newspaper ads. His broadcast advertising strategy was informed by a 2006 experiment that isolated 18 TV media markets and 80 radio stations and randomly assigned each a different start date and volume for ad buys from a $2 million budget earmarked for the experiment. Public-opinion changes from the ads were then monitored with tracking polls. Carney estimates that the research saved Perry $3 million in this year’s primary campaign, and he still beatby 20 points. On Tuesday, the value of Perry’s unusually empirical approach to electioneering will be tested again, this time in a tough race against the Democratic nominee, Bill White, the former mayor of Houston.
After Bush’s 2004 re-election, Podhorzer invited other scientific-minded progressive operatives to A.F.L.-C.I.O. headquarters to share their research.Very few members would be recognizable to cable-news viewers; the group almost entirely bypassed the brand-name consultants whom campaigns like to unveil in press releases. “It’s not the big names on the door,” says Maren Hesla, who directed Emily’s List’s Women Vote! campaign. “It’s all the — God love them — geeky guys who don’t talk to clients but do the work and write the programs.”
The unofficial society called itself the Analyst Group, and it grew by word of mouth. By the time Democrats reclaimed Congress in 2006, as many as 60 people showed up for the regular lunches. In 2007, Podhorzer and his Analyst Group circle established the Analyst Institute, designed to operate with a scholarly sensibility but with the privacy of a for-profit consulting firm. Podhorzer became chairman and looked for an executive director. Gerber suggested Todd Rogers, on whose dissertation committee he had served.
Rogers, who had just turned 30, was a former college-lacrosse player from the Philadelphia suburbs who earned a joint degree in organizational behavior and psychology in connection with Harvard Business School after performing studies that examined the way individuals managed their queues on services like Netflix. Rogers argued that this type of research — on how time delays alter preferences — could help policy makers shape policy design on issues like carbon taxes, which involve balancing your ideal preferences (watching documentaries, having a smaller carbon footprint) with your actual choices (watching action movies, buying an S.U.V.).
Shortly before Pennsylvania’s April 2008 presidential primary, Rogers scripted a phone call that went out to 19,411 Democratic households in the state. The disembodied call-center voice said it had three questions. Around what time do you expect you will head to the polls on Tuesday? Where do you expect you will be coming from when you head to the polls on Tuesday? What do you think you will be doing before you head out to the polls?
Rogers did not care what voters’ answers were to the questions, only whether they had any. He was testing a psychological concept known as “implementation intentions,” which suggests that people are more likely to perform an action if they have already visualized doing it. The subject was on a long list of psychology concepts that Rogers took to Washington. Many had been demonstrated only in situations outside politics or examined by psychologists only in laboratory settings. Enamored of the psychologist Robert Cialdini and the behavioral economist Richard Thaler, Rogers thought their research methods could be applied to elections. And Rogers saw the advantages of doing academic-style work outside the academy: he faced no financing restrictions or the need to navigate a university’s human-subjects review board.
This June, two years after the Pennsylvania experiment, Rogers traveled to Pittsburgh to pre–sent the findings at abehavioral-science conference. Before a room of professors and graduate students, Rogers explained that asking people about their voting plans increased turnout by 4 percentage points. A closer look, however, showed the effects were unevenly distributed. The self-predictive phone calls had little impact on multiple-voter households. But for those living alone, the effect was tremendous: turnout jumped by nearly 10 percentage points. The reason, Rogers surmised, was that making plans is a collaborative activity; spouses and roommates already talk through issues like child care as a condition of voting. For those who live alone, rehearsing their Election Day routine with a stranger helped them make a plan.
Once done, Rogers took a seat next to Richard Thaler, who draped a paternal arm across his back. In 2006, Thaler welcomed Rogers into the Consortium of Behavioral Scientists, a secretive group that helped Democrats apply academic research to policy making and advised party leaders, includingand , on election-year tactics. Two years later, Thaler — a colleague of Obama’s — helped to bring many consortium members together, including Rogers, to informally advise Obama’s presidential campaign.
By that fall, Rogers’s implementation-intention device had become a standard campaign tool for many left-leaning groups, along with scripts declaring things like “turnout is going to be high today.” Rogers’s experiments have shown that voters respond better to everyone-is-doing-it messages emphasizing high turnout than don’t-be-part-of-the-problem appeals describing how few Americans vote.
Rogers spends a lot of time trying to convince activists that the central premise of randomized experiments — deciding not to contact a control group of voters — will not torpedo their short-term priority of winning elections. Meanwhile, a new Democratic establishment has brought the data-driven crowd in from the outside. Since Obama’s election, operatives with Analyst Group ties have moved into key party jobs and now attend meetings as representatives of the Democratic National Committee, the Democratic Congressional Campaign Committee and the Democratic Senatorial Campaign Committee.
While political experiments have proved successful at isolating what gets people to vote, they have been less useful at finding out how voters decide among candidates. Partly for that reason, while the Analyst Institute’s findings and sensibility inform how permanent institutions like unions operate, they have yet to transform candidate’s campaigns, where most money is spent in the least targetable way possible: on broadcast TV time. Rogers has been designing experiments to assess Internet advertising, whose effectiveness has been traditionally gauged by click-throughs and sign-ups that do little to measure the ads’ impact on more-passive viewers. For a study Rogers oversaw during Minnesota’s 2008 Senate campaign, an independent group bought Yahoo! banner ads introducing an issue invisible from the campaign dialogue elsewhere: an obscure vote by the Republicanagainst financing a rural antidrug program. Through polling, Rogers discovered that those who saw the ads were more likely than others to believe that Coleman could have done “more to stop meth use.”
But experimental politicking is not always so provocative. Indeed, groups following Analyst Institute findings often end up abandoning their flashiest tools for more staid ones. The America Votes coalition has dropped get-out-the-vote robocalls. Rock the Vote has found e-mail andarriving from unexciting senders like “Election Center” often do better than those with livelier “from” lines. Malchow has discovered that voters pay less attention to the glossy four-color brochures designed to “cut through” mailbox clutter than they do to spare envelopes evoking a letter from the tax man or a jury-duty summons. “People want information, they don’t want advertising,” Malchow says. “When they see our fingerprints on this stuff, they believe it less.”
Politico, Top 50 Politicos:
Jacobs' cutting-edge use of direct mail, text messaging, e-mail and advanced data mining technologies are winning him notice."