When a new study came out late last year proving—scientifically!—how easy it is to turn opponents of gay marriage into supporters, the political scientist Andrew Gelman managed to summarize his reaction in a single unscientific word: “Wow!”
He was writing in the Washington Post, but his sentiment was echoed throughout the mainstream liberal press: the big daily newspapers, websites like Vox and the Huffington Post, TV network news, and public radio—especially public radio. Twitter lit up like a nonsectarian Holiday Tree. The men and women who write about “social science” were uniformly giddy.
Why the commotion? For years, social scientists have believed (scientifically) that it is extremely difficult to change another person’s political opinions, especially if the other person is an “everyday American” and not a social scientist. Multiple studies and mounds of research—by political scientists, sociologists, social psychologists, all kinds of scientists—have confirmed the bullheadedness of everyday Americans. People’s attitudes and opinions are not the consequence of argument or experience, research revealed, but rather of unreasoning bias and emotion. The studies proving this, by the way, are not the consequence of bias and emotion. They are designed by scientific researchers studying other people.
So imagine the surprise when a pair of researchers found that Americans weren’t so stubborn after all, at least when the subject was gay marriage. The scientists, one from Columbia, another from UCLA, published their paper last December. The paper was called “When Contact Changes Minds: An experiment on the transmission of gay equality.” What happened in the experiment was this: Homosexual canvassers and heterosexual canvassers were assigned to go door to door in Los Angeles talking to voters about gay marriage. Ultimately canvassers talked to nearly 10,000 voters living in precincts that had voted for Proposition 8 in 2008. Prop 8, if you need reminding, is the nefarious (among social scientists) constitutional amendment banning gay marriage in California. These Angelenos were hard cases, in other words. But they cracked with astonishing ease.
The writers who cover social science were deeply impressed with the scientific rigor of this experiment. For control purposes, some of the voters were canvassed by a gay canvasser, others by a straight canvasser (other controls were used too). The canvassers didn’t argue with the everyday Americans, they didn’t cajole or steamroll them with facts. Instead they read from a script provided by a local gay rights organization, which also recruited the canvassers. The canvassers were supposed to keep the conversation going for 20 minutes. The script could be loosely or closely followed, but at all times, according to the press release accompanying the study, the canvassers would be engaged in “heartfelt, reciprocal, and vulnerable conversations.”
Heartfelt, reciprocal, and highly persuasive. Over the course of the conversations, the researchers said, a very large number of voters changed their initial view on gay marriage from anti- to pro. (None of the voters who was for gay marriage changed his mind.)
This wouldn’t be big news, necessarily. Short-term effects like this are pretty common among the people you find in social science experiments. They say they change their minds but then they change back again.
The big news in “When Contact Changes Minds” was that the minds stayed changed! When the canvassers went back to the voters at intervals of three weeks, six weeks, nine months, and a year, most of the changelings were still in favor of gay marriage. Even better, the gay canvassers managed to change many more minds for longer periods than the straight canvassers. And a further survey discovered that the respondents were, on average, able to persuade at least one member of their household to take the pro-gay marriage view too.
Vox, a well-trafficked website that often explains the complexities of social science to its readers, put it like this: “Before, respondents felt the same about gay marriage as Nebraskans; after, they felt the same as folks from Massachusetts.” There is no better definition of heaven, scientists say.
The researchers wrote up their report and sent it off to the “peer-reviewed” journal Science. The peers reviewed it. They gave the editors a thumbs-up. The gay rights group in L.A. rushed out a press release. The swoon, as I say, was universal.
A month later, in January, two researchers from Stanford and Berkeley found themselves so impressed with the study that they decided to extend its methods into other areas of opinion research. (Note, they didn’t want to replicate the study. Though other sciences consider replication necessary to establish the validity of a finding, in the social sciences it’s strictly for chumps. Typically replication is done only by the people who did the original experiment. Replications are therefore wonderfully successful.)
The Stanford-Berkeley guys retrieved as much of the original data as they could and undertook a blizzard of statistical manipulations. To their surprise, they say, they concluded that the study was more or less worthless. One of the original researchers, a psychologist from UCLA named Michael LaCour, had fabricated the data, hopelessly contaminating the results gleaned by the canvassers. Last week, the other original researcher, a well-known political scientist from Columbia named Donald Green, wrote an embarrassed letter to Science retracting the study, saying he had relied on the data provided by LaCour.
To their great credit the outlets and writers that reported the original bogus result, from Andrew “Wow!” Gelman in the Post to Vox to the New York Times, quickly ran news of Green’s retraction. Crow was eaten. The headline in New York magazine captured the crestfallen mood: “A Really Important Political Science Study About Gay Marriage Used Faked Data.”
Now, it’s much too easy to make sport of the hacks; anyone, any journalist, can fall for a fraud if it’s clever enough.
The thing is, this fraud wasn’t particularly clever. In fact, it looked like a typical endeavor in pop social science—which is to say, pretty ridiculous by the commonsense standards of people outside social science. Journalists have become habituated to the implausible, not to say preposterous, mechanics of experiments routinely undertaken in sociology, political science, social psychology, and the rest. Scientist and hack alike treat these artificial protocols as though they were guarantees of the kind of objectivity that the physical sciences strive for and attain. Outright fabrication of data is probably rare in the social sciences. As in Washington politics, the scandal lies not in what’s unusual but in what’s typical. In the case of “When Contact Changes Minds,” the clues were there in the original paper, evident to anyone not blinded by the dazzling claims of scientific hoo-ha.
Just for starters, the incredible size of the experiment’s effects should have struck people as . . . um, incredible. These entailed a shift among respondents of 20 percent from the anti-gay marriage view to the pro-gay marriage view, after only 20 minutes of conversation with a stranger. No such permanent shift, of any size, had ever before been recorded in similar experiments. “We’re talking about a causal effect that’s a full 40 percent of what is pretty much the maximum change imaginable,” Gelman wrote in the Post when the study was first released. “Wow, indeed.”
Last week, after the debunking, Gelman wrote: “in my previous post on the topic, I expressed surprise at the published claim but no skepticism.”
There were lots of reasons for skepticism. The study was laced with evidence of self-interest that could tilt results. The simple term “gay equality” in the subtitle of the paper shows we are in the realm of advocacy rather than investigation: It has become a term of art among activists, an infinitely elastic portmanteau, used less for description than propagandizing. (Affirm traditional marriage and you oppose “equality.”) It’s a much more tendentious phrase, for example, than the now-neutral “gay rights,” and we should assume it was chosen with care.
The continuing involvement of the gay activist group would likewise have hoisted a red flag for someone with a skeptical eye. The group hired the canvassers and wrote the scripts, and it had undertaken the original canvass with the specific purpose of testing methods to advance the cause of gay marriage. In explaining the findings to reporters, LaCour, the researcher who cooked the data, paraphrased the 1970s gay activist Harvey Milk: “It’s harder to deny people rights if those people have names and faces.” If a scientist talks politics when he should be talking science, double-check the numbers.
And there’s much more. As is common in social science research, the respondents were paid to participate, and at intervals over time. Arrangements like this can easily influence how much voters wanted to please the canvassers by telling the canvassers what the canvassers wanted to hear. (Of course I still agree with you! Where’s my fifty bucks?) The sample of voters wasn’t randomly chosen, as it is in the most plausible experiments; instead the researchers used a more convenient “snowball sample,” which means respondents basically were self-selected.
In “When Contact Changes Minds,” the researchers used a technique known as the “feeling thermometer” to gauge the strength of voters’ opinions: The respondent offers his feelings about a person or issue at different points in time on a scale of 1 to 10, as on the old McLaughlin Group. The feeling thermometer is notoriously imprecise, often producing wild effects, and scrupulous social scientists avoid it. Indeed, one clue that tipped the Stanford-Berkeley debunkers to the bogus results was the strange uniformity in the feeling thermometer results.
No single one of these shortcomings is enough to cripple a study, but taken together they would arouse suspicion—if the subject weren’t a matter of such ideological urgency for researchers and journalists alike. You can’t help but suspect that had such a questionable piece of work produced a result unflattering to the cause of “gay equality,” social scientists and journalists would have flogged each of its methodological mistakes. But this assumes that such a study could get published in the first place.
Which leads us to what should have been the brightest red flag of all. The study confirms—perfectly, exquisitely, suspiciously—the picture that gay marriage advocates hold of the believers in traditional marriage, who are assumed to be at once brainless and heartless. Given that no rational or objective reasons exist for opposing gay marriage (goes the assumption), the only explanation for such a view is an unfamiliarity with gay people and a lack of sympathy for them. That’s why the gay canvassers just had to be more persuasive than the straight canvassers. Harvey Milk just had to be right.
Before the debunking, before the retraction, the scientists and their journalistic followers scratched their chins raw trying to explain the process the voters went through in changing their views. Green, the researcher who eventually retracted the study, tried to place himself in that typical American household after the gay canvasser had left.
“Perhaps,” Green mused, “the conversation was something like, ‘Honey, I met a gay man and he was nothing like the gay man I thought I would meet.’ ”
It’s the comment of a man who has yet to recover from watching All in the Family. The America of Green’s imagination must teem with Archie Bunkers. The uplifting message of “When Contact Changes Minds” was that such people could be redeemed, and pretty quick too.
In truth Green’s study tells us nothing about the people who hope to defend traditional marriage. It does speak volumes—whole libraries!—about the parochialism and ignorance of the social scientists who did the study, their peers who reviewed and published and cited it, and the journalists who swallowed it whole.