UC-Davis Psychology Professor Gregory Herek Aims to Debunk Anti-Gay Extremist Paul Cameron
Paul Cameron's discredited research remains a mainstay of the anti-gay religious right. But one key expert aims to change that.
As a scientist, Paul Cameron is a disgrace. Under his chairmanship, the Family Research Institute, a Colorado Springs, Colo., gay-bashing propaganda mill, has churned out reams of pseudo-scientific "studies" purporting to prove that gays and lesbians are more prone than heterosexuals to commit murder, die young and molest children, among many other dubious findings.
The American Psychological Association and the American Sociological Association have publicly discredited Cameron. But he remains a hero to homophobes on the religious right who deploy his junk science in their campaigns against civil rights for gays and lesbians. One of Cameron's most articulate critics is Dr. Gregory Herek, a professor of psychology at the University of California-Davis, who has also served on the faculties of Yale University and the City University of New York.
Herek in recent years has debunked Cameron's findings in dissections archived on-line at the UC-Davis website, including a point-by-point takedown of a 1983 national sexual behavior survey that Cameron used as the foundation for most of his subsequent research papers. Herek has identified numerous fatal flaws in that survey's methodology, meaning that any conclusions drawn from it are irremediably tainted -- "garbage in, garbage out," as the expression goes. Cameron has said very little in his own defense against Herek's withering critiques, other than to call his opponent "a flaming gay."
INTELLIGENCE REPORT: Dr. Herek, let's start off by discussing your critique of the 1983 national survey conducted by Paul Cameron as director of a group he called the Institute for the Scientific Investigation of Sexuality [ISIS]. Given that one of the best ways to evaluate the reliability of a research survey's results is to examine its sample selection, what do you see when you scrutinize the sampling in the 1983 ISIS survey?
GREGORY HEREK: First of all, the response rate is unacceptably low. Cameron reports his research team randomly targeted 18,418 households for their sample, and of that initial group, only 4,340 households actually completed a survey. The rest either weren't home when the doorbell rang, they directly refused the survey, or they accepted a survey but never completed and returned it. Divide 4,340 by 18,418 and you have a 23% response rate. It's important to judge the response rate in its historical context. High-quality surveys typically obtained response rates around 75% back in the early 1980s. There is no absolute scientific standard for what constitutes an acceptable rate of response for research of this kind. By any stretch, however, 23% is inadequate.
IR: Considering that extremely low response rate, how do you think non-response bias affected the validity of the data collected?
HEREK: The simple answer is that we can't know for sure. However, given the survey study's other methodological problems, it is reasonable to hypothesize that the people who took the time and initiative to complete and return a 550-question survey left at their front door -- which included many poorly worded and often intrusive questions about sexual matters -- differed in important respects from those who didn't complete it. I think most social scientists would be concerned that the pool of respondents included a disproportionate number of people who had a special interest in reporting their sexual experiences.
Another concern would be that some respondents may not have been providing accurate data but instead might have purposely exaggerated their sexual experiences to have some fun at Cameron's expense.
IR: You've criticized the design of the ISIS survey as being too long and needlessly complex, and for lacking certain basic accuracy safeguards. Could you outline those design flaws?
HEREK: Many of the questions required respondents to read a large number of alternative answers and then to follow intricate instructions. For example, in one section respondents were expected to read a list of 36 categories of persons, including, "My male YMCA counselor," "My male Scout counselor," "My male camp counselor," and "My female grad school teacher," and then to note the age at which each of these categories of persons made what was described in the instructions as "serious sexual advances."
There's good reason to believe the validity of the results obtained by such a questionnaire would be negatively impacted by what's termed "respondent fatigue," but there's no way to know for certain because Cameron failed to follow the basic practice of including consistency checks in his questionnaire, which is where questions from the early section of a survey are repeated with alternate phrasing toward the end.
IR: Cameron also gave numerous media interviews in the cities where his surveys went out in which he predicted and even promoted certain results. You've identified his conduct as a serious breach of professional standards. Why is that?
HEREK: One of the principal challenges of social research is that the individuals who are being studied can become aware of the researcher's expectations or goals, which can alter their behavior. For this reason, researchers do not communicate their expectations or hypotheses in advance. Nor do they bias participants' responses by suggesting that a particular answer is more correct or desirable than others.
Contrary to this well-established standard, Cameron publicly disclosed the survey's goals and his own political agenda. In one front-page interview, he characterized the survey as providing "ammunition for those who want laws adopted banning homosexual acts throughout the United States." This was in an article headlined "Poll Will Help Oppose Gays" that appeared while data collection was still in process.
IR: What other problems have you identified in Cameron's data collection methods?
HEREK: In a 1994 pamphlet, Cameron related an anecdote about interviewing a man who identified himself as homosexual [and] who, in response to a question about having ever killed another person, gave his interviewer a phone number and asked, Cameron wrote, "to keep him in mind if we ever wanted anyone killed." And then Cameron wrote, "His metallic eyes and steel spring sneer as he assured us of his sincerity are not readily forgotten."
This anecdote is significant because, if true, it suggests that Cameron was himself present. Legitimate researchers employ field staffers who don't know the study's hypotheses and are carefully trained to communicate a nonjudgmental and respectful attitude to all respondents. These standards are especially critical in surveys that involve sensitive information. Cameron's presence would have fundamentally tainted any survey's results, because he had previously expressed strong antipathy toward homosexuals in the news media.
IR: Even if Cameron's data collection methods had been totally sound, and his raw data accurate, you've identified numerous ways that he twisted his analysis of that data for the purpose of later supporting his sensational conclusions. What's an example of that?
HEREK: One example is a 1996 report Cameron published where he reported that children with a homosexual parent are 50 times more likely to be victims of incest than children with heterosexual parents. He based this finding on 17 responses to the 1983 survey where a person reported that one of their parents was a homosexual and that they had been incest victims as children. Now, those 17 represented 29% of the total number of respondents to Cameron's survey who reported having a homosexual parent. So Cameron compared that 29% to the 0.6% reports of incest from the children of heterosexual parents and thereby arrived at the "50 times more likely" finding.
IR: What's the problem with that?
HEREK: The size of the sub-sample. Most of your readers would be familiar with the idea of a "confidence interval." When they hear about public opinion poll results they might have noticed that often those are framed in terms of a plus or minus. You know, "Fifty percent of the population said this and these results are correct within plus or minus three points or five points."
Cameron based his findings on a sub-sample of 17 respondents. The margin of error in a random poll or survey of 17 people is plus or minus 33 percentage points. Based on that margin of error, the only statistically valid conclusion that Cameron could have drawn from those 17 responses is that the true proportion of children having a homosexual parent and being an incest victim is somewhere between minus 4%, or effectively zero, and 62%, which is a such a wide margin as to be absolutely meaningless.
It's important to understand that the presence of even one of these serious errors in Cameron's sampling techniques, his survey methodology and his interpretation of results would be enough to cast serious doubts on the legitimacy of any study's results. In combination, they render his data worthless.
IR: What's your evaluation of the basis for Cameron's numerous published studies claiming homosexuals are far more likely to molest children than heterosexuals?
HEREK: For quite a long time, Cameron has made the equation that any adult male who molests a male child is homosexual. And all of this has the effect of increasing, in Cameron's own analysis, the number of homosexual and bisexual men who are in the child molestation category. The truth is that researchers in the area of child abuse have long questioned whether it is even valid to assign labels like "homosexual" and "heterosexual" to molesters in the sense of saying that these individuals have an adult sexual orientation where they are attracted mainly to either people of their own sex or people of the other sex. A lot of researchers actually put pedophiles and child molesters into their own separate category, which is they're actually attracted to children and it has everything to do with the age of the child, and not so much to do with the gender of the child. So it's really questionable in general whether it makes sense to use labels like "homosexual" and "heterosexual" to describe a child molester's sexual orientation. But Cameron has repeatedly just assumed that if you have an adult male molesting a male child, then that molester is gay, and that's not a valid assumption.
IR: How would you characterize Cameron's status within the scientific community?
HEREK: You couldn't say his findings have much currency at all. I think most scientists either ignore him or are unaware of him. I say this because his research is rarely cited in scientific literature and when it is cited, it's almost always to criticize it, to either use it as an example of bad research, or as research that's done with a particular political agenda. Outside of academia, the place where his research mostly gets cited is on Web sites of organizations that have a very strong anti-gay agenda and that are trying to work against legislation that would protect gay men and lesbians from discrimination, or are trying to further some other political goal that is hostile to gay men and lesbians. The problem with that is Cameron's studies are presented with extensive footnotes and lengthy bibliographies. Members of the lay public who lack training in research methods and statistics may encounter Cameron's studies and mistakenly assume they're scientifically sound.
IR: You've testified to Congress about anti-gay violence, and in 1997 you were invited President Clinton's White House Conference on Hate Crimes. What connections do you see between Cameron's studies and hate crimes against gays and lesbians?
HEREK: Well, that's a good question, and it raises a broader issue about what the connection is between the anti-gay political movement that exists in this country and violence against lesbians and gay men as well as individual acts of discrimination and prejudice.
After the Mathew Shepard murder, I was asked if I thought it had anything to do with the fact that just a week before conservative Christian groups had started their campaign [arguing] that gay people can change their sexual orientation and don't have to be gay. And the thing is, I don't necessarily believe the sorts of people that commit hate crimes are reading the publications of these groups or studying Paul Cameron's research. On the other hand, I think there is an indirect influence, in that the anti-gay organizations that use Cameron's data are helping to create a social climate that vilifies people who are lesbian, gay, or bisexual, that says they are not really human, that they are evil people. And what that conveys as well is a certain sense of permission to do something about that. It sets up a situation in which people might well perceive that if they are attacking gay people either verbally or physically, that somehow doing so is a way to be a virtuous person, because here are these terrible, evil, awful people.
Now, as far as thinking about the linkage between Cameron's work and more structural manifestations of prejudice, there I believe it definitely is the case that these different groups have tried to use Cameron's data in legislative hearings and judicial hearings, and they've tried to actually bring his statistics to bear on public policy. The use of his data is something that has been attempted a number of times in different courts and legislatures. And actually, I might add, this is not just happening in the United States. I've gotten a number of E-mails from people in Australia, Norway, Scotland, all over the world, where anti-gay groups have used Cameron's work and sometimes even had him there in person to try to pass anti-gay legislation or to try to have an influence on policy. And that's cause for concern because were it not for the sensational nature of Cameron's data, the public would most likely never have heard of him, because from a scientific standpoint his work is meaningless.