Tuesday, April 12, 2011

One of our readers destroys the 'study' that the new Title IX sexual assault directive relies on

Yesterday, we posted a very important story about the new Title IX directive that is going to change how rape cases are handled on U.S. college campuses.  We noted that the U.S. Education Department's Office for Civil Rights has just done something most people would not have thought possible. It's made college campuses even less friendly for male students. Among many other things, they've lowered the standard of proof for sexual assault cases -- from now on, to be in compliance with Title IX, at a grievance hearing on a sexual assault charge, the school need only prove the male's responsibility by a preponderance of the evidence.

A reader to this blog -- I can vouch for this man's brilliance -- destroys the rape study that this new Title IX directive cites. If you want to know how dishonest our own government can be when the subject is rape, read this. The following are the words of our reader:

So I was reading FRS today, and following some links on that incredible Title IX stuff, and I soon realized that the “study” Biden and others are citing right now is a new one — one I’d never seen before. So I decided to take a look.


It took me about three minutes to realize that it was bullshit.

Never mind the first paragraph of the Executive Summary, which reads like it was crafted by the director of a rape crisis center. And never mind that the (ahem) research was done by a bunch of academics whose objectivity can immediately be called into question. And focus instead on this little nugget:

To recruit the students who were sampled to participate in the CSA Study, we relied on both recruitment e-mails and hard copy recruitment letters that were mailed to potential respondents. Sampled students were sent an initial recruitment e-mail that described the study, provided each student with a unique CSA Study ID#, and included a hyperlink to the CSA Study Web site. During each of the following 2 weeks, students who had not completed the survey were sent a follow-up e-mail encouraging them to participate. The third week, nonrespondents were mailed a hard-copy recruitment letter. Two weeks after the hard-copy letters were mailed, nonrespondents were sent a final recruitment e-mail. The overall response rates for survey completion for the undergraduate women sampled at the two universities were 42.2% and 42.8%, respectively.


Are you kidding me?

Are you fucking kidding me?

That, to put it mildly, is Statistics 101. Covered before the mid-term, and likely before the first pop quiz.

These people are so shameless, so utterly and totally full of themselves and of their own shit, that they don’t even bother to hide their bias. Or their incompetence. Either of which a college freshman, when she’s not being sexually assaulted before Thanksgiving, should be able to discern in an instant.

In the study, they claim to have “cleaned” the data to adjust for this — as if that’s fucking possible! — but then also have the gall to suggest that, well, this method may have had the opposite effect:

The reasons for nonresponse could affect prevalence estimates in opposing ways. Some nonrespondents (nonvictims) may have chosen not to participate because they felt that they had no relevant experience, whereas other respondents (victims) may have chosen not to participate because they anticipated that taking the survey might be upsetting to them.

You’ll excuse me, now, while I go throw up.