A Jan. 23 congressional study claimed that salaries for women
managers in seven out of 10 industries examined had declined
from 1995 to 2000. Uncritical newspapers rushed to announce that
the wage gap between men and women had widened.
But National Review columnist Betsy Hart took the time to
examine the study commissioned by Reps. Carolyn Maloney, D-N.Y.,
and John Dingell, D-Mich.
She
found
it to be a "biased and
highly-emotionalized reinterpretation" that the "creative" staff
of Maloney and Dingell had imposed upon otherwise
straightforward data. The reinterpretation allowed Maloney to
label the study explosively as "a wake-up call" for America and
to hint at the need for more federal regulation in the workplace.
Hart phoned Maloney's office, identified herself, and spoke
directly to the congresswoman who mistakenly assumed the
journalist was also a liberal feminist. Maloney explained that
the existence of wage discrimination was considered a fact and
the study had been a search for the supporting data. Then, in
Hart's words, "Maloney ... shared with me her intention to keep
the Right from finding out what she and Dingell are up to."
After all, she didn't "want to scare the right wing so that they
stop collecting data" on women and the workplace.
Anyone familiar with what passes for statistics within
feminism will not be surprised by the willful corruption of
data. An underlying assumption of data-manipulators is that
people are too stupid to notice the sleight-of-hand. The media
exacerbates the problem by not asking the most basic questions
about statistics, even ones with surprising conclusions. For
example, journalists rarely ask, "What is the margin of error?"
or "Does the average reflect a mean or a median?" Public schools
contribute their share by failing to teach the fundamentals of
statistical analysis.
It is important to guard against those who twist data and,
then, wield the results as political weapons. Every statistic
should be required to answer several questions before you accept
it:
1. Who says so? This inquires into the possible bias of the
researchers. For example, Maloney's staffers might well be
biased toward processing data in a manner that supports
legislation the congresswoman favors. The source of their income
doesn't invalidate what they say, but it does call for taking a
closer look at their data.
2. How do they know? Unbiased researchers may employ a
sloppy methodology that comes from laziness or error. For
example, a much-cited study entitled
Prostitution,
Violence and Post-Traumatic Stress Disorder, collected data from
streetwalkers in four "strolls" that were notorious for drug use
and violence. Yet the conclusions of the study comment on all
prostitutes, including high-paid call girls. In short, it uses
an unrepresentative sample to draw broad conclusions about a
general population.
3. What's missing? Always place the data within a proper
context. The GAO data upon which the congressional "study" is
based openly states its limitations: It does not control for
highly significant wage factors such as "years of continuous
presence in the workforce." As Hart comments, "studies which do
control for all relevant factors continually show that the wage
gap between men and women virtually or totally disappears."
4. Does the conclusion make sense? Do not let statistics
displace your common sense. Consider a "fact" popularized
several years ago by feminist Naomi Wolf: 150,000 American women
die each year of anorexia. According to the
Centers for Disease
Control and Prevention, this would make anorexia the
fourth-leading cause of death in both males and females. Yet the
CDC missed that data. The grossly inflated number had been taken
from a newsletter of the American Anorexia and Bulimia
Association, which claimed 150,000 to 200,000 women "suffered"
from anorexia nervosa. The actual death rate is closer to 100.
5. Did someone change the subject? Researchers often
redefine terms in such a manner as to produce desired results.
For example, by the word "rape" most people mean forced
intercourse. But feminist studies frequently include all sexual
assault under that label. In turn, sexual assault is sometimes
expanded to include harassment.
Popular
statistics — e.g., "one
in four female college students will be victimized by rape or
attempted rape" — must include the definition of "rape" being
used in order to be meaningful.
Finally, and most importantly, remember that a correlation
does not indicate cause and effect. A correlation is a mutual
relationship between A and B — for example, if one goes up, then
the other goes down. A cause-and-effect relationship means that
A causes B. Consider the claim that women make 75 percent as
much as men for doing the same job. The statement draws a
correlation between being a woman and earning power but it says
nothing about cause and effect. The 75 percent (if true) may be
caused by other factors not weighed by the study. For example,
women often leave the workplace to have children. This factor
alone may cause much of the wage gap.
In his definitive yet delightfully simple book How to Lie
With Statistics, Darrell Huff observes, "The secret language of
statistics, so appealing in a fact-minded culture, is employed
to sensationalize, inflate, confuse, and oversimplify."
Yet statistics are too useful to dismiss. Instead, the
secrecy should be removed.
With Huff tucked under your arm, unafraid "right-wingers"
should approach Maloney's statistics, ask how they are funded,
whether the conclusions are overbroad, what is their context,
how does she define all relevant terms, and is it a correlation
rather than a cause-and-effect? Develop this level of skepticism
toward data, and four out of three times you won't go wrong.