Showing posts with label pseudoscience. Show all posts
Showing posts with label pseudoscience. Show all posts

Saturday, April 03, 2010

Moral Confusion


Slightly shorter Sam Harris:

Good news, Freethinkers! I've come up with a way for us to define morality in universal, scientific terms, which will make it easier and more economical to use. Basically, it's a simple matter of defending vague utilitarian platitudes by means of 1) crotchety appeals to common sense; 2) a firm conviction that my critics are intellectually or emotionally defective; and 3) several platoons of strawmen.

I regret to say that a number of people have been going out of their way to disagree with me. I've even been accused, with malice aforethought, of logical fallacies like petitio principii. As if!

Why do treat people me this way, when all I'm trying to do is help them recognize that the foundations of human knowledge must perforce stand upon the ontological bedrock of the "rather obvious"? And to comprehend that all so-called "alternative" theories of moral judgment secretly agree with mine?

There are several reasons. Some of my critics have clearly fallen under the spell of philosophers who happen to be wrong, duh. Others, I'm sorry to say, are such appalling moral cripples that I can scarcely stand to be in the same room with them. (Ah me, how I've suffered on behalf of Truth!)

Another part of the problem is our modern tendency toward moral relativism. This may possibly have made some sort of sense as a response to the ethnocentric authoritarianism of the bad old days, but only an imbecile would use it as a shield against Eternal Verities like the ones I intend to discover.

If error can run so rampant among supposedly intelligent people, you can imagine how bad things are elsewhere. That's why we need to create — or acknowledge [waves modestly] — a class of "moral experts" who will all agree on certain basic, undeconstructible concepts, just like teh physicists do.

The first step is to purge ourselves of emotionalism and political correctness, and recognize that some people's lives simply aren't successful, in moral and therefore scientific terms. Any practical concerns about what might happen after "moral experts" make this judgment are misplaced, because the people making it are the smart ones, in scientific and therefore moral terms.

This, as anyone who's neither stupid nor bad will concede, is exactly what the world has been missing all these years.

It's all so obvious! And yet it's so hard to compel people to believe it, even though it's for their own good. Why can't everyone simply be reasonable?

PS. Don't be talkin' to me about Hume and Rawls and people like that. I've read them.
Now, if you'll excuse me, I'm going to lie down in a sewer and eat some tinfoil.

UPDATE: RMJ has much more, qualitatively and quantitatively speaking.

Friday, February 05, 2010

Playing a Role


A press release from the Society for Research in Child Development explains the findings of a new study:

Moms influence how children develop advanced cognitive functions

Executive functioning is a set of advanced cognitive functions — such as the ability to control impulses, remember things, and show mental flexibility — that help us plan and monitor what we do to reach goals. Although executive functioning develops speedily between ages 1 and 6, children vary widely in their skills in this area. Now a new longitudinal study tells us that moms play a role in how their children develop these abilities.
The study "looked at 80 pairs of middle-income Canadian moms and their year-old babies." (Apparently, all the fathers were out hunting mastodon.) Here's how the lead author describes her findings:
"The study sheds light on the role parents play in helping children develop skills that are important for later school success and social competence," according to Annie Bernier, professor of psychology at the University of Montreal and the study's lead author.
That's pretty straightforward, and reflects the language of the paper itself. So why does the press release focus on "moms," as though executive functioning were dependent specifically on the (middle-income) mother's presence and level of engagement? Beats me. But since this sort of distortion often happens when research into gender roles, sexuality, and parenthood is summarized for the media, I assume that the people in charge of PR know their market, and have a good sense of which emphasis is more likely to get the research noticed.

Which leads me, naturally enough, to this quote from Pierre Bourdieu:
[S]ocial scientists, and especially sociologists, are the object of very great solicitude, whether it be positive - and very often profitable, materially and symbolically, for those who opt to serve the dominant vision, if only by omission (and in this case, scientific inadequacy suffices) - or negative, and malignant, sometimes even destructive, for those who, just by practising their craft, contribute to unveiling a little of the truth of the world.

Thursday, January 07, 2010

The Big Questions


Barry Goldman is worried that we are becoming "a nation of fruitcakes." Once upon a time, y'see, Americans were reasonable and had a certain inborn respect for whatever authority they happened to find emotionally or intellectually or legally compelling. But nowadays, we're prone to syncretic religion and communication with the dead and Lord only knows what else.

A new poll by the Pew Research Center's Forum on Religion & Public Life concludes: "Large numbers of Americans engage in multiple religious practices, mixing elements of diverse traditions. Many also blend Christianity with Eastern or New Age beliefs such as reincarnation, astrology and the presence of spiritual energy in physical objects. And sizable minorities of all major U.S. religious groups say they have experienced supernatural phenomena, such as being in touch with the dead or with ghosts."
Shocking. It makes me long for the good old days, when Americans entrusted their souls to Madame Blavatsky, or the Mormons, or the Free Market, and the belief in ghosts was as rare as a sober Irishman.

On second thought, maybe the problem isn't simply that Americans believe outlandish things. Maybe it's that they can easily find confirmation of these beliefs on the Internets. It used to be that if you wanted to know the divine meaning of a monstrous birth in the next village, you had to buy a broadsheet from some itinerant peddler. Now, you can simply visit www.poughkeepsiedevilchild.com.

On third thought, maybe the problem is that "people feel entitled to make choices about things that used to be within the exclusive purview of the priestly class." The recent activities of Anne Hutchinson are a sobering example of this tendency, and a reminder that we who dwell in in these latter days must keep close watch against the spirit of Apostasy.

Then again, perhaps this national crisis really boils down to a lack of consensus. Perhaps the trouble is not so much that we embrace false beliefs, as that we don't embrace them unanimously.
This is genuinely scary. And it's scary in a new way. For the last several thousand years, large groups of human beings enjoyed consensus about the big questions. We may have believed that the universe rested on the back of a giant tortoise and the tortoise rested on the back of an elephant...but at least there was widespread agreement.
To be fair, some people "enjoyed" this consensus a bit less than others. But I suppose it served them right for feeling entitled to make choices about things that used to be within the exclusive purview of the priestly class. As John Winthrop wisely said, "Your conscience you must keep, or it must be kept for you."

Of course, the larger problem is that for the first time in recorded history, "we have no agreement on what constitutes a fact." As William of Ockham recently noted on his blog:
[I]t is absurd to claim that I have scientific knowledge with respect to this or that conclusion by reason of the fact that you know principles which I accept on faith because you tell them to me.
All the same, there's a lot to be said for accepting things on faith, so long as it leads to the sort of widespread agreement we enjoyed before the Internets came along and ruined everything.
We used to be a nation with a broad consensus. If you had a religious question, you asked a religious leader. If you had a scientific question, you asked a scientist.
And if you wanted to know something about history, you asked a historian, instead of cobbling together a bunch of harebrained bullshit in order to flatter your own hopelessly confused prejudices.

Those were great days, indeed. And their like will not be here again.


Thursday, October 29, 2009

Data and Logic


Steve Levitt explains economics:

Our question, at noted above, is what is the cheapest, fastest way to quickly cool the Earth. Like every question we tackle in Freakonomics and SuperFreakonomics, we approach the question like economists, using data and logic to conclude that the answer to that question is geo-engineering....
And ocean acidification:
"Of course, ocean acidification is an important issue. Now, there are ways to deal with ocean acidification, right, it's actually, that's actually, we know exactly how to un-acidifiy the oceans, is to pour a bunch of base into it, so, so if that turns out to be an incredibly big problem, then we can deal with that."
(Photo by Rakka.)


Tuesday, February 24, 2009

Some Kind of Link


Since everyone knows that teenagers have no desire for sex unless they've been corrupted by some Pied Piper of sleaze, research into pop-music lyrics is Serious Business.

There's still no firm proof that raunchy music makes kids have sex, but a new study provides another suggestion that there's at least some kind of link between "degrading" songs and teenage sexual activity.
Once you assume that there's some kind of link between hearing about sex and having sex, confirming the theory is relatively light work: Over here, Britney Spears is singing about being a slave 4 U. Over there, a couple of teenagers are having awkward, fumbling sex. Game, set, match!

There are degrading sexual messages in TV shows. And commercials. And magazine ads. And comic books. And those morning radio shows where faux-populist hosts get the workforce's adrenaline flowing by degrading women, laughing at tragedy, and generally acting like a bunch of loudmouthed fucking assholes. For that matter, some children may be exposed to degrading sexual behavior and attitudes in their own homes. But teens' so-called music is obviously much more significant, speaking from a purely objective sociological standpoint. That's just common sense.

Because this is Science, the first thing we need is a clear definition of "degrading."
The researchers...determined how many of the 279 most popular songs in 2005 were "degrading" because they referred to sex that's "based only on physical characteristics" and features a "power differential" instead of being mutually consensual.
First off, sex that's "based only on physical characteristics" is not degrading in and of itself; it's acceptable, in this day and age, for men and women to have recreational sex with people with whom they share no other interests, and have no intention of dating, let alone marrying (assuming, for the sake of argument, that they belong to that lucky group of citizens who are allowed to marry).

This kind of sex is often degrading because we live in a society that sees it as degrading, for reasons having to do with misogyny and homophobia and good old-fashioned ressentiment.

Second, the whole problem with a power differential is that it can create consent (cf. the groupie phenomenon). In some cases, consent can even be seen as an additional degradation (cf. the woman in the clothing ad above).

So right off the bat, the definition of degradation is mired in the same psychosexual fever swamp as the lyrics it's intended to critique. People who have casual sex for no more transcendent reason than ordinary human lust are "degraded" (especially if they're women). But where there's consent, one needn't look too closely for a power differential. (If only our horny teens could get this through their skulls, somehow!)

Those are somewhat arcane objections, though. There are more basic problems here.
"Wait (The Whisper Song)" by the rap group known as Ying Yang Twins was deemed degrading, apparently because it included a reference to rough intercourse.
You can Google the lyrics, if you like, and decide for yourself whether it's the "reference to rough intercourse" that makes the song degrading. I see a larger problem, personally.
By contrast, the lyrics of the rap song "Baby I'm Back" by Baby Bash, including the lines "I wanna be stronger than we've ever been/I'm here to cater to you," was said to be not degrading.
Just for the record, this song is about a two-timin' man who wants to be allowed back into his jilted partner's...er...good graces, and it includes such tender blandishments as "please forgive me for being a rolling stone; please forgive me, let me polish it up like chrome," and "I was gone for a minute; now I'm back, let me hit it."

At this point, I think that we can all agree that this study has a couple of conceptual flaws. On the bright side, these problems are more than made up for by the high quality of the results:
[T]he findings don't prove that the music caused kids to have sex, acknowledged [Dr. Brian A.] Primack, who's an assistant professor of medicine and pediatrics at the University of Pittsburgh School of Medicine.

"The opposite could be true -- that people who have more sex then go out and seek music with degrading sexual messages," he said.
Well, at least we know that it's one or the other. It's a start.

And we also know that teenagers shouldn't be having sex, period, responsibly or otherwise. Which means that something must be done!!1
What to do? Laura Lindberg, senior research associate at the Guttmacher Institute in New York City, said that teens need to learn how to interpret and analyze the messages they see in the world around them.
It seems to me that this is what they're doing already, and that adult culture isn't contradicting their interpretation in any serious way. A feminist (or at least anti-misogynist) overhaul of our society could help with that problem; unfortunately, that would require being a certain amount of acceptance of teenagers as autonomous sexual beings, among other concessions to reality that we find emotionally or politically or logically unacceptable. That leaves us with our longstanding two-pronged approach: spouting anti-sex pieties that are totally at odds with adult culture -- and which teenagers can see through like a gang sign thrown by Jay Leno -- and reinforcing the sexual stereotypes and double standards on which those pieties are based.

But even if we can't handle the issue in all its complexity, we can at least avoid oversimplifying it:
"[T]here's no silver bullet," [Lindberg] said. "If you get all teenagers to turn in their iPods, the teen pregnancy rate is not going to automatically decline."
I'm not so sure. If we take all their iPods away, and pregnancy rates drop, what other explanation could there possibly be?

And besides, how can we know until we try? That's the scientific method, after all.

(Illustration: Ad for Mr. Leggs Trousers, 1970. Via Found in Mom's Attic.)

Wednesday, October 01, 2008

Scary Stories


Chris Horner finds it amusing that members of Mensa -- who are supposed to be, like, all smart and shit -- have invited the archfiend James Hansen to be the keynote speaker at an upcoming synod, or convocation.

How can bright people believe, like the UN Secretary General, that computer model scenarios of the future are more frightening than Hollywood movies? Because they’re . . . real?

Well, apparently because they also accept observed, um, truths like “It is now firmly established that Earth’s global surface temperature is increasing and that human emissions of greenhouse gases (GHGs) are the primary cause of that global warming.”
Horner may not know much about climate modeling, but he knows what he likes. And he likes this:


Most people, when they attempt to win an argument by posting a graph, make some effort to explain what the graph measures, and how, and why, and so forth. Horner can be forgiven for this oversight, as he's writing exclusively for blue-ribbon experts in the field of applied neopaleoconservato-climatologographical discoverism.

As for me, I write for the Plain People of the Blogosphere, who may not have much book-larnin', but know how many apples makes four. Thus, I think I should point out that the "2002" on the bottom left represents the year 2002, while the "2008" on the right represents the year that we're all enjoying right now, viz., 2008.

What we have here, in other words, is a six-year snapshot of temperature anomalies and CO2. The latter is going up. The former are fluctuating, but trending downwards.

You may have heard it said that there are insufficient data to support AGW; after all, instrumental temperature records only go back for over a century. No such objection is thinkable in regards to six years' worth of data, though. This one graph, spanning a mere three-quarters of the Bush administration and ending with a global cold spell, solves all problems, answers all objections, and refutes, in absentia, any "computer modeling scenarios" that predict this trend won't persist over the next decade.

Regardless, the conclusions Horner has drawn from it are wrong. And I have a chart that proves it.


What more evidence could anyone want?

Horner's no alarmist, generally speaking, but he is sincerely worried that Mensa is in danger of being "hijacked" by "activist members," just like the American Meteorological Society (which has apparently been taken over by zealots who insist on looking at climate data collected prior to 2002). A scary story, indeed.

Just for the record, the World Meteorological Organization measures climate averages over 30-year periods, in order to eliminate natural year-to-year variation. The question is: Are they crazy, or just plain stupid?

Thursday, June 26, 2008

Ice Cream or Oil?


CNBC has posted a helpful slideshow that helps concerned consumers to become better informed -- in a certain sense -- about the price of oil. Its title is revealing: "More Expensive by the Barrel: Ice Cream or Oil?"

As you click through the slides, you learn that Coca-Cola is currently a bit less expensive than light sweet crude, while Perrier comes in at a shocking $300 per barrel.

"Who would have thought water would burn a hole in your pocket?" the caption asks...a sobering question indeed, for those whose vehicles are powered by imported soda water.

Still, it's nothing compared to the horror of the Starbucks latte, at $954.24 per barrel. If our society used these lattes to manufacture and distribute and sell oil, instead of the other way around, we'd really have something to complain about.

And get this: Those hippies Ben and Jerry are always talking about saving the planet...but their ice cream costs $1609 per barrel! Typical liberals, eh? It turns out that perfume is much more expensive than oil, too. You have to admit, it kind of puts things in perspective.

In a better world, a slideshow like this one might ask readers to think about oil's role in creating and transporting and setting the prices of the nonessential goods to which it's being compared. It might even manage to discuss interesting issues like external costs and goverment subsidies. But things being as they are, the slideshow is simply a propaganda piece for Big Oil, and it absolutely drips with contempt for its readers.

It also fails to acknowledge the "mileage" you get from a barrel of, say, Tabasco sauce. Sure, it costs $6155...but that adds up to about 2,789 bottles, which I'd say is an ample, if not generous, supply for most households. I love Tabasco and use it in almost everything I cook; even so, I doubt I've gone through 20 bottles of it in my entire life.

By contrast, $6155 will get you about 1,540 gallons of gasoline at current prices. That's 110 full tanks if you've got a 14-gallon tank, or roughly a year's supply if you're filling up twice a week, as so many drivers do. Perhaps this would be a better basis for comparison, since more Americans buy gasoline by the gallon than crude oil by the barrel?

Then again, doing it that way wouldn't be nearly as much fun. You know what else costs more than oil? Chateau Mouton-Rothschild 1945. So quit complaining, ya goddamn schmendriks, and fill 'er up!

Incidentally, the figures in this slideshow come from John S. Herold, Inc., whose "client base is comprised of virtually every major oil company."

(Photo by Steve Brandon, from his set Nepean, Ottawa suburb of infinite excitement!)

Tuesday, May 06, 2008

The Art of Correct Reading


Carol Iannone takes a break from fretting over America's declining academic standards, and complains that biology's standards are too high:

Ben Stein's Expelled is entertaining and informative. Among the bracing clarifications the film provides is that Darwinian evolution is not even necessary for the study of modern biology and medicine; that Darwin and religion are at odds, as Harvard biologist Edward O. Wilson has written; that the Nazis did indeed take Darwinian science as inspiration; and that science is even more fanciful about the origins of life than religion could ever be (one anti-Darwinian hypothesizes that organic life may have begun on the backs of crystals).
Iannone once spent some time debating NR's Anthony Dick on evolution. The paragraph above gives you some sense of how little she learned from it.

The claim that evolution (we'll ignore the cunning adjective "Darwinian") "is not even necessary for the study of modern biology and medicine" is remarkable not because it's such a brazen lie, but because Iannone is so careful to designate this imaginary biology-without-evolution as "modern." This is as exquisite a critical distinction as I've seen, and I'm sure it'll help to advance Iannone's reputation as a brave thinker of thoughts about things.

Hitler did indeed take certain useful elements of his era's "Darwinian science" as inspiration, when he wasn't being equally inspired by Goethe, Herder, Schopenhauer, Fichte, Hans Horbiger, Henry Ford, Karl May, Napoleon, occultism, Christianity, mythology, and anything else that appealed to his magpie mind. In Mein Kampf, he sets forth his method quite clearly:
A man who possesses the art of correct reading will, in studying any book, magazine, or pamphlet, instinctively and immediately perceive everything which in his opinion is worth permanently remembering, either because it is suited to his purpose or generally worth knowing....
What's "generally worth knowing," from the standpoint of a power-hungry ideologue, is precisely that which serves the purpose of gaining and maintaining power. It's immaterial whether Jesus died on the cross or not, so long as you can score political points by blaming the Jews for it.

If he were alive today, Hitler might well embrace Intelligent Design; it's by no means incompatible with pseudoscientific racism, it's vague enough to leave room for Odin as well as YHWH, and it's prone to a type of wishful thinking and paranoia that could easily have resonated with his own (if we weren't intended by some supernatural intelligence to discriminate against the Jews, how come they were designed to look Jewish?).

Current scientific thinking on race wouldn't be any more suited to Hitler's purpose than it is to, say, Charles Murray's; as manufactured anti-elitist causes go, ID would probably have been at least as useful as Horbiger's Welteislehre was in its day. This idle speculation aside, it's clear enough that Hitler had little interest in any scientific theory (or historical account, or philosophy, or religious dogma, or art) that he couldn't use to prop up opinions he'd already formed.

Iannone's final point, about science being more "fanciful" than any religion, is both gratuitous and, ironically enough, irreligious. This is much more significant than any of her "attacks" on evolution, the basic facts of which hardly need defending.

Iannone would like you to think that she worships an omnipotent god. At the same time, she wants to treat certain (potentially divine) origins and mechanisms of life as too "fanciful" to have any chance of being correct. But unless she's a Biblical literalist of a type that's really pretty rare in her circles, she's free to view any and every fanciful thing as revelation, from hummingbird beaks to animated hypercubes to crystal-mediated abiogenesis. The Lord works in mysterious ways, or so I've heard.

Her crowd generally avoids this more humble and affirmative approach, though, lest they be reduced to cheering science from the sidelines for the insights it provides into the Great Chain of Being, instead of challenging or usurping whatever authority its relatively objective successes have earned.

In other words, what's being defended here is not God's omnipotence or wisdom, or some other spiritual doctrine, but the drab materialism of political domination, which requires the basso profundo of divine thundering to ground and dignify the mandrake shrieks of authoritarian dimwits like Iannone.

To assume that this is actually a clash between belief and unbelief, or reason and faith, is basically to concede the argument to the Right before it begins (which is, unfortunately, a tactic at which the Left has excelled for as long as I've been paying attention). It's not about religion or science; it's about the need to capture or marginalize opposing sources of authority. The typical left/atheist outrage is already factored into this game, and is in fact one of the main justifications for playing it.

Wednesday, December 19, 2007

A Constantly Modified Ideal


Wuxtry, wuxtry! Here's the latest news from the Pleistocene!

Like most people, we’re guided by the instinctive sense that a bigger nest is a happier nest. Though we know maxing out our ecological footprint might involve picking up some bad carbon karma, we feel somewhere deep in our guts that we need this house in order to be happy.
Since we feel this need for more square footage "somewhere deep in our guts," it must be the legacy of our hunter-gatherer ancestors' struggles in the Eemian interglacial era.
Most of us don’t need to worry about freezing or starving to death. Yet our happiness barometer continues to compare our living rooms and countertops and backyard barbecues with a constantly modified ideal. “We are victims of that evolutionary hunting strategy,” Rayo explained when I called him to discuss my real estate challenge. “There’s a difference between what’s natural and what's good.”
Given that the desire for an oversized home is "natural," we might reasonably expect it to be pretty much universal. However, the size of the average American house is twice the size of the average house in Europe or Japan. I guess they've had more luck than we have in casting aside Pleistocene hunting strategies.

Interestingly, the architecture in "primitive" villages tends to be fairly uniform, and larger buildings are usually designed to fulfill communal functions. The scarcity of building materials probably has something to do with this, in many cases. But unless I'm mistaken, this approach is also typical of pre-capitalist societies, which have a somewhat different concept of surplus wealth than people do in, say, Palm Springs.

Furthermore, primitive building patterns tend to reflect tribal organization and belief. To depart dramatically from the standard size and construction would represent a break with tradition, and therefore, perhaps, a failure of socialization.

All things considered, we might question whether McMansions in suburban Ohio are really more "natural," in psychological terms, than tiny huts in the Brazilian rainforest. We might even question whether it's wise to use consumption patterns in 21st-century North America as the key to life in the Pleistocene, so that we can then use life in the Pleistocene to explain - or maybe even justify - consumption patterns in 21st-century North America.

The astonishing mystification to which this theorizing can give rise, even among people of good will, is on display in the article's closing paragraphs. The author has bought a house that's too big, thanks to his throbbing biological urges. In order to redeem himself, he intends to rent out the rooms he doesn't need, and form a commune of sorts:
Our acquisitive, status-hungry genes may wish for a life more grand, more private, more sweepingly elegant and expansively lonely. But scarcity will have relegated us to a life of conviviality and trust.
Where is the evidence that our "status-hungry genes" demand expansive loneliness? Where, for that matter, is the evidence that "a life of conviviality and trust" doesn't confer status (as well as reproductive benefits)?

Why, it's all around us. It's just a matter of looking in the right places.

(Photo: "A traditional Kunama village of Eritrea and Ethiopia," by Mutanga Mulambwa.)

Monday, November 26, 2007

Errors and Heresies


Dinesh D'Souza is troubled by a certain...laxity in modern thought, and goes in search of its parentage:

About a hundred years ago, two anti-religious bigots named John William Draper and Andrew Dickson White wrote books promoting the idea of an irreconcilable conflict between science and God. The books were full of facts that have now been totally discredited by scholars. But the myths produced by Draper and Dickson continue to be recycled. They are believed by many who consider themselves educated, and they even find their way into the textbooks. In this article I expose several of these myths, focusing especially on the Galileo case, since Galileo is routinely portrayed as a victim of religious persecution and a martyr to the cause of science.
Like pretty much everyone who's even marginally informed on this issue, I have little respect for the work of Draper or White. But it's not quite fair to call them "anti-religious," since Draper was a deist, and White an Episcopalian. Also, the conflict they described was between science and organized religion, not "science and God."

I'm sorry to say that D'Souza's account of the "Galileo myth" isn't very accurate, either. After conscientiously explaining that "the leading astronomers of the time were Jesuit priests," he assures us that
They were open to Galileo’s theory but told him the evidence for it was inconclusive. This was the view of the greatest astronomer of the age, Tyco [sic] Brahe.
D'Souza's clear implication is that Tycho counted and weighed Galileo's theory and found it wanting. The problem is, Galileo's theory was based on observations he'd made in 1610, with the aid of a new-fangled device known as the telescope; Tycho had been lying in the cold, cold ground for nine years by then.

Next, D'Souza claims that Church didn't "dogmatically" oppose heliocentrism, but simply demanded a little more proof than Galileo was able to provide. From there, it's a short step to the incoherent position that "the Church should not have tried him at all," but nonetheless deserves credit for the tender mercy of putting him under house arrest for the rest of his life.

Finally, he says that "Galileo was neither charged nor convicted of heresy." Which is technically true, sort of: The heliocentric system was described as "formally heretical," and Galileo was therefore "vehemently suspected by this Holy Office of heresy." He was accordingly given the opportunity to "abjure, curse, and detest the above-mentioned errors and heresies and any other error and heresy contrary to the Catholic and Apostolic Roman Church." Which he did, perhaps because he found it preferable to the alternative.

Obviously, I'm not expecting anyone to be shocked that a column by D'Souza is full of serious errors. I am a bit curious, though, as to whether anyone believes that they're the product of stupidity or ignorance, rather than a fairly sophisticated sense of what he can get away with, given his audience's slavering appetite for lies.

(Illustration from Sidereus Nuncius by Galileo, 1610.)

Tuesday, October 09, 2007

Humorless Fanatics


America's kulturkampfers are normally a tough-minded, unemotional bunch. But remind them about the Christlike sufferings of Larry Summers - or, if you're in a hurry, wait for them to remind themselves - and they'll weep like a captured squonk.

For instance, the irrepressible George Leef is heartstricken that unlike Larry Summers, Iranian President Mahmoud Ahmadinejad was given the opportunity to say stupid things on an American college campus:

The New Republic points out the idiocy of embracing freedom of speech for a murderous dictator, but refusing to listen to a respected scholar.
It's probably my superficial, postmodern intoxication with the play of difference that causes me to believe there's a distinction to be made between "embracing freedom of speech" and "refusing to listen" to someone who's exercising it.

And between "refusing to listen" to a respected scholar, and disagreeing with what he says.

And between disagreeing with him, and providing compelling evidence that his "academic speculation" amounts to little more than bad-faith bullshit.

And between visiting a college in one's capacity as a foreign leader, and running it in one's capacity as its president.

Leef seems to understand that he's sunk himself in a dangerous rhetorical swamp, and accordingly attempts to pull himself out by his own hair, like Baron Munchausen:
Why are so many American professors humorless fanatics who think that everything they do has to revolve around pushing their political vision?
Beats me. As I understand it, Summers is a hero to conservatives for arguing, despite a lack of relevant expertise and evidence, that women are underrepresented in the sciences because they lack certain "intrinsic aptitudes." If that doesn't qualify as "pushing a political vision," it's hard to imagine what would. Even granting the conservatarian tendency to see informed disagreement with male-supremacist pseudoscience as neo-Lysenkoism, it's hard to see how Leef can claim with a straight face that people "refused to listen" to Summers.

Between that, and the fact that he feels personally affronted by the mere existence of Latina/o Studies, I'm thinking Leef may want to direct his search for humorless fanatics to the nearest mirror.

(Illustration via Oldtasty.)

Monday, September 24, 2007

An Attraction to Legacy


An article in The Boston Globe describes the changes that are supposedly afoot in the life sciences:

For half a century, the core concept in biology has been that every cell carries within its nucleus a full set of DNA, including genes. Each gene, in turn, holds coded instructions for assembling a particular protein, the stuff that keeps organisms chugging along.

As a result, genes were assigned an almost divine role in biological "dogma," thought to govern not only such physical characteristics as eye color or hair texture, but even much more complicated characteristics, such as behavior or psychology. Genes were assigned blame for illness. Genes were credited for robust health. Genes were said to be the source of the mutations that underlay evolution.

But the picture now emerging is more complicated, one in which illness, health, and evolutionary change appear to be the work of almost fantastical coordination between genes and swaths of DNA previously written off as junk.
With this in mind, let's peek in on Dr. Lonnie Aarssen, as he searches assiduously for the "Mommy gene":
“Only in recent times have women acquired significant control over their own fertility, and many are preferring not to be saddled with the burden of raising children," says Lonnie Aarssen, a Biology professor who specializes in reproductive ecology. "The question is whether this is just a result of economic factors and socio-cultural conditioning, as most analysts claim, or whether the choices that women are making about parenthood are influenced by genetic inheritance from maternal ancestors that were dominated by paternal ancestors.”
These women "prefer" not to be "saddled" with a "burden"...what else could this be but a clue to the rigors of daily life in the Pleistocene?
Dr. Aarssen suggests that because of inherited inclinations, many women when empowered by financial independence are driven to pursue leisure and other personal goals that distract from parenthood.

“The drive to leave a legacy through offspring can be side-tracked by an attraction to legacy through other things like career, fame, and fortune – distractions that, until recently, were only widely available to men”.
Enjoy it while it lasts, bitches, 'cause the times they are a-changin' back:
The women who leave the most descendants will be those with an intrinsic drive for motherhood. The ones who would rather forego parenthood in order to have a career, lavish vacations and leisurely lifestyles will of course leave no descendants at all. Over time those genetic traits that influence women away from motherhood will necessarily be ‘bred out.’
I suppose we could fret over the loaded language here (e.g., an "intrinsic drive" that some women would "rather forego"), but it'd be irresponsible given the gravity of the threat we face. If feminism is to survive, its adherents must learn to ignore "personal goals that distract from parenthood," as well as the "economic factors" that seem (to the untrained observer) to offer an alibi for childlessness.

Can these barren quasi-women learn to embrace the "genetic predisposition for mating and having children"? Or will they continue to heed their "genetic inheritance from maternal ancestors that were dominated by paternal ancestors," and consign themselves to the evolutionary ash-heap?

Only time will tell.

(Illustration by W.K. Haselden, 1906.)