1/22/2006

Reading your mind ?

Quote from Slate
http://www.slate.com/id/2161936/


Full-Mental Nudity
The arrival of mind-reading machines.
By William Saletan

Years ago, Woody Allen used to joke that he'd been thrown out of college as a freshman for cheating on his metaphysics final. "I looked within the soul of the boy sitting next to me," he confessed.

Today, the joke is on us. Cameras follow your car, GPS tracks your cell phone, software monitors your Web surfing, X-rays explore your purse, and airport scanners see through your clothes. Now comes the final indignity: machines that look into your soul.

With the aid of functional magnetic resonance imaging, neuroscientists have been hard at work on Allen's fantasy. Under controlled conditions, they can tell from a brain scan which of two images you're looking at. They can tell whether you're thinking of a face, an animal, or a scene. They can even tell which finger you're about to move.

But those feats barely scratch the brain's surface. Any animal can perceive objects and move limbs. To plumb the soul, you need a metaphysician. John-Dylan Haynes, a brilliant researcher at Germany's Bernstein Center for Computational Neuroscience, is leading the way. His mission, according to the center, is to predict thoughts and behavior from fMRI scans.

Haynes, a former philosophy student, is going for the soul's jugular. He's trying to clarify the physical basis of free will. "Why do we shape intentions in this way or another way?" he wonders. "Your wishes, your desires, your goals, your plans—that's the core of your identity." The best place to look for that core is in the brain's medial prefrontal cortex, which, he points out, is "especially involved in the initiation of willed movements and their protection against interference."

To get a clear snapshot of free will, Haynes designed an experiment that would isolate it from other mental functions. No objects to interpret; no physical movements to anticipate or execute; no reasoning to perform. Participants were put in an fMRI machine and were told they would soon be shown the word "select," followed a few seconds later by two numbers. Their job was to covertly decide, when they saw the "select" cue, whether to add or subtract the unseen numbers. Then, they were to perform the chosen calculation and punch a button corresponding to the correct answer. The snapshot was taken right after the "select" cue, when they had nothing to do but choose addition or subtraction.

Until this experiment, which was reported last month in Current Biology, nobody had ever tried to take a picture of free will. One reason is that fMRI is too crude to distinguish one abstract choice from another. It can only show which parts of the brain are demanding blood oxygen. That's too coarse to distinguish the configuration of cells that signifies addition from the configuration that signifies subtraction. So, Haynes used software to help the computer recognize complex patterns in the data. To dissect human thought, the computer had to emulate it.

Each participant took the test more than 250 times, choosing independently in each trial. The computer then looked at a sample of the scans, along with the final answers that revealed what choices had actually been made. It calculated a pattern and used this pattern to predict, from each participant's remaining scans, his or her decisions in the corresponding trials. Haynes checked the predictions—add or subtract—against the participants' answers. The computer got it right 71 percent of the time.

I know what you're thinking: Why would anyone want a machine to read his mind? But imagine being paralyzed, unable to walk, type, or speak. Imagine a helmet full of electrodes, or a chip implanted in your head, that lets your brain tell your computer which key to press. Those technologies are already here. And why endure the agony of mental hunt-and-peck? Why not design computers that, like a smart secretary, can discern and execute even abstract intentions? That's what Haynes has in mind. You want to open a folder or an e-mail, and your computer does it. Your wish is its command.

But if machines can read your mind when you want them to, they can also read it when you don't. And your will isn't necessarily the one they obey. Already, scans have been used to identify brain signatures of disgust, drug cravings, unconscious racism, and suppressed sexual arousal, not to mention psychopathy and propensity to kill.

Haynes understands the objection to these scans—he calls it "mental privacy"—but he buys only half of it. He doesn't like the idea of companies scanning job applicants for loyalty or scanning customers for reactions to products (an emerging practice known as neuromarketing). But where criminal justice is at stake, as in the case of lie detection, he's for using the technology. Ruling it out, he argues, would "deny the innocent people the ability to prove their innocence" and would "only protect the people who are guilty."

I hear what he's saying. I'd love to have put Khalid Sheikh Mohammed through an fMRI before Sept. 11, 2001, instead of waiting six years for his confession. And I wish we'd scanned Mohamed Atta's brain before he boarded that flight out of Boston. But what Haynes is saying—and exposing—is almost more terrifying than terrorism. The brain is becoming just another accessible body part, searchable for threats and evidence. We can sift through your belongings, pat you down, study your nude form through your clothes, inspect your body cavities, and, if necessary, peer into your mind.

FMRI is just the first stage. Electrodes, infrared spectroscopy, and subtler magnetic imaging are next. Scanners will shrink. Image resolution and pattern-recognition software will improve.

But don't count out free will. To make human choice predictable, you first have to constrain it so that it's not really free. That's why Haynes confined his participants to arithmetic, gave them only two options, and forbade them to change their minds. They could have wrecked his experiment by defying any of those conditions. So could you, if somebody came at you with a scanner or an electrode helmet. To look into your soul and get the right answer, science, too, has to cheat. Somewhere, Woody Allen is laughing. I can feel it.

...........................................

Enough of this metaphysical nonsense from Saletan.
fMRI (and neuroscience for that matter) is a materialist enterprise. I'm getting pretty sick of Saletan coming at us again and again with the "we've found the soul, we've scanned choices, natural selection is shaped by free will.. Did I blow your mind?"

No, dooood, you did not. The reason it sounds so freaky to scan "free will" or "the soul" is because you haven't...no one can and no one will. If these things exist at all they are nonphysical, so, for the purposes of science they can safely be ignored.

What you scan in an fMRI, or an EEG, or measure with a key press are the consequences of the physical events which are thought. Cognition is a material activity. Once you get this concept through your head then you can stop with the mysticism.

For instance, "thinking about faces" is a cognitive state instantiated in matter...thus, it has a chance to show up on an fMRI. The cognitive state that we experience as "the intention to add or subtract" something is also instantiated in matter. The fact that an fMRI picks this up, too, is not a transcendental experience unless you willfully misinterpret what's going on. Who's to say a cognitive "intentional state" is uncaused, separate from the material world, or otherwise special or supernatural? Why cannot an intentional state be caused just like anything else? The answer is "it can".

There's plenty of wonder to be had in the science itself...this is very cool stuff. We don't need Saletan's injecting inappropriate, outdated shorthands and concepts from Cartesian dualism and medieval philosophy muddying the waters in order to hook us.

--Mangar


:::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

GokuRakuAn

Daruma Museum, Japan

:::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::