A Self-Made Man
Woman Goes Undercover to Experience Life as a Man
March 6, 2007
This report orignally ran on January 20th, 2006.
Norah Vincent has lived as a man. She didn't undergo a sex change or radical hormone treatments. She simply went undercover. In an extraordinary feat of acting, disguise and guts, Vincent lived among men -- as a man -- for 18 months to see what life was like on the other side of the gender divide.
"This wasn't just a stunt. This was about learning. This is a human project. It was about finding something out about the human creature. ... And I learned it the best possible way because I went through it," Vincent told ABC's JuJu Chang.
Growing up in the Midwest with her actress mother, lawyer father and two older brothers, Vincent was a tomboy with a flair for the dramatic. She says she's still a tomboy, and a lesbian living in midtown Manhattan with her partner, Lisa.
At 5 feet, 10 inches and 155 pounds, Vincent passed as a medium-build man she called Ned. Her transformation began with a buzz cut, baggy men's clothes, and a too-small sports bra to flatten her breasts. She even wore a little padding in a jock strap. For the rest, she enlisted the help of makeup artist Ryan McWilliams, who created Ned's five-o'-clock shadow.
Then there was the theatrical component. Vincent underwent months of training with Juilliard voice teacher Kate Maré to learn how to sound like a man. "Women have much stronger nasal resonances as a rule," Maré explained.
When all the pieces were put together -- hair, makeup, voice, posture and style -- the transformation was complete, and Norah Vincent became Ned Vincent.
Becoming One of the Guys
Vincent, a journalist, didn't take the project lightly. She estimates she put on Ned's whiskers and clothes about 150 times during her 18-month experiment. "I wanted to enter males' spheres of interest and ... see how men are with each other. I wanted to make friends with men. I wanted to know how male friendships work from the inside out," she told "20/20."
Vincent's first act as a newly minted male was to join a quintessential bastion of camaraderie -- a men's bowling team in a working-class Pennsylvania neighborhood. The only problem: She's a terrible bowler.
But the men didn't boot her off the team. "It's an amazing thing, because I think that shows you the generosity that they had," she said.
Her experience with these men turned some of her long-held perceptions about men being harsh and rejecting and women being warm and welcoming upside down.
Related
"I mean, it was just the most wonderful rush to get these guys' handshakes, and I felt comfortable, I mean as comfortable as I could feel, right away. They just took me in ... no questions asked," she said.
The team bowled together for nine months and gradually Vincent gained entrance to their inner sanctum. She found that all the cussing and good-natured ribbing is just how men often show affection for one another.
Near the end of the team's run, Vincent decided to reveal herself as a woman. Nervous about how the guys would react, she tested the waters with Jim, the guy she had become closest with.
Vincent took Jim out for a drink with her partner, Lisa, and told him she had something to say that was going to "blow his mind."
"I said the only thing that would blow my mind is if you told me that you were a girl and that she was a guy. And she goes, well, you're half right," Jim said.
Later, Jim told the rest of the teammates, who all took it well.
Jim said he thinks Vincent came into the experiment with some misconceptions about men. "I think she expected to find like a bunch of guys just talking about women's private parts and a bunch of racists and, you know. I think, kind of, that's what she came into this thinking," he said.
Vincent agreed. "They really showed me up as being the one who was really judgmental, because they were the ones who took me in, not knowing anything about me. They were the ones who made me their friend ... no judgments attached," Vincent said.
Sex: 'For a Man, It's an Urge'
Cracking the mystery of a "boys' night out" is one thing, but understanding the explicit world of a man's sexuality is quite another.
To gain an understanding of what some might consider the quintessential male experience, Vincent went to several strip clubs with a male friend. She describes the experience as hellish -- demeaning for the strippers and even worse for the men.
"I saw the men there. I saw the looks on their faces. This is not about appreciation of women, of course. It's not about appreciation of their own sexuality. It's about an urge and ... that's not always that pleasurable, really," she said.
Vincent said strip joints are about pure sex drive -- completely empty of any meaningful interaction, even when a woman is gyrating on your lap.
Even though Vincent is attracted to women, she said she was never aroused during her visits to the clubs. "I really ran smack up against the difference between male and female sexuality. It's that female sexuality is mental. ... For a man, it's an urge," she said.
"At its core, it's a bodily function. It's a necessity. It's such a powerful drive and I think because we [women] don't have testosterone in our systems, we don't understand how hard it is," she said.
Vincent even dabbled in the art of picking up women and agreed to wear a hidden camera for ABC News during her exploits.
She was quickly reminded that in this arena, it's women who have the power, she said.
"In fact, we sit there and we just with one word, 'no,' will crush someone," she said. "We don't have to do the part where you cross the room and you go up to a stranger that you've never met in the middle of a room full of people and say the first words. And those first words are so hard to say without sounding like a cheeseball or sounding like a jerk."
Vincent encountered some pretty cold shoulders in her attempts at the bar, but she did manage to go on about 30 dates with women as "Ned," mostly arranging them on the Internet.
Vincent said the dates were rarely fun and that the pressure of "Ned" having to prove himself was grueling. She was surprised that many women had no interest in a soft, vulnerable man.
"My prejudice was that the ideal man is a woman in a man's body. And I learned, no, that's really not. There are a lot of women out there who really want a manly man, and they want his stoicism," she said.
Three Weeks at a Monastery
Vincent didn't limit her exploration of masculinity to just friendships and sexuality. She said she found differences in every walk of life, including shopping for a new car at a dealership.
Going in as Norah, the salesman's pitch quickly turns flirtatious, but when she returned to the same salesman as Ned, the tone was all business and the talk was all about the car's performance.
In Vincent's final months as Ned, she managed to infiltrate all-male environments. A lapsed Catholic, Vincent thought it would be interesting to penetrate the cloistered inner world of a monastery. Ned managed to live there for three weeks as a trainee. The monks, Vincent said, were pious, smart men. But they were still men.
She said she witnessed a "desperate need for male intimacy and the lack of ability to give it" at the retreat. It was "really painful," she added.
Not only were the monks struggling to be open and intimate, Vincent said they were hostile to her feminine side. She said she was ostracized because of the monks' assumptions about her sexual orientation.
"Many of them thought I was gay, as one of them told me in confession. ... And I said, 'Well, yeah, but not in the way you think,'" Vincent said.
Vincent thought the perfect end to her 18-month saga would be to join a men-only therapy group, a place where guys tried to bond and show their emotions instead of hiding them.
Again, Vincent saw the men struggle with vulnerability. "They don't get to show the weakness, they don't get to show the affection, especially with each other. And so often all their emotions are shown in rage," she said.
Instead, Vincent said, the men talked about rage, often their rage toward women, and what they would do physically and violently toward women.
"A lot of this was blowing off steam. ...They would talk about fantasizing about chopping up their wives or something. It's not that they would ever do that, but it was a way to get out the blackest thoughts," she said.
Norah began to empathize with the fear and stress men feel for having to always be the strong provider.
Once again, some group members thought Ned was gay, but nobody suspected Ned was a woman. After eight sessions, the group went on a back-country weekend retreat, but Vincent's 18 months of being an imposter was closing in on her.
"The pressure of being someone that you're not and ... the fear of discovery and the deceit that it involves piles up and piles up. So, by the time I got around to doing this men's group, it was really reaching critical mass," she said.
"I was out in the woods with a bunch of guys who had rage issues about women and I was in drag ... and I thought, oh, God, you know, what am I doing," she added.
She continued her emotional descent, and a week later, checked in to a hospital with severe depression. Identity, she concluded, was not something to play around with.
"When you mess around with that, you really mess around with something that you need that helps you to function. And I found out that gender lives in your brain and is something much more than costume. And I really learned that the hard way," she said.
Vincent says she's healed now and glad to be rid of Ned. But her views about men have changed forever.
"Men are suffering. They have different problems than women have, but they don't have it better," she said. "They need our sympathy. They need our love, and maybe they need each other more than anything else. They need to be together."
Ironically, Vincent said, it took experiencing life as a man for her to appreciate being a woman. "I really like being a woman. ... I like it more now because I think it's more of a privilege."
21 enero 2009
My dead body being eaten by worms? That's disgusting
My dead body being eaten by worms? That's disgusting
Scientific advances and cultural diversity mean that people's instinctive reactions are playing a bigger role in politics
Jackie Ashley
The Guardian, Monday 20 November 2006
This may sound more morbid than it was, but I was talking to my husband about how we would like to be finally disposed of, and it triggered an argument about that most modern of political phenomena, the yuk factor.
For me, cremation is the wholesome, hygienic solution. It is what progressive people go for automatically. Along with being pro-social-justice, pro-abortion-rights and anti-homophobia, favouring cremation is just what you expect these days. He, it turns out, would be happy to be buried. I react against burial with instinctive horror - yuk - all those worms and microbes and slow, Victorian-style squelch as the rain pelts down and the headstones discolour. He shrugs: so what?
So we have different yuk factors. Some go into spasms about spiders. Some keep rats or snakes as pets. Orthodox Jews or Muslims offered pork, or Hindus confronted by beef, will feel a spasm of disgust, just as many vegetarians will to any cooked flesh. Most of us find the idea of female circumcision utterly disgusting but clearly some African men don't.
The yuk factor has always had an entertainment value, too, from the campfire stories about the tribe down the valley who paint their faces with dung, to I'm a Celebrity, Get Me Out of Here, with its endless and highly successful formula of grub- and beetle-eating, sticking crawly things down your pants and being scuttled over as you sleep. Out in the jungle, David Gest isn't the only thing to make you cringe.
In politics, though it is little discussed, the yuk factor also has a history. The anti-hanging campaign was motivated in part by the feeling that it was simply disgusting, unspeakable, to take a living, healthy human being, slip a rope over his or her head and then break their neck, whatever crime had been committed. And you only have to scratch the surface of the abortion debate to come across gruesome images meant to produce a visceral, instinctive recoil.
But it seems to me that the yuk factor is becoming steadily more important. Partly it is the result of migration and multiculturalism, so that different instincts, which were once separated by thousands of miles, coexist in the same street. Halal butchers are one example. Another is the disgust that many traditional Muslims as well as Christians feel when they see sexually charged images glinting from advertising hoardings and magazine covers. A few decades ago, if you heard someone say "that's indecent", you would assume that here was an old-fashioned person reacting to the modern world. Now it can as easily be a politically charged assault on someone else's culture.
But it goes a lot wider than that. Science is hugely extending its challenge to our yuk-response. Take the harvesting of stem cells. Earlier this month a team at Newcastle University asked for permission to place human DNA into cow eggs, which had previously been scooped out, producing human-cow hybrids. The idea is to grow human stem cells, which may be vital for treating Alzheimer's, Parkinson's and stroke, without having to operate on women to take their eggs. The scientists see it as an efficient, humane shortcut and this is starting to happen all round the world.
A couple of years ago Chinese scientists in Shanghai successfully fused human cells with rabbit eggs, producing what were said to be the first human-animal chimeras successfully created. They were allowed to develop for several days in a dish before they were destroyed for their stem cells. In Nevada, sheep implanted with human stem cells before birth grew partially human inner organs; that was research aimed at curing diabetes. For some people, this is an utterly disgusting game with the essence of human identity, which will lead, if unchecked, to an age of biological monsters, eugenics and who knows what else. It is the ultimate yuk. For others, stem-cell therapy is the most exciting medical frontier, which will bring us better and healthier lives and end some of the health terrors all round us today.
More knowledge of human biology tends to bring more potential solutions to illnesses but also more yuk factor. The debate on the maximum time for abortions would not have developed as it has without the detailed and vivid womb images now available. Our growing knowledge of the brain and how consciousness happens gives us new insights into drug abuse, mental illness and disease, but it also makes many people queasy - the very idea of our precious inner thoughts becoming visible as mere electricity and heat. It's hardly controversial to say that these kinds of issue are going to become politically hotter all the time.
So how should we begin to include the yuk factor in political debate? Is it actually debatable at all? Nothing seems to produce angrier reactions than abortion, human cloning, stem-cell research or genetic modification of food. But we can find ways through, if we bear in mind a few basics. Respecting other people's "yuks" is important, because mostly we cannot avoid instant reactions. But that should be only the start.
Every time we recoil, we have to think about the consequences of banning something, and we have to remember that other people react differently. Modern open societies are not tribes which can thoughtlessly impose their taboos. To me, the yuk factor in hanging, or female circumcision, is about violence and oppression and non-negotiable. But yuk to stem cells has to be confronted by yuk to senile dementia and its humiliations. Yuk to euthanasia is fine. But there is another yuk, which is watching someone die slowly and painfully, full of fury that they cannot control their end. Let us have a thoughtful parity of disgust.
For we need to remember that the yuk factor is not the essence of what makes us human, but is almost infinitely malleable. Other societies found cannibalism and incest acceptable. Not long ago, most people found male homosexuality so disgusting they were happy for it to be a criminal offence. So it is possible, indeed likely, that current reactions against mixed-DNA technology to harvest stem cells will change too.
The argument doesn't end there. It never will end. Many religious people feel oppressed by a tide of arrogant rationalism. Those of us on the other side have to start engaging with a new emotionalism in politics. Disgust is a good warning. But, like any warning, it needs to be investigated, probed and challenged. Though, thus far, I haven't quite escaped my fear of treacly, wormy soil.
Scientific advances and cultural diversity mean that people's instinctive reactions are playing a bigger role in politics
Jackie Ashley
The Guardian, Monday 20 November 2006
This may sound more morbid than it was, but I was talking to my husband about how we would like to be finally disposed of, and it triggered an argument about that most modern of political phenomena, the yuk factor.
For me, cremation is the wholesome, hygienic solution. It is what progressive people go for automatically. Along with being pro-social-justice, pro-abortion-rights and anti-homophobia, favouring cremation is just what you expect these days. He, it turns out, would be happy to be buried. I react against burial with instinctive horror - yuk - all those worms and microbes and slow, Victorian-style squelch as the rain pelts down and the headstones discolour. He shrugs: so what?
So we have different yuk factors. Some go into spasms about spiders. Some keep rats or snakes as pets. Orthodox Jews or Muslims offered pork, or Hindus confronted by beef, will feel a spasm of disgust, just as many vegetarians will to any cooked flesh. Most of us find the idea of female circumcision utterly disgusting but clearly some African men don't.
The yuk factor has always had an entertainment value, too, from the campfire stories about the tribe down the valley who paint their faces with dung, to I'm a Celebrity, Get Me Out of Here, with its endless and highly successful formula of grub- and beetle-eating, sticking crawly things down your pants and being scuttled over as you sleep. Out in the jungle, David Gest isn't the only thing to make you cringe.
In politics, though it is little discussed, the yuk factor also has a history. The anti-hanging campaign was motivated in part by the feeling that it was simply disgusting, unspeakable, to take a living, healthy human being, slip a rope over his or her head and then break their neck, whatever crime had been committed. And you only have to scratch the surface of the abortion debate to come across gruesome images meant to produce a visceral, instinctive recoil.
But it seems to me that the yuk factor is becoming steadily more important. Partly it is the result of migration and multiculturalism, so that different instincts, which were once separated by thousands of miles, coexist in the same street. Halal butchers are one example. Another is the disgust that many traditional Muslims as well as Christians feel when they see sexually charged images glinting from advertising hoardings and magazine covers. A few decades ago, if you heard someone say "that's indecent", you would assume that here was an old-fashioned person reacting to the modern world. Now it can as easily be a politically charged assault on someone else's culture.
But it goes a lot wider than that. Science is hugely extending its challenge to our yuk-response. Take the harvesting of stem cells. Earlier this month a team at Newcastle University asked for permission to place human DNA into cow eggs, which had previously been scooped out, producing human-cow hybrids. The idea is to grow human stem cells, which may be vital for treating Alzheimer's, Parkinson's and stroke, without having to operate on women to take their eggs. The scientists see it as an efficient, humane shortcut and this is starting to happen all round the world.
A couple of years ago Chinese scientists in Shanghai successfully fused human cells with rabbit eggs, producing what were said to be the first human-animal chimeras successfully created. They were allowed to develop for several days in a dish before they were destroyed for their stem cells. In Nevada, sheep implanted with human stem cells before birth grew partially human inner organs; that was research aimed at curing diabetes. For some people, this is an utterly disgusting game with the essence of human identity, which will lead, if unchecked, to an age of biological monsters, eugenics and who knows what else. It is the ultimate yuk. For others, stem-cell therapy is the most exciting medical frontier, which will bring us better and healthier lives and end some of the health terrors all round us today.
More knowledge of human biology tends to bring more potential solutions to illnesses but also more yuk factor. The debate on the maximum time for abortions would not have developed as it has without the detailed and vivid womb images now available. Our growing knowledge of the brain and how consciousness happens gives us new insights into drug abuse, mental illness and disease, but it also makes many people queasy - the very idea of our precious inner thoughts becoming visible as mere electricity and heat. It's hardly controversial to say that these kinds of issue are going to become politically hotter all the time.
So how should we begin to include the yuk factor in political debate? Is it actually debatable at all? Nothing seems to produce angrier reactions than abortion, human cloning, stem-cell research or genetic modification of food. But we can find ways through, if we bear in mind a few basics. Respecting other people's "yuks" is important, because mostly we cannot avoid instant reactions. But that should be only the start.
Every time we recoil, we have to think about the consequences of banning something, and we have to remember that other people react differently. Modern open societies are not tribes which can thoughtlessly impose their taboos. To me, the yuk factor in hanging, or female circumcision, is about violence and oppression and non-negotiable. But yuk to stem cells has to be confronted by yuk to senile dementia and its humiliations. Yuk to euthanasia is fine. But there is another yuk, which is watching someone die slowly and painfully, full of fury that they cannot control their end. Let us have a thoughtful parity of disgust.
For we need to remember that the yuk factor is not the essence of what makes us human, but is almost infinitely malleable. Other societies found cannibalism and incest acceptable. Not long ago, most people found male homosexuality so disgusting they were happy for it to be a criminal offence. So it is possible, indeed likely, that current reactions against mixed-DNA technology to harvest stem cells will change too.
The argument doesn't end there. It never will end. Many religious people feel oppressed by a tide of arrogant rationalism. Those of us on the other side have to start engaging with a new emotionalism in politics. Disgust is a good warning. But, like any warning, it needs to be investigated, probed and challenged. Though, thus far, I haven't quite escaped my fear of treacly, wormy soil.
Hearts & Minds
Hearts & Minds
Since Plato, scholars have drawn a clear distinction between thinking and feeling. Now science suggests that our emotions are what make thought possible.
By Jonah Lehrer April 29, 2007
Just over 50 years ago, a group of brash young scholars at an MIT symposium introduced a series of ideas that would forever alter the way we think about how we think.
In three groundbreaking papers, including one on grammar by a 27-year-old linguist named Noam Chomsky, the scholars ignited what is now known as the cognitive revolution, which was built on the radical notion that it is possible to study, with scientific precision, the actual processes of thought. The movement eventually freed psychology from the grip of behaviorism, a scientific movement popular in America that studied behavior as a proxy for understanding the mind. Cognitive psychology has fueled a generation of productive research, yielding deep insights into many aspects of thought, including memory, language, and perception.
Tomorrow, Harvard University is celebrating this intellectual achievement with a discussion featuring Chomsky and other luminaries of the revolution. But even as Harvard, and the field, celebrate the 50th anniversary of a true paradigm shift, another revolution is underway.
Ever since Plato, scholars have drawn a clear distinction between thinking and feeling. Cognitive psychology tended to reinforce this divide: emotions were seen as interfering with cognition; they were the antagonists of reason. Now, building on more than a decade of mounting work, researchers have discovered that it is impossible to understand how we think without understanding how we feel.
"Because we subscribed to this false ideal of rational, logical thought, we diminished the importance of everything else," said Marvin Minsky, a professor at MIT and pioneer of artificial intelligence. "Seeing our emotions as distinct from thinking was really quite disastrous."
This new scientific appreciation of emotion is profoundly altering the field. The top journals are now filled with research on the connections between emotion and cognition. New academic stars have emerged, such as Antonio Damasio of USC, Joseph LeDoux of NYU, and Joshua Greene, a rising scholar at Harvard. At the same time, the influx of neuroscientists into the field, armed with powerful brain-scanning technology, has underscored the thinking-feeling connection.
"When you look at the actual anatomy of the brain you quickly see that everything is connected," said Elizabeth Phelps, a cognitive neuroscientist at NYU. "The brain is a category buster."
The field has largely welcomed the new emotion studies, according to scientists. They have yielded discoveries that are widely acknowledged as important. And they have even generated enthusiasm among the leaders of the cognitive revolution, as emotion studies have helped ground cognitive psychology -- which has had a penchant for the abstract -- in the real world, uncovering important science behind everything from how people decide what to buy in a supermarket to how they make weighty moral decisions.
"People were coming up with all these lovely theories that don't relate to anything that's going on in the real world," said Jerome Bruner, a psychologist at NYU and luminary of the cognitive revolution who will speak at the Harvard symposium. "If we can get back to a sense of cognition that's more grounded in reality, then that's a good thing."
. . .
From its inception, the cognitive revolution was guided by a metaphor: the mind is like a computer. We are a set of software programs running on 3 pounds of neural hardware. And cognitive psychologists were interested in the software. The computer metaphor helped stimulate some crucial scientific breakthroughs. It led to the birth of artificial intelligence and helped make our inner life a subject suitable for science.
For the first time, cognitive psychologists were able to simulate aspects of human thought. At the seminal MIT symposium, held on Sept. 11, 1956, Herbert Simon and Allen Newell announced that they had invented a "thinking machine" -- basically a room full of vacuum tubes -- capable of solving difficult logical problems. (In one instance, the machine even improved on the work of Bertrand Russell.)
Over time, these simulations grew increasingly sophisticated. By "reverse-engineering" the mind, cognitive psychologists gained important insights into how some basic mental processes, like learning and memory, might actually function. Much of the work developing the field was done at the Harvard Center for Cognitive Studies, which was founded in 1960 by Bruner and George Miller, who is now an emeritus professor of psychology at Princeton.
Speaking at that same 1956 symposium, Miller described how, at any given moment, our working memory could contain only about seven bits of information. According to Miller, the mind dealt with this limited "channel capacity" by constantly grouping our sensations into "chunks." This suggested that crucial aspects of cognition were done, without our awareness, by the unconscious brain.
But the computer metaphor was misleading, at least in one crucial respect. Computers don't have feelings. Feelings didn't fit into the preferred language of thought. Because our emotions weren't reducible to bits of information or logical structures, cognitive psychologists diminished their importance.
"They regarded emotions as an artifact of subjective experience, and thus not worthy of investigation," said Joseph LeDoux, a neuroscientist at NYU.
In part, this was a necessary omission. Behaviorists attacked cognitive psychology as lacking rigor. Because our inner mental processes couldn't be measured, the behaviorists, eager to expunge anything that smacked of Freud or introspection, disregarded them as irrelevant and unscientific. Although cognitive psychologists aggressively defended their approach -- Chomsky quipped that defining psychology as the science of behavior was like defining physics as the science of meter reading -- they were inevitably forced to focus on the facets of cognition they could best understand. At the time, emotions just seemed too mysterious.
"These were nerdy guys interested in the nerdy aspects of cognition," said Steven Pinker, a psychologist at Harvard and moderator of tomorrow's panel. "It's not that our emotions aren't interesting topics of study, but these weren't the topics that they were interested in." Instead, early cognitive psychologists focused on the features of mind that seemed most machine-like, such as the construction of grammatical sentences.
Antonio Damasio, a neuroscientist at USC, has played a pivotal role in challenging the old assumptions and establishing emotions as an important scientific subject. When Damasio first published his results in the early 1990s, most cognitive scientists assumed that emotions interfered with rational thought. A person without any emotions should be a better thinker, since their cortical computer could process information without any distractions.
But Damasio sought out patients who had suffered brain injuries that prevented them from perceiving their own feelings, and put this idea to the test. The lives of these patients quickly fell apart, he found, because they could not make effective decisions. Some made terrible investments and ended up bankrupt; most just spent hours deliberating over irrelevant details, such as where to eat lunch. These results suggest that proper thinking requires feeling. Pure reason is a disease.
Scientists are now finding more examples of emotional processing almost everywhere they look. A study led by Brian Knutson of Stanford University, published last January, demonstrated that our daily shopping decisions depend on the relative activity of various emotional brain regions. What we end up buying is largely dictated by these instant feelings, and not by some rational calculation.
In 2004, Harvard psychologist Joshua Greene used brain imaging to demonstrate that our emotions play an essential role in ordinary moral decision-making. Whenever we contemplate hurting someone else, our brain automatically generates a negative emotion. This visceral signal discourages violence. Greene's data builds on evidence suggesting that psychopaths suffer from a severe emotional disorder -- that they can't think properly because they can't feel properly.
"This lack of emotion is what causes the dangerous behavior," said James Blair, a cognitive psychologist at the National Institute of Mental Health.
. . .
This new science of emotion has brought a new conception of what it means to think, and, in some sense, a rediscovery of the unconscious. In the five decades since the cognitive revolution began, scientists have developed ways of measuring the brain that could not have been imagined at the time. Researchers can make maps of the brain at work, and literally monitor emotions as they unfold, measuring the interplay of feeling and thinking in colorful snapshots. Although we aren't aware of this mental activity -- much of it occurs unconsciously -- it plays a crucial role in governing all aspects of thought. The black box of the mind has been flung wide open.
The increasing use of sophisticated imaging is clearly the direction in which the field is moving, scientists say. And yet some cognitive psychologists worry that this "trend to integrate with neuroscience" means that some aspects of cognition will be neglected.
"Everybody is now looking at these very big mental processes, like attention or emotion," said Pinker. "But I think that one of the great things about the cognitive revolution is that it went all the way down to the detailed rules and algorithms used by the mind. I hope we don't lose that."
Pinker hopes the Harvard commemoration will lead people to reflect on the cognitive revolution, to think about "what it got right and what it got wrong."
The lasting influence of the cognitive revolution is apparent in the language used by neuroscientists when describing the mind. For example, the unconscious is often described as a massive computer, processing millions of bits of information per second. Emotions emerge from this activity. Feelings can be seen as responses to facts and sensations that exist beyond the tight horizon of awareness. They can also be thought of as messages from the unconscious, as conclusions it has reached after considering a wide range of information -- they are the necessary foundation of thought.
As Jonathan Haidt, a social psychologist at the University of Virginia, recently wrote, "It is only because our emotional brains work so well that our reasoning can work at all."
Since Plato, scholars have drawn a clear distinction between thinking and feeling. Now science suggests that our emotions are what make thought possible.
By Jonah Lehrer April 29, 2007
Just over 50 years ago, a group of brash young scholars at an MIT symposium introduced a series of ideas that would forever alter the way we think about how we think.
In three groundbreaking papers, including one on grammar by a 27-year-old linguist named Noam Chomsky, the scholars ignited what is now known as the cognitive revolution, which was built on the radical notion that it is possible to study, with scientific precision, the actual processes of thought. The movement eventually freed psychology from the grip of behaviorism, a scientific movement popular in America that studied behavior as a proxy for understanding the mind. Cognitive psychology has fueled a generation of productive research, yielding deep insights into many aspects of thought, including memory, language, and perception.
Tomorrow, Harvard University is celebrating this intellectual achievement with a discussion featuring Chomsky and other luminaries of the revolution. But even as Harvard, and the field, celebrate the 50th anniversary of a true paradigm shift, another revolution is underway.
Ever since Plato, scholars have drawn a clear distinction between thinking and feeling. Cognitive psychology tended to reinforce this divide: emotions were seen as interfering with cognition; they were the antagonists of reason. Now, building on more than a decade of mounting work, researchers have discovered that it is impossible to understand how we think without understanding how we feel.
"Because we subscribed to this false ideal of rational, logical thought, we diminished the importance of everything else," said Marvin Minsky, a professor at MIT and pioneer of artificial intelligence. "Seeing our emotions as distinct from thinking was really quite disastrous."
This new scientific appreciation of emotion is profoundly altering the field. The top journals are now filled with research on the connections between emotion and cognition. New academic stars have emerged, such as Antonio Damasio of USC, Joseph LeDoux of NYU, and Joshua Greene, a rising scholar at Harvard. At the same time, the influx of neuroscientists into the field, armed with powerful brain-scanning technology, has underscored the thinking-feeling connection.
"When you look at the actual anatomy of the brain you quickly see that everything is connected," said Elizabeth Phelps, a cognitive neuroscientist at NYU. "The brain is a category buster."
The field has largely welcomed the new emotion studies, according to scientists. They have yielded discoveries that are widely acknowledged as important. And they have even generated enthusiasm among the leaders of the cognitive revolution, as emotion studies have helped ground cognitive psychology -- which has had a penchant for the abstract -- in the real world, uncovering important science behind everything from how people decide what to buy in a supermarket to how they make weighty moral decisions.
"People were coming up with all these lovely theories that don't relate to anything that's going on in the real world," said Jerome Bruner, a psychologist at NYU and luminary of the cognitive revolution who will speak at the Harvard symposium. "If we can get back to a sense of cognition that's more grounded in reality, then that's a good thing."
. . .
From its inception, the cognitive revolution was guided by a metaphor: the mind is like a computer. We are a set of software programs running on 3 pounds of neural hardware. And cognitive psychologists were interested in the software. The computer metaphor helped stimulate some crucial scientific breakthroughs. It led to the birth of artificial intelligence and helped make our inner life a subject suitable for science.
For the first time, cognitive psychologists were able to simulate aspects of human thought. At the seminal MIT symposium, held on Sept. 11, 1956, Herbert Simon and Allen Newell announced that they had invented a "thinking machine" -- basically a room full of vacuum tubes -- capable of solving difficult logical problems. (In one instance, the machine even improved on the work of Bertrand Russell.)
Over time, these simulations grew increasingly sophisticated. By "reverse-engineering" the mind, cognitive psychologists gained important insights into how some basic mental processes, like learning and memory, might actually function. Much of the work developing the field was done at the Harvard Center for Cognitive Studies, which was founded in 1960 by Bruner and George Miller, who is now an emeritus professor of psychology at Princeton.
Speaking at that same 1956 symposium, Miller described how, at any given moment, our working memory could contain only about seven bits of information. According to Miller, the mind dealt with this limited "channel capacity" by constantly grouping our sensations into "chunks." This suggested that crucial aspects of cognition were done, without our awareness, by the unconscious brain.
But the computer metaphor was misleading, at least in one crucial respect. Computers don't have feelings. Feelings didn't fit into the preferred language of thought. Because our emotions weren't reducible to bits of information or logical structures, cognitive psychologists diminished their importance.
"They regarded emotions as an artifact of subjective experience, and thus not worthy of investigation," said Joseph LeDoux, a neuroscientist at NYU.
In part, this was a necessary omission. Behaviorists attacked cognitive psychology as lacking rigor. Because our inner mental processes couldn't be measured, the behaviorists, eager to expunge anything that smacked of Freud or introspection, disregarded them as irrelevant and unscientific. Although cognitive psychologists aggressively defended their approach -- Chomsky quipped that defining psychology as the science of behavior was like defining physics as the science of meter reading -- they were inevitably forced to focus on the facets of cognition they could best understand. At the time, emotions just seemed too mysterious.
"These were nerdy guys interested in the nerdy aspects of cognition," said Steven Pinker, a psychologist at Harvard and moderator of tomorrow's panel. "It's not that our emotions aren't interesting topics of study, but these weren't the topics that they were interested in." Instead, early cognitive psychologists focused on the features of mind that seemed most machine-like, such as the construction of grammatical sentences.
Antonio Damasio, a neuroscientist at USC, has played a pivotal role in challenging the old assumptions and establishing emotions as an important scientific subject. When Damasio first published his results in the early 1990s, most cognitive scientists assumed that emotions interfered with rational thought. A person without any emotions should be a better thinker, since their cortical computer could process information without any distractions.
But Damasio sought out patients who had suffered brain injuries that prevented them from perceiving their own feelings, and put this idea to the test. The lives of these patients quickly fell apart, he found, because they could not make effective decisions. Some made terrible investments and ended up bankrupt; most just spent hours deliberating over irrelevant details, such as where to eat lunch. These results suggest that proper thinking requires feeling. Pure reason is a disease.
Scientists are now finding more examples of emotional processing almost everywhere they look. A study led by Brian Knutson of Stanford University, published last January, demonstrated that our daily shopping decisions depend on the relative activity of various emotional brain regions. What we end up buying is largely dictated by these instant feelings, and not by some rational calculation.
In 2004, Harvard psychologist Joshua Greene used brain imaging to demonstrate that our emotions play an essential role in ordinary moral decision-making. Whenever we contemplate hurting someone else, our brain automatically generates a negative emotion. This visceral signal discourages violence. Greene's data builds on evidence suggesting that psychopaths suffer from a severe emotional disorder -- that they can't think properly because they can't feel properly.
"This lack of emotion is what causes the dangerous behavior," said James Blair, a cognitive psychologist at the National Institute of Mental Health.
. . .
This new science of emotion has brought a new conception of what it means to think, and, in some sense, a rediscovery of the unconscious. In the five decades since the cognitive revolution began, scientists have developed ways of measuring the brain that could not have been imagined at the time. Researchers can make maps of the brain at work, and literally monitor emotions as they unfold, measuring the interplay of feeling and thinking in colorful snapshots. Although we aren't aware of this mental activity -- much of it occurs unconsciously -- it plays a crucial role in governing all aspects of thought. The black box of the mind has been flung wide open.
The increasing use of sophisticated imaging is clearly the direction in which the field is moving, scientists say. And yet some cognitive psychologists worry that this "trend to integrate with neuroscience" means that some aspects of cognition will be neglected.
"Everybody is now looking at these very big mental processes, like attention or emotion," said Pinker. "But I think that one of the great things about the cognitive revolution is that it went all the way down to the detailed rules and algorithms used by the mind. I hope we don't lose that."
Pinker hopes the Harvard commemoration will lead people to reflect on the cognitive revolution, to think about "what it got right and what it got wrong."
The lasting influence of the cognitive revolution is apparent in the language used by neuroscientists when describing the mind. For example, the unconscious is often described as a massive computer, processing millions of bits of information per second. Emotions emerge from this activity. Feelings can be seen as responses to facts and sensations that exist beyond the tight horizon of awareness. They can also be thought of as messages from the unconscious, as conclusions it has reached after considering a wide range of information -- they are the necessary foundation of thought.
As Jonathan Haidt, a social psychologist at the University of Virginia, recently wrote, "It is only because our emotional brains work so well that our reasoning can work at all."
Damasio's Error
Damasio's Error
Aaron Sloman
In 1994 Antonio Damasio, a well known neuroscientist, published his book Descartes' Error . He argued that emotions are needed for intelligence, and accused Descartes and many others of not grasping that. In 1996 Daniel Goleman published Emotional Intelligence: Why I t Can Matter More than IQ , quoting Damasio with approval, as did Rosalind Picard a year later in her book Affective Computing .
Since then there has been a flood of publications and projects echoing Damasio's claim. Many researchers in artificial intelligence have become convinced that emotions are essential for intelligence, and they are now producing many computer models containing a module called emotion.
Before that, serious researchers had begun to argue that the study of emotions and affect had not been given its rightful place in psychology and cognitive science, but their claims were more moderate. For example, a journal called Cognition and Emotion was started in 1987. Even I had a paper in it in the first year.
Damasio's argument rested heavily on two examples. The first was of Phineas Gage. In 1848, an accidental explosion of a charge he had set blew his tamping iron through his head, destroying the left frontal part of his brain:
He lived, but having previously been a capable and efficient foreman, one with a well-balanced mind, and who was looked on as a shrewd smart business man, he was not fitful, irrelevant, and grossly profane, showing little deference for his fellows. He was also impatient and obstinate, yet capricious and vacillating, unable to settle on any of the plans he devised for future action. His friends said he was no longer Gage. (www.deakin.edu.au/hbs/GAGEPAGE/Pgstory.htm)
The second example was one of Damasio's patients, whom he refers to as Elliot. Following a brain tumour and subsequent operation, Elliott suffered damage in the same general brain area as Gage (left frontal lobe). Like Gage, he experienced a great change in personality. Elliot had been a successful family man, and successful in business. After his operation he became impulsive and lacking in self-discipline. He could not decide between options where making the decision was important but both options were equally good. He persevered on unimportant tasks while failing to recognise priorities. He had lost all his business acumen and ended up impoverished, even losing his wife and family. He could no longer hold a steady job. Yet he did will on standard IQ tests. (See http://serendip.brynmawr.edu/bb/damasio)
Both patients appeared to retain high intelligence as measured by standard tests, but not as measured by their ability to behave sensibly. Both had also lost certain kinds of emotional reactions. What follows from these cases?
In a nutshell, here is the argument Damasio produced which many people in many academic disciplines enthusiastically accepted as valid:
Damage to frontal lobes impairs emotional capabilities.Damage to frontal lobes impairs intelligence.Therefore Emotions are required for intelligence.
The conclusion does not follow from the premises. (Whether the conclusion is true is a separate matter, which I'll come to.) Compare this argument proving that cars need functioning horns in order to start:
Damaging the battery stops the horn working in a car.Damage to the battery prevents the car starting.Therefore a functioning horn is required for the car to start.
A moment's thought should have reminded Damasio's readers that two capabilities could presuppose some common mechanism, so that damaging the mechanism would damage both capabilities, without either capability being required for the other. For instance, even if both premises in the horn argument are true, you can damage the starter motor and leave the horn working, or damage the horn and leave the starter motor working.
I first criticised Damasio's argument in two papers in 1998 and 1999 and have never seen these criticisms made by other authors. My criticisms were repeated in several subsequent publications. Nobody paid any attention to the criticism and even people who had read those papers continued to refer approvingly to Damasio's argument in their papers. Very intelligent people kept falling for the argument. For example, Susan Blackmore did not notice the fallacy when summarising Damasio's theories in her excellent recent book Consciousness: An Introduction (2003). (She has now informed me that she agrees that the argument used is fallacious.)
The best explanation I can offer for the surprising fact that so many intelligent people are fooled by an obviously invalid argument is sociological: they are part of a culture in which people want the conclusion to be true. There seems to be a widespread (though not universal) feeling, even among many scientists and philosophers, that intelligence, rationality, critical analysis and problem-solving powers are over-valued, and that they have defects that can be overcome by emotional mechanisms. This leads people to like Damasio's conclusion. They want it to be true. And this somehow causes them to accept as valid an argument for that conclusion, even though they would notice the flaw in a structurally similar argument for a different conclusion (such as in the car horn example). This is a general phenomenon. Consider, for instance, how many people on both sides of the evolution/creation debate, or both sides of the debate for and against computational theories of mind, tend to accept bad arguments for their side.
A research community with too much wishful thinking does not advance science. Instead of being wishful thinkers, scientists trying to understand the most complex information-processing system on the planet should learn how to think (at least some of the time) as designers of information-processing systems do.
To be fair, Damasio produced additional theoretical explanations of what is going on, so, in principle, even though the quoted argument is invalid, the conclusion might turn out to be true and explained by his theories. However, his theory of emotions as based on somatic markers (regulatory signals in the brain's representation of the body) is very closely related to the theory of William James, which regards emotions as a form of awareness of bodily changes. This sort of theory is incapable of accounting for the huge subset of socially important emotions in humans which involve rich semantic content which would not be expressible within somatic markers (such as admiring someone's courage while being jealous of his wealth) and emotions that endure over a long period of time while bodily states come and go (such as obsessive ambition, infatuation, or long term grief at the death of a loved one).
The key assumption, shared by both Damasio and many others whose theories are different in details, is that all choices depend on emotions, and especially choices where there are conflicting motives. If that were true it would support a conclusion that emotions are needed for at least intelligent conflict resolution.
Although I will not argue the point here, I think it is very obvious from the experience of many people (certainly my experience) that one can learn how to make decisions between conflicting motives in a totally calm, unemotional, even cold way, simply on the basis of having preferences or having learnt principles that one assents to. Many practical skills require learning which option is likely to be better. A lot of social learning provides conflict resolutions strategies for more subtle decisions: again without emotions having to be involved. Of course, one could make a terminological decision to label all preferences, policies, and principles emotions. But that would trivialise Damasio's conclusion.
So, let's start again: what are emotions, and how do they work? There are many ways to study emotions and other aspects of human minds. Reading plays, novels or poems will teach much about how people who have emotions, moods, attitudes, desires and so on think and behave, and how others react to them, because many writers are very shrewd observers. Studying ethology will teach you something about how emotions and other mental phenomena vary among different animals. Studying psychology will add extra detail concerning what can be triggered or measured in laboratories, and what correlates with what. Studying developmental psychology can teach you how the states and processes in infants differ from those in older children and adults. Studying neuroscience will teach you about the physiological brain mechanisms that help to produce and modulate mental states and processes. Studying therapy and counselling can teach you about ways in which things can go wrong and do harm, and in some ways of helping people. Studying philosophy with a good teacher may help you discern muddle and confusion in attempts to say what emotions are and how they differ from other mental states and processes.
There's another way that complements these: do some engineering design. Suppose you had to design animals (including humans) or robots capable of living in various kinds of environments, including environments containing intelligent systems. What sorts of information-processing mechanisms, including control mechanisms, would you need to include in the design, and how could you fit all the various mechanisms together to produce all the required functionality, including: perceiving, learning, acquiring new motives, enjoying some activities and states and disliking others, selecting between conflicting motives, planning, reacting to dangers and opportunities, communicating in various ways, reproducing and so on?
If we combine this design standpoint with the other ways to study mental phenomena, we can learn much about all sorts of mental processes: what they are, how they can vary, what they do, what produces them, whether they are essential or merely by-products of other things, how they can go wrong, and so on. The result could be both deep new insights about what we are, and important practical applications.
The design-based approach is not new: over the last half century, researchers in computational cognitive science and in artificial intelligence have been pursuing it. Because the work was so difficult, and because of pressures of competition for funding and other aspects of academic life (such as lack of time for study outside one's own specialism), as more people became involved, the research community became more fragmented, with each group investigating only a small subset of the larger whole, and talking only to members of that group.
Deep, narrowly focused research on very specific problems is a requirement for progress, but if everybody does only that, the results will be bad. People working on natural language without relating it to studies of perception, thinking, reasoning, and acting may miss out on important aspects of how natural languages work. Likewise those who study only a small sub-problem in perception may miss out ways in which the mechanisms they study need to be modified to fit into a larger system. The study of emotions also needs to be related to the total system.
We may be able to come up with clear, useful design-based concepts for describing what is happening in a certain class of complex information processing systems, if we study the architecture, mechanisms and forms of representations used in that type of system, and work out the states and processes that can be generated when the components interact with each other and the environment.
If the system is one that we had previously encountered and for which we already have a rich and useful pre-scientific vocabulary, then the new design-based concepts will not necessarily replace the old ones but may instead refine and extend them. For example, they might lead us to new sub-divisions and bring out deep similarities between previously apparently different cases.
This happened to our concepts of physical stuff (air, water, iron, copper, salt, carbon and so on) as we learnt more about the underlying architecture of matter and the various ways in which the atoms and sub-atomic particles could combine and interact. So we now define water as H 2 O and salt as NaCl rather than in terms of how they look, taste or feel, and we know that there are different isotopes of carbon with different numbers of neutrons.
As we increase our understanding of the architecture of mind (what they mechanisms are, how they are combined, how they interact), our concepts of mind (such as emotion, consciousness, learning, seeing) will also be refined and extended. In the meantime, muddle and confusion reign.
Aaron Sloman
In 1994 Antonio Damasio, a well known neuroscientist, published his book Descartes' Error . He argued that emotions are needed for intelligence, and accused Descartes and many others of not grasping that. In 1996 Daniel Goleman published Emotional Intelligence: Why I t Can Matter More than IQ , quoting Damasio with approval, as did Rosalind Picard a year later in her book Affective Computing .
Since then there has been a flood of publications and projects echoing Damasio's claim. Many researchers in artificial intelligence have become convinced that emotions are essential for intelligence, and they are now producing many computer models containing a module called emotion.
Before that, serious researchers had begun to argue that the study of emotions and affect had not been given its rightful place in psychology and cognitive science, but their claims were more moderate. For example, a journal called Cognition and Emotion was started in 1987. Even I had a paper in it in the first year.
Damasio's argument rested heavily on two examples. The first was of Phineas Gage. In 1848, an accidental explosion of a charge he had set blew his tamping iron through his head, destroying the left frontal part of his brain:
He lived, but having previously been a capable and efficient foreman, one with a well-balanced mind, and who was looked on as a shrewd smart business man, he was not fitful, irrelevant, and grossly profane, showing little deference for his fellows. He was also impatient and obstinate, yet capricious and vacillating, unable to settle on any of the plans he devised for future action. His friends said he was no longer Gage. (www.deakin.edu.au/hbs/GAGEPAGE/Pgstory.htm)
The second example was one of Damasio's patients, whom he refers to as Elliot. Following a brain tumour and subsequent operation, Elliott suffered damage in the same general brain area as Gage (left frontal lobe). Like Gage, he experienced a great change in personality. Elliot had been a successful family man, and successful in business. After his operation he became impulsive and lacking in self-discipline. He could not decide between options where making the decision was important but both options were equally good. He persevered on unimportant tasks while failing to recognise priorities. He had lost all his business acumen and ended up impoverished, even losing his wife and family. He could no longer hold a steady job. Yet he did will on standard IQ tests. (See http://serendip.brynmawr.edu/bb/damasio)
Both patients appeared to retain high intelligence as measured by standard tests, but not as measured by their ability to behave sensibly. Both had also lost certain kinds of emotional reactions. What follows from these cases?
In a nutshell, here is the argument Damasio produced which many people in many academic disciplines enthusiastically accepted as valid:
Damage to frontal lobes impairs emotional capabilities.Damage to frontal lobes impairs intelligence.Therefore Emotions are required for intelligence.
The conclusion does not follow from the premises. (Whether the conclusion is true is a separate matter, which I'll come to.) Compare this argument proving that cars need functioning horns in order to start:
Damaging the battery stops the horn working in a car.Damage to the battery prevents the car starting.Therefore a functioning horn is required for the car to start.
A moment's thought should have reminded Damasio's readers that two capabilities could presuppose some common mechanism, so that damaging the mechanism would damage both capabilities, without either capability being required for the other. For instance, even if both premises in the horn argument are true, you can damage the starter motor and leave the horn working, or damage the horn and leave the starter motor working.
I first criticised Damasio's argument in two papers in 1998 and 1999 and have never seen these criticisms made by other authors. My criticisms were repeated in several subsequent publications. Nobody paid any attention to the criticism and even people who had read those papers continued to refer approvingly to Damasio's argument in their papers. Very intelligent people kept falling for the argument. For example, Susan Blackmore did not notice the fallacy when summarising Damasio's theories in her excellent recent book Consciousness: An Introduction (2003). (She has now informed me that she agrees that the argument used is fallacious.)
The best explanation I can offer for the surprising fact that so many intelligent people are fooled by an obviously invalid argument is sociological: they are part of a culture in which people want the conclusion to be true. There seems to be a widespread (though not universal) feeling, even among many scientists and philosophers, that intelligence, rationality, critical analysis and problem-solving powers are over-valued, and that they have defects that can be overcome by emotional mechanisms. This leads people to like Damasio's conclusion. They want it to be true. And this somehow causes them to accept as valid an argument for that conclusion, even though they would notice the flaw in a structurally similar argument for a different conclusion (such as in the car horn example). This is a general phenomenon. Consider, for instance, how many people on both sides of the evolution/creation debate, or both sides of the debate for and against computational theories of mind, tend to accept bad arguments for their side.
A research community with too much wishful thinking does not advance science. Instead of being wishful thinkers, scientists trying to understand the most complex information-processing system on the planet should learn how to think (at least some of the time) as designers of information-processing systems do.
To be fair, Damasio produced additional theoretical explanations of what is going on, so, in principle, even though the quoted argument is invalid, the conclusion might turn out to be true and explained by his theories. However, his theory of emotions as based on somatic markers (regulatory signals in the brain's representation of the body) is very closely related to the theory of William James, which regards emotions as a form of awareness of bodily changes. This sort of theory is incapable of accounting for the huge subset of socially important emotions in humans which involve rich semantic content which would not be expressible within somatic markers (such as admiring someone's courage while being jealous of his wealth) and emotions that endure over a long period of time while bodily states come and go (such as obsessive ambition, infatuation, or long term grief at the death of a loved one).
The key assumption, shared by both Damasio and many others whose theories are different in details, is that all choices depend on emotions, and especially choices where there are conflicting motives. If that were true it would support a conclusion that emotions are needed for at least intelligent conflict resolution.
Although I will not argue the point here, I think it is very obvious from the experience of many people (certainly my experience) that one can learn how to make decisions between conflicting motives in a totally calm, unemotional, even cold way, simply on the basis of having preferences or having learnt principles that one assents to. Many practical skills require learning which option is likely to be better. A lot of social learning provides conflict resolutions strategies for more subtle decisions: again without emotions having to be involved. Of course, one could make a terminological decision to label all preferences, policies, and principles emotions. But that would trivialise Damasio's conclusion.
So, let's start again: what are emotions, and how do they work? There are many ways to study emotions and other aspects of human minds. Reading plays, novels or poems will teach much about how people who have emotions, moods, attitudes, desires and so on think and behave, and how others react to them, because many writers are very shrewd observers. Studying ethology will teach you something about how emotions and other mental phenomena vary among different animals. Studying psychology will add extra detail concerning what can be triggered or measured in laboratories, and what correlates with what. Studying developmental psychology can teach you how the states and processes in infants differ from those in older children and adults. Studying neuroscience will teach you about the physiological brain mechanisms that help to produce and modulate mental states and processes. Studying therapy and counselling can teach you about ways in which things can go wrong and do harm, and in some ways of helping people. Studying philosophy with a good teacher may help you discern muddle and confusion in attempts to say what emotions are and how they differ from other mental states and processes.
There's another way that complements these: do some engineering design. Suppose you had to design animals (including humans) or robots capable of living in various kinds of environments, including environments containing intelligent systems. What sorts of information-processing mechanisms, including control mechanisms, would you need to include in the design, and how could you fit all the various mechanisms together to produce all the required functionality, including: perceiving, learning, acquiring new motives, enjoying some activities and states and disliking others, selecting between conflicting motives, planning, reacting to dangers and opportunities, communicating in various ways, reproducing and so on?
If we combine this design standpoint with the other ways to study mental phenomena, we can learn much about all sorts of mental processes: what they are, how they can vary, what they do, what produces them, whether they are essential or merely by-products of other things, how they can go wrong, and so on. The result could be both deep new insights about what we are, and important practical applications.
The design-based approach is not new: over the last half century, researchers in computational cognitive science and in artificial intelligence have been pursuing it. Because the work was so difficult, and because of pressures of competition for funding and other aspects of academic life (such as lack of time for study outside one's own specialism), as more people became involved, the research community became more fragmented, with each group investigating only a small subset of the larger whole, and talking only to members of that group.
Deep, narrowly focused research on very specific problems is a requirement for progress, but if everybody does only that, the results will be bad. People working on natural language without relating it to studies of perception, thinking, reasoning, and acting may miss out on important aspects of how natural languages work. Likewise those who study only a small sub-problem in perception may miss out ways in which the mechanisms they study need to be modified to fit into a larger system. The study of emotions also needs to be related to the total system.
We may be able to come up with clear, useful design-based concepts for describing what is happening in a certain class of complex information processing systems, if we study the architecture, mechanisms and forms of representations used in that type of system, and work out the states and processes that can be generated when the components interact with each other and the environment.
If the system is one that we had previously encountered and for which we already have a rich and useful pre-scientific vocabulary, then the new design-based concepts will not necessarily replace the old ones but may instead refine and extend them. For example, they might lead us to new sub-divisions and bring out deep similarities between previously apparently different cases.
This happened to our concepts of physical stuff (air, water, iron, copper, salt, carbon and so on) as we learnt more about the underlying architecture of matter and the various ways in which the atoms and sub-atomic particles could combine and interact. So we now define water as H 2 O and salt as NaCl rather than in terms of how they look, taste or feel, and we know that there are different isotopes of carbon with different numbers of neutrons.
As we increase our understanding of the architecture of mind (what they mechanisms are, how they are combined, how they interact), our concepts of mind (such as emotion, consciousness, learning, seeing) will also be refined and extended. In the meantime, muddle and confusion reign.
Suscribirse a:
Comentarios (Atom)