Tuesday, April 05, 2005

IP vs. "real" embodiment

In the embodiment of meaning N. Katherine Hayles, whom I started my journey into embodiment with, considers the differences between Herb Simon's symbolic IP version of embodiment and her own. She writes:
I applaud Herbert Simon's effort to put cognitive science in conversation with literary criticism. It is an effort I have made myself from time to time. And I would like to agree that his definition of meaning makes sense in a literary context. But I get nervous with the implication that, given this definition of meaning, computers can be said not only to generate meanings, but also to understand them. Simon's announced aim is to make available to literary critics a precise definition of meaning understood in operational terms. An unannounced aim, but one I think we are entitled to infer, is to define meaning in such a way as to advance his program of simulating human intelligence with computers. He writes, "Meanings flow from the intensions of people (or perhaps people and computers, a controversial issue)." If computers can generate meaning, then it follows (his parenthesis suggests) that computers possess intension. When a computer is programmed to achieve a goal, does it have intension toward that goal? Suppose we are willing to grant that proposition. Immediately another issue arises, for intension is only part of meaning; the other (and perhaps larger) part of meaning flows from understanding. Chance events may create juxtapositions that have meaning for observers, even though no intension was involved in producing the meaning. But meaning without understanding on someone's part is not meaning, for only when understanding occurs is it possible to say, "Oh, I see what it means."
She continues by tying meaning to emotion (and the endocrine system). Since computers don't have endocrine systems, they can't have emotions, or meaning, or therefore, embodied knowledge. I will not go so far. I'll suggest that computer input systems and networks can form a rudimentary endocrine-like system that can be argued to be parallel to ours, and therefore one might argue that computers can have some sort of embodied knowledge. What I will suggest is that, if they have embodied knowledge, it's markedly different from ours.

At the point where "meaning" is introduced, the discussion becomes far more philosophical, and questions like: What does it mean to have meaning? and What does it mean to "know"? or -- to be human? demand to be answered. Then sentience must be tackled, and foundational beliefs of ontology and epistemology arise. And it's here that I stop arguing, and simply state my own take.

I am a constructivist, but allow for the agency of the individuals as they negotiate with both recognized and unrecognized influences of their environments (social and physical) within the constraints of their own genetic makeup. It's a merging of social, radical, and critical constructivism -- I want to eat the cake that I have.

0 Comments:

Post a Comment

<< Home