Monday, October 06, 2008

minsky 4: consciousness

Overview of Chapter 4 of The Emotion Machine (summary, online draft, buy)

Consciousness is a suitcase word. It can mean such diverse things as a unifier, self awareness, identity, an animator of the mind, a provider of meaning or a detector of feelings. It refers to many different mental activities that don't have a single cause or origin

We need a way to divide the mind into parts that is more meaningful than crude folk psychology "dumbbell" (two part distinctions) such as conscious v. unconscious, premeditated v. impulsive, etc. For example, the "unconscious" state may represent various different states. Information may be inaccessible for a variety of reasons such as simple failure to retrieve, active censorship or "sublimated" into a form which can't be recognised (to borrow Freud's terminology)

Minsky uses the Plato / Socrates shadows on the cave allegory to discuss a possible structure of our minds.

Imagine we have an A-Brain and a B-Brain. The A-Brain receives signals from external world via organs such as eyes, ears, nose and skin - and can react to those signals by making our muscles move. The A-Brain has no sense of what the events mean.

The B-Brain receives and reacts to signals from A-Brain. However, the B-Brain has no direct connection to the outer world, so it is like the prisoners in Plato's cave, who see only shadows on the wall. The B-Brain mistakes A's descriptions for real things

For example, if B sees that A has got stuck at repeating itself, it might suffice for B to instruct A to change its strategy. To acquire its skills the B brain may need a C brain to help (eg. it's not always appropriate to stop repeating oneself, especially when crossing a road)

Student: Would not this raise increasingly difficult questions, because each higher level would need to be smarter and wiser?
Minsky: No, C-Brain could act as a "manager" who has no special expertise about particular jobs but could still give "general" guidance, like: "If B's descriptions seem too vague, C tells it to use more specific details", etc.

Minsky proposes six levels of processes:

The organism principle

Does your theory really need so many different levels? Are you sure that you can't make do with fewer of them? Indeed, why should we need any "levels" at all - instead of a single, big, cross connected network of resources?
The Organism Principle: When a system evolves to become more complex, this always involves a compromise. If its parts become too separate, then the system's abilities will be limited. But if there are too many interconnections, then each change in one part will disrupt many others

Hence, our bodies are composed of distinctive separate parts we call "organs". This also applies to the brain organ. New design is built on top of old design: "... large parts of our brains work mainly to correct mistakes that other parts make ..."

Psychology is hard because each "law of thought" has exceptions. It will never be like physics which has "unified theories" which work flawlessly.

Minsky's proposed solution to consciousness being a suitcase word: We must try to design - as opposed to define - machines that can do what human minds do. DESIGN not DEFINE

Consciousness seems mysterious because we exaggerate our perceptiveness. Most processes are hidden from us. We see things less as they are and more with a view to how they are used (eg. hammer, ball). Our minds did not evolve to serve as instruments for observing themselves

There are many suitcase words in psychology: attention, emotion, perception, consciousness, thinking, feeling, self, intelligence, pleasure, pain, happiness

Why do people, including scientists, look for a single concept, process or thing to explain multiple aspects of mind? They prefer one large problem rather than dozens or hundreds of smaller problems

Aaron Sloman:
"People are too impatient. They want a three-line definition of consciousness and a five-line proof that a computational system can or cannot have consciousness. And they want it today. They don't want to do the hard work of unraveling complex and muddled concepts that we already have, and exploring new variants that could emerge from precisely specified architectures for behaving systems"
How do we initiate what we call consciousness?

Most mental processes don't cause us to think or reflect about why or how. But when those low level processes don't function well or when they encounter obstacles the high-level activities start up with these properties: self models, serial processes, symbolic descriptions and recent memories. A trouble detecting critic (T) might operate as shown:

If you reverse the trouble detector, then you have a consciousness detector, ie. a part of our brain that sends signals to other parts including our language system which then invents words to describe this condition, such as: conscious, attentive, aware, alert, me, myself, deliberate, intentional, free will

Our higher level descriptions are mainly stable, they have been formed previously. Hence it's an illusion to think we live in the present moment!

The Immanence Illusion: For most of the questions you would otherwise ask, some answers will have already arrived before the higher levels of your mind have had enough time to ask for them. Our Critics may recognise a problem and start retrieving the knowledge you need before your other processes have had time to ask questions about it

Some philosphers regard explaining "subjective experience" as the hardest problem in psychology: the quality of deep blue, the sensation of middle C. Minsky argues that terms like experience or inner life refer to big suitcases of different phenomena. Our "insights" from inside our mind are frequently wrong. If consciousness means "awareness of our internal processes" then it doesn't live up to its reputation

Minsky uses the word model in this book to mean "a mental representation that can be used to answer some questions about some other, more complex thing or idea". We have multiple models: professional, political, beliefs about abilities, ideas about social roles, moral and ethical views. Our thinking depends on (a) quality of models; (b) how good our ways of choosing which model to use in different situations

"Free will" might mean "I have no model that explains how I made the choice I made"

The Cartesian Theater is the idea that our minds contain a central stage on which various actors perform while we (the self) watches and then makes decisions. This popular idea is analysed and debunked. The spatial metaphor is deeply held and hard to abandon

The idea that we live in the here and now, moving steadily into the future - is an illusion! "Real time" is a process of zigzagging through memories as we assess our progress on goals, hopes, plans and regrets!

Dennett and Kinsbourne (1992):
"... there is no single, definitive 'stream of consciousness', only a parallel stream of conflicting and continuously revised contents"
There are problems with thinking too much about how we think, to be too self aware would be very tedious! Minsky employs an amusing and enlightening dialogue with HAL to illustrate:
... interpreting those records is so tedious ... I often hear people say things like, "I'm trying to get in touch with myself." ... take my word for it, they would not like the result of accomplishing this


Anonymous said...

Well, it's fascinating but I don't imagine there's any way to be certain whether we've achieved consciousness in any system even if it exhibits behavior akin to our own intelligence. Fun stuff though. :-)

Anonymous said...

Whoops. I went back and read the post again. I see that my own first comment assumed that the post is about the development of artificial intelligence. I see that it really isn't. I guess what that shows more than anything is where my mind is at. ;-)

Bill Kerr said...

hi carl,

Well, the goal of building an AI that exhibits what we call consciousness is a Minsky goal, so that is relevant to the discussion.

If "inner life" is a single compact thing then it might warrant a compact explanation. Minsky is arguing that "inner life" is a big suitcase of different phenomena and each of them requires different explanations. We can't explain them yet but to break it down is the way to go.

He also makes the point that although each person can inspect their own mind from inside those insights are frequently wrong and so, they may seem special, but they are not.

So, we have this "thing" we call consciousness but really it's a bunch of many different processes - not something belonging to a "Single-Self"

I think this puts the zombie argument - (since zombies lacking qualia and sentience are logically possible then qualia and sentience are not fully explained by physical properties alone: from (David_Chalmers ))- in it's place as not all that significant.