Chip with simple program for Toy

On Mar 23, "Bob Myers" <nospample...@address.invalid> wrote:
Exactly my point. We cant define consciousness
in an absolute way.

A mental state which includes self-awareness.
(mental states = brain states, I believe we concur)

Can you cleanly distinguish "consciousness" from "self-
awareness," though? If not, then the definition would collapse
down to "Consciousness is a mental state which includes
consciousness," and that surely isn't very helpful. What do
we mean by "consciousness" if not "self-awareness"?
Good point, you caught me in a bit of sloppy thinking.

All right then, we define self-awareness as that
set of brain states (determined by synaptic
activity), which accords with our commonly held
notions, derived from observed behavior. Then
we define 'consciousness' in some weaker form.
The distinction is not a big deal, technicallly speaking.

But still, it's a matter of semantics, not physics.
i.e. we can still define everything in terms of
neurons and synapses, without any of Kevin's circularities.

My argument shows that it is that it is quite impossible to define
consciousness without referring to consciousness in the definition. Its
inherently a circular process.

False. Refer to neural activity - which is what
thinking/feeling/memory IS.

Yes, but the term "consciousness" seems to me to point to
the experiential aspect of that neural activity; I know that
I, myself, are conscious solely because I experience that state
in myself. And clearly not all neural activity equates to
consciousness -
Yes yes, but that's all fuzz. Just accept that your
"experiential aspect" has no independent existence,
it's an illusion.

We observe people (observe yourself, if you wish),
and correlate brain activity to 'experience'. Build a
database, and we're done.

Of course there will be some degree of
arbitrariness, that's how it is with definitions.
It's like asking: how can we be sure an 'elephant'
is REALLY an elephant? The question is
moot. What happened was, Adam went around
the Garden of Eden with a notebook, and said
"that criitter looks like an elephant", and it stuck.

We can measure electrical signals of the brain and corralate
them with actions etc.

Right.
So what's the problem?

Can we correlate any such measurements with "consciousness"?
Getting back to an earlier question - if there's no problem
here, then what you're saying is that we have (through these
measurements) an objective test for consciousness in other
entities. Is this in fact your claim?
No, in the case of ET, that wouldn't apply. We
would just apply the Turing test - if it walks
like a duck and quacks like a duck...

Not satisfying, eh? But if mental states = brain
states, consciousness is an illusion
anyhow. So it doesn't make sense to say
"lets look for it in Klingons" ('it' doesn't exist),
you would need a behavioral test.

But we can empirically test for self-awareness
(which we might define as 'soul'). We know that chimps
are self-aware.

More precisely, we know that chimps behave in a
manner which appears consistent with our own experiences
of self-awareness, and on this we base an assumption that
they are, in fact, self-aware. However, since self-awareness
it itself a personal experience, we cannot directly observe it
or sense it for ourselves re the chimp (or any other entity) in
question.
Are you familiar with the mirror test for
self-awareness? We can define self-aware as
anything which passes that test. Then that is
'conscious', by definition.

Obviously, I choose to behave in the vast majority of cases
as if others are conscious, but I do acknowledge that this is
simply an assumption on my part.
It all depends on the definition... people get
hung on semantics too often, as this thread demonstrates...

--
Rich
 
On Mar 25, 4:58 pm, "RichD" <r_delaney2...@yahoo.com> wrote:
On Mar 25, "The_Man" <me_so_hornee...@yahoo.com> wrote:

There's another consideration... in quantum
mechanics, the observer occupies a special
place - the wave function cannot collapse
(to a particular event) without one. But
'observer' is vague... presumably, it means
consciousness.

Presumably, it doesn't. The wave function collapses during
measurement, as best that it is understood, because it
interacts with a classical system. "Consciousness"
doesn't have anything to do with it at all.
The measuring device is an inherently classical (i.e., quantum
state with very high quantum number, so that it appreoaches the
predictions of classical mechanics, the so-called "correspondence
principle".)

No, that view - "the classical measuring device
interacts with the quantum sytstem" - is too
simplistic, it's introductory level, it doesn't capture
the essential weirdness.
So apparently Bohr's view was "simplistic".

There are experiments where no such classical
such as?

interaction exists, yet we still see the dualistic
quantum behavior; the only 'interaction' is
observation, which apparently occurs only in
the perception/consciousness of the (human) observer.
Please describe how "measurements" take place merely by "perception",
and without interaction with some apparatus.

Hence, consciousness has
some attribute whch interacts with nature.

--
Rich
 
The_Man wrote:
On Mar 25, 4:58 pm, "RichD" <r_delaney2...@yahoo.com> wrote:

There are experiments where no such classical

such as?
Here's a pretty good description of the quantum bomb detector.

http://hunch.se/stuff/the%20quantum%20world.pdf


Bob
--

"Things should be described as simply as possible, but no simpler."

A. Einstein
 
"Daryl McCullough" <stevendaryl3016@yahoo.com> wrote in message
news:eu64ks0vqc@drn.newsguy.com...
I'm talking about the meaning of the words "Joe appears
to be conscious, but he doesn't really have any experiences."

What do those words mean? You may say that you have, through
introspection, direct knowledge of your own consciousness.
Fine. But what does it mean for *Joe* to have, or not have
that? He can't have *your* consciousness. The best he can
have is something *analogous* to your consciousness. But
you haven't specified which analogies count and which don't.
If you've been following this thread since it started, you're
aware that this is precisely what I've been saying. I only KNOW
of my own consciousness, through direct experience. I assume
that others are conscious, because they (a) claim to be, (b) are
constructed in much the same way I am, and (c) behave in the
way I would expect them to, based on what I know of myself.
But I cannot directly experience their consciousness (at least as
far as we know, Mr. Spock's "mind-melds" on "Star Trek"
notwithstanding!), so I acknowledge that this IS an assumption on
my part.

Yes, I agree. But in my opinion, the correct response is to
reformulate morality, and ethics so that it doesn't rely on
the existence of some intrinsic, though undetectable and
indefinable "essence". It's much better, in my opinion, to
formulate the question of how we morally treat systems that
exhibit "as if" consciousness.
That could very well be, and I think I would tend to go along
those lines as well - that "personhood," in any legal, ethical, or
moral sense, should be granted to any entity sufficiently advanced
to claim it for themselves. But I seriously doubt that such a position
is going to be widely accepted without objections from several
sides. In particular, a lot of religious thought is very tied to the
notion of human beings being something "special" in the cosmos,
and they're not likely to welcome other claimants to that position.
So it's good to get our arguments thought out now rather than
later, no?

I think that there *can't* be any evidence in favor or opposed,
because the concept of "real consciousness" doesn't have a determinate
meaning. You're trying to define a concept by generalization from
one example. There are infinitely many ways to generalize from one
example, and there are *no* criteria for preferring one way over
another way.
Agreed - but now we have to take a step back and confront
a slightly different question: if we can't come up with evidence
either way re "real consciousness" (either way), then on what
principle are we going to hang our claims re "personhood" for
these other entities?

Bob M.
 
"Daniel Mandic" <daniel_mandic@aon.at> wrote in message
news:46069f4b$0$25628$91cee783@newsreader02.highway.telekom.at...
Bob Myers wrote:

"conscious, complex system" to examine - the human brain (or mind, if
you're a dualist). As we've already covered, once an artificial


What has this to do with a dualist.

Daniel, in the context of philosophy, at least as it is
typically discussed in English, "dualism" is generally used to
refer to that school of thought which holds that "mind" and
"brain" are completely separate things - or, if you like, "soul"
and "body." More generally, it is the position that there is
something to the "self" that is beyond "mere" physical
structures and neurochemistry.

See:
http://en.wikipedia.org/wiki/Dualism_(philosophy_of_mind)

Bob M.
 
"RichD" <r_delaney2001@yahoo.com> wrote in message
news:1174865326.733685.32870@y80g2000hsf.googlegroups.com...
More precisely, we know that chimps behave in a
manner which appears consistent with our own experiences
of self-awareness, and on this we base an assumption that
they are, in fact, self-aware. However, since self-awareness
it itself a personal experience, we cannot directly observe it
or sense it for ourselves re the chimp (or any other entity) in
question.

Are you familiar with the mirror test for
self-awareness? We can define self-aware as
anything which passes that test. Then that is
'conscious', by definition.
Sure - but as you've already noted, we can define ourselves
out of just about any tough question. Using the mirror test
as completely sufficient for determining "self-awareness" is
not satisfying, simply because we DO have direct experience
of this thing called "consciousness" and from that have the
feeling that there IS something more to it than merely the
proper set of externally-observable behaviors. We may, in
feeling that, be deluding ourselves - but admitting that doesn't
get rid of that feeling that we haven't really come up with anything
when playing with definitions.

Bob M.
 
Bob Myers wrote:
"Daryl McCullough" <stevendaryl3016@yahoo.com> wrote in message
news:eu64ks0vqc@drn.newsguy.com...
I'm talking about the meaning of the words "Joe appears
to be conscious, but he doesn't really have any experiences."


That could very well be, and I think I would tend to go along
those lines as well - that "personhood," in any legal, ethical, or
moral sense, should be granted to any entity sufficiently advanced
to claim it for themselves. But I seriously doubt that such a position
is going to be widely accepted without objections from several
sides. In particular, a lot of religious thought is very tied to the
notion of human beings being something "special" in the cosmos,
and they're not likely to welcome other claimants to that position.
So it's good to get our arguments thought out now rather than
later, no?
They have dolls that will say "I'm hungry" and a few other phrases
when a cord is pulled. They could just as well program a phrase,
"I am alive." and make it battery operated.

So some robot which has been programmed to assert its personhood
from time to time, is not enough to call it an entity because
entity is a word for some sentient creature. If it is not a
sentient creature, then it is not alive, and the words "I am alive"
doesn't make it a person. I think you are looking for evidence
that the entity has volition which would happen if the robot
asserted "I am alive." without being programmed to say that.
You didn't specify how it can be determined whether one is looking
at an entity rather than a programmable mechanical tool.

What the entity says will not be nearly so important as that you
can definitely say it is an entity. This uses a definition of
entity which would not include a sophisticated answering machine.
 
Bob Myers wrote:

Daniel, in the context of philosophy, at least as it is
typically discussed in English, "dualism" is generally used to
refer to that school of thought which holds that "mind" and
"brain" are completely separate things - or, if you like, "soul"
and "body." More generally, it is the position that there is
something to the "self" that is beyond "mere" physical
structures and neurochemistry.

See:
http://en.wikipedia.org/wiki/Dualism_(philosophy_of_mind)

Bob M.

Hi Bob!

O.k. I understand. But there is already a dualism in Brain. Left- and
Right halve Brain. Which has a broader (wider diameter) connection
going with womens, by the way.

I was just on the Brain Theory of the Mr., of Newsreader.com :)


I think our brain can fetch better the surroundings, than explaining
its own function. And for what.... (to get mad, probably)



Using mind and brain cannot be dualism, so far I learned. Mind is
changing from Heart to Brain and the result is at least three, if not
four wayed. (when breaking it down, into pursuable logical steps)



Best regards,

Daniel Mandic
 
On Mar 24, 1:56 pm, stevendaryl3...@yahoo.com (Daryl McCullough)
wrote:
Kevin Aylward says...

We say that feelings aren't physical. Memories aren't physical.
Ideas aren't physical. Qualia aren't physical. Tangible objects are
physical, intangible objects are not physical. But that's all pure
bull shit created by people that didn't even know we had a physical
brain.

Oh dear. Its clear that you just haven't thought this thing through at all.
A machine doesn't have "experience".

On what basis do you say that? I can understand how one could use
introspection as evidence that he has experience, what counts as
evidence that something *doesn't* have experience?
What is the evidence that God doesn't exist?

In general, we rely upon positive evidence that something exists
before assserting it. It seems to me the appropriate question is:
What counts as evidence for experience (qualia)? Why is it that we
assume other humans and many animals have experience but plants do
not? (OK, some think some plants have experiences as well. I
haven't seen many complain about mowing the grass just because of
the pain it inflicts upon the grass.)
 
forbisgaryg@msn.com says...
stevendaryl3...@yahoo.com (Daryl McCullough) wrote:
Kevin Aylward says...

Oh dear. Its clear that you just haven't thought this thing through at all.
A machine doesn't have "experience".

On what basis do you say that? I can understand how one could use
introspection as evidence that he has experience, what counts as
evidence that something *doesn't* have experience?

What is the evidence that God doesn't exist?
I don't know what would count as evidence for or against
the existence of God. It depends on how you define "God".

In general, we rely upon positive evidence that something exists
before asserting it. It seems to me the appropriate question is:
What counts as evidence for experience (qualia)?
It depends on how you define qualia. I don't think it actually
has a very clear definition.

Why is it that we assume other humans and many animals have
experience but plants do not?
The way I feel about it is that it's simply a matter of the
sophistication of the response to changing conditions. Plants
respond in extremely simple ways to changing conditions: they
bend towards the sunlight, they put out roots to seek out water.
Animals have a much more sophisticated repertoire of responses
to the environment. I think that that's all that "having experience"
means.

--
Daryl McCullough
Ithaca, NY
 
On 2007-03-25, John Fields <jfields@austininstruments.com> wrote:

Other than physically moving the laser back and forth (toward and
away from the receiver) at an audio rate, just how would you propose
going about frequency modulating the laser?
Attach a mirror to a loudspeaker cone, it has less mass and you get twice as
much modulation, (but this is really phase modulation)

Hmmm... it's possible that a mirror attached to a piezo behind
a half silvered mirror could do a pretty good job of amplitude
modulating a laser.

Move the back mirror half a wavelength and the
interferance goes from constructive to destructive.

Bye.
Jasen
 
On 26 Mar 2007 10:19:36 GMT, jasen <jasen@free.net.nz> wrote:

On 2007-03-25, John Fields <jfields@austininstruments.com> wrote:

Other than physically moving the laser back and forth (toward and
away from the receiver) at an audio rate, just how would you propose
going about frequency modulating the laser?

Attach a mirror to a loudspeaker cone, it has less mass and you get twice as
much modulation, (but this is really phase modulation)
---
No, it's FM. What you get reflected is light of a slightly
different wavelength because of Doppler shift, so it's a change in
frequency/color.

Also, if the rate of change and the amplitude of the mechanical
motion is the same in both systems the carrier deviation will be
identical.


--
JF
 
RichD wrote:
On Mar 23, "feedbackdroid" <feedbackdr...@yahoo.com> wrote:
As you said yourself, consciousness seems to
arise from a sufficiently complex neural network.
But there are degrees... a snail has a neural net,
with brain activity, should we call it conscious?
That doesn't appear too useful.
Maybe more useful that you surmise. The alternative
is that consciousness isn't simply a yes-no issue,
but has graded degrees, and so a snail does have
a certain degree of consciousness, albeit much
much less than chimps and humans.

Perhaps... but then you need a way to define, and
measure, this 'degree of consciousness'. Which
could be done, in terms of synaptic activity.

My point is, to define conscousness in a manner
to include snails, seems counter-productive; it
doesn't capture what we experience as humans.

--
Rich
Just what do we experience as humans that is different than what snails
experience. How do you know?
 
Bob Myers wrote:
"RichD" <r_delaney2001@yahoo.com> wrote in message
[...]
But we can empirically test for self-awareness
(which we might define as 'soul'). We know that chimps
are self-aware.

More precisely, we know that chimps behave in a
manner which appears consistent with our own experiences
of self-awareness, and on this we base an assumption that
they are, in fact, self-aware. However, since self-awareness
it itself a personal experience, we cannot directly observe it
or sense it for ourselves re the chimp (or any other entity) in
question. This gets us back to the "zombie" issue that Kevin
and I were discussing earler (and which has certainly been
gone over time and time again by philosophers, for years!) -
can you distinguish between an entity which perfectly
simulates the OUTWARD behavior of a conscious entity
and yet is not truly conscious (it does not have the necessary
experience within itself), and one who IS truly conscious?
We may choose to believe that such "zombie" entities are
not possible, but I for one don't see a particularly strong reason
to believe this. I simply cannot know with any degree of
certainty whether or not any other entity is conscious.
Obviously, I choose to behave in the vast majority of cases
as if others are conscious, but I do acknowledge that this is
simply an assumption on my part.

Bob M.
But all animals show survival instinct. Chimps show a certain KIND of
self-awareness--that they can groom themselves while looking in a
mirror, that is more "sophisticated" than most animals. But every mammal
shows some awareness of itself as being different than the pack of other
animals of like or different kinds that surround it (not meant to evoke
Creationist worries, sorry). And where does "Self" awareness come into
play here? Are humans the only creatures on the planet able to
experience Self? Most humans don't have that in a consistent way.
 
RichD wrote:
On Mar 23, "Bob Myers" <nospample...@address.invalid> wrote:
Exactly my point. We cant define consciousness
in an absolute way.
A mental state which includes self-awareness.
(mental states = brain states, I believe we concur)
Can you cleanly distinguish "consciousness" from "self-
awareness," though? If not, then the definition would collapse
down to "Consciousness is a mental state which includes
consciousness," and that surely isn't very helpful. What do
we mean by "consciousness" if not "self-awareness"?

Good point, you caught me in a bit of sloppy thinking.

All right then, we define self-awareness as that
set of brain states (determined by synaptic
activity), which accords with our commonly held
notions, derived from observed behavior. Then
we define 'consciousness' in some weaker form.
The distinction is not a big deal, technicallly speaking.

But still, it's a matter of semantics, not physics.
i.e. we can still define everything in terms of
neurons and synapses, without any of Kevin's circularities.
And yet, even synapses are not necessary. Individual cells show the
ability to maintain their own identify in ways that are far more complex
than what we assign to human behavior.

My argument shows that it is that it is quite impossible to define
consciousness without referring to consciousness in the definition. Its
inherently a circular process.
False. Refer to neural activity - which is what
thinking/feeling/memory IS.
Yes, but the term "consciousness" seems to me to point to
the experiential aspect of that neural activity; I know that
I, myself, are conscious solely because I experience that state
in myself. And clearly not all neural activity equates to
consciousness -

Yes yes, but that's all fuzz. Just accept that your
"experiential aspect" has no independent existence,
it's an illusion.
But of an interesting kind. All language is based on it. Some speculate
that human-style awareness is based on the ability to have SOME kind of
language.

We observe people (observe yourself, if you wish),
and correlate brain activity to 'experience'. Build a
database, and we're done.

Of course there will be some degree of
arbitrariness, that's how it is with definitions.
It's like asking: how can we be sure an 'elephant'
is REALLY an elephant? The question is
moot. What happened was, Adam went around
the Garden of Eden with a notebook, and said
"that criitter looks like an elephant", and it stuck.

We can measure electrical signals of the brain and corralate
them with actions etc.
Right.
So what's the problem?
Can we correlate any such measurements with "consciousness"?
Getting back to an earlier question - if there's no problem
here, then what you're saying is that we have (through these
measurements) an objective test for consciousness in other
entities. Is this in fact your claim?

No, in the case of ET, that wouldn't apply. We
would just apply the Turing test - if it walks
like a duck and quacks like a duck...

Not satisfying, eh? But if mental states = brain
states, consciousness is an illusion
anyhow. So it doesn't make sense to say
"lets look for it in Klingons" ('it' doesn't exist),
you would need a behavioral test.
\
Why do you say consciousness is an illusion if based on brain-states?

But we can empirically test for self-awareness
(which we might define as 'soul'). We know that chimps
are self-aware.
More precisely, we know that chimps behave in a
manner which appears consistent with our own experiences
of self-awareness, and on this we base an assumption that
they are, in fact, self-aware. However, since self-awareness
it itself a personal experience, we cannot directly observe it
or sense it for ourselves re the chimp (or any other entity) in
question.

Are you familiar with the mirror test for
self-awareness? We can define self-aware as
anything which passes that test. Then that is
'conscious', by definition.

Obviously, I choose to behave in the vast majority of cases
as if others are conscious, but I do acknowledge that this is
simply an assumption on my part.

It all depends on the definition... people get
hung on semantics too often, as this thread demonstrates...
And your assumptions are far more subtle than you realize.
 
Kevin Aylward wrote:
RichD wrote:
On Mar 22, "Kevin Aylward" <kevin_aylw...@ntlworld.com> wrote:
Please, explain "explain" without referring to itself. Like,
explain means to give an understanding..like, you know...when you
are aware of something that clicks in your brain..well, what do
you mean by "aware"...etc. Its all circular. Turtules all the way
down...
No.

Yes.

Formulate a Turing machine which models the data,
a la Kolmogorov complexity. This consitutes 'explanation'.
I believe this is, in fact, what the brain does (the details are
a bit mysterious).

Oh dear.

And exactly what has this got to do with qualia. That is experiance,
feelings etc?

Difficult, but no logic or semantic problems involving self-reference.

Agreed, again IF this all must ultimately refer back to this
notion of "consciousness" which we have yet to define.
Exactly my point. We cant define consciousness in an absolute way.
A mental state which includes self-awareness.
(mental states = brain states, I believe we concur)

Nope. You miss the whole argument.

Now try and define "aware" independently of consciousness.

Have you actually read my paper?

http://www.kevinaylward.co.uk/replicators/thehardproblem.html

MY point was that it is "consciousness" itself which we have
yet to adequately define or describe such that we can truly
speak of it in any meaningful manner. In any and all cases I've
been able to think of, at least, "consciousness" winds up
equating to "that which *I* experience in myself."
My argument shows that it is that it is quite impossible to define
consciousness without referring to consciousness in the definition.
Its inherently a circular process.
False. Refer to neural activity - which is what
thinking/feeling/memory IS.

Oh...now explain how that physical measurement of electrical signals
transpires to us as qualia? That is in something we *experience*.

Explain the word *experience" "perceive" etc. All aspects of being aware and
conscious. It is indeed turtles all the way down. None of these words can be
defined without referring to consciousness as they are all properties that
only a consciousness provides.

We can
measure electrical signals of the brain and corralate them with
actions etc.
Right.
So what's the problem?

Whats the problem you say. Oh dear...

err.. like is an electron aware of its own existence? You comment here tells
me you haven read pretty much anything on the problem of "conscious
experience"
Hindus would say yes.

Hagelin says that when the superstring field starts interacting with
itself, it shows properties of consciousness. From that perspective, it
IS consciousness ALL the way down...
 
Daryl McCullough wrote:
Bob Myers says...

...I would again point out
that we have no way of directly demonstrating "experience"
or "feelings" in ANYONE but ourselves - therefore, I am not
certain that they exist in anyone else. The rest of you may be
all zombies for all I know (which would raise troubling questions
about where you all came from, and why I'M here, but at
least it IS a possibility!).

I don't see how such a possibility has any meaning.
What would it *mean* for someone else to lack "feelings"
or "experience"?
WE can identify people with extreme forms of this: they're called
autistic or sociopathic and so on. They lack the physical structures
associated with feelings and certain kinds of experience.

Obviously, when people talk about zombies, they are
distinguishing between "as if" mental properties and
"real" mental properties. We can certainly all agree
that other humans behave as if they had sensation,
emotions, awareness, etc. But supposedly that isn't
enough to show that they have "real" mental experience.
But what does that mean? What does "real" mean in this
case? Presumably, it means "Like mine". But what notion
of "likeness" is appropriate here? Of course, no two
brains are alike, so no other brain is like mine, and
no other mind works precisely like mine. But what
range of differences is allowable for mentality to
be considered "real"?

The further question is, why should anyone *care*
about the difference between "real" and "as if"
mentality?

Let me try an analogy. Suppose we're talking about
socks. Some philosopher has a theory that there are true
socks and there are pseudo-socks. This philosopher
doesn't yet have any physical test to distinguish
true socks from pseudo-socks, and he *also* doesn't
have any explanation for why anyone would care whether
they are wearing true socks or pseudo-socks. But he
insists that there is a property of "intrinsic sockness"
that is not reducible to the physical facts. Why would
such a theory of socks make any sense? Why is the
possibility of zombies any different from the possibility
of pseudo-socks?
By "zombie," you mean someone lacking "free will?"

But that's another definition issue.
 
"RichD" <r_delaney2001@yahoo.com> wrote:

Can you cleanly distinguish "consciousness" from "self-
awareness," though? If not, then the definition would collapse
down to "Consciousness is a mental state which includes
consciousness," and that surely isn't very helpful. What do
we mean by "consciousness" if not "self-awareness"?
Well, it seems to be common theme to tie these together, and even to try
and define consciousness as something that happens when one becomes self
aware, but I see no value in doing that.

If a device is not aware of itself, does that mean it's not conscious?

If the brain is a device which responds to sensory signals and generates
internal signals and ultimately external signals, then self awareness
happens in such a system when the effects of the generated systems end up
feeding back to various sensory inputs. In other words, the brain
generates a signal to move the hand, the hand moves, and we see the hand
move, and we feel the contact when the hand touches something, etc.

The brain is a learning controller which shapes it's behavior though
experience interacting with an environment. Learning what to expect from
the environment is how the brain becomes "aware" of the environment. It
just so happens that the brain and the body it controls is part of the
environment it is becoming aware of. As a result, as it becomes aware of
the environment, it's also becoming aware of itself.

But what would happen if such a learning machine was exposed to an
environment that didn't include itself? What happens if the learning
machine could sense some environment, but not effect anything that happened
in that environment? With this configuration, it would still have the
power to learn about what to expect from the environment, but it would have
no ability to sense itself because non of it's actions were a part of the
environment.

The machine would still be functioning normally, but now we would be forced
to say the machine had no self awareness. Is the machine, which is still
functioning exactly the same as it did before, no be called unconscious
simply because it had no self awareness?

I see no value in defining consciousness that way. But I see little value
in the word consciousness no matter how you look at it. In the end, the
word implies that humans are somehow special but there's no real indication
that we are. There's currently no indication that what the brain is doing
is any more special than what any of our computers are already doing. The
only thing stopping people from believing that is the simply and obvious
fact that we don't yet have a robot that acts like a human. Other than
this important roadblock, all the data we have suggests strongly that
humans are just biological robots with a biological computer brain.

And this learning control we have called a brain shapes our behavior
through simple reinforcement. All of our awareness is nothing more than a
large and complex set of learned reactions to our environment. Our self
awareness is formed simply because we happen to be part of the environment
which we are learning to react to.

But, if this is truly what we are (and I sure believe it) then we can see
that we already have robots that have learning controllers that are self
aware. The only difference between them and us, is that the brain has a
more advanced learning system which can hold a far larger set of learned
behaviors. But this makes human consciousness only a matter of degree over
the current robot and computer consciousness and not a matter of missing
features. It makes human consciouses nothing special or magical - our
computers already have it as well.

Yes, but the term "consciousness" seems to me to point to
the experiential aspect of that neural activity; I know that
I, myself, are conscious solely because I experience that state
in myself.
Yes, it's the awareness of our internal neural activity - our own thoughts
and memories and other internal mental activity that makes all this so
magical to us. But we are equally self-aware of our external actions. We
are aware that we have reached out to grab something. We are aware that it
is "our" hand and "our" arm that is moving. This is also self awareness.

We are aware that the internal mental activity is "special" because of the
fact that this sensory data has no correlation with our other external
sensory signals. When we have a thought, we can't hear the brain running
with our ears. We can't feel our head vibrate if we hold our head with our
hands.

How do we know that when we see a dog bark and hear a dog bark that those
two sensory signals are actually from a single source? (the dog). It's
because those sensory signals happen at the same time. They have a
temporal correlation and act as temporal predictors of each other. When we
hear a dog bark, we expect to be able to look around and see a dog barking
in sync with the sound. It's this temporal correlations in the sensory
signals that allow us to link them together as one object. This is how
sensory binding works. It's why visual signals and audio signals get bound
together as being part of one physical world.

But when we sense mental activity, what visual signal can the brain bind it
with? None. There is no correlation between our visual signals and our
mental thoughts - the brain doesn't give off light when we it's working.
It doesn't make the head move when it's working. There is nothing we can
see with our eyes to let us know a person is having mental activity (not
until you use special tools to monitor brain activity). As such, the brain
doesn't bind our sensory perception of mental activity, to our visual
sensory data, or to our auditory sensory data, or to any of our other
external sensory data. As a result, mental activity gets bound directly to
nothing. It seems to exist in it's own domain, separate from everything we
think of as physical because the brain doesn't normally get exposed to any
temporal correlations in the signals.

So, what does this lead to? It leads to our awareness of our own mental
activity. And it leads to the basic belief in humans that there is no
connection between our sensory awareness of mental activity, and our
sensory awareness of everything physical. It creates this world view of a
split between physical and mental. And it's all because of the simple fact
that that the brain does data binding by temporal correlations and there
are no temporal correlations and what we call internal mental activity and
our other physical sensory signals.

And when people talk about being conscious, what are they making a
reference to most the time? It's the existence of internal mental activity
which has no correlation to physical sensory data. People believe their
power to generate these internal mental signals is what makes them
"conscious".

But, this is all hogwash and illusion. Because in fact, all our physical
awareness is the same internal mental activity. When I look at my computer
and know that it's a computer I'm seeing in the sensory signals, this is
because the brain has correctly decoded and the visual data as "computer
monitor". In other words, our ability to be aware of the external
environment is the same type of hardware at work that makes us aware of our
private internal environment. Our awareness of both worlds is all just
neural activity.

So, what do you want to define consciousness as? We can say it's the
result of a learning machine which has the power to bind sensory data like
the brain does. Or we could add more say that the system must have the
power to generate internal signals separate from the sensory driven signals
and then also note there is not correlation between the two. But in the
end, it's all just fairly simple data processing which is going on here of
a reaction machine.

And clearly not all neural activity equates to
consciousness -
Not clear at all. All the evidence points to the idea that what we are
calling consciousness is nothing more than neural activity - not even
special complex neural activity - just a simple reaction machine.

It's nothing more special than a robot with a computer brain trying to talk
about its ability to control its own arms and legs and trying to say,
"clearly, my self awareness is more than the activity of transistors".
Clearly he would be wrong.

What we know, is that humans believe something special is happening in
them. But as I explained above, we understand why people believe this. We
know why mental activity seems to be disconnected from the physical world.
It' simply because there are no temporal correlations in the sensory data
for the brain's mental activity and our normal external sensory signals.
This is the source of all the confusion over mental activity being non
physical - it's the source of the inherent dualistic view humans believe
in. And it's the source of the confusion over consciousness - they believe
they are conscious because this dualistic sense of self exists and because
these internal mental signals exist.

But these internal mental signals are not special, they are the same types
of signals that the brain uses to represent all our awareness - our
awareness of the physical world as well as the awareness of the internal
mental world. And they are nothing more than neural activity. Stop the
neural activity, and we stop being aware - it's no different than turning a
video camera off. Turn it off, and it stops being aware. It stops being
able to "see".

There's just nothing special and magical happening in the human brain.
It's just a straight forward signal processing machine that learns by
experience.

--
Curt Welch http://CurtWelch.Com/
curt@kcwc.com http://NewsReader.Com/
 
"Stephen Harris" <cyberguard-1048@yahoo.com> wrote in message
news:ZMINh.3539$rO7.1094@newssvr25.news.prodigy.net...
They have dolls that will say "I'm hungry" and a few other phrases
when a cord is pulled. They could just as well program a phrase,
"I am alive." and make it battery operated.
Implicit throughout this discussion has been the notion that
"asserting personhood" requires at least Turing-level
conversational capability.

Bob M.
 
<forbisgaryg@msn.com> wrote in message
news:1174910947.139641.285440@y66g2000hsf.googlegroups.com...

What is the evidence that God doesn't exist?
A bit off-subject, but I would like to strongly recommend the following
books as extremely interesting reads, no matter which side of the
"God" question you're on:

"The God Delusion" by Richard Dawkins,

and

"God: The Failed Hypothesis" by Victor Stenger.

Keep an open mind, and you can get a LOT out of these two.
Up to and including, possibly, a whole new perspective on both
the truth and desirability of the sort of thinking we call "religious."

Bob M.
 

Welcome to EDABoard.com

Sponsor

Back
Top