Chip with simple program for Toy

On Fri, 30 Mar 2007 07:19:30 -0500, Mitchell Jones
<mjones@21cenlogic.com> Gave us:

When many plausible
actions may be taken, there will be an extrapolated sensory sequence
associated with each one of which the individual is aware.

Kai! Last of the Brunnen-G

HEHEhHEHhehehe!
 
On 3/30/07 5:42 AM, in article C23253B7.6CA52%salmonegg@sbcglobal.net,
"Salmon Egg" <salmonegg@sbcglobal.net> wrote:

About 13 years ago, I developed 12 proofs that Muhammad was a false prophet.
I noted brief summaries in the margin of a book I was reading.
Unfortunately, what was preeminently clear at the time has become very
obtuse now. Can anyone help me reconstruct the proofs?
I was just being a wise-ass. I am really amazed at how seriously people have
taken my post. Actually the proof showed that Muhammad was not any prophet
at all.

-- Fermez le Bush--about two years to go.
 
Bill Hobba wrote:
"RichD" <r_delaney2001@yahoo.com> wrote in message


Kevin, you have made claims about inertial frames of
reference, acceleration, and the equivalence principle,
containing 'fundamental, unresolved' logical problems.

GR contains no fundamental unresolved problems.
err, your kidding right?

I dont have time to go into it right now, but trust me, there are quite a
few unresolved issues.

Sure, I am not an expert, http://www.kevinaylward.co.uk/gr/index.html, but I
have formally studied it enough to know some of the issues.


--
Kevin Aylward
ka@anasoftEXTRACT.co.uk
www.anasoft.co.uk
SuperSpice
 
rscan@nycap.rr.com wrote:
On Mar 30, 6:06 pm, "Glen M. Sizemore" <gmsizemo...@yahoo.com> wrote:

No, you are mistaken. This is not the hard problem.

I thought that the hard problem was, "Can a computer, robot, whatever,
be conscious, have feelings?"
That is not the hard problem.

The question as to why a human is
conscious, has feelings, is entirely
different. In the one case, we question whether it is possible, In the
other case, we know it is possible, we only ask how?
http://en.wikipedia.org/wiki/Hard_problem_of_consciousness, to wit,

Various formulations of the "hard problem":

a.. "Why should physical processing give rise to a rich inner life at
all?"
b.. "How is it that some organisms are subjects of experience?"
c.. "Why does awareness of sensory information exist at all?"
d.. "Why do qualia exist?"
e.. "Why is there a subjective component to experience?"
f.. "Why aren't we philosophical zombies?"
g..
in other words, how is it that a bunch of inaminate eletrons, protons, when
arranged in a complex way, exhibit "feelings"

or google on "The hard problem"

Am I missing something?

I dont know.

--
Kevin Aylward
ka@anasoftEXTRACT.co.uk
www.anasoft.co.uk
SuperSpice
 
Glen M. Sizemore wrote:
"Kevin Aylward" <kevin_aylward@ntlworld.com> wrote in message
news:pEdPh.21840$NK3.15810@newsfe6-win.ntli.net...
Glen M. Sizemore wrote:
snip

In the consciousness field, this is called the "Easy Problem".

Yes, well, the "Easy Problem" is hard enough. There is no
reason to go around making up new problems for oneself.

Actually, it is one of the other so-called "easy problems" that is
at the heart of the matter. This is the problem of how we come to
talk about subjective phenomena. This sets the stage for answering
questions concerning the physiological mediation of the behavioral
phenomena just described. Unfortunately, a physiological explanation
of this entails pretty much a physiological explanation of the
entirety of behavior. I agree, though, that the "hard-problem" is a
myth.

From you comments below, it seems that you misunerstand what the hard
problem actually is.

I don't think so. I can tell that you're pretty much an idiot, though.
Ahmm, very effective debating technique, personal insults?

What we feel when we introspect is our own behavior, and that
includes the behavior called "seeing," "hearing," "tasting" etc. It
simply makes no sense to ask something like: "Why is 'seeing green'
the way it is?" What other way could it be?

Yes, such questons are of little value, but such questions have
absolutly to do with the hard problem.

On the contrary, they are closely related to Chalmer's discussion.


The Hard Problem is how come a bunch of inanimate electrons, protons
and neutrons club together and give us that feeling we get when we
get kicked in the balls. electrons, protons and neutrons don't have
feelings, so why do we?

No, you are mistaken. This is not the hard problem.
Yep it is. Debate finished with you. See above as to why.

http://en.wikipedia.org/wiki/Hard_problem_of_consciousness, to wit,

Various formulations of the "hard problem":

a.. "Why should physical processing give rise to a rich inner life at
all?"
b.. "How is it that some organisms are subjects of experience?"
c.. "Why does awareness of sensory information exist at all?"
d.. "Why do qualia exist?"
e.. "Why is there a subjective component to experience?"
f.. "Why aren't we philosophical zombies?"
g..
in other words, how is it that a bunch of inaminate eletrons, protons, when
arranged in a complex way, exhibit "feelings"

or google on "The hard problem"

--
Kevin Aylward
ka@anasoftEXTRACT.co.uk
www.anasoft.co.uk
SuperSpice
 
"Kevin Aylward" <kevin_aylward@ntlworld.com> wrote in message
news:z2oPh.10910$F82.7650@newsfe4-win.ntli.net...
Glen M. Sizemore wrote:
"Kevin Aylward" <kevin_aylward@ntlworld.com> wrote in message
news:pEdPh.21840$NK3.15810@newsfe6-win.ntli.net...
Glen M. Sizemore wrote:
snip

In the consciousness field, this is called the "Easy Problem".

Yes, well, the "Easy Problem" is hard enough. There is no
reason to go around making up new problems for oneself.

Actually, it is one of the other so-called "easy problems" that is
at the heart of the matter. This is the problem of how we come to
talk about subjective phenomena. This sets the stage for answering
questions concerning the physiological mediation of the behavioral
phenomena just described. Unfortunately, a physiological explanation
of this entails pretty much a physiological explanation of the
entirety of behavior. I agree, though, that the "hard-problem" is a
myth.

From you comments below, it seems that you misunerstand what the hard
problem actually is.

I don't think so. I can tell that you're pretty much an idiot, though.

Ahmm, very effective debating technique, personal insults?
See below.

What we feel when we introspect is our own behavior, and that
includes the behavior called "seeing," "hearing," "tasting" etc. It
simply makes no sense to ask something like: "Why is 'seeing green'
the way it is?" What other way could it be?

Yes, such questons are of little value, but such questions have
absolutly to do with the hard problem.

On the contrary, they are closely related to Chalmer's discussion.


The Hard Problem is how come a bunch of inanimate electrons, protons
and neutrons club together and give us that feeling we get when we
get kicked in the balls. electrons, protons and neutrons don't have
feelings, so why do we?

No, you are mistaken. This is not the hard problem.

Yep it is. Debate finished with you. See above as to why.
Civil debate ended when you said: "From you comments below, it seems that
you misunerstand what the hard problem actually is."

http://en.wikipedia.org/wiki/Hard_problem_of_consciousness, to wit,

Various formulations of the "hard problem":

a.. "Why should physical processing give rise to a rich inner life at
all?"
b.. "How is it that some organisms are subjects of experience?"
c.. "Why does awareness of sensory information exist at all?"
d.. "Why do qualia exist?"
e.. "Why is there a subjective component to experience?"
f.. "Why aren't we philosophical zombies?"
g..
in other words, how is it that a bunch of inaminate eletrons, protons,
when arranged in a complex way, exhibit "feelings"

or google on "The hard problem"
I have read Chalmers paper, dipshit. The problm is that you are conflating
reductionism with physicalism. One can be a physicalist, but not a
reductionist.

--
Kevin Aylward
ka@anasoftEXTRACT.co.uk
www.anasoft.co.uk
SuperSpice
 
Daryl McCullough wrote:
Kevin Aylward says...

Daryl McCullough wrote:

Not all big name philosophers agree that it's real. For example,
Dennett http://www.imprint.co.uk/online/HP_dennett.html and

I have actually read this before. I was astounded. Astounded that
some respected name could be so de-facto wrong.

I don't think he's wrong.

The hard problem is that we *feel* pain. End of story.

That's only a hard problem because you refuse to say what
"feel" and "pain" mean.
I have already explained, that pain can not be explianed, i.e reduced to
anything less than pain. Pain is what you feel when you get kicked in the
balls.

The issue here, is that you still can not handle the concept of irreducible
axioms. There is no simplar explanation that the fact that a conscious
entity has pain. It can only be explained with reference to the assumption
that someone else's feels pain in a similar way as oneself feels pain. The
ireducability of pain, does not mean that it dosnt exist, as you are
actually claiming.

Electrons don't feel pain. We do. Why? Thats the hard problem.

And a feeling is...? A feeling is a "loose" causal relation
between stimuli and behavior. I say "loose" because, as I
explain below, rather than stimulus directly causing behavior,
the feelings are intermediaries, informational states containing
something of the stimulus that caused them, and something of the
behavior that it predisposes the "feeler" to perform.

And a "... feelings are intermediaries, informational states
containing..." means exactly what?

I'm just repeating the description of the two-step process:
Environmental conditions affect brain states.
Brain states affect behavior. Mental terms such
as "pain", "pleasure", "qualia", etc. are just
labels for particular brain state patterns.
There is no "just". These label don't tell us that pain hurts. Its new
informaton.

And yes, pain hurts is circular. That is why the problem exists.

If you could completely
disconnect pain from behavior, it wouldn't be pain anymore.

Suppose that some sophisticated brain surgery rewired your
brain so that the sensation of tasting sweetness and
the sensation of pain were switched. So tasting sweetness
causes pain, and stubbing your toe causes you to taste
sweetness. But the rewiring also made the corresponding
change in behavior, so that you tend to flinch when you
taste sweetness, but you tend to seek out pain (especially
painful ice-cream).

I would say that such a rewiring is *meaningless*. There
is no meaning to "the feeling of tasting sweetness" other
than the typical causes of that mental state and the typical
behaviors that result. There *is* no such thing as a feeling
divorced from stimulus and behaviour.

I disagree, the internal metal state, (the elrctro-chemical signals)
is all there is that defines pain and sweetness. The external
behaviour is simple irrelevant.

Well, I think you're completely wrong about that.
Pardon?

If you have
a test tube full of neurons, you could induce them to fire, but
that firing doesn't indicate pain, or pleasure, or tasting sweetness,
or anything in particular until those neurons are connected up with
a more complete system that can act on those electrochemical signals.
This is now getting quite daft. You obviously understand little that I have
said.

The de-facto assumption is that I am discussing the complete complex system
of neurons wired up as a brain as a given. That is what the internal mental
state is. Maybe I should have said (the electro-chemical signals etc...). I
have already explained that consciousness is an emergent phenomena when
extremely complex systems all interact.

We can be trained to behavioural respond opposite to that normally
expected, but this does not change the internal feeling.

I think that's completely wrong.
I know that your wrong. It trivially, obvious that it the internal brain
state that is what governs feelings. This internal brain state can me
measured externally in principle. That is the distribution of chemicals and
electrical signals. The measurement of someone moving their arms (external
behaviour), is obviously only a second level result of the internal physical
state. Feelings reside in the brain, so that is what is important. External
behaviour is essentially, automatous., not under direct control of the brain
for the most part. For example, I play guitar. I practise scale runs. To
play this fast it relies on the fact that my consciousness part of the brain
must relinquish thinking about where my fingers go. It has to be delegated
to slave systems that just carry out orders. Hence, the main part of the
brain system isn't actually aware of external body motion. Maybe I should
have said say, "minimal change of the internal feeling".

The point about *scientific* explanation is that it
has the power to unify diverse observations and to
predict future observations. Newton's law of gravity
unified the motion of the planets and that of a
dropped apple. It allowed people to calculate
and predict trajectories of cannon balls and rocket
ships. In contrast, a theory of "feelings" makes
no testable predictions. If you're wrong, or you're
right, it doesn't make *any* difference.

This is missing the point. The fact that we do not
have enough theory yet to make great predictions from
feelings, is irrelevent.

You don't have observations that require explanation, either.
Tautologies (such as "Pain hurts") are *not* observations.
Yes they are.

Nearly everything about
an evolved creature---the hardness of its teeth, the composition of
its blood, the strength of its muscles, the acuity of its
vision---can be said to serve reproduction...

Indeed, *all* of the physical construction of an animal is
"designed" to serve replication. This is 101 evolution, mate.

Isn't that what I just said? If everything is in the service of
reproduction, then it's redundant to include that in the definition
of "emotion".
No it is not redundant. It is used to explain why emotions are selfish. To
wit, feeling good about helping someone else, is because of self interest.
Rocks are not designed to serve our evolution, so it rules them out also.

I think you need to do a bit of reading on Dawinan evolution.

http://www.kevinaylward.co.uk/replicators/replicatortheory.html

Are you suggesting that your own paper is the best source
to learn about Darwinian evolution?
Depends on what particular aspect of Darwinian evolution you want to know
about.

After a man has a vasectomy,
he continues to have emotions, but they no longer have anything
to do with reproduction. The love of one's children is no less
strong when the children are adopted.

Not actually so

Yes, actually so. I'm speaking from experience here.
If you are telling me that you are bringing up someone else's child and
*all* the feelings are the same as if it were your own, you are lying to me
or lying to yourself. There is no reasonable way that the fact that is
bringing up someone else's child is the same. Yes, there is much political
correctness to try and put forward this view, but, quite frankly, its
bullshit.

but you completely miss the point. Traits are designed
(selected, varied, copied) by evolution to maximise their numbers.

Of course they are. That doesn't mean that emotions are
about maximizing offspring, any more than it means that
teeth are about maximizing offspring.
oh dear, you don't understand the evolution, or mechanism of evolution at
all.

The fundamental point of emotions is to maximise replication of genes.
Period. Evolution demands this. They do this by *statistical* methods. The
work by trying to maximise the owners "interests". Maximising all of ones
interests is more statistically likely to maximise their numbers. Its
irrelevant whether in all cases an emotion directly increases numbers. All
that matters is that on average, maximising ones interests, rather than
being neutral, or minimising ones interests, results in a statistical
advantage to replication numbers. That is, there are 3 options, minimise,
neutral or maximise. Evolution will ensure that even if only 0.1% advantage,
the maximise one that is all you will see 10,000 generations later. Its a
tautology, "that which is mostly observed, is that which replicates the
most". Genes don't know what will maximise themselves, so they can take
action that doesn't maximise. This don't matter, if the average is still
positive.

You need to understand the logic and the math.

Emotions are purely due to the brain. The brain is constructed from DNA,
which is a Replicating Replicant. The mathematics of replication, random
variation and selection, of replicants *mandate* that, such a replicant
*must* evolve to be selfish, *and* act in the direction that tries to
maximise the numbers of such replicant. No iffs or buts about it.

I explain the math as to why this must be in that paper
http://www.kevinaylward.co.uk/replicators/replicatortheory.html.

There is no choice if you accept evolution as true. The brain satisfies the
axioms, so it must satisfy the results. Emotion variations that were neutral
or negative to replication have to be weeded out by the selection process.
Life IS a maximisation process.

It clear that your knowledge of Evolution is limited, otherwise you couldn't
possible query what the root purposes of emotions. You need to go through
the axioms, and the math that's derives the maximisation result.

Let me give an analogy. The military is presumably all about
winning wars. But part of winning wars involves bureaucracy,
and bureaucracy nowadays involves sending email. Does that mean that
email is about winning wars?
See above. To wit, the millions and millions of generation of selection,
replication and random variation undergone by the brain hardware mandates
that the brain is completely, and optimally designed to maximise its own
genes. End of story. I am not going to debate this point any more. Again,
its 101 human Darwinian evolution.

The fact that emotions contribute towards reproductive success
does *not* imply that "reproductive success" is part of the
definition of what it means to be an emotion.
I don't care about what is or is not implied, or how someone else might
believe what an emotion is. *I* am constructing a theory. I am *defining*
emotions. I can define anything I like.

To quote "nothing makes sense in biology without evolution".

As explained, emotions are so fundamental to maximising gene numbers, that a
definition should reflect that fact. One simply can not understand human
behaviour without being explicitly aware that all behaviour is designed for
the benefit of maximising gene numbers.

This is mathematical provable (see above paper). They do this on a
statistical basis.. All statements in evolution are statistical,
and based on "natural" selection. What's a pencil designed for?
Suppose I use it to stab someone in the heart, is it no longer
designed to be a pencil?

I don't know the history of pencils, but suppose that pencils
were first designed for use by merchants. Does that mean that
a pencil is about helping merchants? No, it's more general than
that.
That is why my definition of an emotion is general. It does not exclude
applications of emotions that are short term, net detrimental to gene
numbers.

The other thing I don't like is the qualifier "conscious". What
extra information is that conveying?

Oh dear...its specifically there to rule out hardness of teeth as an
emotion. dah...

Then that's a very bad definition, if the only way it can
distinguish teeth from emotions is by invoking an undefined
notion of "consciousness".
Nonsense. Again, you don't understand an axiomatic approach to theories. I
have declared consciousness as a new axiom. That is we take as held that
"consciousness exist". All explanations resolve down to that axiom.

This is no different from any physics theory. For example. We take as an
axiom "conservation of momentum". How do we verify this axiom? Well we
measure velocities and masses and see if the axiom holds. But how to we
define the masses.? Well, we have to look at the force needed to move it,
like d(P)/dt, to wit, we measure mass by knowing momentum. So, its all
circular. We cant define entities in physics with out being self referral.

So, the fact that we can only define consciousness by a circular argument,
e.g. consciousness is that which allows us to feel pain, but pain is that
attribute of consciousness is not a serious flaw. All theories have that
problem.

I define emotions with reference to consciousness which is itself
defined by axiom.

Yes, I know. It's a very poor definition.
See above. The definition actually works. It allows me to "prove" that
emotions are selfish from the axioms of evolution.

I think that emotions are, like other feelings such as pain,
causal information representations that serve as an intermediary
between environmental stimulus and behavioral response.

Yes, they do this, but their function is strictly selfish, for the
purpose of maximising the interests of the emotion holder, as I
highlight in the paper.

To wit:

http://www.kevinaylward.co.uk/replicators/index.html

"Adaptability to all environments requires adaptable physical
action. Fully adaptable physical action requires programmable
emotions as emotions instigate physical action. Emotions (gene
traits) are programmed by morals (meme traits). Morals are
programmed by the environment. Emotions require consciousness."

The last sentence is not contributing anything to the explanation.
see above, yep, its circular, as is all of physics,. You just don't know
enough physics to know what the problems are in physics.

I have been all thorough your arguments, extensively. None of them
explain conscious experience.

I haven't heard a good definition of what it would *mean* to explain
conscious experience.

consciousness = F(electrons, protons, quarks) ?

It would mean just the same as explaining conservation of momentum.

No, it would not. Conservation of momentum is an actual
observation. "Consciousness" is not.
Not at all. Conservation of momentum is actualy a definition. You need to
study physics in more detail as to why this so, noted above. For example,
the speed of light is a constant by definition.

What evidence is there that qualia exist, above and beyond
the "causal informational intermediate representations" that
I talk about above?

Because *you* *feel* that kick in the balls.

What is the evidence that "feeling" is anything more than
causal relations between environment and behavior?
It hurts How many times must this be said.

No, I'm not agreeing that consciousness has no observable
consequences. I think that consciousness is a sophisticated
type of behaviour.

So, then you now agree that consciousness can be measured with a
different value from non consciousness, hence you must now agree
that Godel is valid in application to consciousness, since
consciousness, in principle, is testable due to these differences.

Not at all. If I claim that perfectly round three-dimensional objects
are always spheres, I haven't made a testable. I'm just explaining how
I use the word "sphere" if it's an unfamiliar word.

When I say that conscious entities behave in certain ways when
placed in such-and-such a circumstances, I'm explaining what I
mean by the phrase "conscious entity".

Apparently you now admit that you do. You state that you find Zombies
objectionable, so you state that you believe consciousness and non
conscious entities behave differently.

I think that "conscious entity" simply *means* an entity capable
of certain sophisticated behaviors.

So, how do you distinguish the behaviour of
conscious entities and non conscious entities ?

Basically, a conscious entity is one exhibiting goal-directed
behavior that is affected by changes of the environment.
Oh? why should consciousness demand goal directed behaviour? This is just
another arbitrary assumption.

Obviously,
there is more to it than that, to put into words exactly how I
sort things into conscious and unconscious entities, but one thing's
for sure: I *never* need to consult any "qualia" in order to make
a judgement about it. I make a judgement about whether something
is conscious based on its behavior, and if I ever revise my judgement
the revision is because I learn something new about its behaviour.

How do you actually mark up or assign the behaviour to one or the
other?

Now, *that's* a good question. How do you human beings make such
judgements about other entities? The answer to that question is
worth pursuing. Unlike the "hard problem".

Clearly, the only way to put
the behaviour into their respective bins is whether or not an entity
has experiences/feelings/qualia. That is, conscious behaviour is
only *defined* de-facto, by the behaviour that is associated with an
entity that has feelings. What is that something?

I don't agree.
This makes no absolutely sense. Do you actually realise what you have said?

Look, we are discussing behaviour, one due to non conscious and
consciousness. Consciousness is, by assumption or definition, "that which
has feelings". That is what we are trying to distinguish, non consciousness
behaviour and consciousness behaviour as the fundamental point of the
argument. Consciousness has feelings, non consciousness don't. Otherwise
what are you actually trying to debate?

How do you define what is non conscious and consciousness, behaviour? How
do we know what behaviour to attribute to non conscious and consciousness?
How do we independently know what behaviour experiences true feelings?

To say that conscious behaviour is that which is due to conscious
behaviour, seems to be your only escape root, which of course,
explains nothing.

That seems what *you* are saying. I'm not saying that.
Yes you are.


--
Kevin Aylward
ka@anasoftEXTRACT.co.uk
www.anasoft.co.uk
SuperSpice
 
Kevin Aylward says...

The issue here, is that you still can not handle the concept of irreducible
axioms.
I have no problem with irreducible axioms that have
non-tautological content.

There is no simplar explanation that the fact that a conscious
entity has pain. It can only be explained with reference to the assumption
that someone else's feels pain in a similar way as oneself feels pain. The
ireducability of pain, does not mean that it dosnt exist, as you are
actually claiming.
I never claimed that pain doesn't exist. I just deny that
it is irreducible in the sense that you are talking about.

Electrons don't feel pain. We do. Why? Thats the hard problem.
"Electrons don't feel pain" is *NOT* an observation that
needs to be explained. It is an *assumption*. The point of
a scientific theory is to explain our observations, not to
explain our assumptions. Unless you are talking about a
theory of human assumptions---in that case, "Humans do not believe
that electrons feel pain" is an observation about *humans*,
not about electrons.

That's actually the interesting observation that needs explanation:
Humans divide the world into conscious entities and non-conscious
entities. How do they do it, and why are these categories useful?

Note that this question does *not* involve any circularities of
the form "Pain hurts". We don't need to assume that consciousness
exists in order to explore why humans find it a useful concept,
any more than we need to assume that ghosts exist in order to
explore why humans have believed in them.

The reason it's useful to distinguish
conscious from unconscious entities is because for
conscious entities it is helpful to take *motivation*
into account: If you want to move a 10-ton elephant, you
can try offering it food some distance away. In contrast,
there is really nothing that can motivate a 10-ton boulder
to move where you want it.

Assuming that an entity is conscious allows you to predict
and in some cases control its behavior by taking into
account motivation. This allows a level of understanding
of an extremely complex system that could not be otherwise
understood.

*THAT's* what it means for a creature to be conscious, to me.
It's a category, in the same way that being alive is. There is
no magic spark of life. Things are called "alive" because they
behave in certain common ways: taking in energy from the environment
and using it to maintain their own bodies and to reproduce.

I'm just repeating the description of the two-step process:
Environmental conditions affect brain states.
Brain states affect behavior. Mental terms such
as "pain", "pleasure", "qualia", etc. are just
labels for particular brain state patterns.

There is no "just". These label don't tell us that pain hurts. Its new
informaton.
The *labels* are meaningless. What makes it "pain" as opposed to
pleasure is the causal connections involved in (1) producing
that state, and (2) producing behavior when in that state.

I do not agree that saying "Pain hurts" is new information.
Tautologies are not information (at least, not *empirical*
information).

And yes, pain hurts is circular. That is why the problem exists.
That's why it's a pseudo-problem.

If you could completely
disconnect pain from behavior, it wouldn't be pain anymore.

Suppose that some sophisticated brain surgery rewired your
brain so that the sensation of tasting sweetness and
the sensation of pain were switched. So tasting sweetness
causes pain, and stubbing your toe causes you to taste
sweetness. But the rewiring also made the corresponding
change in behavior, so that you tend to flinch when you
taste sweetness, but you tend to seek out pain (especially
painful ice-cream).

I would say that such a rewiring is *meaningless*. There
is no meaning to "the feeling of tasting sweetness" other
than the typical causes of that mental state and the typical
behaviors that result. There *is* no such thing as a feeling
divorced from stimulus and behaviour.

I disagree, the internal metal state, (the elrctro-chemical signals)
is all there is that defines pain and sweetness. The external
behaviour is simple irrelevant.

Well, I think you're completely wrong about that.

Pardon?
I meant that you are completely wrong about the claim that
"external behavior is simply irrelevant". What makes a particular
electro-chemical signal "pain" as opposed to something else are
the causal connections describing how that signal typically arises
and how it typically influences behavior.

If you have
a test tube full of neurons, you could induce them to fire, but
that firing doesn't indicate pain, or pleasure, or tasting sweetness,
or anything in particular until those neurons are connected up with
a more complete system that can act on those electrochemical signals.

This is now getting quite daft. You obviously understand little that I have
said.
You have a tendency to dismiss all disagreement as due to stupidity.
That isn't very helpful. If it were really the case that everything
that you believed was so obvious that any intelligent person would
believe the same thing, then there would be no need for you to say
it.

The de-facto assumption is that I am discussing the complete complex system
of neurons wired up as a brain as a given. That is what the internal mental
state is.
Yes, and I'm disagreeing with that. What makes it pain is the way
that that system is typically connected with the rest of the world.

We can be trained to behavioural respond opposite to that normally
expected, but this does not change the internal feeling.

I think that's completely wrong.

I know that your wrong. It trivially, obvious that it the internal
brain state that is what governs feelings.
You need to keep straight when a claim is an empirical
observation, and when it is a theoretical observation.
You have a theory of "feelings", and according to this
theory, it is the internal brain state that governs
feelings. That's fine, but that's a *theoretical* claim.
If you want to make it an *empirical* claim, then you
have to give an empirical test for "feelings" that is
independent of brain states.

This internal brain state can me measured externally in principle.
But how do you externally measure "feelings"?

That is the distribution of chemicals and electrical signals.
The measurement of someone moving their arms (external
behaviour), is obviously only a second level result of the internal physical
state. Feelings reside in the brain, so that is what is important.
Fine, that sounds very reasonable. But my point is that it is only
meaningful to label a feeling *pain* (versus pleasure) because of the
role that that internal state plays in mediating the interaction between
the entity and the environment.

You don't have observations that require explanation, either.
Tautologies (such as "Pain hurts") are *not* observations.

Yes they are.
They aren't *empirical* observations.

Isn't that what I just said? If everything is in the service of
reproduction, then it's redundant to include that in the definition
of "emotion".

No it is not redundant. It is used to explain why emotions are selfish.
The fact that emotions are selfish is an empirical *observation*.
It shouldn't be lumped into the *definition* of "emotion" so that
it is tautologically true.

After a man has a vasectomy,
he continues to have emotions, but they no longer have anything
to do with reproduction. The love of one's children is no less
strong when the children are adopted.

Not actually so

Yes, actually so. I'm speaking from experience here.

If you are telling me that you are bringing up someone else's child and
*all* the feelings are the same as if it were your own,
you are lying to me or lying to yourself.
Oh, go jump in a lake. All four of my children are adopted, and
I love them very much. You're an idiot.

--
Daryl McCullough
Ithaca, NY
 
On Sat, 31 Mar 2007 05:45:46 GMT, Salmon Egg
<salmonegg@sbcglobal.net> wrote:

On 3/30/07 5:42 AM, in article C23253B7.6CA52%salmonegg@sbcglobal.net,
"Salmon Egg" <salmonegg@sbcglobal.net> wrote:

About 13 years ago, I developed 12 proofs that Muhammad was a false prophet.
I noted brief summaries in the margin of a book I was reading.
Unfortunately, what was preeminently clear at the time has become very
obtuse now. Can anyone help me reconstruct the proofs?


I was just being a wise-ass. I am really amazed at how seriously people have
taken my post. Actually the proof showed that Muhammad was not any prophet
at all.

-- Fermez le Bush--about two years to go.

Muhammad was indeed a prophet for a profit. His "revelations"
were astonishingly skewed to meet his immediate lusts and greed.
 
On Mar 24, 2:42 pm, c...@kcwc.com (Curt Welch) wrote:
"Kevin Aylward" <kevin_aylw...@ntlworld.com> wrote:
Consciousness can be defined in a non-circular way. It's brain activity.
No, no, no, no. Consciousness is phenomenal experience. If you want
to claim that our phenomenal experiences are caused by or just are a
function of brain activity, you are free to prove that. However, you
have yet to even GRASP what proving that might entail, let alone
making any progress on showing that it is.

If you choose to reject that definition, then you are the one that are
choosing to believe it can't be defined simply because you choose to reject
the only definition that fits the facts.
Dualism fits the facts. It promotes the mind as being non-physical.

You can't prove you are right, by
defining yourself to be right as an axiom of your beliefs. But that's
exactly what you have done. In which case, your paper says nothing more
than, I'm right because I say I'm right. And that level of stupidity is
why I couldn't bring myself to read past the title.
I'm surprised, since that sort of argument is all that I've ever seen
from you.

Brain activity is not something that can be studied and understood by
philosophers who choose to ignore everything physical as if it wasn't
important.
Since most philosophers are staunch physicalists, all you've done here
is show that you don't know anything about the real issues and
positions on consciousness.
 
On Mar 24, 3:56 pm, stevendaryl3...@yahoo.com (Daryl McCullough)
wrote:
Kevin Aylward says...

We say that feelings aren't physical. Memories aren't physical.
Ideas aren't physical. Qualia aren't physical. Tangible objects are
physical, intangible objects are not physical. But that's all pure
bull shit created by people that didn't even know we had a physical
brain.

Oh dear. Its clear that you just haven't thought this thing through at all.
A machine doesn't have "experience".

On what basis do you say that? I can understand how one could use
introspection as evidence that he has experience, what counts as
evidence that something *doesn't* have experience?
Doesn't this question fall into the same sort of reasoning as some
would claim applies to asking for evidence that God doesn't exist?
 
On Mar 24, 6:10 pm, c...@kcwc.com (Curt Welch) wrote:
"Kevin Aylward" <kevin_aylw...@ntlworld.com> wrote:
Curt Welch wrote:
Look, mate, suppose I kick you in the balls. Now explain to me how
inanimate electrons and protons give you that experience that you
attribute as pain. What is experience. The electrical impulses
co-incident with that kick in the balls simply does not explain the fact
that you don't like it. That it *hurts*.

Unfortunetaly for you, you can't present any evidence to support this
belief of yours.

I don't have to prove anything because it is you who is suggesting that
pain is not explained by the electrical impulses. I say it is. Prove me
wrong.
Why does he have to prove you wrong? You have no evidence beyond a
correlation to show that the brain is involved AT ALL. You have given
no evidence that that electrical impulses explain the experience of
pain. And then you demand that he prove that your view CANNOT be
correct in order to claim that you haven't supported YOUR claim? Give
me a break.

Show me the scientific experiment to prove what you suggest. If you can't
show it to me, then there is no evidence to support your belief. It is,
just a belief.
Show me the scientific experiment to prove your claim. I'll guarantee
that it in no way TOUCHES the actual phenomenal experience. And in
doing so, I think that his point would be proven: electrical activity
cannot explain the phenomenal experience because we cannot find an
experiment that would show that the two of them have to go together,
and aren't just correlated.

Again, this is something, like a God, that you have simply chosen to
believe in. I don't. I believe physics explains everything we need to
know about what pain is.
Then, please, prove it.

But, I don't have to prove this. I don't have to be right, for you to be
wrong.
But if you aren't right, then why would you believe your own claim?
Why do you believe your claim if you can't prove your own claim?

It does exist in the behavior of electrons. You simply choose to believe
it doesn't even though there is no evidence to support this belief.

I've argued these points for years so don't tell me I've not thought about
them. You can find over a 1000 messages posted by me on these subjects in
cap.
And all of them prove exactly that: you HAVEN'T thought about them.
Instead, you've simply asserted them in the face of argumentation in
comments in the same manner as the most rabid theist.

You say we can have brain activity without consciousness. How can you
prove that? It's totally fucking impossible.
Put someone under anaesthetic. The brain still functions, or else
they'd die. Ain't no evidence of consciousness at that point in any
way.

There are drugs that make our memory stop working. They prevent us from
remembering what happened to us or what type of conscious experience we
were having 10 minutes in the past. Where they consciousness at that time?
If they can't remember it does that means they weren't conscious? I think
most people would argue that they were conscious but that they simply
couldn't remember it.
So, let's take general anaesthetics. Do they knock me unconscious, or
do they just stop me from remembering that I WAS conscious while
they're cutting me open?

I prefer to believe the more likely answer, that they actually make me
unconscious. If you don't, then I hope you never actually become a
doctor [grin].

Also, what about sleep? Do you think that I'm awake all the time
while I'm asleep, but just don't remember it when I wake up?
 
On Mar 25, 7:33 am, stevendaryl3...@yahoo.com (Daryl McCullough)
wrote:
Bob Myers says...

...I would again point out
that we have no way of directly demonstrating "experience"
or "feelings" in ANYONE but ourselves - therefore, I am not
certain that they exist in anyone else. The rest of you may be
all zombies for all I know (which would raise troubling questions
about where you all came from, and why I'M here, but at
least it IS a possibility!).

I don't see how such a possibility has any meaning.
What would it *mean* for someone else to lack "feelings"
or "experience"?

Obviously, when people talk about zombies, they are
distinguishing between "as if" mental properties and
"real" mental properties. We can certainly all agree
that other humans behave as if they had sensation,
emotions, awareness, etc. But supposedly that isn't
enough to show that they have "real" mental experience.
But what does that mean? What does "real" mean in this
case? Presumably, it means "Like mine". But what notion
of "likeness" is appropriate here? Of course, no two
brains are alike, so no other brain is like mine, and
no other mind works precisely like mine. But what
range of differences is allowable for mentality to
be considered "real"?
The zombie example is clear on this: a case where there is no
experience at all. So: no inner speech, no inner reasoning, and no
phenomenal experiences. This is hard for us to imagine because almost
all of our knowledge and beliefs come from experience, so let me
clarify this slightly with an example: Imagine that all of your
experiences of colours come from a machine that pops up a set of text
in front of your eyes that says that the object is a particular
colour. The claimed experience of the zombie is something like
that ... except it doesn't even have the text.

Basically, for the zombie, all that happens is that the brain state
changes and the behaviour changes, with no inner states or experiences
occurring at all.


The further question is, why should anyone *care*
about the difference between "real" and "as if"
mentality?
Because what we want to explain when we want to explain mind is REAL
mentality. "As if" mentality is utterly meaningless when trying to
explain mind.

Let me try an analogy. Suppose we're talking about
socks. Some philosopher has a theory that there are true
socks and there are pseudo-socks. This philosopher
doesn't yet have any physical test to distinguish
true socks from pseudo-socks, and he *also* doesn't
have any explanation for why anyone would care whether
they are wearing true socks or pseudo-socks. But he
insists that there is a property of "intrinsic sockness"
that is not reducible to the physical facts. Why would
such a theory of socks make any sense? Why is the
possibility of zombies any different from the possibility
of pseudo-socks?
You have the case backwards. Here, the philosopher's have put forward
the question of what socks are. The materialist has put forward a
solution to that question. The philosopher then points out that under
the materialist's theory, you could have socks and pseudo-socks and
not be able to tell the difference. Your reply here then basically
saying: "Why do I care about some philosophical argument about pseudo-
socks?" To which the philosopher replies: "Because if your theory was
right and captured all there was to know about socks, you should be
able to tell the difference between socks and pseudo-socks.". Sub in
consciouness above, and you should see why this is still an issue.
 
On Mar 26, 1:29 pm, stevendaryl3...@yahoo.com (Daryl McCullough)
wrote:
Kevin Aylward says...





Daryl McCullough wrote:
Kevin Aylward says...
All due respect, here, but if someone actually wants to debate whether of a
wind up watch has "experience". to wit, consciousness, there will need to
do it without me. I don't have the time to engage in such pointless debates.

If the debate is about experience, then how in the world is it
pointless to ask how you know that something doesn't have experience?
I'm not arguing that a wind up watch has experience, I'm asking you
on what basis are you saying one way or the other? What counts as
evidence on such a question?

To quote someone else from this thread: "Its clear that you just haven't
thought this thing through at all."
Two points here:

1) There is no reason to EXTEND the idea that it might possess
experience to the watch, thus it's ridiculous to use that as ANY sort
of point in a debate about experience.

2) If the reason I extend consciousness to other humans is similar
behaviour and a similar physical make-up, then the watch question is
itself answered.
 
On Mar 26, 11:36 pm, c...@kcwc.com (Curt Welch) wrote:
"Kevin Aylward" <kevin_aylw...@ntlworld.com> wrote:
Curt Welch wrote:
I've argued these points for years so don't tell me I've not thought
about them.

Clearly with no success.

That's true! Everyone that understands my points understood them before I
tried to argue them and everyone that fails to understand them still fails
to understand them as far as I know.
It's not that we fail to understand them. We understand them all too
well. It's just that they never prove anything like what you claim
they prove, because you simply ignore any of the issues that people
raise about them.
 
On Mar 28, 2:08 pm, stevendaryl3...@yahoo.com (Daryl McCullough)
wrote:
Kevin Aylward says...

I'm saying that you can give an evolutionary explanation for
*behavior*, because behavior affects survival and reproduction.
You can give an evolutionary explanation for brain structures
that give rise to those behaviors. So as long as you are identifying
consciousness with "the brain structures necessary to produce
such and such behavior" then I think it's fine to invoke
evolution as an explanation for consciousness. That's my preferred
approach. Consciousness is just another name for a sophisticated
process of modeling the world and acting on that model.

Consciousness, is more. A kick in the balls *hurts*.

But we don't know whether that is "more" or not.

Just how is that accounted for by "a sophisticated process
of modeling the world and acting on that model"?

You haven't really said what it would *mean* to account
for it. We can certainly account for the fact that a kick
in the balls causes a person to grab his crotch, to scream,
to keel over, to strike back, to avoid situations in which
that sort of thing happens. What, exactly, are you thinking
is left to account for?
Why I feel that it hurts?

I could do all of those things without ever actually feeling pain.
And once I learn that a kick in the balls isn't a good thing, why
would I ever need to feel pain ever again, under your idea?
 
On Mar 28, 4:49 pm, "Glen M. Sizemore" <gmsizemo...@yahoo.com> wrote:
"Daryl McCullough" <stevendaryl3...@yahoo.com> wrote in message

news:eueeep0rmt@drn.newsguy.com...



Kevin Aylward says...

I'm saying that you can give an evolutionary explanation for
*behavior*, because behavior affects survival and reproduction.
You can give an evolutionary explanation for brain structures
that give rise to those behaviors. So as long as you are identifying
consciousness with "the brain structures necessary to produce
such and such behavior" then I think it's fine to invoke
evolution as an explanation for consciousness. That's my preferred
approach. Consciousness is just another name for a sophisticated
process of modeling the world and acting on that model.

Consciousness, is more. A kick in the balls *hurts*.

But we don't know whether that is "more" or not.

Just how is that accounted for by "a sophisticated process
of modeling the world and acting on that model"?

You haven't really said what it would *mean* to account
for it. We can certainly account for the fact that a kick
in the balls causes a person to grab his crotch, to scream,
to keel over, to strike back, to avoid situations in which
that sort of thing happens. What, exactly, are you thinking
is left to account for?

I don't know what Kevin is thinking, but I'll tell you what is unaccounted
for: the verbal response "That hurts." Accounting for that is the key to
"self-awareness," and it is reasonably well understood by a few people.
The verbal response is irrelevant: I can produce that verbal statement
without pain being anywhere near present. In fact, I just did it now,
while typing. And again.

The fact that it's a verbal response ain't very interesting unless you
say what it's a response TO. And it's a response to a feeling of
pain. And that's what the problem here is, and that's what's not
accounted for.
 
On Mar 29, 2:14 pm, stevendaryl3...@yahoo.com (Daryl McCullough)
wrote:
Kevin Aylward says...



Daryl McCullough wrote:
We can certainly account for the fact that a kick
in the balls causes a person to grab his crotch, to scream,
to keel over, to strike back, to avoid situations in which
that sort of thing happens. What, exactly, are you thinking
is left to account for?

Yeah,...this is all getting pretty pointless.

I'm trying to understand the sense in which your "hard problem"
is *not* pointless. It's not clear to me that there is any
hard problem at all.

I mean, there is certainly a hard problem of how does the
human brain do what it does.

Err. I can fake all of that all of that, without a kick in the
balls.

So what? I'm suggesting that once you've explained a causal
connection between (A) getting kicked in the crotch, and (B)
the behaviors that I described, then you've explained all there
is to explain about pain. The fact that the same behaviors can
arise in other circumstances doesn't affect this.
Sure it does: if I can fake those behaviours in those situations, it
means that you can't point at those situations and those behaviours
and say "That's pain." Because if the experience isn't there,
neither is the pain, as you've just admitted.

"Pain" is just
a name that we give for the causal relationship between
environmental stimuli and behaviors.
No, pain is the name we give to the phenomenal experience that occurs
due to certain environmental stimuli and causes certain behaviours to
occur.
 
On Mar 31, 3:14 am, "Glen M. Sizemore" <gmsizemo...@yahoo.com> wrote:
"Kevin Aylward" <kevin_aylw...@ntlworld.com> wrote in message
Yep it is. Debate finished with you. See above as to why.

Civil debate ended when you said: "From you comments below, it seems that
you misunerstand what the hard problem actually is."
Because, somehow, claiming that someone doesn't understand the issues
under debate is supposed to be the most heinous insult imaginable, no?

I wonder how I've managed to remain relatively civil to you all these
years, which all the "insults" you toss at me in much less polite
terms.
 
"Allan C Cybulskie" <allan_c_cybulskie@yahoo.ca> wrote in message
news:1175374355.966635.251410@n59g2000hsh.googlegroups.com...
On Mar 28, 4:49 pm, "Glen M. Sizemore" <gmsizemo...@yahoo.com> wrote:
"Daryl McCullough" <stevendaryl3...@yahoo.com> wrote in message

news:eueeep0rmt@drn.newsguy.com...



Kevin Aylward says...

I'm saying that you can give an evolutionary explanation for
*behavior*, because behavior affects survival and reproduction.
You can give an evolutionary explanation for brain structures
that give rise to those behaviors. So as long as you are identifying
consciousness with "the brain structures necessary to produce
such and such behavior" then I think it's fine to invoke
evolution as an explanation for consciousness. That's my preferred
approach. Consciousness is just another name for a sophisticated
process of modeling the world and acting on that model.

Consciousness, is more. A kick in the balls *hurts*.

But we don't know whether that is "more" or not.

Just how is that accounted for by "a sophisticated process
of modeling the world and acting on that model"?

You haven't really said what it would *mean* to account
for it. We can certainly account for the fact that a kick
in the balls causes a person to grab his crotch, to scream,
to keel over, to strike back, to avoid situations in which
that sort of thing happens. What, exactly, are you thinking
is left to account for?

I don't know what Kevin is thinking, but I'll tell you what is
unaccounted
for: the verbal response "That hurts." Accounting for that is the key to
"self-awareness," and it is reasonably well understood by a few people.

The verbal response is irrelevant: I can produce that verbal statement
without pain being anywhere near present. In fact, I just did it now,
while typing. And again.
And that would require a different explanation than the one in which someone
is actually "reporting pain." It does not change the view that when we are
trained to "report pain" we are made "conscious of it." That is, our own
response to "painful stimuli" would not discriminatively control any
behavior if the contingencies that generate such discriminative behavior
were not arranged.

The fact that it's a verbal response ain't very interesting unless you
say what it's a response TO. And it's a response to a feeling of
pain. And that's what the problem here is, and that's what's not
accounted for.
No, it is a response to other responses.
>
 

Welcome to EDABoard.com

Sponsor

Back
Top