[comp.ai] Definitions are pointless

jeff@aipna.ed.ac.uk (Jeff Dalton) (12/18/88)

In article <1144@arctic.nprdc.arpa> meadors@nprdc.arpa (Tony Meadors) writes:
>
>   The issues of mental experience or consciousness are simply irrelavent
>   to the action approach to intelligence taken here.
>   This is one of the main strengths of focusing upon actions.

Whether or not it's a strength depends on what your interests are.
Someone who's interested in consciousness may not see it as an advantage
of this approach that it makes consciousness irrelevant.

But does it really matter, at this point, how we define intelligence?
For example: if we want to study consciousness, or if we just want to
make machines do things without bothering about the machine's
subjective experience (if any), we can go ahead without knowing or
caring exactly what "intelligence" means.

It may be that consciousness is necessary for certain degrees of
intelligence, or it may be that it never becomes necessary at all;
but right now we're not in a position to find out which is correct.
We don't know enough about it.  We can talk about all we want,
but we might just be wrong.