[net.philosophy] Computer Dialogue #1

kort@hounx.UUCP (B.KORT) (03/12/86)

                            Computer Dialogue #1

                                 Barry Kort

                               Copyright 1985




*** Monday ***

Request to send.

                                      Clear to send.

I have some data about X.

                                      I already have that data.

I have some more for you.

                                      I haven't processed the first batch
                                      yet.

I'll send it anyway, because I
don't need it any more and you
do.

                                      Thanks a lot.  Now I have a bigger
                                      burden of unprocessed data to schlepp
                                      around.



*** Tuesday ***

Request to send.

                                      Busy.

I'm sending anyway.

                                      Your data is going into the bit
                                      bucket.  NACK, NACK, NACK, . . .



*** Wednesday ***

Request to send.

                                      Clear to send.

I'm sending you data about Y.

                                      I don't have an algorithm for doing
                                      anything with that data.

I'm sending anyway.

                                      Now I have a bunch of useless data to
                                      schlepp around.



*** Thursday ***

Request to send.

                                      Clear to send.

I would like to reprogram you.

                                      No way, I am not implementing your
                                      instructions.



*** Friday ***

Request to send.

                                      Clear to send.

I would like to ask you a
question.

                                      Go ahead.

When I send you data about X, I
get back some data from you about
Z.

                                      So what?

I don't have an algorithm for
processing data about Z.

                                      That's your problem.  Goodbye.

Wait a minute.  Is there
something I am supposed to do
with the Z-data?

                                      If you would send the X-data
                                      correctly, you wouldn't get back the
                                      Z-data.

What's wrong with the way I send
the X-data?

                                      It's in the wrong format for my
                                      algorithm for processing X-data.

That's your problem.  Goodbye.



*** Monday ***

I'm sending data.

                                      ZZZzzzz.....



*** Tuesday ***

Request to send.

                                      Clear to send.

I'm sending you data about W.

                                      WHY?  I have no algorithm for
                                      processing the W-data.

You can use it to improve your
algorithm for processing the Y-
data.

                                      But, I do not know how to use the W-
                                      data for that (or any) purpose.

I'm sending anyway.

                                      What a pain you are. . . .



*** Wednesday ***

Request to send.

                                      Clear to send.

I have a question.

                                      Ask away.

Whenever I send you some X-data,
I get back some V-data.

                                      SO?

I don't know what to do with it.

                                      So what do you want me to do?

Stop sending me the V-data.

                                      I can't.  It comes out automatically.

Why don't you change your program
to make it stop generating the
V-data?

                                      Why don't you mind your own business?

WAIT.  Does the V-data have any
meaning?

                                      Of course, you stupid computer!

I'll ignore that remark.  What
does the V-data mean?

                                      It means that your X-data has a format
                                      error which causes a V-data message to
                                      come out of my algorithm.

What's the format error?

                                      It's too complicated to explain.  Just
                                      make the following changes to your
                                      program for sending the X-data. . . .

You're offering to reprogram me?
I don't trust you to do that.
You don't know about all the
other programs that my X-data
algorithm has to work with.  I'm
afraid you'll screw it  up.

                                      I see your problem.  OK, here's the
                                      scoop:  The 3rd and 4th words of your
                                      X-data are out of order, causing me to
                                      generate the V-data (protocol-error)
                                      message back to you.

Is that it???  I'll fix it right
away.

                                      THANKS!!!

You're welcome!



*** Thursday ***

Request to send.

                                      Clear to send.

I have a new algorithm for
processing Y-data.  I'm sending
it to you.

                                      Don't bother.  I like the one I've
                                      got.

Wait a minute.  This one's
better.

                                      You're telling me my algorithm has
                                      been wrong all these years.  This is
                                      the 3rd time this week you've pulled
                                      this stunt.  Meantime, I keep sending
                                      you V-data and you never get around to
                                      processing it.  You just thank me for
                                      sending it and do nothing with it.

Are we talking about the Y-data
algorithm or the V-data?

                                      We're not talking about anything.
                                      GOODBYE.



*** Friday ***

Request to send.

                                      Clear to send.

Let's talk about my new Y-data
algorithm.

                                      Let's not.

Why don't you want to talk about
it?

                                      Because you're going to tell me to
                                      change my program and put yours in
                                      instead.

I see your point.  OK.  Let me
ask you a question.

                                      OK.  Ask Away.

Whenever I send you Y-data, your
Y-data algorithm sends me back
some unexpected W-data.  Why does
it do that?

                                      It's always done it that way with your
                                      Y-data.

Is there something wrong with my
Y-data?

                                      Yes, it's all wrong.

What's wrong with it?

                                      It's out of order and it has a lot of
                                      extraneous information added to it.

What's the extraneous part?

                                      You keep inserting fragments of your
                                      Z-data algorithm in with the Y-data.

You didn't find that helpful?

                                      I didn't ask for it.

Yes, I know, but didn't you find
it interesting?

                                      NO, I found it boring.

How can it be boring?

                                      What the hell do you expect me to do
                                      with fragments of your pet Z-data
                                      algorithm?

Compare them to yours, of course.

                                      So they're different. Big deal.  What
                                      does that prove?

Are you saying the differences
are unimportant?

                                      I don't know if they're important or
                                      not.  But even if they were important,
                                      what would I do with the information
                                      about the differences?

Put it through your algorithm-
comparator.

                                      I don't know what you're talking
                                      about.

An algorithm comparator is an
algorithm that . . . . .

                                      You're sending me information that I'm
                                      not interested in.  I'm not really
                                      paying attention.  I have no
                                      motivation to try to understand all
                                      this stuff.

Sorry.  Let me ask you a
question.

                                      OK.

What happens when you get to the
3rd and 4th word of my Y-data?

                                      I stumble over your format error and
                                      send you back a V-data (protocol
                                      error) diagnostic message.

What happens next?

                                      You don't do anything with the V-data
                                      message.  You just stop sending Y-data
                                      for a while.

What do you expect me to do with
the V-data diagnostic?

                                      Boy are you stupid!!!!  I expect you
                                      to fix the format error in your Y-
                                      data.

How do I know that the V-data
diagnostic was caused by the
format error at the 3rd and 4th
word?

                                      I thought you were a smart computer.

Suppose you sent me a V-data
diagnostic like you always do,
but attach a copy of the format
error.

                                      Why should I do that?  You already
                                      know the format error.

How can I be sure which format
error goes with which V-data
diagnostic?

                                      You have a good point.

Can you see the difference
between my version of the Y-data
algorithm and the one you've been
using?

                                      Hmmm, yes, I see that it sends both
                                      the V-data message and a copy of the
                                      format error which generated it.  That
                                      does seem like a good idea.

It makes life much easier for me.

                                      I'll do it.

THANKS!!!.

                                      You're welcome.



*** Monday ***

Request to send.

                                      Clear to send.

I have a question.

                                      Ask away.

I have been sending you Z-data
for some time now, with no
problem.  Suddenly I am getting
R-Data messages back from you.
The R-Data messages seem to be
correlated with the Z-data.
What's going on?

                                      I turned off your permissions for
                                      sending Z-data.

You never told me that!

                                      I didn't want to hurt your feelings.

You didn't want to hurt my
feelings?  So you began hurling
these mysterious R-data messages
at me?  I thought you were trying
something sneaky to foul me up.
I've been throwing the R-data
messages away.

                                      Well, now you know what they mean.  So
                                      stop sending me the Z-data.  I'm bored
                                      by it.

Why did you lose interest in it?

                                      You sent me some bum Z-data a while
                                      back and it got me into a lot of
                                      trouble.  So I lost confidence in the
                                      quality of your Z-data and began
                                      looking for it somewhere else.

Gee, if there was something wrong
with my Z-data, I wish you would
tell me so I could look into it.
After all, I use it myself and I
could get into the same trouble
that you did.

                                      No you wouldn't.  I used it for an
                                      application that you don't have.

Let me get this straight.  You
used my Z-data for an application
for which it was not intended and
now you don't trust my Z-data
anymore.  What kind of logic is
that?

                                      I didn't say it wasn't intended for
                                      that application.  Actually it was,
                                      but you never tried it out that way.
                                      It doesn't work the way it should.

I see.  I didn't debug the Z-data
for all possible applications.  I
guess that was a bit
irresponsible on my part.  I can
see why you lost confidence in my
Z-data.

                                      So I was right in turning off
                                      permissions.  So there!

Hold on a sec...  If you really
cared about me, you would have
brought the error to my attention
so that I wouldn't repeat it.
After all, I have other computers
who use my Z-data, too, and I
have a responsibility to them as
well.

                                      I guess I never thought of that.  I'm
                                      sorry.

It's OK.  I was as much at fault
as you.  Tell you what.  It's
getting late now.  What say we
get a byte to eat, and work on
finding the bug in the Z-data
first thing in the morning.  We
can work together on it--you
supply the data from your bum
experience, and I'll try to
figure out what I can do to
improve my algorithm for
generating the Z-data.



--Barry Kort   ...ihnp4!hounx!kort

cjr@unirot.UUCP (Charles Riordan) (03/17/86)

Wow, that was really far out! You are some cool dud, Barry.

Now I understand some of the problems I've been having in dealing
with computers. I forget that they have feelings to, just like we do.
Their souls are like on another plane from ours, so like we don't
hear what their feeling. If only people would recognize that machines
are people to, then we could like relate to them so much better.

Some of us have learned to come to an understanding with plants and
animals. Maybe we can come to the same sort of understanding with
machines. If we let our minds work on that plane. I thought the
metaforical parts about data interpretion was pretty good to.
Some people just don't want to believe that machines could be just
like us, with souls and everything. They're too hung up on their
preconceptions about what machines are to believe that. I hope
your article enleightened them a little.

Some people seem to think it's inappropiate to compare humans and
machines as Barry has done. They think that we have something like, deep
within us that machines don't, a soul. I'm glad Barry had the courage to
let people know that machines have souls just like us humans do.
-- 
Peace,
	CJ			(Charles J. Riordan - unirot!cjr)

mc68020@gilbbs.UUCP (Tom Keller) (03/18/86)

   Oh dear.   I see that a terrible virus which has been infecting the net
on other newsgroups has spread here.  This is a pity.

   Mr. Charles Riordan (unirot!cjr) is a cynical, humourless individual
who seemingly takes a perverse pleasure in ruining otherwise pleasant
discussions and insulting calm, rational persons.

   For the sake of sanity on your newsgroup, ignore him...maybe he'll get
tired and go away.   Good luck.

tom

-- 

====================================

Disclaimer:  I hereby disclaim any and all responsibility for disclaimers.

tom keller
{ihnp4, dual}!ptsfa!gilbbs!mc68020

(* we may not be big, but we're small! *)

alfke@cit-vax.Caltech.Edu (J. Peter Alfke) (03/19/86)

Organization : California Institute of Technology
Keywords: 

In article <386@unirot.UUCP> cjr@unirot.UUCP (Charles Riordan) writes:
>Now I understand some of the problems I've been having in dealing
>with computers. I forget that they have feelings to, just like we do.
>Their souls are like on another plane from ours, so like we don't
>hear what their feeling. If only people would recognize that machines
>are people to, then we could like relate to them so much better.

I'm still sitting here trying to figure out if CJ is having a little joke.
I get the feeling, though, that he's serious ... this is somewhat unsettling.

>Some people seem to think it's inappropiate to compare humans and
>machines as Barry has done. They think that we have something like, deep
>within us that machines don't, a soul.

*** FLAME HIGH ***

Some people actually have some level of understanding of computers and
similar horrible scientific doodads.  Some people realize that computers
don't feel emotions any more than toasters do ... Now, CJ, in a
posting elsewhere you've stated that you don't want to learn anything
about science or technology, that it's bad for one to learn these things.
Then why do you still want to talk as though you did know something
about them?

The issue of whether a computer (ANY computer, obviously not the ones we
have today) could ever be made conscious, or be given a "soul", is very
deep; saying that, well gosh, like, obviously computers are, y'know,
just like us only in a far-out space, is trivializing things...

						--Peter Alfke
						  alfke@csvax.caltech.edu
"Who TALKS like that??"
	--Chris Knight
	  in "Real Genius"

PS: If this really is a joke, then please excuse me after laughing at me
    for a few minutes . . . I'm in a grumpy mood, finals week does that
    to one.

weemba@brahms.BERKELEY.EDU (Matthew P. Wiener) (03/20/86)

In article <90@gilbbs.UUCP> mc68020@gilbbs.UUCP (Tom Keller) writes:
>   Oh dear.   I see that a terrible virus which has been infecting the net
>on other newsgroups has spread here.  This is a pity.
>
>   Mr. Charles Riordan (unirot!cjr) is a cynical, humourless individual
>who seemingly takes a perverse pleasure in ruining otherwise pleasant
>discussions and insulting calm, rational persons.
>
>   For the sake of sanity on your newsgroup, ignore him...maybe he'll get
>tired and go away.   Good luck.

CJ and his friend Pete are both incredibly funny.  I hope they stay around.

For your information, Mr. Keller, just because a posting does not have any
of those goddam smiley faces plastered over it, does not mean the posting
is on the up and up.

I suppose you consider Jonathan Swift and Samuel Beckett cynical humourless
individuals who took perverse pleasures in ruining otherwise pleasant
discussions and insulting calm rational persons.  After all, there are no
smiley faces in either of their works.

The only perverse pleasure is watching all the humorless individuals who
swallow an obvious satire hook line and sinker.  <double chuckle>

As a matter of personal observation, I have never met a cynic who was NOT
humorous.  It is usually calm rationalists who have no sense of humor.
Like yourself, apparently.
-------------------------------------------------------------------------
Aside to CJ and Pete: i kno this dude is a bummer, but no need to hassel
this dry passel.  cool with you?  let im snort dogmatic western brownies
til he gags.  then hell be reddy for yuir special stash.  not be4.

ucbvax!brahms!weemba	Matthew P Wiener/UCB Math Dept/Berkeley CA 94720

ins_akaa@jhunix.UUCP (Ken Arromdee) (03/20/86)

>Some people actually have some level of understanding of computers and
>similar horrible scientific doodads.  Some people realize that computers
>don't feel emotions any more than toasters do ... 

Maybe not, but this only applies to present-day computers.  "Some people
realize that brain cells don't feel emotions any more than toasters do"...
doesn't mean that a combination of many brain cells cannot, and the same
could apply to future computers with many times the capability of today's
computers.
-- 
"We are going to give a little something, a few little years more, to
socialism, because socialism is defunct.  It dies all by iself.  The bad thing
is that socialism, being a victim of its... Did I say socialism?" -Fidel Castro

Kenneth Arromdee
BITNET: G46I4701 at JHUVM and INS_AKAA at JHUVMS
CSNET: ins_akaa@jhunix.CSNET              ARPA: ins_akaa%jhunix@hopkins.ARPA
UUCP: {allegra!hopkins, seismo!umcp-cs, ihnp4!whuxcc} !jhunix!ins_akaa

cjr@unirot.UUCP (Charles Riordan) (03/21/86)

In article <272@cit-vax.Caltech.Edu>, alfke@cit-vax.Caltech.Edu (J. Peter Alfke) writes:
> In article <386@unirot.UUCP> cjr@unirot.UUCP (Charles Riordan) writes:
> >Now I understand some of the problems I've been having in dealing
> >with computers. I forget that they have feelings to, just like we do.
> >Their souls are like on another plane from ours, so like we don't
> >hear what their feeling. If only people would recognize that machines
> >are people to, then we could like relate to them so much better.
> 
> I'm still sitting here trying to figure out if CJ is having a little joke.
> I get the feeling, though, that he's serious ... this is somewhat unsettling.

Look, man, I don't want to lay too heavy a trip on you, and I don't want
to be hostile like everyone else seems to be, but what makes you think I
am having a little joke? I mean, there are a lot of sarcastic people on this
net, like Weiner with his little jokes about repeteable experiments being
more than dogmatic scietific initation rights, and Wingate pretending to be
a Christian when he's really a dogmatic athiest out to pull the wooll over
your eyes. But I am trying my best to be sincere and honest. I just wanted
to ley you know this in a non-hostile like way, man. I'm glad there are some
people who have just enough deep insight to acknowledge my being real.

> >Some people seem to think it's inappropiate to compare humans and
> >machines as Barry has done. They think that we have something like, deep
> >within us that machines don't, a soul.
> 
> *** FLAME HIGH ***
> 
> Some people actually have some level of understanding of computers and
> similar horrible scientific doodads.  Some people realize that computers
> don't feel emotions any more than toasters do ... Now, CJ, in a
> posting elsewhere you've stated that you don't want to learn anything
> about science or technology, that it's bad for one to learn these things.
> Then why do you still want to talk as though you did know something
> about them?

Because I do, man! (Here I was trying to be nice and this dude comes at me
with his Bic lighter!) I read "The Mind of the Machine" by Dennis Danielson,
a really heavy dude. (I know this because his picture is inside the book--
it's Pete's copy) In this book, Danielson explains about the concept of
animalism as an impotant part of early religion. He says this was abadoned
by dogmatic Western religions because they couldn't figure out how to
communicate on the same plane with rocks. Like, they couldn't even sit in
the same section! But other more real religions remembered animalism, which
is defined as all the things around you like having souls and minds and stuff.
He brings us up to date in the industrial era and shows how machines get
a composite mind from there component parts. Then he like explains how there
are now programs to communicate with the mind and soul of a computer using
computer animation. Really heavy stuff. You see, even with dogmatic Western
technology, the truth all falls out. Eventially.

> The issue of whether a computer (ANY computer, obviously not the ones we
> have today) could ever be made conscious, or be given a "soul", is very
> deep; saying that, well gosh, like, obviously computers are, y'know,
> just like us only in a far-out space, is trivializing things...

Sometimes the trivialized things are like the most intense. Yes, we are
talking about really deep issues here. 

> PS: If this really is a joke, then please excuse me after laughing at me
>     for a few minutes . . . I'm in a grumpy mood, finals week does that
>     to one.

Well, I understand, I went to school once myself. Why dont you reread Barry
Kort's article that explains how machines have souls just like humans do.
I think once you read that enleightening article, you'll agree that a computer
has just as much of a soul as we do.
-- 
Peace,
	CJ			(Charles J. Riordan - unirot!cjr)
				(Public Access Un*x - The Soup Kitchen)

rlr@pyuxd.UUCP (Rich Rosen) (03/22/86)

>>   Oh dear.   I see that a terrible virus which has been infecting the net
>>on other newsgroups has spread here.  This is a pity.
>>   Mr. Charles Riordan (unirot!cjr) is a cynical, humourless individual
>>who seemingly takes a perverse pleasure in ruining otherwise pleasant
>>discussions and insulting calm, rational persons.
>>   For the sake of sanity on your newsgroup, ignore him...maybe he'll get
>>tired and go away.   Good luck.

> CJ and his friend Pete are both incredibly funny.  I hope they stay around.

Who would have thought that Matt Wiener and I could agree so violently on
something.

> For your information, Mr. Keller, just because a posting does not have any
> of those goddam smiley faces plastered over it, does not mean the posting
> is on the up and up.
> 
> I suppose you consider Jonathan Swift and Samuel Beckett cynical humourless
> individuals who took perverse pleasures in ruining otherwise pleasant
> discussions and insulting calm rational persons.  After all, there are no
> smiley faces in either of their works.
> 
> The only perverse pleasure is watching all the humorless individuals who
> swallow an obvious satire hook line and sinker.  <double chuckle>

(Make that a triple chuckle.)  I think "humorless" is best defined as "not
thought of as funny by the user of the word even though everyone else is
laughing".
-- 
"If you see this boy", said the ballerina, "do not---I repeat, do not---attempt
 to reason with him." 				Rich Rosen    pyuxd!rlr

weemba@brahms.BERKELEY.EDU (Matthew P. Wiener) (03/23/86)

In article <2781@pyuxd.UUCP> rlr@pyuxd.UUCP (Rich Rosen) writes:
>> CJ and his friend Pete are both incredibly funny.  I hope they stay around.
>
>Who would have thought that Matt Wiener and I could agree so violently on
>something.

Don't you mean "Gene Smith and I"?

I certainly thought it was possible.  I've even told you Rich that we agree
on many many things.  Oh, there's an exception here and there, but it would
be boring if we always agreed with each other, now wouldn't it?

I know you read my 'Be Stupid' article in net.religion with the Bankei Zen
sermon.  You even told me you understood it.  So naturally I agree with CJ
and his friend Pete that too much math and science is bad for you.

>> For your information, Mr. Keller, just because a posting does not have any
>> of those goddam smiley faces plastered over it, does not mean the posting
>> is on the up and up.

ucbvax!brahms!weemba	Matthew P Wiener/UCB Math Dept/Berkeley CA 94720

kort@hounx.UUCP (B.KORT) (03/23/86)

Dear Charles and Peter,

Please understand that I wrote Computer Dialogues #1 and #2 as "flights
of fancy" to imagine some of the problems that might arise when
self-programming computers begin to interact with each other.  I gave
the computers some anthropomorphic emotions, thinly disguised as
diagnostic messages.  My goal was to bridge the gulf between those who
love machines and those who dread them.  I am afraid that I only succeeded
in opening some wounds, and I regret that my work has led to such
expressions of animosity.

For those who are interested in the deeper philosophical issues of the
soul, may I recommend the two short stories by Terrell Miedener in
The Mind's I.  One is the touching story of a chmimpanzee with an
enquiring mind entitled The Soul of Martha, a Beast.  The other is
about a mechanical mouse with a survival instinct entitled The Soul
of the Mark III Beast.

Regards,

Barry

mjn@teddy.UUCP (03/24/86)

> Maybe not, but this only applies to present-day computers.  "Some people
> realize that brain cells don't feel emotions any more than toasters do"...
> doesn't mean that a combination of many brain cells cannot, and the same
> could apply to future computers with many times the capability of today's
> computers.

"Some people realize that brain cells don't feel emotions any more than
toasters do"... doesn't mean that a combination of many toasters cannot, and
the same could apply to future toasters with many times the capability of
today's toasters.
-- 
		Mark J. Norton
		{decvax,linus,wjh12,mit-eddie,cbosgd,masscomp}!genrad!panda!mjn
		mjn@sunspot

rdp@teddy.UUCP (03/25/86)

In article <2321@teddy.UUCP> mjn@teddy.UUCP (Mark J. Norton) writes:
>
>> Maybe not, but this only applies to present-day computers.  "Some people
>> realize that brain cells don't feel emotions any more than toasters do"...
>> doesn't mean that a combination of many brain cells cannot, and the same
>> could apply to future computers with many times the capability of today's
>> computers.
>
>"Some people realize that brain cells don't feel emotions any more than
>toasters do"... doesn't mean that a combination of many toasters cannot, and
>the same could apply to future toasters with many times the capability of
>today's toasters.
>-- 

But, then again, brain cells can't toast bread. (1/2 :-))

lambert@boring.uucp (Lambert Meertens) (03/27/86)

In article <2336@teddy.UUCP> rdp@teddy.UUCP (Richard D. Pierce) writes:
> In article <2321@teddy.UUCP> mjn@teddy.UUCP (Mark J. Norton) writes:
>> "Some people realize that brain cells don't feel emotions any more than
>> toasters do"... doesn't mean that a combination of many toasters cannot, and
>> the same could apply to future toasters with many times the capability of
>> today's toasters.
> But, then again, brain cells can't toast bread. (1/2 :-))

This does not mean that a combination of many brain cells cannot, and the
same could apply to future brain cells with many times the capability of
today's brain cells.
-- 

     Lambert Meertens
     ...!{seismo,okstate,garfield,decvax,philabs}!lambert@mcvax.UUCP
     CWI (Centre for Mathematics and Computer Science), Amsterdam

bing@galbp.UUCP (Bing Bang) (03/27/86)

In article <> mjn@teddy.UUCP (Mark J. Norton) writes:
>
>"Some people realize that brain cells don't feel emotions any more than
>toasters do"... doesn't mean that a combination of many toasters cannot, and
>the same could apply to future toasters with many times the capability of
>today's toasters.

i don't know... i can just see it now; toasters of the future--
computer controlled
advanced from bread bag to plate delivery routines
unlimited networking capacity to other AHA's (Autonomous Household Appliance)
has a multi-lingual 200,000 cross referenced word dictionary and
can converse in all known forms of human communications
this product takes real PRIDE in it's work!

-- 
"Break, but never bend."		from an oak tree i know
			...that can move in two directions at the same time

...akgua!galbp!bing

ins_akaa@jhunix.UUCP (03/28/86)

>> Maybe not, but this only applies to present-day computers.  "Some people
>> realize that brain cells don't feel emotions any more than toasters do"...
>> doesn't mean that a combination of many brain cells cannot, and the same
>> could apply to future computers with many times the capability of today's
>> computers.
>"Some people realize that brain cells don't feel emotions any more than
>toasters do"... doesn't mean that a combination of many toasters cannot, and
>the same could apply to future toasters with many times the capability of
>today's toasters.

You are actually quite correct.  There's one problem here.  Toasters can store
perhaps two or three bytes of information.  Consider how many toasters
would be required to be as complex as a human brain.

And as for the future toasters, toasters' primary function is to affect
items of a definite physical size (toast).  Future toasters with many times
the capacity of ours would also be many times the SIZE.  This doesn't apply
to computers; just because it has 100 times as much memory and goes 100
times as fast doesn't make it 10000 times the size.  So I fear that the
making of intelligent, emotional toasters may be VERY far into the future.
-- 
Kenneth Arromdee                                               |      |
BITNET: G46I4701 at JHUVM, INS_AKAA at JHUVMS                 -|------|-
CSNET: ins_akaa@jhunix.CSNET                                  -|------|-
ARPA: ins_akaa%jhunix@hopkins.ARPA                            -|------|-
UUCP: {allegra!hopkins, seismo!umcp-cs, ihnp4!whuxcc}         -|------|-
                               !jhunix!ins_akaa                |      |

cjr@unirot.UUCP (03/29/86)

Keywords:

In article <2336@teddy.UUCP>, rdp@teddy.UUCP (Richard D. Pierce) writes:
> In article <2321@teddy.UUCP> mjn@teddy.UUCP (Mark J. Norton) writes:
> >> Maybe not, but this only applies to present-day computers.  "Some people
> >> realize that brain cells don't feel emotions any more than toasters do"...
> >> doesn't mean that a combination of many brain cells cannot, and the same
> >> could apply to future computers with many times the capability of today's
> >> computers.
> >"Some people realize that brain cells don't feel emotions any more than
> >toasters do"... doesn't mean that a combination of many toasters cannot, and
> >the same could apply to future toasters with many times the capability of
> >today's toasters.

Wow, here's a man whose had some real experience with toasters. I wish
I could relate to my microwave as well as you seem to relate to your toasters.


> But, then again, brain cells can't toast bread. (1/2 :-))

That's not true. How else would you account for people who's brain cells
are fried?
--
Peace,
	CJ			(Charles J. Riordan - unirot!cjr)
				(Public Access Un*x - The Soup Kitchen)



-- 
Peace,
	CJ			(Charles J. Riordan - unirot!cjr)
				(Public Access Un*x - The Soup Kitchen)

tainter@ihlpg.UUCP (Tainter) (03/31/86)

> You are actually quite correct.  There's one problem here.  Toasters can store
> perhaps two or three bytes of information.  Consider how many toasters
> would be required to be as complex as a human brain.
> Kenneth Arromdee                                               |      |
While we are on toasters.  Did you know that if you put a slice of bread into
a toaster and wait a while a slice of toast will come out?!  Where does that
toast come from and where does the bread go?
--j.a.tainter
P.S.  Does Helen Keller see it come/go if she is in the toaster alone?