[sci.virtual-worlds] so called cyberspace conferences

frerichs@ux1.cso.uiuc.edu (David J Frerichs) (12/13/90)

I originally wrote this for alt.cyberpunk, but I think it applies here just
as well if not more...

I must agree that these "cyberspace conferences" are quite useless.  They
consist of people discussing the implications of the VR representation of
cyberspace... quite a funny thing to talk about considering no two groups
of people agree on what cyberspace is let alone how to represent it using
Virtual Reality.

What would be useful is a conference of researchers and theorists to try
to create some standards for VR and networked VR (my definition of a
cyberspace is any multiuser WAN).  I think many others agree with me that
the time for bullshit is over and the time to treat VR and its various
sub catagories like a true field has come.
We need standardization and Protocols... not fantasy.

[dfRERICHS
 University of Illinois, Urbana         Designing VR systems that work...
 Dept. of Computer Engineering
 IEEE/SigGraph                          Looking for cyberspace?
 frerichs@ux1.cso.uiuc.edu              Well stop your snivelin', son,
 frerichs@well.sf.ca.us                 you've been in here all along!     ]

cyberoid@milton.u.washington.edu (Robert Jacobson) (12/13/90)

This presumes, of course, that the field wants standards.  While
it is true that some of us are working toward that end, cooperation
in the field is by no means assured.  And this is still while the
field is occupied by mostly researchers.  What happens when the big
companies get in?  One would hope that the spirit of cooperation
which is so often expressed among gatherings of researchers in
the field (and which is more often than not sincerely held) would
become manifest soon and create an ethos of cooperation that can
be sustained when the technology reaches adulthood.
 
The conferences do serve a purpose, to open people's minds to the
possibilities inherent in virtual worlds technology.  Please note,
also, that our Lab is sponsoring an industry symposium intended
to have concrete results like those you ask for, David, and that
there have been more "serious" discussions at SIGCHI and SIGGRAPH.

However, a professional association or working group must
inevitably form.  The means for it are being assembled.  Be
patient and stop plunking down $500 if you've heard the general
rap already, IMHO.

Bob Jacobson
HIT Lab
(replacing moderator cap)

sharp@cs-sun-fsd.cpsc.ucalgary.ca (Maurice Sharp) (12/13/90)

In article <12911@milton.u.washington.edu> frerichs@ux1.cso.uiuc.edu (David J Fr
erichs) writes:
>
>I must agree that these "cyberspace conferences" are quite useless.  They
>
>What would be useful is a conference of researchers and theorists to try
>to create some standards for VR and networked VR (my definition of a
>cyberspace is any multiuser WAN).  I think many others agree with me that
>the time for bullshit is over and the time to treat VR and its various
>sub catagories like a true field has come.
>We need standardization and Protocols... not fantasy.

Hiya,

    I must disagree with your assesment. If you look at the literature
on how scientific fields progress, it is NOT time to define standards.
In fact, the types of meetins we are having now are just the type they
should be.

    According to Gaines (see ref at end), there are 6 stages in the
development of an information technology :
        Breakthrough - creative advance
        Replication - mimick the breakthrough and gaine experience
        Empirical - rules of design based on experience
        Theoretical - underyling theories found
        Automation - theories predict experience and generate rules
        Maturity - routine use of technology

    The average time from Breakthrough to Maturity is 70 years !
Cyberspace and/or virtual reality is at the BREAKTHROUGH stage. What
is worse, the technology does not exist to implement and test most of
the ideas. It is going to be about 20-30 years before there are heavy
underlying theories. Less (3-10) for some empirical guidelines.

    I also disagree with your point about Cyberspace vs. Virtual
Reality. At the First Conference on Cyberspace we came up with a
reasonable Empirical distinction. Perhaps you should take a look at
the literature before you damn the conference. Science is not instant,
you do not name a field one year, then break it up into
compartmentalized subfields the next. Take a more realistic view of
science (read Thomas S. Khun, that ought to open your eyes).

    If you are really interested in what was accomplished at the
conference, I can send you a copy of my summary of the presentations.
It mentions the Cyberspace/VR difference.

        maurice

Gaines, B. R. From Information Technology to Knowledge Technology.
Future Computing Systems Vol 2 Num. 4. Oxford University Press.


-- 
Maurice Sharp MSc. Student (403) 220 7690
University of Calgary Computer Science Department
2500 University Drive N.W.            sharp@cpsc.UCalgary.CA
Calgary, Alberta, T2N 1N4             GEnie M.SHARP5

cgy@cs.brown.edu (Curtis Yarvin) (12/14/90)

In article <1990Dec13.093343.8402@cpsc.ucalgary.ca> sharp@cs-sun-fsd.cpsc.ucalga
ry.ca (Maurice Sharp) writes:
>
>In article <12911@milton.u.washington.edu> frerichs@ux1.cso.uiuc.edu (David J F
r
>erichs) writes:
>>
>>I must agree that these "cyberspace conferences" are quite useless.  They
>>
>>What would be useful is a conference of researchers and theorists to try
>>to create some standards for VR and networked VR (my definition of a
>>cyberspace is any multiuser WAN).  I think many others agree with me that
>>the time for bullshit is over and the time to treat VR and its various
>>sub catagories like a true field has come.
>>We need standardization and Protocols... not fantasy.
>
>Hiya,
>
>    I must disagree with your assesment. If you look at the literature
>on how scientific fields progress, it is NOT time to define standards.
>In fact, the types of meetins we are having now are just the type they
>should be.
>
>    According to Gaines (see ref at end), there are 6 stages in the
>development of an information technology :
>        Breakthrough - creative advance
>        Replication - mimick the breakthrough and gaine experience
>        Empirical - rules of design based on experience
>        Theoretical - underyling theories found
>        Automation - theories predict experience and generate rules
>        Maturity - routine use of technology
>
>    The average time from Breakthrough to Maturity is 70 years !
>Cyberspace and/or virtual reality is at the BREAKTHROUGH stage. What
>is worse, the technology does not exist to implement and test most of
>the ideas. It is going to be about 20-30 years before there are heavy
>underlying theories. Less (3-10) for some empirical guidelines.

We just got an off-scale reading on the good ol' bogometer here.  70 years?
"Information technology" itself (unless you define it as double-entry
bookkeeping & other heavy-duty paperwork) has been around for less than 50
years.  I don't know who this Gaines guy is (some freshwater professor of
sociology?), but his "stages" are so vague as to be entirely useless.
In fact, I don't even know what he means by "an" information technology.  So
let's try some guesses:

        (1) A new methodology of hardware design.
                This probably comes closest to Gaines's stage system.  Let's
take RISC as an example of a hardware breakthrough.  The first RISC machine
was the CDC 6600.  This was the result of a creative breakthrough by Seymour
Cray, around 1970.  It was replicated in the mid-1970s by Cocke and others
at IBM, culminating in the IBM 803.  The "Empirical" stage was pretty much
skipped; Patterson was coming up with some "theoretical" results in the
early 80s, and RISC microprocessors were beginning to be fabricated.  Right
now the technology is definitely at maturity.  All processors introduced in the
last two years incorporate RISC design concepts; although (like the 486 and
040) they may have complex instruction sets, they are highly pipelined and
lack vertical microcode.  Total time elapsed?  20 years.

        (2) A new methodology of software design.

                Software advances have not followed Gaines's model.  Rather,
they have moved on a slow but steady curve.  One important feature of the
curve is occasional penduluming: the gradual realization that compilers are
good, followed by the discovery that HLLs can be really slow if they're
badly designed (eg COBOL); the gradual realization that structured programming c
uts
down on bugs, followed by the discovery that it could also be quite anal (eg
Pascal); the gradual acceptance of object-oriented models, followed by the
gradual ostracism of dynamic binding from those models.

        (3) A new way of thinking about information processing.

                These have followed the exact opposite of Gaines's path.
Let's take symbolic AI as an example.  First a brilliant researcher
propounds an elegant theory that purports to explain everything.  But
extending the theory downward into the cold, smelly mud of reality proves
difficult; and it eventually sinks under its own weight.

Now where does VR fit into this classification?  It consists mostly of the
first and the third.  There are people trying to build it; and there are
people trying to concoct theories about it.  The examples presented above,
as I'm sure Frerichs is quite aware, indicate that we need a lot more
of the former, and a lot fewer of the latter.  One Eric Pepke is worth a
thousand Timothy Learys.  VR is going to happen, whether people theorize
about it or not; but we don't want to let the theorists suck grant money
away from the genuine researchers.

>Take a more realistic view of
>science (read Thomas S. Khun [sic], that ought to open your eyes).

An excellent book - he takes great care to define his terms, which is the
most important thing in the philosophy business.  But it doesn't apply.
"Knowledge technology" is (gasp, shudder) not a science.  It's an art.
It is not based on hard data, and theories presented in it are not
falsifiable - which means they're meaningless.  Read Popper.
>
>Gaines, B. R. From Information Technology to Knowledge Technology.
>Future Computing Systems Vol 2 Num. 4. Oxford University Press.
>
>Maurice Sharp MSc. Student (403) 220 7690

Curtis

"I tried living in the real world
 Instead of a shell
 But I was bored before I even began." - The Smiths

sharp@cs-sun-fsd.cpsc.ucalgary.ca (Maurice Sharp) (12/15/90)

Hiya,

    A reply to the reply...

In article <12979@milton.u.washington.edu> cgy@cs.brown.edu (Curtis Yarvin) writ
es:
>
>
>In article <1990Dec13.093343.8402@cpsc.ucalgary.ca> sharp@cs-sun-fsd.cpsc.ucalg
a
[stuff about ref to how information technology develops]
>
>We just got an off-scale reading on the good ol' bogometer here.  70 years?

Sorry, my mistake, the actual quote is (p 392) :

"The line of product innovation marks the practical availability of
the various stages of new technology, and it lags the line of
invention by 16 years, and in its turn is lagged by the line of low
cost products by 16 years. Thus there is a 16-year gap between
invention and significant application, and a 32-year gap between
invention and mass production"

I was looking at invention to next invention time lines.

>"Information technology" itself (unless you define it as double-entry
>bookkeeping & other heavy-duty paperwork) has been around for less than 50
>years.  I don't know who this Gaines guy is (some freshwater professor of
>sociology?), but his "stages" are so vague as to be entirely useless.
>In fact, I don't even know what he means by "an" information technology.  So
>let's try some guesses:

First, yes it is 50 years, 1940 as the start time. Second, the 'Gaines
guy' as you put it is TOP in his field (Knowledge Acquisition), and a
recognized expert on social effects of technology. Not exactly a
freshwater professor, unless you call a PhD. from Cambridge
freshwater. And as to vague stages, I am sure most of your stuff looks
vague when quoted. Try reading the paper, that is how research and
science works.

As to the other stuff you presented. Try reading the paper before
commenting on the vague terms. Perhaps the key factor I left out is
why thing have followed cycles as described. Take your example of
hardware development.

If you plot the devices per chip versus year as a linear graph, you
get almost nothing from 1956 to 1980. Then it curves up like crazy.
This makes it look like there was one innovation. This is just not
true. If you plot it as a series of linear graphs, taking each major
change in density as breaking points, it clearly shows several
innovations. 1956-1959 0 to 1 device, 1959-1964 1 to 20 devices,
1964-1972 20 to 5,000 devices, 1972-1980 5,000 to 500,000 devices,
1980-1988 500,000 to 20,000,000 devices. A slight adjustment in point
of view, and bingo, the idea makes more sence.

The bottom line is, if you take the stages given, you can show a
revolutionary change in every generation of computers (the 5 above,
now the 6th).

The paper even predicts cyberspace/vr ideas, before there was even a
conference !!

>falsifiable - which means they're meaningless.  Read Popper.

I have read Popper, and I like it :-) I am suprised that you did not
get a copy of a paper before criticising it. Read it, he presents the
argument in more detail, and more convincingly than I can.

The bottom line is it is still too early for hard theories of
cyberspace and VR. Wait for 10 or 15 years, then we may see some.
Until then, we will have to be happy with design principles. More an
art of cyberspace/VR creation than a science.

        maurice


-- 
Maurice Sharp MSc. Student (403) 220 7690
University of Calgary Computer Science Department
2500 University Drive N.W.            sharp@cpsc.UCalgary.CA
Calgary, Alberta, T2N 1N4             GEnie M.SHARP5

jwtlai@watcgl.waterloo.edu (Jim W Lai) (12/17/90)

In article <1990Dec15.073629.20435@cpsc.ucalgary.ca>
sharp@cs-sun-fsd.cpsc.ucalgary.ca (Maurice Sharp) writes:
>"The line of product innovation marks the practical availability of
>the various stages of new technology, and it lags the line of
>invention by 16 years, and in its turn is lagged by the line of low
>cost products by 16 years. Thus there is a 16-year gap between
>invention and significant application, and a 32-year gap between
>invention and mass production"

>The bottom line is it is still too early for hard theories of
>cyberspace and VR. Wait for 10 or 15 years, then we may see some.
>Until then, we will have to be happy with design principles. More an
>art of cyberspace/VR creation than a science.

My opinion is that these invention gaps apply better to hardware than to
software.  The windowed interface that we see on PCs today was pioneered
by Xerox Parc in the 1970s.  The mouse was invented in the late 1960s, I
believe.  User interaction is less easily developed because it is partially
an art as well as a developing science.  Software is not driven by economies
of scale in that mass release (i.e. mass production) is readily possible
given a widely-used hardware platform.

I guess my point is that VR software/applications should be considered
distinct from a VR platform.  In the typical examples, the "software"
(signal processing) aspect is relatively trivial, e.g. HDTV.  Trying to
assign times to software and user interface development is riskier than
mere hardware development, partially because there is no simple measure.

cgy@cs.brown.edu (Curtis Yarvin) (12/18/90)

In article <1990Dec15.073629.20435@cpsc.ucalgary.ca> sharp@cs-sun-fsd.cpsc.ucalg
ary.ca (Maurice Sharp) writes:
>
>Hiya,
>
>    A reply to the reply...
>
>In article <12979@milton.u.washington.edu> cgy@cs.brown.edu (Curtis Yarvin) wri
t
>es:
>>
>>
>>In article <1990Dec13.093343.8402@cpsc.ucalgary.ca> sharp@cs-sun-fsd.cpsc.ucal
g
>a
>[stuff about ref to how information technology develops]
>>
>>We just got an off-scale reading on the good ol' bogometer here.  70 years?
>
>Sorry, my mistake, the actual quote is (p 392) :
>
>"The line of product innovation marks the practical availability of
>the various stages of new technology, and it lags the line of
>invention by 16 years, and in its turn is lagged by the line of low
>cost products by 16 years. Thus there is a 16-year gap between
>invention and significant application, and a 32-year gap between
>invention and mass production"

Okay, that's better.  But I still want to know - where does he get his
numbers? (sorry, no time to read the paper)  Does he make a list of
revolutionary inventions, and average them, noting that the standard
deviation is sufficiently small to make his conclusions statistically
valid?  Or does he read some computer history books, say, "hmm, 16, 32, nice
round numbers, seems like things go about that speed?"  Is this science, or
just plain numerology?  Even great men are sucked into such traps, when
there are no facts around to work with.  Not sure if I have the astronomer
right, but I think Kepler spent many of his later years looking at the
numerological consequences of his theories.

>First, yes it is 50 years, 1940 as the start time. Second, the 'Gaines
>guy' as you put it is TOP in his field (Knowledge Acquisition), and a
>recognized expert on social effects of technology. Not exactly a
>freshwater professor, unless you call a PhD. from Cambridge
>freshwater. And as to vague stages, I am sure most of your stuff looks
>vague when quoted. Try reading the paper, that is how research and
>science works.

As I say, I don't really have time to read the paper.  So I'll just strain
your patience by asking you to defend it.  As for the qualifications...
well, Joan Quigley is one of the top people in astrology these days.  But
would you buy a used car from her?  Your argument is ad hominem in the
reverse direction.

And sure, my stuff is vague.  Part of it is because I don't have time or
space to define all my terms.  But mostly it's because I'm trying to present
a meta-theory: that all theorizing about this subject is inherently bullshit.

>
>As to the other stuff you presented. Try reading the paper before
>commenting on the vague terms. Perhaps the key factor I left out is
>why thing have followed cycles as described. Take your example of
>hardware development.

no, No, NO!  This is NOT the scientific method.  There are two methods of
doing science: the inductive, and the deductive.  Since the 1800s, the
deductive method has been abandoned for everything except mathematically
provable sciences - that is to say, math and physics.  In all fields where
propositions cannot be stated in mathematical terms, it is necessary to use
the inductive method.  So we can't ask "why?", only "what" and "whether."

>
>If you plot the devices per chip versus year as a linear graph, you
>get almost nothing from 1956 to 1980. Then it curves up like crazy.
>This makes it look like there was one innovation. This is just not
>true. If you plot it as a series of linear graphs, taking each major
>change in density as breaking points, it clearly shows several
>innovations. 1956-1959 0 to 1 device, 1959-1964 1 to 20 devices,
>1964-1972 20 to 5,000 devices, 1972-1980 5,000 to 500,000 devices,
>1980-1988 500,000 to 20,000,000 devices. A slight adjustment in point
>of view, and bingo, the idea makes more sence.
>
>The bottom line is, if you take the stages given, you can show a
>revolutionary change in every generation of computers (the 5 above,
>now the 6th).
>

Bollocks.  Gaines is looking at things bass-ackwards; he's only seeing the
results, not the causes.  "0 to 1 device, 1 to 20 devices" is not an
innovation, or even a "breaking point."  "Development of an ultraviolet
photolithographer" is an innovation.  Now if you want to look at chip
densities... the history of increasing integration is the history of
developments in condensed matter physics, and chemical and mechanical
engineering.  These are fields which move extremely smoothly.  There hasn't
been just one innovation in ICs since 1956; there haven't been six.  More
like six thousand.  This is an inherently evolutionary field which has been
advancing at a continuously exponential rate for a long time, and shows no
signs of stopping.  Gaines's theory of "revolutionary change" bites the dust
on this one, and it makes me very suspicious of him; I know next to nothing
about condensed matter physics, but I am beginning to get this creeping
feeling that Gaines wouldn't know a chip if he sat on one.

>The paper even predicts cyberspace/vr ideas, before there was even a
>conference !!

So did Gibson.  And what does Gibson know about computers?  Nothing, by his
own admission.
>
>>falsifiable - which means they're meaningless.  Read Popper.
>
>I have read Popper, and I like it :-) I am suprised that you did not
>get a copy of a paper before criticising it. Read it, he presents the
>argument in more detail, and more convincingly than I can.

Sorry, no time.  Obviously I'm missing something - Gaines can't be that dumb.
But then again I'm in a terribly optimistic mood this morning...

>
>The bottom line is it is still too early for hard theories of
>cyberspace and VR. Wait for 10 or 15 years, then we may see some.
>Until then, we will have to be happy with design principles. More an
>art of cyberspace/VR creation than a science.

Well, I partially agree.  It's not a science.  But I don't think we'll ever
see any theories.  Maybe it's just my cold, cynical heart coming back to
haunt me...

>        maurice
>2500 University Drive N.W.            sharp@cpsc.UCalgary.CA

Curtis

"I tried living in the real world
 Instead of a shell
 But I was bored before I even began." - The Smiths

sharp@cs-sun-fsd.cpsc.ucalgary.ca (Maurice Sharp) (12/18/90)

In article <13119@milton.u.washington.edu> jwtlai@watcgl.waterloo.edu (Jim W Lai
) writes:
>
>My opinion is that these invention gaps apply better to hardware than to
>software.  The windowed interface that we see on PCs today was pioneered
>by Xerox Parc in the 1970s.  The mouse was invented in the late 1960s, I

Hiya,

    Yes, the first reasonably complete interface was created at Parc.
BUT, the ideas and inventions occured much earlier. An object oriented
direct manipluation drawing system existed in the 60's ! It used a
light pen instead of a mouse, but the ideas were the same. In reality,
you have supported my point. The Star User interface was actually
implemented a few years earlier by David Canfield Smith for his thesis
(in Smalltalk !). 


        maurice


-- 
Maurice Sharp MSc. Student (403) 220 7690
University of Calgary Computer Science Department
2500 University Drive N.W.            sharp@cpsc.UCalgary.CA
Calgary, Alberta, T2N 1N4             GEnie M.SHARP5

pezely@cis.udel.edu (Daniel Pezely) (12/19/90)

cgy@cs.brown.edu (Curtis Yarvin) writes:
>sharp@cs-sun-fsd.cpsc.ucalgary.ca (Maurice Sharp) writes:
>>The paper even predicts cyberspace/vr ideas, before there was even a
>>conference !!
>
>So did Gibson.  And what does Gibson know about computers?  Nothing, by his
>own admission.

Folks, 

Cyberspace, VR, or whatever else you wish to call, it is not an invention
of any single person, but it is what we all, ideally, want computers to be.
[That's we, as in everyone, not just c.s., e.e., etc, people.]

The desire for tools which are actually useful, without interfering with
the task which they were intended for, has been around for a very long time.

It's only now that the technology is ALMOST capable of doing what we want.

As interactive as some computer systems are today, they're still pretty
poor.  Even all of the VR systems that are displayed at conferences are
lame.  In many people's ideal systems, you shouldn't have to learn a
gesture language to move about in a virtual environment; if it's not
completely intuitive, then there's more to be done yet.

But, it's closer to what we want than other things have been in the past! 


Relating all this to what's on the subject line:
Researchers of all kinds are finding out new things that are stemming from
virtual realities, so there is a bit more to it than what anyone could have
imagined, but then again, isn't that true about most technologies and
sciences?  

That is what the conferences are for: not implementation details of VR
systems, but applications, findings, and new ideas.  The VR frenzy at
SIGGRAPH or any of the other ACM conferences which might have VR stuff
deals with applications in that area.  A single, true-VR system will be
discussed in more groups and at more conferences than any of us can
imagine.  You'd be surprised at who is talking about it...


-Daniel

PS - Guys: pretty soon, you should start cross-posting to alt.flame  :-)