bzs@world.std.com (Barry Shein) (01/01/90)
It's a New Year and a New Decade and you're already sick of all the looking-back pieces on tv/radio/etc. That's not INFO-FUTURES' venue, we are only interested in looking back on the 90's; the rest of this century. Towards that end I invite (implore? beg for?) your predictions in various technologies for the coming end of the decade. To show I'm sincere, I'll post mine forthwith: THE REST OF THE CENTURY Barry Shein Software Tool & Die HARDWARE THE DESKTOP Personal computers and workstations will of course become the same thing (it's not clear they were ever terribly distinguishable other than next year's PC was this year's workstation.) The amount of time that the last generation or two of chips define cheaper workstations will shorten and finally approach zero. Most likely fast, cheap upgrades will be the rage, a little plug-in cpu board for $500 or less which converts your machine to the current chip will be expected, it will probably become a hardware service checkoff item. This will be driven by the government(s) demanding this as too-rapid obsolescense becomes an issue in procurements and vendors will respond. The big problem with this will become memory bandwidth speed, some solution will be found, quite possible by adding another storage hierarchy local to the CPU (e.g. 16MB "caches" on-board.) By the middle of the decade 100MIPS/25MFLOPS CPUs will be commonplace on the desktop, probably with 128MB memory standard and upgradeable to 1GB. Parallel to this we will see parallel workstations become standard. Four CPUs will be considered minimal for your desktop by the middle of the decade, by the end of the decade we may see single chips with the equivalent of 16 CPUs finding their way into desktop models, at that point 100+ CPUs will be a moderately fancy workstation, each CPU running at 100 or more MIPS. I think somewhere up around 100..250 MIPS the low-end technologies will begin to falter in various ways (more speed-up will require exotic technologies, too expensive for mass market demands.) The limiting factor may turn out to be that the memory bandwidth is just too expensive for "under $10K" systems. This is basically an economic and marketing observation, more than a technological one. This "wall" is what will drive the interest in just putting many CPUs into the box, it's cheap and effective. Expect a lot of interest in asymmetric systems which have lumps of memory local to each CPU (possibly on-chip) and a big lump of shared memory. The point is that this will no longer be the playground of the technology pushers, they will be commodity items with commodity considerations. Of course, higher-end workstations will be available but very few folks (comparatively) will be interested in them, they'll be mostly special purpose. HDTV will help a lot, we'll standardize on 1Kx1K color monitors because it will be more important to standardize than make software understand a zillion different display options. Again, fancier options will be available but the market will be small. THE HIGHER END At the high-end (minis, mainframes, supers) these will mutate to something you won't think of as "computers", you certainly won't log into them. They'll all be file servers, communications back-ends, information servers etc. and will run special-purpose OS's tuned to particular applications. They'll mostly be bought for their reliability and archiving capability and for places where centralized control makes sense. For example, it would be too much of a nuisance to try to keep 1000 workstations up to date on the wire services, it'll be easier to just keep one system up to date and access it over the network. The super-computer industry will run into massive economic problems simply because not enough sites are interested in their wares to keep them in business and competitive. Supercomputers might become like supercolliders, the government(s) will commission (say) 16 of them, private sales will dwindle to non-existant as the drop in demand will cause the prices to skyrocket to hundreds of millions each. In fact, it's almost like that today. This coming decade people will get over their idea that a big computer is big in size. THE COMPUTING GESTALT Computers will become less and less interesting in and of themselves. Computers will be seen as merely connection end-points (appliances, to use Steve Jobs' term) in massive, world-wide networks. It will be the networks which people will become obsessed with, how to get services, new and amazing activities fostered by 50 million desktop computers hooked up together in hierarchical nets, global nets. University academic computing centers will become service organizations to the University library systems (where they've belonged for years, but that's another matter.) The main computing facility will be a massive information server with on-line catalogs, texts, video, sound etc. In turn these collections will be linked to each other around the world. Access to these information servers will become a major source of income for Universities as AT&T (&c) hook them up to the general wide area networks and subscriptions are sold. Universities currently building such systems will leap ahead of universities which haven't. In fact, colleges which haven't built these will mostly go broke as the demographics shift against them. Better public libraries and the Library of Congress will also get into the act. The government will also derive significant income from selling on-line services. This will become significant in govt revenues and seen as almost a big a savior as the state lotteries. Wide-area fiber-optic lines will be commonplace to all homes and businesses. This will have been (surprisingly, to some) driven by AT&T (and others, phone companies, BOCs, MCI etc) entering the Cable TV business as operators, wiping out most current operators. The programming, of course, will still be delivered by specialists although small, specialty programming will be more commonplace. Interactive TV will finally start to be introduced by the middle...end of the decade. Given this a major market for these networks will be calling up movies, music and other programming on demand (the demise of the home VCR.) It will be hard to find the workstation within the home entertainment system. Businesses will order/sell almost everything on the screen by the end of the decade. This will give rise to interesting arbitrage opportunities as people search the nets for orders and vendors to match up with price mismatches (e.g. causing orders in progress to cancel by under-bidding before the order is filled.) That is, the general product markets will resemble more and more the stock markets with fast price fluctuations and people making money merely playing others' inability (lack of time, interest, equipment, capital, knowledge) to exploit imbalances within the system. Congress and other govts will investigate the effect of the global product markets becoming too efficient and probably make some of this illegal for a while, at least in some areas. There will also be all kinds of fights for control under the guise of "consumerism" (which will mostly be a thinly veiled attempt to keep small players out.) Purchasing will start resembling an inverted auction more than the current methods as you will typically float out an offer to buy, say, a particular model refrigerator, to the nets and come back a few hours later to scan the responses. Auto dealerships as we now know them may become almost non-existant over this sort of technology, auto makers will just set up demonstration centers around the country for test drives etc and you'll go home and float your interests. This global market net will be what finally spurs general public interest in the home computer (beyond its intrinsic function in home entertainment centers.) There will be myriad ways to "get into business" with your keyboard, almost every family will have someone moonlighting (if not fully employed) in some way selling, buying, dealing on the nets. This will be a new route out of poverty for the quick and the bright. No one will know the color of your skin or how you're dressed over the nets. There is always enough market in specialty items that big corporations can't touch (e.g. collectibles, handicrafts) to create opportunities. SOFTWARE Software will start to become obsolete in the coming decade, at least in its current form. Software will become a service commodity rather than a tangible commodity. When you call up a spreadsheet or word-processor or whatever it will come whizzing in over the net from outside. Your workstation will hold mostly (personal) data and very little software. You will rely upon high-speed network connections to software development houses for your software. You will be charged on a usage basis and it will be a lot cheaper than it costs now to buy packages, but the vendors will make more money (or about the same) because the audience will be huge and the resistance to spending a few bucks trying out a new package will be almost zero. All the vendors have to sell you on is typing the command which calls up their software (manuals etc will be on-line or separately available, no big problem, another business.) This will pretty much make moot the current piracy issue (also for videotapes and other media as well as software.) Expect digital radio and entertainment experiments which put MIDI right into your home. Parts of the software will require network attachment for function so it won't be meaningful to copy something. This network part will be real function, not just security tricks. The major advantages will be instant bug fixes and enhancements being transparently incorporated. Also, access to other on-line services tuned to the software. Other services like on-line training "films" etc will become big business, also sold on a per-usage basis. Almost the entire software industry will become a network services industry. The current "dataless" workstation is exactly backwards and will become a "software-less" workstation (other than base operating system etc, whatever it takes to hook up, this will be, as today, part of the machine.) It will mainly hold and manipulate your private data. SUMMARY The 90's will become known as "The Telecommunications Decade". POSTSCRIPT The century will end with the AI community still claiming that new machines and novel operating systems have to be developed before they can make any progress. --- -Barry Shein Software Tool & Die, Purveyors to the Trade | bzs@world.std.com 1330 Beacon St, Brookline, MA 02146, (617) 739-0202 | {xylogics,uunet}world!bzs
jaap@hpuamsa.UUCP (Jaap Vegter AEO) (01/02/90)
re: happy new year and happy new decade.... Well, stating that a new decade started on Jan.1 this year (1990) is incorrect I think. The first decade ran from year 1 through year 10 and the second decade started when the year 11 came around. Likewise the current decade runs from Jan.1 1981 through Dec.31 1990. The next decade starts on Jan.1 1991 and the next century starts on Jan.1 2001. All this aasumes that the first year was year 1 and not year 0. This is a logical assumption since the first day in the first month of every year is January (month 1, not month 0) 1 (and not 0). If one wants to maintain that the first year was 0 (instead of 1) than we should write todays date (January 2, 1990) in Year, Month, Day format as 1990 0 1 (January being months zero and yesterday, new years day, being day zero). Greetings and a happy new year, Jaap Vegter, Hewlett-Packard Netherlands.
cosell@bbn.com (Bernie Cosell) (01/03/90)
jaap@hpuamsa.UUCP (Jaap Vegter AEO) writes: }re: happy new year and happy new decade.... }Well, stating that a new decade started on Jan.1 this year (1990) is incorrect }I think. The first decade ran from year 1 through year 10 and the second }decade started when the year 11 came around. Likewise the current decade }runs from Jan.1 1981 through Dec.31 1990. The next decade starts on Jan.1 1991 }and the next century starts on Jan.1 2001. Geez, is this going to pop up on EVERY newsgroup? You are wrong. decades are conventionally named *cardinally* not *ordinally*. We did not just complete "the eighth decade of the twentieth century", but just completed "the eighties" --- that is, the set of ten years (->decade) which happen to be numbered of the form "198?". We can debate when we think the century/millennium ends, but there's no question about how decades are defined. /Bernie\
bakken@cs.arizona.edu (Dave Bakken) (01/03/90)
In article <9001010220.AA17906@world.std.com> bzs@world.std.com (Barry Shein) writes: > >Wide-area fiber-optic lines will be commonplace to all homes and >businesses. This will have been (surprisingly, to some) driven by >AT&T (and others, phone companies, BOCs, MCI etc) entering the Cable >TV business as operators, wiping out most current operators. How I wish!!! Think of all the work we could do at home in our 'jammies! But its going to be 2010 until even 5% or 10% of American homes have the ``last mile'' installed with fiber optics. Or so I've read. -- Dave Bakken Internet: bakken@cs.arizona.edu 721 Gould-Simpson Bldg UUCP: uunet!arizona!bakken Dept of Computer Science; U of Arizona Phone: +1 602 621 8372 (w) Tucson, AZ 85721 USA FAX: +1 602 621 4246
ts@uwasa.fi (Timo Salmi LASK) (01/03/90)
In article <10410001@hpuamsa.UUCP> jaap@hpuamsa.UUCP (Jaap Vegter AEO) writes: >re: happy new year and happy new decade.... > >Well, stating that a new decade started on Jan.1 this year (1990) is incorrect >I think. The first decade ran from year 1 through year 10 and the second Please, please, and no offense meant, but is there any way of avoiding this subject and the finicking. This has been discussed ad nauseum at the beginning of each new decade (whenever it may be :-) in all conceivable media. It has ceased to be fun or clever a long time ago. ................................................................... Prof. Timo Salmi (Site 128.214.12.3) School of Business Studies, University of Vaasa, SF-65101, Finland Internet: ts@chyde.uwasa.fi Funet: vakk::salmi Bitnet: salmi@finfun
10e@hpcvia.CV.HP.COM (Steven_Tenney) (01/05/90)
>>re: happy new year and happy new decade.... >> >>Well, stating that a new decade started on Jan.1 this year (1990) is incorrect >>I think. The first decade ran from year 1 through year 10 and the second > >Please, please, and no offense meant, but is there any way of >avoiding this subject and the finicking. This has been discussed ad >nauseum at the beginning of each new decade (whenever it may be :-) >in all conceivable media. It has ceased to be fun or clever a long >time ago. > >................................................................... >Prof. Timo Salmi (Site 128.214.12.3) >School of Business Studies, University of Vaasa, SF-65101, Finland >Internet: ts@chyde.uwasa.fi Funet: vakk::salmi Bitnet: salmi@finfun Yes, it really is splitting hairs! It really doesn't matter in the long run if we call last mon. the start of the new decade or not. Afterall, we could be using the Jewish, Chinese or Moslem calendars instead of the Gregorian calender. If english speaking journalism (from Time mag to Tom Brokaw) are the benchmarks (Heaven forbid!), then indeed we are in the new decade. If you talk to Arthur C. Clarke (who wrote 2001:A Space Odyssey), then you better wait a year. (By the way, Time mag spells Rumania with a u while most of the newspapers spell it Romania! More hair splitting!) +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Steve Tenney | "I spied three ships Hewlett-Packard Corvallis, ORE| They were all sailin' my way. 10e@hpcvia.CV.HP.COM | I asked the captain of the first _ _|***|__ | ship what his name was and |_ _||| | how come he didn't drive a truck? ( ~~ ~~ ))) | he said his name was Columbus \ == /// | an' I just said 'Good Luck!'" ||||\\\ | -Bob Dylan ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++