[sci.electronics] I don't need HDTV!

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/15/90)

  It must be really frustrating trying to come up w/ a HDTV standard:  I mean
these guys (the ones making the standard) have to come up with the LAST WORD
in TV standards.  The standard has to be something we're willing to live with 
for AT LEAST the next 50 years!  Sure, 50 years from now, the things will be
cheap to make, but now they're going to be VERY expensive.  Especially if they
make no compromises so we'll still be happy with the standard in 50 years...

  But you know all that.

  I'd be perfectly happy to settle for the telivision picture tubes of today!
It's the signal that's so horrendous!  I mean:  If you've ever seen the
output of a laser disc player, it's awesome!  It shows what your picture tube
can do.  It's the low signal to noise ratio of the broadcast signals that we
receive that's horrendous!

  I commend the FCC for requiring that the new format signals be backwards
compatible, but I think that the standard could easily offer more than just
backwards compatibility.  Here's what I mean:

+----+--------------+----+    I've heard of one standard which would break
|    |              |    |  your picture into at least four signals:  First
| A  |      B/C     | D  |  a standard TV picture is sent (B).  This provides 
|    |              |    |  the requisite backwards compatibility.  Then 
|    |              |    |  image C is sent.  This image is interlaced between
|    |              |    |  the lines from picture B.  Third, zones A&D are
+----+--------------+----+  sent to provide HDTC users w/ a movie-box picture.

  If a standard such as this were adopted (which I highly doubt it would),
there would be an EASILY EXTRACTED HD IMAGE OF THE NORMAL PICTURE.  Ok, so
it's not exactly the same image (It's the line in-between the normal NTSC
lines), but it could easily be extracted.  And if images A,C,&D are sent
digitally w/ error correction, a (relatively) cheap tuner could be built
to extract the high-quality image C, convert it to analog, and display this
VIRTUALLY NOISE-FREE picture on your old TV!

  This way, consumers would have a choice of three levels of TV quality,
depending upon what they could afford:  They could use the cheap old NTSC,
or the crisp new digital NTSC, or the dramatic full-performance HDTV!

  I personally don't think the above standard will (or even should) be 
implemented as I have described it, But I highly encourage the HDTV Standards
Committee to use a format which includes an easily tuned digital encoding
of the current NTSC signal.

--Glenn

billd@fps.com (Bill Davidson) (03/16/90)

In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>It's the signal that's so horrendous!  I mean:  If you've ever seen the
>output of a laser disc player, it's awesome!  It shows what your picture tube
>can do.  It's the low signal to noise ratio of the broadcast signals that we
>receive that's horrendous!

I disagree completely.  Resolution is sickly compared to what it could
be.  Have you ever seen a high resolution screen?  Laser disks are as
good as it gets on a regular TV but the resoulution is still pitiful.
The pixels are huge and there's no signal that can possibly fix that.
Maybe I'm spoiled by looking at 1000+ line computer screens so much
(I'd like HDTV to be even higher resolution but that would *really* be
expensive).

>  I commend the FCC for requiring that the new format signals be backwards
>compatible, but I think that the standard could easily offer more than just
>backwards compatibility.

[idea deleted --billd]

How long do we have to carry around the baggage of a standard that was
designed so long ago that it can't even get the colors right most of
the time?  Color was an add-on and the implementation suffered in order
to maintain compatibility with old black and white sets.  At some point
you have to say "enough is enough".  We can do so much better now.  We
know a lot more about video signals than we did when NTSC was designed.
Also, the frame rate is annoying.  It destroys resolution when
converting 24 frame/sec film to video due to frame mixing.  I want a
standard with at least 1000 lines and a 72Hz frame rate.  Wide screen
would be nice for films and square pixels would be nice for computer
graphics.  Why suffer with the old forever?  Just because most people
won't be able to afford it is rediculous.  Most people couldn't afford
pocket calculators when they first came out (or TV's, or cars or most
other major new technologies).  We need to define a standard that is
good and is doable and which can be foreseen to become cheap with time.
It doesn't have to be cheap now.  It would be nice if it was an
international standard as well so that video tapes and laserdisks will
work anywhere.  It took CD's 5-6 years to really break into the US market.
A lot of people thought they were rediculous when they first became
available.  They were very expensive (both the players and the disks).
Now it's getting hard to find records stores that have more vinyl than
aluminum coated plastic.  It took laser video even longer (it's back
and gaining a lot of momentum right now).  HDTV will be the same story.
We have generations of people now who grew up watching TV and they are
getting more and more demanding of quality video.

--Bill

toddpw@tybalt.caltech.edu (Todd P. Whitesel) (03/16/90)

billd@fps.com (Bill Davidson) writes:

[ To Glenn, who said that NTSC pix are ok, it's the signal quality that bites ]

>I disagree completely.  Resolution is sickly compared to what it could
>be.  Have you ever seen a high resolution screen?  Laser disks are as
>good as it gets on a regular TV but the resoulution is still pitiful.
>The pixels are huge and there's no signal that can possibly fix that.
>Maybe I'm spoiled by looking at 1000+ line computer screens so much
>(I'd like HDTV to be even higher resolution but that would *really* be
>expensive).

You're spoiled. Think about what HDTV is going to get used for: NOT for
computer graphics. It will be BROADCAST and nobody in his right mind is
going to need to broadcast 24 bit megapixel to the general public....

Don't ask broadcast television to transmit N pixels of RGB, because it won't.
It transmits real world images which do not have a pixel resolution, and does
pretty good job if the signal is intact. These pictures also don't change
colors very fast the way graphics can, and NTSC was designed to exploit both
that and the inherent color resolution of our eyes. Once you get used to
megapixel displays, you really do notice the fringing on NTSC but IMHO it is a
small price to pay for the picture quality you get for the price in both
circuitry and broadcast power.

I agree with Glenn. The key is preserving the NTSC signal quality, improving
the color synchronization, and expanding the picture to cinema size. A
standard with these features will gain instant acceptance from its market
and is feasible to implement. Transmitting RGB quality pictures is expensive
and pointless because your picture quality will get nuked right away anyway.
The other nice thing about low resolution on NTSC is that it is robust against
noise. RGB video that normally requires shielded cable is not.

[ Glenn wrote this ]
>>  I commend the FCC for requiring that the new format signals be backwards
>>compatible, but I think that the standard could easily offer more than just
>>backwards compatibility.

>How long do we have to carry around the baggage of a standard that was
>designed so long ago that it can't even get the colors right most of
>the time?

That's because your TV's color decoder sucks or the signal was recorded
by a lousy camera. If you continue to insist that television deliver
workstation quality color graphics then I suggest you invent a new standard to
implement it, because HDTV will not be used for graphics by most of its
intended market.

> Color was an add-on and the implementation suffered in order
>to maintain compatibility with old black and white sets.

Oh? How would you have done it? You forget the installed base of black
and white TV sets, and the inherent cost difference which causes b&w sets
to still be manufactured and sold to this day. There is also transmission
bandwidth to consider.

> At some point
>you have to say "enough is enough".  We can do so much better now.  We
>know a lot more about video signals than we did when NTSC was designed.

Very true. But what you are suggesting is impractical overkill that gratifies
graphics purists and is engineering hell to the standards committee.

>Also, the frame rate is annoying.  It destroys resolution when
>converting 24 frame/sec film to video due to frame mixing.  I want a
>standard with at least 1000 lines and a 72Hz frame rate.  Wide screen
>would be nice for films and square pixels would be nice for computer
>graphics.

Your frame rate is a good idea, because the primary benefit of HDTV will be
movie quality video rentals and cable. I don't know if a full 1000 lines
would be required, and I somehow doubt it. 800 sounds ok for a wide picture
but I'll defer to the movie experts.

Where do you get square pixels? You're practically asking them to define an
exact display and graphics transmission standard along with it! Who's going to
use that, besides Pixar and Apple Computer?

> Why suffer with the old forever?

Because it addresses the realities of its intended use much more effectively
than what you're suggesting could ever hope to.

> Just because most people
>won't be able to afford it is rediculous.  Most people couldn't afford
>pocket calculators when they first came out (or TV's, or cars or most
>other major new technologies).

No, it's the installed base. You can't just tell them to buy new boxes because
they'll tell you go screw. After all, if you talk them into paying you for a
whole new set of equipment, couldn't you just do it again in 20 years when this
standard is 'obsolete'?

> We need to define a standard that is
>good and is doable and which can be foreseen to become cheap with time.
>It doesn't have to be cheap now.

Yes it does. We want movie quality television, more or less cheap, and now.

> It would be nice if it was an
>international standard as well so that video tapes and laserdisks will
>work anywhere.  It took CD's 5-6 years to really break into the US market.
>A lot of people thought they were rediculous when they first became
>available.  They were very expensive (both the players and the disks).
>Now it's getting hard to find records stores that have more vinyl than
>aluminum coated plastic.  It took laser video even longer (it's back
>and gaining a lot of momentum right now).  HDTV will be the same story.

No it won't. You just mentioned some storage mediums, which are dislpayed
through cables to a monitor nearby. These can have almost infinite quality
because the wires are shielded and the signal quality does not degrade.
As soon as you broadcast via radio, you have many engineering restrictions
on your picture quality; NTSC addresses them very well regardless of its age as
a standard. It does lack signal correction, but for many people the improved
quality and wider screen will make them happy campers.

>We have generations of people now who grew up watching TV and they are
>getting more and more demanding of quality video.

They can get it from a laser disk, or on their computer's monitor. In the world
of broadcast television it will end up more like Glenn's idea, because without
signal correction all the resolution in the world won't look much better than
NTSC anyway, especially when you start throwing noise and ghosts and other
radio phenomena at the signal.

The basic point I'm making is: there's a fundamental difference between your
monitor cable and a radio transmitter. You can't get the quality of a shielded
cable from a radio transmission unless you use very good error correction and
even then it takes a lot of power and bandwidth to transmit. I can see a few
special purpose channels to do this but for consumer television it is simply
not worth the cost.

What you want deserves its own standard. The quality you'd like to see cannot
be broadcast cost-effectively so why not declare it to be only for storage
media and then go all out on the picture quality: 24 bit color, 2Kx1K pixel,
CD-ROM or better error correction, etc..

But please do not ask anyone to figure out how to transmit it via radio. Its
time will come when cable and fiber optic networks become the standard means
of communication. Until then, 'NTSC on steroids' will be the better choice for
movies and television.

Todd Whitesel
toddpw @ tybalt.caltech.edu

bill@bilver.UUCP (Bill Vermillion) (03/16/90)

In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
 
>  It must be really frustrating trying to come up w/ a HDTV standard:  I mean
>these guys (the ones making the standard) have to come up with the LAST WORD
>in TV standards.  The standard has to be something we're willing to live with 
>for AT LEAST the next 50 years!  Sure, 50 years from now, the things will be
>cheap to make, but now they're going to be VERY expensive.  Especially if they
>make no compromises so we'll still be happy with the standard in 50 years...

We've already had current video standards for 50 years (though the color
portions a bit less than that) and it's time for a change.  If you consider
that the LP is virtually dead now, it only lasted 41.
 
>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>It's the signal that's so horrendous!  I mean:  If you've ever seen the
>output of a laser disc player, it's awesome!  It shows what your picture tube
>can do.  It's the low signal to noise ratio of the broadcast signals that we
>receive that's horrendous!

The output of a laser player isn't awesome.  It's good, but you can surely see
the limits of NTCS video if you watch it on anything bigger than a 21" set.  I
have had a laser player for several years now.  The early discs are poor
compared to today's technology, but we are pretty much at the limits.  I
assume you are a recent disc convert.   S/N is only one problem.   Have you
ever noticed how the color is in little dots at the transition points, how the
resolution isn't as good as a clean 16mm print, let alone 35?
 
>  I commend the FCC for requiring that the new format signals be backwards
>compatible, but I think that the standard could easily offer more than just
>backwards compatibility.  .....

I spent years in broadcast, and have seen many changes in the FCC, and I am
not too particularly impressed with their performance in the past few years.
They seem to be a non-regulatory regulatory agency.  Part of the time it's
hands off, other times its hands on for the wrong reasons.  They totally blew
the AM stereo standards by refusing to take a stand.

Years ago, here in Orlando, a disk jockey "locked" himself in a radio station
control room and played Sheb Wooley's "Monkey Fever" for 24 hours straight.
His name was Mark Fowler.  He didn't do much better when he was chairman of
the FCC.


-- 
Bill Vermillion - UUCP: uunet!tarpit!bilver!bill
                      : bill@bilver.UUCP

news@haddock.ima.isc.com (overhead) (03/16/90)

In article <530@bilver.UUCP> bill@bilver.UUCP (Bill Vermillion) writes:
>If you consider that the LP is virtually dead now, it only lasted 41.

I never liked LPs.  Even an audiophile LP has pops & klicks, even
the first play.  It gets worse each play.  A low grade audio
cassette doesn't have terrible distractions or sound degradation.
We could have dumped LPs long ago.  The industry kept pushing
vinal.  They kept saying that it was "better".

A friend has a nice 25 inch monitor TV.  It was real expensive.
I saw some stuff on laser disk.  It was very impressive.  I saw
some stuff on video tape - super beta, VHS.  Less impressive, but
not generally distracting.  Cable TV had almost OK stations and
pretty bad ones.  Without cable, the distractions while watching
the show are almost as bad as commercials.

If over-the-air quality could be brought to laser disk standards
it would be an overnight success.  If it was cable-only, I'd
probably buy a better TV & spend the $30-$40 a month.  Heck, I
might even watch it.

If HDTV doesn't have some sort of correction system built in,
then it will be no better than cable as it is now.  Simply
increasing bandwidth on a noisy medium does not remove noise.

If you create un-expandable standards, they will be bad.  The
rope cut to length is too short.  Technology will outstrip our
current ideas of what is easily "as much as anyone can afford".  If
you want to build a cheap TV with the new standard, you should be
able to ignore things - like error correction, or some of the
resolution, or one of the sound channels, or the closed captioned
channel...  On transmission, you have to be able to omit stuff,
like error correction, resolution, sound channels...

Stephen.
suitti@haddock.ima.isc.com

roy@phri.nyu.edu (Roy Smith) (03/17/90)

toddpw@tybalt.caltech.edu (Todd P. Whitesel) writes:
> square pixels would be nice for computer graphics.

	I think the thing to realize here is that there is very little in
common between broadcast TV and computer graphics.  First off, you have to
worry about bandwidth.  With 1024 x 1024 x 24 bit graphics at 72 Hz, you're
talking about 75 million pixels per second.  It may be fine to talk about
100 MHz video bandwidth on each of three coax cables in a lab, but how are
you going to transmit that over the air?  One of the absolute limits of
broadcasting is that bandwidth is a finite resouce.  Unless you can change
the gravitational constant of the universe by "just doing it", you have to
live with that fact.

	Secondly, with interactive computer graphics, the ratio of sources
to displays is about 1:1.  With broadcast TV, there are many, many more
recievers than there are sources of material to watch.  Using square pixels
may make life easier for the ray tracer folks, but that's just tough on
them.  Is there any fundemental reason why you can't do ray tracing with
rectangular pixels?  If using rectangular pixels saves broadcast bandwidth,
or makes it easier to build a reciever, it's A Good Thing.
--
Roy Smith, Public Health Research Institute
455 First Avenue, New York, NY 10016
roy@alanine.phri.nyu.edu -OR- {att,philabs,cmcl2,rutgers,hombre}!phri!roy
"My karma ran over my dogma"

bowers@elxsi.dfrf.nasa.gov (Al Bowers) (03/17/90)

In article <530@bilver.UUCP> bill@bilver.UUCP (Bill Vermillion) writes:

>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

...stuff deleted...

>>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>>It's the signal that's so horrendous!  I mean:  If you've ever seen the
>>output of a laser disc player, it's awesome!  It shows what your picture tube
>>can do.  It's the low signal to noise ratio of the broadcast signals that we
>>receive that's horrendous!

>The output of a laser player isn't awesome.  It's good, but you can surely see
>the limits of NTCS video if you watch it on anything bigger than a 21" set.  I
>have had a laser player for several years now.  The early discs are poor
>compared to today's technology, but we are pretty much at the limits.  I
>assume you are a recent disc convert.  S/N is only one problem.   Have you
>ever noticed how the color is in little dots at the transition points, how the
>resolution isn't as good as a clean 16mm print, let alone 35?

Just as a point of reference laser disc has always been a superior to
tape.  The fundamental reason for this is the digital nature of LDs.
Keep in mind that a good VHS resolution from a decade ago was 190
lines of horizontal resolution, improvments in signal conditioning and
processing that apply to the analog tape formats will apply just as
well to the analog portion of a LD signal once it has run through a
DAC (digital to analog converter).

Photo rag tests have shown that lenses for film are capable of up to
100 lines per millimeter resolution (the test I recall said a Lietz
50mm f/2.0 could get 102 l/mm in the center of the frame).  On the
basis of that I'd say 35mm is capable of in the range of 20000 lines
resolution across a frame from edge to edge.  16mm is probably half of
that.  The real limit in film is the films ability to resolve,
probably in reality half of the above numbers, 10000 lines fo 35mm and
5000 lines for 16mm.  So I'd say we still have an order of magnitude
to go yet.  But give me a clean signal and I can live with a lot less
resolution.  S/N is where video can really beat out film, especially
if its digital.

I'd vote for a clean break from the current NTSC standard in the new
HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
resolution and I could be happy with that until I'm old and gray...

Just another opinion...

--
Albion H. Bowers  bowers@elxsi.dfrf.nasa.gov  ames!elxsi.dfrf.nasa.gov!bowers
         NASA Ames-Dryden Flight Research Facility, Edwards, CA
                  Aerodynamics: The ONLY way to fly!

                     Live to ski, ski to live...

bas+@andrew.cmu.edu (Bruce Sherwood) (03/17/90)

At the risk of stating the obvious:

Some of this discussion of HDTV implies that there will always be a need
for another generation of standards with even higher resolution.  That
isn't necessarily the case.  The human eye has limited resolution, and
higher resolution than that in the picture is literally useless, if you
are talking in terms of a "typical" screen size viewed from a "typical"
viewing distance.  Similarly, there must be an upper limit on useful
fidelity in color discrimination, beyond which the human eye just can't
see any improvement.

The analogy with audio is that a CD with frequency response out to 10
MHz would not sound better than one with frequency response out to 20
KHz, because the human ear can't hear the higher frequencies.

What we want in electronic products is high fidelity for both eye and
ear, but no more than that.  Unfortunately today's television and
computers are typically well below this threshold.

Bruce Sherwood

billd@fps.com (Bill Davidson) (03/17/90)

In article <1990Mar15.215645.20272@spectre.ccsf.caltech.edu> toddpw@tybalt.caltech.edu (Todd P. Whitesel) writes:
>That's because your TV's color decoder sucks or the signal was recorded
>by a lousy camera. If you continue to insist that television deliver
>workstation quality color graphics then I suggest you invent a new standard to
>implement it, because HDTV will not be used for graphics by most of its
>intended market.

I have a Sony XBR which I bought because it was the best I could find
(except the Proton which was slightly better but was also more
expensive and had fewer nifty feature-junky-pacifiers).  I think my TV
is close to being about as good as it gets.  As for signal purity, I've
seen NTSC composite video on many different computers and many
different monitors and resolutions.  NTSC just doesn't do color very
well compared to RGB.  It's inherent in the standard.

I wrote:
>> Color was an add-on and the implementation suffered in order
>>to maintain compatibility with old black and white sets.

>Oh? How would you have done it? You forget the installed base of black
>and white TV sets, and the inherent cost difference which causes b&w sets
>to still be manufactured and sold to this day. There is also transmission
>bandwidth to consider.

I probably would have done it the same way.  Believe it or not, I do
understand the installed base thing.  I just think you can only take
it so far.  Also, I haven't seen a B&W set bigger than 8 inches in a
store in about 5 years.  I haven't seen one bigger than 13 inches in
about 10 years.  Very few people buy them anymore.  If HDTV were to
to approach the quality of RGB and have over 1000 lines, I think
that NTSC sets would go the same way eventually, even if NTSC
transmissions remained widely available.

>> At some point
>>you have to say "enough is enough".  We can do so much better now.  We
>>know a lot more about video signals than we did when NTSC was designed.

>Very true. But what you are suggesting is impractical overkill that gratifies
>graphics purists and is engineering hell to the standards committee.

So?  I think we need to make a standard that stretches things for now.
It will probably be restricted to cable and satellite for quite a while
but I don't see why that's a problem.  It's going to be expensive no
matter what we do.  I admit to being on the lunatic fringe of
resolution purists.  When I was a photographer, I prefered to shoot
pratically everything in 4X5 mostly because of the resolution gain
(though somewhat for the greater focus-plane and paralax control).

>>Also, the frame rate is annoying.  It destroys resolution when
>>converting 24 frame/sec film to video due to frame mixing.  I want a
>>standard with at least 1000 lines and a 72Hz frame rate.  Wide screen
>>would be nice for films and square pixels would be nice for computer
>>graphics.

>Your frame rate is a good idea, because the primary benefit of HDTV will be
>movie quality video rentals and cable. I don't know if a full 1000 lines
>would be required, and I somehow doubt it. 800 sounds ok for a wide picture
>but I'll defer to the movie experts.

Yes, 800 would be a great improvement (more than double the current
useful resolution) but to me it's not enough, especially if you have a
large screen.  Have you ever seen a laser disk on a 32" Proton?  It's
sharp but the pixels seem golfball sized.  Larger screens are also
becoming more popular and this is increasing the need for more
resolution.

>Where do you get square pixels? You're practically asking them to define an
>exact display and graphics transmission standard along with it! Who's going to
>use that, besides Pixar and Apple Computer?

Yes I do want an exact display and graphics transmission standard built
into it.  Computer graphics is becoming more and more prevalent in
video.  To think that only two companies are going to care is
ignorant.  HDTV should be useful for a large number of applications
(medical imaging, computer graphics, movies, home video, video
conferencing).  It would be great if all these things had compatible
interfaces.  Right now we have conversions galore, all degrading
quality.  It would be great.  You could pull video into your computer
graphics application (or vice versa) without any conversion.  One
monitor could serve for a TV, closed circuit TV, a computer screen,
your Nintendo (TM) set and anything else.  Monitors could become an
interchangeable part for all of these systems.

>> Why suffer with the old forever?

>Because it addresses the realities of its intended use much more effectively
>than what you're suggesting could ever hope to.

I think you have a very limited view of it's intended use.

>> Just because most people
>>won't be able to afford it is rediculous.  Most people couldn't afford
>>pocket calculators when they first came out (or TV's, or cars or most
>>other major new technologies).

>No, it's the installed base. You can't just tell them to buy new boxes because
>they'll tell you go screw. After all, if you talk them into paying you for a
>whole new set of equipment, couldn't you just do it again in 20 years when this
>standard is 'obsolete'?

Who said we have to dump the old immediately?  It could be brought in
slowly over a period of 10-20 years, starting with cable.  Even when
we start radio transmissions, it could be done with just a few channels
at first.  In any case, cable is slowly taking over vast amounts of the
US.  In 20 years, it may be just about everywhere.

>> We need to define a standard that is
>>good and is doable and which can be foreseen to become cheap with time.
>>It doesn't have to be cheap now.

>Yes it does. We want movie quality television, more or less cheap, and now.

It won't happen.  It can't happen with NTSC.  Also, NTSC is not about
to become an international standard.  What about the Europeans?  It
can't happen NOW.  Nobody has made it.  I think it will be difficult
to widen the screen and impossible to change the frame rate and remain
compatible with old equipment.

>You just mentioned some storage mediums, which are dislpayed
>through cables to a monitor nearby. These can have almost infinite quality
>because the wires are shielded and the signal quality does not degrade.
>As soon as you broadcast via radio, you have many engineering restrictions
>on your picture quality; NTSC addresses them very well regardless of its age as
>a standard. It does lack signal correction, but for many people the improved
>quality and wider screen will make them happy campers.

You got me here.  I'm not much of an expert on radio transmissions
so I don't know all the potential problems with putting out a high
definition signal.  It just seems to me that it should be possible.
Maybe it will be expensive, but like I said before, almost all new
technologies are.

>>We have generations of people now who grew up watching TV and they are
>>getting more and more demanding of quality video.

>They can get it from a laser disk, or on their computer's monitor. In the world
>of broadcast television it will end up more like Glenn's idea, because without
>signal correction all the resolution in the world won't look much better than
>NTSC anyway, especially when you start throwing noise and ghosts and other
>radio phenomena at the signal.
[...]
>But please do not ask anyone to figure out how to transmit it via radio. Its
>time will come when cable and fiber optic networks become the standard means
>of communication. Until then, 'NTSC on steroids' will be the better choice for
>movies and television.

Sadly, it may turn out this way.  Too many people want it to work with
their old sets.  To me, the loss is much greater than the gain.
Fortunately, widespread fiber optic communications lines may not be
very far away.  Video conferencing may see to that.

--Bill

billd@fps.com (Bill Davidson) (03/17/90)

In article <1990Mar16.172343.10577@phri.nyu.edu> roy@phri.nyu.edu (Roy Smith) writes:
>toddpw@tybalt.caltech.edu (Todd P. Whitesel) writes:
[Todd didn't write this, I did --billd]
>> square pixels would be nice for computer graphics.

>	I think the thing to realize here is that there is very little in
>common between broadcast TV and computer graphics.  First off, you have to
>worry about bandwidth.  With 1024 x 1024 x 24 bit graphics at 72 Hz, you're
>talking about 75 million pixels per second.  It may be fine to talk about
>100 MHz video bandwidth on each of three coax cables in a lab, but how are
>you going to transmit that over the air?  One of the absolute limits of
>broadcasting is that bandwidth is a finite resouce.  Unless you can change
>the gravitational constant of the universe by "just doing it", you have to
>live with that fact.

I never expected digital signals.  Even a lunatic resolution freak
like me knows that that's too much to hope for.  Bandwidth is a
problem.  How we deal with it is unknown.  We can compress the signal.
We can also have less channels and transmit diffrent parts at
diffrent frequencies.  Maybe we can't do it.  I'd sure like to
see it happen though.

>	Secondly, with interactive computer graphics, the ratio of sources
>to displays is about 1:1.  With broadcast TV, there are many, many more
>recievers than there are sources of material to watch.  Using square pixels
>may make life easier for the ray tracer folks, but that's just tough on
>them.  Is there any fundemental reason why you can't do ray tracing with
>rectangular pixels?  If using rectangular pixels saves broadcast bandwidth,
>or makes it easier to build a reciever, it's A Good Thing.

No it's not.  There is no problem for raytracing with rectangular
pixels (hell, elongated hexogonal pixels would work just fine).
Raytracing is not the part of computer graphics that has a problem with
rectangular pixels.  The problem is with geometric transformations.
Rectangular pixels cause a lot of extra work to have to be done to
scale the image and this has to be done every time you move anything
around on the screen.  This costs you a lot of cpu cycles which are
usually in very short supply when you are doing graphics animation.
This cost is very significant and unnecessary.  How do rectangular
pixels save broadcast bandwidth?  I thought that they just increased
horizontal resolution.  I don't need a resolution increase at this
cost.

--Bill Davidson

kucharsk@number6.Solbourne.COM (William Kucharski) (03/17/90)

In article <BOWERS.90Mar16105816@drynix.dfrf.nasa.gov> bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:
 >Just as a point of reference laser disc has always been a superior to
 >tape.  The fundamental reason for this is the digital nature of LDs...

Sorry, but the video portion of laser videodiscs is analog; the only portion
which may be digital on any given LD is the audio...
--
===============================================================================
| ARPA:	kucharsk@Solbourne.COM	              |	William Kucharski             |
| uucp:	...!{boulder,sun,uunet}!stan!kucharsk |	Solbourne Computer, Inc.      |
= The opinions above are mine alone and NOT those of Solbourne Computer, Inc. =

schumach@convex.com (Richard A. Schumacher) (03/17/90)

All NTSC fans should try previewing the Phillips digital
monitor (60 Hz frame, non-interlaced) with a video disc
as a source. It looks sickly. It beautifully shows all
the weaknesses of "NTSC on steroids" and is as good an
argument as I've seen for going to HDTV. 

To heck with backwards-compatibility to a 50-year-old 
technology!

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/17/90)

bill@bilver.UUCP (Bill Vermillion) writes:

>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
> 
>>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>>It's the signal that's so horrendous!  

>The output of a laser player isn't awesome.  It's good...

HA!  I knew this discussion would get people going!
  My point was not that Laser Discs are awesome.  My point IS that
when *I* watch TV (not too often, mind you) It doesn't bother me that
I can't get 6" from the sceen and still not see pixels:  If I got a 6'
screen, I would just sit 3 times further away from it than a 24" set.
  Sure, a few years after graduating from Caltech, I MIGHT be able to
afford a HDTV set.  But will it be WORTH it?  Will I want to spend
$5000 on a HDTV set, or would I rather spend $500 plop a box on the top
of my TV that decodes digital broadcasts, and gives me a noise-free
picture on my lower Res. monitor?  I'd go for the $500 box!

  And what of the MILLIONS of people who won't be able to afford the
HDTV sets?

  I say, there should be a middle-of-the-road solution in addition to HDTV.
There should be HDTV, but I DON'T NEED HDTV!

>I spent years in broadcast, and have seen many changes in the FCC, and I am
>not too particularly impressed with their performance in the past few years.

I can't argue w/ that!  In fact, where does the FCC claim to get the 
legal authority to regulate speach over the airwaves?  e.g. why will
a HAM who says F**K on the airwaves almost surely lose his license?  So
much for freedom of speach!  It's a form of government censorship.  I 
have nothing against CENSURE, but censorship by the gov. is WRONG.

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/17/90)

bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:

>>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

>Just as a point of reference laser disc has always been a superior to
>tape.  The fundamental reason for this is the digital nature of LDs.

Uh, well, It's not exactly digital:  It's PWM (pulse width modulation).
This is the ANALOG recording of discrete samples.  Therefore the
recording is discrete, but I don't think it qualifies as digital.

>I'd vote for a clean break from the current NTSC standard in the new
>HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
>resolution and I could be happy with that until I'm old and gray...

>Just another opinion...

I hear you: GIVE ME.  But are you willing to PAY for it?  You sound like
those comp.sys.mac people who say the low cost Mac should have at least
a 68030 and 4 megs of RAM and 256 colors. :-P

Glenn

toddpw@tybalt.caltech.edu (Todd P. Whitesel) (03/17/90)

roy@phri.nyu.edu (Roy Smith) writes:

>toddpw@tybalt.caltech.edu (Todd P. Whitesel) writes:
>> square pixels would be nice for computer graphics.

I never said that! In fact your explanation of broadcast tradeoffs is better 
than the one I used to flame Bill (who did say it).

BTW, you can get square pixels with NTSC, you just have to get the vertical
and horizontal size right or mess with the dot clock. The dot clock is probably
the best way but there are often constraints on it that come from the system
itself, like coordination of the video shifter and the memory cycles. (Using
VRAMs neatly avoids this.)

Todd Whitesel
toddpw @ tybalt.caltech.edu

fff@mplex.UUCP (Fred Fierling) (03/17/90)

In article <BOWERS.90Mar16105816@drynix.dfrf.nasa.gov>, bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:
> 
> I'd vote for a clean break from the current NTSC standard in the new
> HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
> resolution and I could be happy with that until I'm old and gray...

This question comes up a lot doesn't it?  Do you scrap the old technology and
design a new, more refined and technically superior one, or do you compromise
your design to maintain backwards compatibility?

I hope the "clean break" approach wins out too.  What a engineer's nightmare
it would be to match up the side panels to the NTSC center.
-- 
Fred Fierling   uunet!van-bc!mplex!fff    Tel: 604 875-1461  Fax: 604 875-9029
Microplex Systems Ltd   265 East 1st Avenue   Vancouver, BC   V5T 1A7,  Canada

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/18/90)

fff@mplex.UUCP (Fred Fierling) writes:

>I hope the "clean break" approach wins out too.  What a engineer's nightmare
>it would be to match up the side panels to the NTSC center.

I heard that the FCC said they wouldn't approve the standard unless its
backwards compatible...=-(

Glenn

dave@imax.com (Dave Martindale) (03/18/90)

In article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu> bas+@andrew.cmu.edu (Bruce Sherwood) writes:
>At the risk of stating the obvious:
>
>Some of this discussion of HDTV implies that there will always be a need
>for another generation of standards with even higher resolution.  That
>isn't necessarily the case.  The human eye has limited resolution, and
>higher resolution than that in the picture is literally useless, if you
>are talking in terms of a "typical" screen size viewed from a "typical"
>viewing distance.
>
>The analogy with audio is that a CD with frequency response out to 10
>MHz would not sound better than one with frequency response out to 20
>KHz, because the human ear can't hear the higher frequencies.

If you move closer to a loudspeaker, you don't need better frequency
response - your ear's limits are the same at any distance.  You can
make the same argument for colour and brightness resolution, but not
spatial resolution, in an image.  I.e. beyond a certain point, using
extra bits for brightness or colour resolution just doesn't produce
a noticeable improvement in the picture, no matter how close you get.

But the analogy is all wrong for resolution.  If you move closer to an
image, so it fills more of your field of view, you need better spatial
resolution.  And somebody will always want to sit closer than the
current standard is designed for, at least for the forseeable future.
NTSC was designed for a viewing distance of 10 times the picture
height, HDTV for 3-4 times the picture height.  I want an image that
looks sharp from 2/3 the picture height - that gives me a 90 degree
field of view (with a 4:3 aspect ratio).

What you say would be true only if there was a "typical" screen size
and a "typical" viewing distance.

good@pixar.uucp (Veni, Vidi, Visa.) (03/19/90)

In article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu> bas+@andrew.cmu.edu (Bruce Sherwood) writes:
:
:Some of this discussion of HDTV implies that there will always be a need
:for another generation of standards with even higher resolution.  That
:isn't necessarily the case.  The human eye has limited resolution, and
:higher resolution than that in the picture is literally useless...
:
:The analogy with audio is that a CD with frequency response out to 10
:MHz would not sound better than one with frequency response out to 20
:KHz, because the human ear can't hear the higher frequencies.

Ooh!  I'll take this one, in two parts:

1) There will *always* be room for improvement in anything.

2) Your analogy with CD's is the right analogy for the wrong argument.  An
audio system with a 10MHz bandwith *will* sound better than one with a
20KHz bandwidth, because your ears *can* hear out there!  This old 20KHz
nonsense is a figure for steady-state, sinusoidal response and was generated
using WWII-era equipment.  Later experiments show some people with 40KHz
or better response to sine waves.  But even that isn't the biggie.  Human
hearing is very sensitive to the rise time on transients.  The wider the
bandwidth, the steeper the slope.  The steeper the slope, the more like
the real sound.  A MegaHertz is adequate for mid-fi, but not for the
good stuff.  Can you tell that I'm a DC-to-daylight freak?  Just remember
that *real* music isn't bandwidth limited at all!  (Ok, except by the
air between the source and your ear -- ditto for your stereo!)

The biggest bill of goods sold to the audio world in the last decade is
that CD's represent the state of the art in sound reproduction.  They
don't.  Not even close.  Practically an also-ran.  They *are* the
state of the art in marketing, and provide a wonderfully convenient
bang for the buck for low to mid-fi.  They're a good value, not the
cat's meow in fidelity.

I agree that there is a theoretical limit to our eyes' resolution, but
you must take into account things like frame rate (see ShowScan) and
image size (see Imax), not to mention the near limitless need for better
audio, which current HDTV just barely addresses.  CD's can't touch a
good analog master tape, and video of any kind is way behind big,
fast motion picture projection.

That doesn't mean I don't *want* HDTV!  Bring it on!  I'll be happier,
but never satisfied.


		--Craig
		...{ucbvax,sun}!pixar!good

		Gun control yields tyranny.

siegman@sierra.Stanford.EDU (Anthony E. Siegman) (03/19/90)

>>  ... the thing to realize here is that there is very little in
>>  common between broadcast TV and computer graphics.

Well, maybe yes, maybe no.  But we'd sure like to be able to hang just
_one_ set of expensive high-resolution HDTV monitors from the ceiling
or side walls of our classrooms and be able to project live or cable
TV, video cassettes, computer graphics and animation, or anything else
in the way of images, with just one common standard instead of a whole
bunch of fancy and expensive interfaces to translate between different
formats.

[By the way, has anyone ever replaced the omnipresent overhead
projector with a TV camera on a little tripod looking down on the
table and projecting whatever you set under it onto classroom
monitors.  Once a classroom had good built-in monitors, you'd never
need the overhead projector and screen again.]

thant@horus.esd.sgi.com (Thant Tessman) (03/20/90)

In article <1990Mar17.022845.9450@spectre.ccsf.caltech.edu>,
gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

> 
> HA!  I knew this discussion would get people going!
>   My point was not that Laser Discs are awesome.  My point IS that
> when *I* watch TV (not too often, mind you) It doesn't bother me that
> I can't get 6" from the sceen and still not see pixels:
[...]
> 
>   And what of the MILLIONS of people who won't be able to afford the
> HDTV sets?

The point of my original posting was that the FCC isn't even
giving people the chance to choose.

If new formats were allowed, the high fidelity nuts like me would 
be willing to support the HDTV industry until it became affordable 
for everyone, while leaving the old system in place until there was
no longer enough of a market to support it.

> 
> >I spent years in broadcast, and have seen many changes in the FCC, and I am
> >not too particularly impressed with their performance in the past few years.
> 
> I can't argue w/ that!  In fact, where does the FCC claim to get the 
> legal authority to regulate speach over the airwaves?  e.g. why will
> a HAM who says F**K on the airwaves almost surely lose his license?  So
> much for freedom of speach!  It's a form of government censorship.  I 
> have nothing against CENSURE, but censorship by the gov. is WRONG.

Wouldn't you consider regulating broadcast formats just as much 
censorship as regulating speech?

thant

bowers@elxsi.dfrf.nasa.gov (Al Bowers) (03/20/90)

In article <1990Mar17.025325.9827@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

>bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:

>>>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

...story of my infamous analog digital screwup deleted...

>>I'd vote for a clean break from the current NTSC standard in the new
>>HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
>>resolution and I could be happy with that until I'm old and gray...

>>Just another opinion...

>I hear you: GIVE ME.  But are you willing to PAY for it?  You sound like
>those comp.sys.mac people who say the low cost Mac should have at least
>a 68030 and 4 megs of RAM and 256 colors. :-P

Actually Glenn I have a real dislike for Macs and here is why, I
refuse to use any machine whose presupposition is that I don't know
anything about what it is I am trying to do.  This prevents me from
doing some of the wonderful things a Mac is capable of but it doesn't
bother me too much.  As to the implication that I would not be willing
to spend the money to get the performance, I'd like to point out that
if you'd read any of my previous postings you'd know that I firmly
believe that you get what you pay for.  My current camcorder is a
S-VHS-C and next time around I will buy the same format with HiFi
also.  My VCR is a VHS and has MTS which came out in the mid '80s (MTS
was only 12 months old when I bought it, only linear stereo but at the
time that was all that was available).  My next VCR will be the new
JVC S-VHS(-C) machine that will accept S-VHS, VHS, S-VHS-C and VHS-C,
it will cost me a pretty penny but it too will be worth it.

GIVE ME the option and let me decide how much I want to spend for what
features.  :-P yourself...

--
Albion H. Bowers  bowers@elxsi.dfrf.nasa.gov  ames!elxsi.dfrf.nasa.gov!bowers
         NASA Ames-Dryden Flight Research Facility, Edwards, CA
                  Aerodynamics: The ONLY way to fly!

                     Live to ski, ski to live...

sehat@iit (Sehat Sutardja) (03/20/90)

In article <1990Mar17.022845.9450@spectre.ccsf.caltech.edu>, gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
>   Sure, a few years after graduating from Caltech, I MIGHT be able to
> afford a HDTV set.  But will it be WORTH it?  Will I want to spend
> $5000 on a HDTV set, or would I rather spend $500 plop a box on the top
> of my TV that decodes digital broadcasts, and gives me a noise-free
> picture on my lower Res. monitor?  I'd go for the $500 box!
> 
Your wish might come true next year. For now, I can't say much about this.

>   And what of the MILLIONS of people who won't be able to afford the
> HDTV sets?
> 

This depends on how low the cost of a high resolution monitor would be. 
As far as the digital signal processing requirement goes, only a few milion 
transistors and some DRAM chips would be needed. You can pretty much guess
what the cost of electronics would be when they are used in consumer products.
For now, the major problem is to get a low cost monitor.




 
-- 
Sehat Sutarja,

{decwrl!sun}!imagen!iit!sehat	| Integrated Information Tech.
sutarja@janus.Berkeley.EDU	| Santa Clara, CA. (408)-727-1885

forbes@aries.scs.uiuc.edu (Jeff Forbes) (03/21/90)

Aren't all of the digitally mastered recordings sampled at ca. 45kHz?
Which would make the arguemnt about CD response moot.

		Jeff

mikemc@mustang.ncr-fc.FtCollins.NCR.com (Mike McManus) (03/21/90)

In article <7322@celit.fps.com> billd@fps.com (Bill Davidson) writes:
>   How long do we have to carry around the baggage of a standard that was
>   designed so long ago that it can't even get the colors right most of
>   the time?  Color was an add-on and the implementation suffered in order
>   to maintain compatibility with old black and white sets.  At some point
>   you have to say "enough is enough".
>   ...
>   Why suffer with the old forever?  Just because most people
>   won't be able to afford it is rediculous.  Most people couldn't afford
>   pocket calculators when they first came out (or TV's, or cars or most
>   other major new technologies).

While I basically agree with you Bill, the issue of backward compatability is a
very sticky one.  Simply making the change and living with it is not as easy as
it sounds.  Your analogy to CD technology is not valid.  The introduction of
CD's did not make records disappear, and you could still play old records that
you had, even if buying old one is harder now days.  Making the TV which you
currently have in your home unable to be used is quite another thing.  Unless
all you want to do is watch video tapes (not likely)...

Can you imagine someone decreeing that gasoline will no longer be produced, and
cars must run on some alternative power source?  Can you imagine millions of
people, who once owned a useful mode of transportation, reduced to having a
useless antique sitting in there driveway?  Can you imagine the public outrage
at such a thing?  I think it's very similar to the HDTV dilema.  Yes, it may be
the *BEST* thing to do (in the long run), but who's going to convince the
*PUBLIC* that this is what they want?  No, you don't need to convince me, but
there are several million other folks out there that you *DO* need to convince.
--
Disclaimer: All spelling and/or grammer in this document are guaranteed to be
            correct; any exseptions is the is wurk uv intter-net deemuns.

Mike McManus (mikemc@ncr-fc.FtCollins.ncr.com)  
NCR Microelectronics                
2001 Danfield Ct.                   ncr-fc!mikemc@ncr-sd.sandiego.ncr.com, or
Ft. Collins,  Colorado              ncr-fc!mikemc@ccncsu.colostate.edu, or
(303) 223-5100   Ext. 360           uunet!ncrlnk!ncr-sd!ncr-fc!garage!mikemc
                                    

minich@a.cs.okstate.edu (MINICH ROBERT JOHN) (03/21/90)

From article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu>, by bas+@andrew.cmu.edu (Bruce Sherwood):
> The analogy with audio is that a CD with frequency response out to 10
> MHz would not sound better than one with frequency response out to 20
> KHz, because the human ear can't hear the higher frequencies.

  Well, it probably would sound a bit better. Consider this:

A 20KHz sample on CD looks something like this


 * * * * * * * * * *
* * * * * * * * * *

which is just a dumb square wave. Sure, it's at high enough of a pitch 
that most people wouldn't be able to discriminate between it and a pure sine 
of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
you get a somewhat harsh sound. If I were really after a "human limits" sample, 
I'd bump the rate up to around 30KHz to minimize the distortion. (Assuming that 
a 40KHz sample is "wasteful".) 
  Since there are indeed people sensitive enough to these at-the-limits
conditions, we shouldn't write off any increase in the sapling freqeuncy as
wasteful just because simple math says "you can't hear anything higher than..."
The truth is, we CAN hear the effects BELOW the maximum frequency. Here's
an analogy (and an excuse to post here): The human eye can only discern between
a limited amount of colors, especially in small areas. The number is quite
small (on the order of 100s). So, should we abandon 24bit color displays since
we _shouldn't_ be able to tell the difference? Just because our hearing doesn't
necessarily scream at us "yuck" like our eyes do doesn't mean we should ignore
what does exist. I'll keep my 24bit color, thank you. And I'll also get the
SUPER-CD player that sample at a higher rate because _I_ can tell the diff.

Robert Minich
Oklahoma State University
minich@a.cs.okstate.edu

DISCLAIMER: One who takes forgotten floppies.

wte@sauron.Columbia.NCR.COM (Bill Eason) (03/21/90)

In article <1990Mar20.162041.4639@ux1.cso.uiuc.edu> forbes@aries.scs.uiuc.edu (Jeff Forbes) writes:
>
>Aren't all of the digitally mastered recordings sampled at ca. 45kHz?
>Which would make the arguemnt about CD response moot.
>
>		Jeff

Enter here the Nyquist criterion which says that the sampling frequency
(44.1 kHz really) must be >= two times the highest analog frequency being
recorded.  Therefore, the highest frequency which can be accurately 
reproduced from a 44.1 kHz digital recording is 22.05 kHz, which is within 
the range of hearing of several folks posting here.  I suspect my audible
range reaches up there, too, since I can hear department store burglar
alarms and CRT flyback transformers.

-- 
Bill Eason   (803) 791-6348	...bill.eason@ncrcae.Columbia.NCR.COM
NCR Corporation
E&M Columbia     3325 Platt Springs Road     West Columbia, SC 29169

billd@fps.com (Bill Davidson) (03/21/90)

>In article <7322@celit.fps.com> billd@fps.com (Bill Davidson) writes:
>How long do we have to carry around the baggage of a standard that was
>designed so long ago that it can't even get the colors right most of
>the time?  Color was an add-on and the implementation suffered in order
>to maintain compatibility with old black and white sets.  At some point
>you have to say "enough is enough".
>Why suffer with the old forever?  Just because most people
>won't be able to afford it is rediculous.  Most people couldn't afford
>pocket calculators when they first came out (or TV's, or cars or most
>other major new technologies).

In article <MIKEMC.90Mar20102012@mustang.ncr-fc.FtCollins.NCR.com> mikemc@mustang.ncr-fc.FtCollins.NCR.com (Mike McManus) writes:
>While I basically agree with you Bill, the issue of backward compatability is a
>very sticky one.  Simply making the change and living with it is not as easy as
>it sounds.  Your analogy to CD technology is not valid.  The introduction of
>CD's did not make records disappear, and you could still play old records that
>you had, even if buying old one is harder now days.  Making the TV which you
>currently have in your home unable to be used is quite another thing.  Unless
>all you want to do is watch video tapes (not likely)...

The point that you are missing is that HDTV's should be done in a
similar way to CD's.  You are right in saying that CD's did not make it
impossible to use vinyl records.  However, they are taking over the
market.  In ten years, you may find it very difficult to buy vinyl.
CD's are slowly killing the vinyl market.  HDTV should do the same
thing.  Both should be available for several years.  Eventually
everyone will get HDTV's and at some point eventually the vast majority
of people will have HDTV's and broadcasters can just dump the old
broacasts.  If HDTV and old TV have to share the same band space then
we'll have to suffer with less channels on both for a while.  For this
reason, I think cable should be the first market for HDTV since it has
a lot more control over bandwidth and channel placement.  Perhaps for
the first 5 years or so it should be available only on cable.

Another possibility that I'm not too sure about is a converter box.  It
might be possible to produce a relatively cheap converter box that
could read in an HDTV signal and cut the resolution down and convert to
NTSC and feed it into an old TV.  If it's cheap enough (say under or
around $100) and doesn't degrade the signal too much beyond the what we
currently live with, not too many people will complain.

The general public won't be convinced that they need HDTV until they
see it.  Once they see it, they will want it.  NTSC is far below the
resolution of the human eye.  The difference with increased resolution
will be very noticable to anyone with decent eyesight (>1000 lines and
72Hz would look so good by comparison it will make old TV's hard to
watch).

--Bill

forbes@aries.scs.uiuc.edu (Jeff Forbes) (03/21/90)

In article <2070@sauron.Columbia.NCR.COM> wte@sauron.UUCP (Bill Eason) writes:
>In article <1990Mar20.162041.4639@ux1.cso.uiuc.edu> forbes@aries.scs.uiuc.edu (Jeff Forbes) writes:
>>
>>Aren't all of the digitally mastered recordings sampled at ca. 45kHz?
>>Which would make the arguemnt about CD response moot.
>>
>>		Jeff
>
>Enter here the Nyquist criterion which says that the sampling frequency
>(44.1 kHz really) must be >= two times the highest analog frequency being
>recorded.  Therefore, the highest frequency which can be accurately 
>reproduced from a 44.1 kHz digital recording is 22.05 kHz, which is within 
>the range of hearing of several folks posting here.  I suspect my audible
>range reaches up there, too, since I can hear department store burglar
>alarms and CRT flyback transformers.
>
>-- 
>Bill Eason   (803) 791-6348	...bill.eason@ncrcae.Columbia.NCR.COM
>NCR Corporation
>E&M Columbia     3325 Platt Springs Road     West Columbia, SC 29169


I am thoroughly familiar with the Nyquist theorem. The point I was trying to
make was that a large fraction of the music today is digitally mastered at
44.1kHz which would limit all media to 22.05 kHz maximum frequency. A record
made from a digital master can have no better frequency response than that of
the master. Any higher frequency heard are noise. I can hear TV flyback 
transformers as well, but that is only 15.75 kHz. I certainly can't here it in
my multisync when it is at 35kHz. I have been experimenting with piezoelectric
transducers, and I can hear 17.5kHz in a noisy lab. With a quiet enviroment 
and more sound power I could probably hear above 18kHz. I do wonder what the
actual percentage of the ADULT population can hear sine waves above 20kHz.

		Jeff

arnief@tekgvs.LABS.TEK.COM (Arnie Frisch) (03/21/90)

In article <2070@sauron.Columbia.NCR.COM> wte@sauron.UUCP (Bill Eason) writes:
>In article <1990Mar20.162041.4639@ux1.cso.uiuc.edu> forbes@aries.scs.uiuc.edu (Jeff Forbes) writes:
>Enter here the Nyquist criterion which says that the sampling frequency
>(44.1 kHz really) must be >= two times the highest analog frequency being
>recorded.  Therefore, the highest frequency which can be accurately 
>reproduced from a 44.1 kHz digital recording is 22.05 kHz, which is within 
>the range of hearing of several folks posting here.  I suspect my audible
>range reaches up there, too, since I can hear department store burglar
>alarms and CRT flyback transformers.

And do you and your friends have shaggy coats and waggly tails?

I don't know what frequency you think you are hearing, but I seriously
doubt that its 22 kHz.  TV flybacks run at 15,750 (approx), and
sometimes you can hear a subharmonic of higher frequency oscillators -
but as far as hearing 22kHz, that's strictly for the dogs and birds.

Arnold Frisch
Tektronix Laboratories

phil@pepsi.amd.com (Phil Ngai) (03/21/90)

In article <7438@celit.fps.com> billd@fps.com (Bill Davidson) writes:
|The general public won't be convinced that they need HDTV until they
|see it.  Once they see it, they will want it.  NTSC is far below the
|resolution of the human eye.  The difference with increased resolution

Sorry, dude. VHS is far below the resolution of NTSC, yet look at
its popularity. VHS is far below the resolution of Laserdiscs,
yet look at their relative market share.

A public which shuns Laserdiscs can not be expected to go whole
hog on HDTV. Don't assume that everyone is a rich yuppie like yourself.

A public which uses indoor antennas or cable TV just doesn't give
a hoot about video picture quality.

HDTV is doomed as a consumer format.


--
Phil Ngai, phil@amd.com		{uunet,decwrl,ucbvax}!amdcad!phil
Boycott the census! With the history of abuse census data has,
can you afford to trust the government?

sorka@ucscb.UCSC.EDU (Alan Waterman) (03/21/90)

In article <2070@sauron.Columbia.NCR.COM> wte@sauron.UUCP (Bill Eason) writes:

>Enter here the Nyquist criterion which says that the sampling frequency
>(44.1 kHz really) must be >= two times the highest analog frequency being
>recorded.  Therefore, the highest frequency which can be accurately 
>reproduced from a 44.1 kHz digital recording is 22.05 kHz, which is within 
>the range of hearing of several folks posting here.  I suspect my audible
>range reaches up there, too, since I can hear department store burglar
>alarms and CRT flyback transformers.
>Bill Eason   (803) 791-6348	...bill.eason@ncrcae.Columbia.NCR.COM

Dude!!!! That is really really dumb. It is not 44.1 KHz per channel. It is
22.05KHz per channel. Have you forgotten STEREO?????????

By your own reasoning, the sampling rate should be at least 88.2 KHz which
is exactly what I posted a few messages back.

toddpw@tybalt.caltech.edu (Todd P. Whitesel) (03/21/90)

mikemc@mustang.ncr-fc.FtCollins.NCR.com (Mike McManus) writes:

[ about HDTV as a 'clean break' from NTSC ]
>  Yes, it may be
>the *BEST* thing to do (in the long run), but who's going to convince the
>*PUBLIC* that this is what they want?  No, you don't need to convince me, but
>there are several million other folks out there that you *DO* need to convince.

You're not just convincing them it's a good idea.

You're convincing them to PAY for it.

Slightly different and infinitely harder.

Todd Whitesel
toddpw @ tybalt.caltech.edu

heath@shumv1.uucp (Heath Roberts) (03/21/90)

In article <7104@ucdavis.ucdavis.edu> (Alan Waterman) writes:
>
>>Enter here the Nyquist criterion which says that the sampling frequency
>>recorded.  Therefore, the highest frequency which can be accurately 
>>reproduced from a 44.1 kHz digital recording is 22.05 kHz, which is within 
>>the range of hearing of several folks posting here.  I suspect my audible
>
>Dude!!!! That is really really dumb. It is not 44.1 KHz per channel. It is
>22.05KHz per channel. Have you forgotten STEREO?????????
>
>By your own reasoning, the sampling rate should be at least 88.2 KHz which
>is exactly what I posted a few messages back.

The sampling rate is still 44.1KHz, and you get a frequency response of
22.05KHz. For more than one channel (stereo, 4-, 8-, 16-, 32-track
studio, etc.) you add more ADC's, each of which operates at the same
sampling frequency. Yes, you are taking 44,100 samples * the number of
channels per second, but it's kind of like saying you have two radios
receiving at 100MHz so collectively they're receiving 200MHz. It just
doesn't make sense. Everything's happening at the single channel sync
rate, it's just in parallel.


Heath Roberts
NCSU Computer and Technologies Theme Program
heath@shumv1.ncsu.edu

keith@csli.Stanford.EDU (Keith Nishihara) (03/21/90)

minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:

>From article by bas+@andrew.cmu.edu (Bruce Sherwood):
>> The analogy with audio is that a CD with frequency response out to 10
>> MHz would not sound better than one with frequency response out to 20
>> KHz, because the human ear can't hear the higher frequencies.

>  Well, it probably would sound a bit better. Consider this:

>A 20KHz sample on CD looks something like this

> * * * * * * * * * *
>* * * * * * * * * *

>which is just a dumb square wave. Sure, it's at high enough of a pitch 
>that most people wouldn't be able to discriminate between it and a pure sine 

I can't take this any more!  You  don't  just  feed  the  samples
through  an audio amlifier and see the square wave!  You put them
through  a  `reconstruction  filter'   which   reconstructs   the
waveform.   An  ideal reconstruction filter, with a step function
low pass frequency response at 20kHz will reconstruct  the  20kHz
waveform  as  a  *perfect* sine wave.  It will also reconstruct a
19.99 kHz waveform *perfectly*,  notwithstanding  the  fact  that
there  is  a  beat between the sample frequency and the frequency
represented (the sample  points  `walk'  slowly  along  the  wave
shape).

So if the basilar membrane in your ear responds up to  20kHz  you
_will not hear_  the  difference between a properly reconstructed
signal from 44.1kHz samples and a signal reconstructed from 20MHz
samples!   Most  adults'  hearing is far below this limit, in any
case (15kHz is considered good -- if  you  have  often:  operated
heavy  machinery, fired a gun, driven a car with the window open,
or listened to loud music with headphones on, 8 to 12 kHz  may be
more like it!)

Before someone asks what if the original were not  a  sine  wave:
recall that complex waveforms may be considered as a summation of
sine waveforms of different amplitudes and frequencies, so  in  a
linear system it is valid to think only in terms of the behaviour
of the individual sine wave components.

Of course, perfect reconstruction filters are hard  build,  so  a
44.1  kHz  sample  rate  permits reconstruction filters to have a
finite roll off starting at 20kHz and being essentially fully cut
at 22.05kHz (the limit for a 44.1 kHz filter), and *still* repro-
duce all frequencies up to 20kHz *perfectly*.  Now if the  filter
did  not cut off frequencies above 22.05 kHz, some of that 20 kHz
signal would appear as a 24.1 kHz signal (reflected in  frequency
about the Nyquist frequency).  This would be undesirable.

Oversampling (no one sells CD players that don't `oversample' any
longer, do they?) permits some of the reconstruction filtering to
be done using a digital filter.  Consider  4x  resampling:   each
sample is replicated four times in a row at 176.4 kHz.  A digital
filter with a cut off frequency of 20 kHz can  be  applied.   Now
when  reconstructing,  the  analog filter still has to be flat to
20kHz, but need not be fully cut until 88.2 kHz, the Nyquist rate
for  the 4x oversampled signal.  Since the digital filter has en-
sured that there will be no frequency components in  the  digital
signal  between  20kHz  and 88.2kHz, a much lower Q filter may be
used, which is much easier and cheaper to design.

Now what about those 18 bit players?  CDs only have 16  bit  sam-
ples  dont they?  but if you use oversampling and digital filter-
ing, you can `interpolate' between the original samples and  sam-
ple  quantisation.   But what does it buy you?  The reconstructed
signal is only as good as the orignal digital material.   A  good
advertising  gimmick,  in  my opinion.  (What about the precision
and linearity of those 18 bit A-D converters?)

Neil/.		Neil%teleos.com@ai.sri.com

Note that our mail feed  via  SRI  is  currently  dead,  so  that
flames,  questions  and  assertions  that  `my hearing is good to
37.496e29 MhZ -- medically verified' (you must be an alien)  will
be thrown into the bit bucket.

turk@media-lab.media.mit.edu (Matthew Turk) (03/22/90)

In article <5478@okstate.UUCP> minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:
>
>    Well, it probably would sound a bit better. Consider this:
>
>  A 20KHz sample on CD looks something like this
>
>   * * * * * * * * * *
>  * * * * * * * * * *
>
>  which is just a dumb square wave. Sure, it's at high enough of a pitch 
>  that most people wouldn't be able to discriminate between it and a pure sine
>  of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
>  Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
>  you get a somewhat harsh sound. If I were really after a "human limits" sample, 
>  I'd bump the rate up to around 30KHz to minimize the distortion. (Assuming that 
>  a 40KHz sample is "wasteful".) 
...
>  The truth is, we CAN hear the effects BELOW the maximum frequency. 


The problem is quite a bit better understood than you are assuming.
The aliasing you describe is eliminated by prefiltering the signal
with a lowpass filter.  Also, digital signals are not reproduced as
square waves.  If the human ear was indeed insensitive to signals
above 20kHz, then an ideal system would prefilter the signal (lowpass
at 20kHz), sample at 40kHz, then reconstruct the (filtered) analog
signal exactly to be amplified and sent to your speakers.

The real issues here are: (1) a perfect low-pass filter is not
realizable, so you either have to accept some aliasing or filter at a
higher rate; (2) the human frequency response isn't an ideal low-pass
system, so there's no clear and clean cutoff point.  ~20kHz is, I
believe, the -3dB point.  Since the CD sampling rate is 44.1kHz,
there's a little room for variation -- perfect filtering would fully
represent signals < 22.05kHz.  In real systems, we can definitely
avoid any noticable aliasing, but this reduces the frequence response.
Anyone know how tight the filters used in digital recording are?

	Matthew

billd@fps.com (Bill Davidson) (03/22/90)

In article <29574@amdcad.AMD.COM> phil@pepsi.AMD.COM (Phil Ngai) writes:
>Sorry, dude. VHS is far below the resolution of NTSC, yet look at
>its popularity. VHS is far below the resolution of Laserdiscs,
>yet look at their relative market share.

It's a matter of marketing.  If HDTV is a good product and it's
marketed well, it will kill NTSC.  Laserdiscs were not marketed well
the first time around.  Also, they got killed by those stupid RCA
videodiscs which were much cheaper.  They are doing better now since
the RCA discs died and they are concentrating on appealing to audio-
videophiles.  The prices are now down to the consumer market and the
market appears to be growing.  Also, VHS is NTSC (or PAL or SECAM for
you Europeans).  It is just not that great of a recording system.
Another thing that the public doesn't like about laserdiscs is that you
can't record.  I know a lot of people who are waiting for writeable
laserdiscs.  It's sad when I burst their bubble and tell them that it's
not too likely to happen for quite a while since the only writable
laser discs that we can make right now (or expect to for the near
future) are not up to the task of showing feature-length movies due to
being digital, low capacity and slow (digital, high capacity and
fast could do it but I think that that's quite aways off from now).

>A public which shuns Laserdiscs can not be expected to go whole
>hog on HDTV. Don't assume that everyone is a rich yuppie like yourself.

They're not exactly shunning them.  They are selling pretty well in San
Diego from what I can tell.  More and more stores are selling them.
The prices are getting decently low on the low end players and we even
have a few places that rent discs (and at prices which are comparable
to VHS rentals).  I'm also not a rich yuppie.  I wish I was.  I just
spend an inordinate amount of my income on quality A/V.  I don't go out
to shows and concerts much as a result.  The yuppies around here own
condos and houses.  I don't and probably won't for quite a while.

As I've said before, it will take time for HDTV to break the market.
I suspect at least 10 years from introduction to the point where it
dominates the market (an HDTV standard that can be generally agreed
on does not exist yet; Japanese HDTV notwithstanding).

--Bill

bowers@elxsi.dfrf.nasa.gov (Al Bowers) (03/22/90)

In article <5478@okstate.UUCP> minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:

>From article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu>, by bas+@andrew.cmu.edu (Bruce Sherwood):
>> The analogy with audio is that a CD with frequency response out to 10
>> MHz would not sound better than one with frequency response out to 20
>> KHz, because the human ear can't hear the higher frequencies.

>  Well, it probably would sound a bit better. Consider this:

>A 20KHz sample on CD looks something like this


> * * * * * * * * * *
>* * * * * * * * * *

Exactly!  Or maybe I would have described it as:
_ _ _ _ _ _ _ _ _ _ _
 _ _ _ _ _ _ _ _ _ _ 

>which is just a dumb square wave. Sure, it's at high enough of a pitch 
>that most people wouldn't be able to discriminate between it and a pure sine 
>of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
>Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
>you get a somewhat harsh sound. If I were really after "human limits" sample,
>I'd bump the rate up to around 30KHz to minimize the distortion.(Assuming that
>a 40KHz sample is "wasteful".)

I realize that this is a little out of place but as an example we in
the aircraft industry prefer to sample at a minimum of 5 times the
maximum frequency of interest and we usually prefer 10 times the max
frequency.  Now I realize that this is far and away more than required
for decent sound (or maybe even exceptional sound) but having the cut
off just above the limit of average human hearing just seems to be a
little rash.

--
Albion H. Bowers  bowers@elxsi.dfrf.nasa.gov  ames!elxsi.dfrf.nasa.gov!bowers
         NASA Ames-Dryden Flight Research Facility, Edwards, CA
                  Aerodynamics: The ONLY way to fly!

                     Live to ski, ski to live...

kassover@jupiter.crd.ge.com (David Kassover) (03/22/90)

In article <BOWERS.90Mar21114921@drynix.dfrf.nasa.gov> bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:
...
>I realize that this is a little out of place but as an example we in
>the aircraft industry prefer to sample at a minimum of 5 times the
>maximum frequency of interest and we usually prefer 10 times the max
>frequency.
...
I also work in conjunction with the aircraft industry, and have a
devil of a time convincing the airframe and powerplant types that
sampling at more than their beloved 10x maximum frequency just
introduces sampling noise into the analysis (and therefore
anything downstream of the sampler)  (As well as making it hard
for me to build hardware and software that can actually sample
*and* do the required calculations that fast 8-) )

Remembering way back into High School Health, the generally
accepted *nominal* range of human hearing is 20-20kHz.  I don't
think there's a problem with building audio components that are
"flat" (well, flat enough. +- .5 dB?) out to say 30kHz.  And
economic for most of us to buy.  If someone wants to spend more
than that and get "better"  frequency response, they may.

Assuming, of course, that the input signal contains meaningful
information at those high frequencies, anyway.  (NOT whether it
should, just whether it does).  The case of an audio frequency square wave
is somewhat misleading, since the mechanical components of the
system possess enough inertia to low-pass filter the signal.
(like speaker cones, eardrums, and ossicles)

Back to the point.  The standard that is adopted should be
capable of providing a *reasonable* benefit for most people.
Even unreasonably, say 3.5 sigma out from "normal" or "average",
and furthermore should allow for those 4.0+ sigma people to add
on, at additional cost to themselves.

I don't see why there is so much agony about making the people
who want video to do traditional graphics and the people who want
video to look at more true-to-life images have to put up with the
same standards.  I submit that *most* people who do both kinds of
activities will not need to convert from one to the other often,
and those who do will be able to get hardware, software, or
whatever to do it.  I submit that it is not necessary to
displease everyone, nor to provide the new technology immediately
to the man on the street for only $49.95 in 1990 dollars.

Before you all flame me for wasting bandwidth:

I am reading this through comp.graphics.  The article I am
following-up was posted to no less than 3 other newsgroups, none
of which I read.  IMHO, the proposed standard for HDTV is a reasonable piece of
information for posting here.  The argument, polemics, and other
such stuff is maybe better restricted to someplace else.

johna@gold.GVG.TEK.COM (John Abt) (03/22/90)

In article <5478@okstate.UUCP> minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:
>From article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu>, by bas+@andrew.cmu.edu (Bruce Sherwood):
>> The analogy with audio is that a CD with frequency response out to 10
>> MHz would not sound better than one with frequency response out to 20
>> KHz, because the human ear can't hear the higher frequencies.
>
>  Well, it probably would sound a bit better. Consider this:
>A 20KHz sample on CD looks something like this
> * * * * * * * * * *
>* * * * * * * * * *
>which is just a dumb square wave. Sure, it's at high enough of a pitch 
>that most people wouldn't be able to discriminate between it and a pure sine 
>of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
>Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
>you get a somewhat harsh sound. 

But the 20 KHz square wave is just a "dumb" sine wave after it goes through
the re-construction filter. And, as any student of Fourier will tell you,
the only thing that can change the periodic shape of a sine wave are 
harmonics - the first of which for a 20 KHz waveform occurs at 40 KHz.
Nobody can hear 40 KHz.

>...... Here's
>an analogy (and an excuse to post here): The human eye can only discern between
>a limited amount of colors, especially in small areas. The number is quite
>small (on the order of 100s). So, should we abandon 24bit color displays since
>we _shouldn't_ be able to tell the difference? 

Bad analogy because it's not always applicable, e.g., the eye is extremely
sensitive to correlated discontinuities in an image. The number of different
colors that are discernable when seperated by distinct line is far greater.
Matter of fact, 10 bit RGB makes better pictures.


- John Abt,  Grass Valley Group