[rec.video] I don't need HDTV!

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/15/90)

  It must be really frustrating trying to come up w/ a HDTV standard:  I mean
these guys (the ones making the standard) have to come up with the LAST WORD
in TV standards.  The standard has to be something we're willing to live with 
for AT LEAST the next 50 years!  Sure, 50 years from now, the things will be
cheap to make, but now they're going to be VERY expensive.  Especially if they
make no compromises so we'll still be happy with the standard in 50 years...

  But you know all that.

  I'd be perfectly happy to settle for the telivision picture tubes of today!
It's the signal that's so horrendous!  I mean:  If you've ever seen the
output of a laser disc player, it's awesome!  It shows what your picture tube
can do.  It's the low signal to noise ratio of the broadcast signals that we
receive that's horrendous!

  I commend the FCC for requiring that the new format signals be backwards
compatible, but I think that the standard could easily offer more than just
backwards compatibility.  Here's what I mean:

+----+--------------+----+    I've heard of one standard which would break
|    |              |    |  your picture into at least four signals:  First
| A  |      B/C     | D  |  a standard TV picture is sent (B).  This provides 
|    |              |    |  the requisite backwards compatibility.  Then 
|    |              |    |  image C is sent.  This image is interlaced between
|    |              |    |  the lines from picture B.  Third, zones A&D are
+----+--------------+----+  sent to provide HDTC users w/ a movie-box picture.

  If a standard such as this were adopted (which I highly doubt it would),
there would be an EASILY EXTRACTED HD IMAGE OF THE NORMAL PICTURE.  Ok, so
it's not exactly the same image (It's the line in-between the normal NTSC
lines), but it could easily be extracted.  And if images A,C,&D are sent
digitally w/ error correction, a (relatively) cheap tuner could be built
to extract the high-quality image C, convert it to analog, and display this
VIRTUALLY NOISE-FREE picture on your old TV!

  This way, consumers would have a choice of three levels of TV quality,
depending upon what they could afford:  They could use the cheap old NTSC,
or the crisp new digital NTSC, or the dramatic full-performance HDTV!

  I personally don't think the above standard will (or even should) be 
implemented as I have described it, But I highly encourage the HDTV Standards
Committee to use a format which includes an easily tuned digital encoding
of the current NTSC signal.

--Glenn

billd@fps.com (Bill Davidson) (03/16/90)

In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>It's the signal that's so horrendous!  I mean:  If you've ever seen the
>output of a laser disc player, it's awesome!  It shows what your picture tube
>can do.  It's the low signal to noise ratio of the broadcast signals that we
>receive that's horrendous!

I disagree completely.  Resolution is sickly compared to what it could
be.  Have you ever seen a high resolution screen?  Laser disks are as
good as it gets on a regular TV but the resoulution is still pitiful.
The pixels are huge and there's no signal that can possibly fix that.
Maybe I'm spoiled by looking at 1000+ line computer screens so much
(I'd like HDTV to be even higher resolution but that would *really* be
expensive).

>  I commend the FCC for requiring that the new format signals be backwards
>compatible, but I think that the standard could easily offer more than just
>backwards compatibility.

[idea deleted --billd]

How long do we have to carry around the baggage of a standard that was
designed so long ago that it can't even get the colors right most of
the time?  Color was an add-on and the implementation suffered in order
to maintain compatibility with old black and white sets.  At some point
you have to say "enough is enough".  We can do so much better now.  We
know a lot more about video signals than we did when NTSC was designed.
Also, the frame rate is annoying.  It destroys resolution when
converting 24 frame/sec film to video due to frame mixing.  I want a
standard with at least 1000 lines and a 72Hz frame rate.  Wide screen
would be nice for films and square pixels would be nice for computer
graphics.  Why suffer with the old forever?  Just because most people
won't be able to afford it is rediculous.  Most people couldn't afford
pocket calculators when they first came out (or TV's, or cars or most
other major new technologies).  We need to define a standard that is
good and is doable and which can be foreseen to become cheap with time.
It doesn't have to be cheap now.  It would be nice if it was an
international standard as well so that video tapes and laserdisks will
work anywhere.  It took CD's 5-6 years to really break into the US market.
A lot of people thought they were rediculous when they first became
available.  They were very expensive (both the players and the disks).
Now it's getting hard to find records stores that have more vinyl than
aluminum coated plastic.  It took laser video even longer (it's back
and gaining a lot of momentum right now).  HDTV will be the same story.
We have generations of people now who grew up watching TV and they are
getting more and more demanding of quality video.

--Bill

bill@bilver.UUCP (Bill Vermillion) (03/16/90)

In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
 
>  It must be really frustrating trying to come up w/ a HDTV standard:  I mean
>these guys (the ones making the standard) have to come up with the LAST WORD
>in TV standards.  The standard has to be something we're willing to live with 
>for AT LEAST the next 50 years!  Sure, 50 years from now, the things will be
>cheap to make, but now they're going to be VERY expensive.  Especially if they
>make no compromises so we'll still be happy with the standard in 50 years...

We've already had current video standards for 50 years (though the color
portions a bit less than that) and it's time for a change.  If you consider
that the LP is virtually dead now, it only lasted 41.
 
>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>It's the signal that's so horrendous!  I mean:  If you've ever seen the
>output of a laser disc player, it's awesome!  It shows what your picture tube
>can do.  It's the low signal to noise ratio of the broadcast signals that we
>receive that's horrendous!

The output of a laser player isn't awesome.  It's good, but you can surely see
the limits of NTCS video if you watch it on anything bigger than a 21" set.  I
have had a laser player for several years now.  The early discs are poor
compared to today's technology, but we are pretty much at the limits.  I
assume you are a recent disc convert.   S/N is only one problem.   Have you
ever noticed how the color is in little dots at the transition points, how the
resolution isn't as good as a clean 16mm print, let alone 35?
 
>  I commend the FCC for requiring that the new format signals be backwards
>compatible, but I think that the standard could easily offer more than just
>backwards compatibility.  .....

I spent years in broadcast, and have seen many changes in the FCC, and I am
not too particularly impressed with their performance in the past few years.
They seem to be a non-regulatory regulatory agency.  Part of the time it's
hands off, other times its hands on for the wrong reasons.  They totally blew
the AM stereo standards by refusing to take a stand.

Years ago, here in Orlando, a disk jockey "locked" himself in a radio station
control room and played Sheb Wooley's "Monkey Fever" for 24 hours straight.
His name was Mark Fowler.  He didn't do much better when he was chairman of
the FCC.


-- 
Bill Vermillion - UUCP: uunet!tarpit!bilver!bill
                      : bill@bilver.UUCP

news@haddock.ima.isc.com (overhead) (03/16/90)

In article <530@bilver.UUCP> bill@bilver.UUCP (Bill Vermillion) writes:
>If you consider that the LP is virtually dead now, it only lasted 41.

I never liked LPs.  Even an audiophile LP has pops & klicks, even
the first play.  It gets worse each play.  A low grade audio
cassette doesn't have terrible distractions or sound degradation.
We could have dumped LPs long ago.  The industry kept pushing
vinal.  They kept saying that it was "better".

A friend has a nice 25 inch monitor TV.  It was real expensive.
I saw some stuff on laser disk.  It was very impressive.  I saw
some stuff on video tape - super beta, VHS.  Less impressive, but
not generally distracting.  Cable TV had almost OK stations and
pretty bad ones.  Without cable, the distractions while watching
the show are almost as bad as commercials.

If over-the-air quality could be brought to laser disk standards
it would be an overnight success.  If it was cable-only, I'd
probably buy a better TV & spend the $30-$40 a month.  Heck, I
might even watch it.

If HDTV doesn't have some sort of correction system built in,
then it will be no better than cable as it is now.  Simply
increasing bandwidth on a noisy medium does not remove noise.

If you create un-expandable standards, they will be bad.  The
rope cut to length is too short.  Technology will outstrip our
current ideas of what is easily "as much as anyone can afford".  If
you want to build a cheap TV with the new standard, you should be
able to ignore things - like error correction, or some of the
resolution, or one of the sound channels, or the closed captioned
channel...  On transmission, you have to be able to omit stuff,
like error correction, resolution, sound channels...

Stephen.
suitti@haddock.ima.isc.com

bowers@elxsi.dfrf.nasa.gov (Al Bowers) (03/17/90)

In article <530@bilver.UUCP> bill@bilver.UUCP (Bill Vermillion) writes:

>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

...stuff deleted...

>>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>>It's the signal that's so horrendous!  I mean:  If you've ever seen the
>>output of a laser disc player, it's awesome!  It shows what your picture tube
>>can do.  It's the low signal to noise ratio of the broadcast signals that we
>>receive that's horrendous!

>The output of a laser player isn't awesome.  It's good, but you can surely see
>the limits of NTCS video if you watch it on anything bigger than a 21" set.  I
>have had a laser player for several years now.  The early discs are poor
>compared to today's technology, but we are pretty much at the limits.  I
>assume you are a recent disc convert.  S/N is only one problem.   Have you
>ever noticed how the color is in little dots at the transition points, how the
>resolution isn't as good as a clean 16mm print, let alone 35?

Just as a point of reference laser disc has always been a superior to
tape.  The fundamental reason for this is the digital nature of LDs.
Keep in mind that a good VHS resolution from a decade ago was 190
lines of horizontal resolution, improvments in signal conditioning and
processing that apply to the analog tape formats will apply just as
well to the analog portion of a LD signal once it has run through a
DAC (digital to analog converter).

Photo rag tests have shown that lenses for film are capable of up to
100 lines per millimeter resolution (the test I recall said a Lietz
50mm f/2.0 could get 102 l/mm in the center of the frame).  On the
basis of that I'd say 35mm is capable of in the range of 20000 lines
resolution across a frame from edge to edge.  16mm is probably half of
that.  The real limit in film is the films ability to resolve,
probably in reality half of the above numbers, 10000 lines fo 35mm and
5000 lines for 16mm.  So I'd say we still have an order of magnitude
to go yet.  But give me a clean signal and I can live with a lot less
resolution.  S/N is where video can really beat out film, especially
if its digital.

I'd vote for a clean break from the current NTSC standard in the new
HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
resolution and I could be happy with that until I'm old and gray...

Just another opinion...

--
Albion H. Bowers  bowers@elxsi.dfrf.nasa.gov  ames!elxsi.dfrf.nasa.gov!bowers
         NASA Ames-Dryden Flight Research Facility, Edwards, CA
                  Aerodynamics: The ONLY way to fly!

                     Live to ski, ski to live...

bas+@andrew.cmu.edu (Bruce Sherwood) (03/17/90)

At the risk of stating the obvious:

Some of this discussion of HDTV implies that there will always be a need
for another generation of standards with even higher resolution.  That
isn't necessarily the case.  The human eye has limited resolution, and
higher resolution than that in the picture is literally useless, if you
are talking in terms of a "typical" screen size viewed from a "typical"
viewing distance.  Similarly, there must be an upper limit on useful
fidelity in color discrimination, beyond which the human eye just can't
see any improvement.

The analogy with audio is that a CD with frequency response out to 10
MHz would not sound better than one with frequency response out to 20
KHz, because the human ear can't hear the higher frequencies.

What we want in electronic products is high fidelity for both eye and
ear, but no more than that.  Unfortunately today's television and
computers are typically well below this threshold.

Bruce Sherwood

kucharsk@number6.Solbourne.COM (William Kucharski) (03/17/90)

In article <BOWERS.90Mar16105816@drynix.dfrf.nasa.gov> bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:
 >Just as a point of reference laser disc has always been a superior to
 >tape.  The fundamental reason for this is the digital nature of LDs...

Sorry, but the video portion of laser videodiscs is analog; the only portion
which may be digital on any given LD is the audio...
--
===============================================================================
| ARPA:	kucharsk@Solbourne.COM	              |	William Kucharski             |
| uucp:	...!{boulder,sun,uunet}!stan!kucharsk |	Solbourne Computer, Inc.      |
= The opinions above are mine alone and NOT those of Solbourne Computer, Inc. =

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/17/90)

bill@bilver.UUCP (Bill Vermillion) writes:

>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
> 
>>  I'd be perfectly happy to settle for the telivision picture tubes of today!
>>It's the signal that's so horrendous!  

>The output of a laser player isn't awesome.  It's good...

HA!  I knew this discussion would get people going!
  My point was not that Laser Discs are awesome.  My point IS that
when *I* watch TV (not too often, mind you) It doesn't bother me that
I can't get 6" from the sceen and still not see pixels:  If I got a 6'
screen, I would just sit 3 times further away from it than a 24" set.
  Sure, a few years after graduating from Caltech, I MIGHT be able to
afford a HDTV set.  But will it be WORTH it?  Will I want to spend
$5000 on a HDTV set, or would I rather spend $500 plop a box on the top
of my TV that decodes digital broadcasts, and gives me a noise-free
picture on my lower Res. monitor?  I'd go for the $500 box!

  And what of the MILLIONS of people who won't be able to afford the
HDTV sets?

  I say, there should be a middle-of-the-road solution in addition to HDTV.
There should be HDTV, but I DON'T NEED HDTV!

>I spent years in broadcast, and have seen many changes in the FCC, and I am
>not too particularly impressed with their performance in the past few years.

I can't argue w/ that!  In fact, where does the FCC claim to get the 
legal authority to regulate speach over the airwaves?  e.g. why will
a HAM who says F**K on the airwaves almost surely lose his license?  So
much for freedom of speach!  It's a form of government censorship.  I 
have nothing against CENSURE, but censorship by the gov. is WRONG.

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/17/90)

bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:

>>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

>Just as a point of reference laser disc has always been a superior to
>tape.  The fundamental reason for this is the digital nature of LDs.

Uh, well, It's not exactly digital:  It's PWM (pulse width modulation).
This is the ANALOG recording of discrete samples.  Therefore the
recording is discrete, but I don't think it qualifies as digital.

>I'd vote for a clean break from the current NTSC standard in the new
>HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
>resolution and I could be happy with that until I'm old and gray...

>Just another opinion...

I hear you: GIVE ME.  But are you willing to PAY for it?  You sound like
those comp.sys.mac people who say the low cost Mac should have at least
a 68030 and 4 megs of RAM and 256 colors. :-P

Glenn

fff@mplex.UUCP (Fred Fierling) (03/17/90)

In article <BOWERS.90Mar16105816@drynix.dfrf.nasa.gov>, bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:
> 
> I'd vote for a clean break from the current NTSC standard in the new
> HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
> resolution and I could be happy with that until I'm old and gray...

This question comes up a lot doesn't it?  Do you scrap the old technology and
design a new, more refined and technically superior one, or do you compromise
your design to maintain backwards compatibility?

I hope the "clean break" approach wins out too.  What a engineer's nightmare
it would be to match up the side panels to the NTSC center.
-- 
Fred Fierling   uunet!van-bc!mplex!fff    Tel: 604 875-1461  Fax: 604 875-9029
Microplex Systems Ltd   265 East 1st Avenue   Vancouver, BC   V5T 1A7,  Canada

gbrown@tybalt.caltech.edu (Glenn C. Brown) (03/18/90)

fff@mplex.UUCP (Fred Fierling) writes:

>I hope the "clean break" approach wins out too.  What a engineer's nightmare
>it would be to match up the side panels to the NTSC center.

I heard that the FCC said they wouldn't approve the standard unless its
backwards compatible...=-(

Glenn

dave@imax.com (Dave Martindale) (03/18/90)

In article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu> bas+@andrew.cmu.edu (Bruce Sherwood) writes:
>At the risk of stating the obvious:
>
>Some of this discussion of HDTV implies that there will always be a need
>for another generation of standards with even higher resolution.  That
>isn't necessarily the case.  The human eye has limited resolution, and
>higher resolution than that in the picture is literally useless, if you
>are talking in terms of a "typical" screen size viewed from a "typical"
>viewing distance.
>
>The analogy with audio is that a CD with frequency response out to 10
>MHz would not sound better than one with frequency response out to 20
>KHz, because the human ear can't hear the higher frequencies.

If you move closer to a loudspeaker, you don't need better frequency
response - your ear's limits are the same at any distance.  You can
make the same argument for colour and brightness resolution, but not
spatial resolution, in an image.  I.e. beyond a certain point, using
extra bits for brightness or colour resolution just doesn't produce
a noticeable improvement in the picture, no matter how close you get.

But the analogy is all wrong for resolution.  If you move closer to an
image, so it fills more of your field of view, you need better spatial
resolution.  And somebody will always want to sit closer than the
current standard is designed for, at least for the forseeable future.
NTSC was designed for a viewing distance of 10 times the picture
height, HDTV for 3-4 times the picture height.  I want an image that
looks sharp from 2/3 the picture height - that gives me a 90 degree
field of view (with a 4:3 aspect ratio).

What you say would be true only if there was a "typical" screen size
and a "typical" viewing distance.

thant@horus.esd.sgi.com (Thant Tessman) (03/20/90)

In article <1990Mar17.022845.9450@spectre.ccsf.caltech.edu>,
gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

> 
> HA!  I knew this discussion would get people going!
>   My point was not that Laser Discs are awesome.  My point IS that
> when *I* watch TV (not too often, mind you) It doesn't bother me that
> I can't get 6" from the sceen and still not see pixels:
[...]
> 
>   And what of the MILLIONS of people who won't be able to afford the
> HDTV sets?

The point of my original posting was that the FCC isn't even
giving people the chance to choose.

If new formats were allowed, the high fidelity nuts like me would 
be willing to support the HDTV industry until it became affordable 
for everyone, while leaving the old system in place until there was
no longer enough of a market to support it.

> 
> >I spent years in broadcast, and have seen many changes in the FCC, and I am
> >not too particularly impressed with their performance in the past few years.
> 
> I can't argue w/ that!  In fact, where does the FCC claim to get the 
> legal authority to regulate speach over the airwaves?  e.g. why will
> a HAM who says F**K on the airwaves almost surely lose his license?  So
> much for freedom of speach!  It's a form of government censorship.  I 
> have nothing against CENSURE, but censorship by the gov. is WRONG.

Wouldn't you consider regulating broadcast formats just as much 
censorship as regulating speech?

thant

bowers@elxsi.dfrf.nasa.gov (Al Bowers) (03/20/90)

In article <1990Mar17.025325.9827@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

>bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:

>>>In article <1990Mar15.090214.9871@spectre.ccsf.caltech.edu> gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:

...story of my infamous analog digital screwup deleted...

>>I'd vote for a clean break from the current NTSC standard in the new
>>HDTV standard.  Give me anywhere from 500 to 1000 lines horizontal
>>resolution and I could be happy with that until I'm old and gray...

>>Just another opinion...

>I hear you: GIVE ME.  But are you willing to PAY for it?  You sound like
>those comp.sys.mac people who say the low cost Mac should have at least
>a 68030 and 4 megs of RAM and 256 colors. :-P

Actually Glenn I have a real dislike for Macs and here is why, I
refuse to use any machine whose presupposition is that I don't know
anything about what it is I am trying to do.  This prevents me from
doing some of the wonderful things a Mac is capable of but it doesn't
bother me too much.  As to the implication that I would not be willing
to spend the money to get the performance, I'd like to point out that
if you'd read any of my previous postings you'd know that I firmly
believe that you get what you pay for.  My current camcorder is a
S-VHS-C and next time around I will buy the same format with HiFi
also.  My VCR is a VHS and has MTS which came out in the mid '80s (MTS
was only 12 months old when I bought it, only linear stereo but at the
time that was all that was available).  My next VCR will be the new
JVC S-VHS(-C) machine that will accept S-VHS, VHS, S-VHS-C and VHS-C,
it will cost me a pretty penny but it too will be worth it.

GIVE ME the option and let me decide how much I want to spend for what
features.  :-P yourself...

--
Albion H. Bowers  bowers@elxsi.dfrf.nasa.gov  ames!elxsi.dfrf.nasa.gov!bowers
         NASA Ames-Dryden Flight Research Facility, Edwards, CA
                  Aerodynamics: The ONLY way to fly!

                     Live to ski, ski to live...

sehat@iit (Sehat Sutardja) (03/20/90)

In article <1990Mar17.022845.9450@spectre.ccsf.caltech.edu>, gbrown@tybalt.caltech.edu (Glenn C. Brown) writes:
>   Sure, a few years after graduating from Caltech, I MIGHT be able to
> afford a HDTV set.  But will it be WORTH it?  Will I want to spend
> $5000 on a HDTV set, or would I rather spend $500 plop a box on the top
> of my TV that decodes digital broadcasts, and gives me a noise-free
> picture on my lower Res. monitor?  I'd go for the $500 box!
> 
Your wish might come true next year. For now, I can't say much about this.

>   And what of the MILLIONS of people who won't be able to afford the
> HDTV sets?
> 

This depends on how low the cost of a high resolution monitor would be. 
As far as the digital signal processing requirement goes, only a few milion 
transistors and some DRAM chips would be needed. You can pretty much guess
what the cost of electronics would be when they are used in consumer products.
For now, the major problem is to get a low cost monitor.




 
-- 
Sehat Sutarja,

{decwrl!sun}!imagen!iit!sehat	| Integrated Information Tech.
sutarja@janus.Berkeley.EDU	| Santa Clara, CA. (408)-727-1885

minich@a.cs.okstate.edu (MINICH ROBERT JOHN) (03/21/90)

From article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu>, by bas+@andrew.cmu.edu (Bruce Sherwood):
> The analogy with audio is that a CD with frequency response out to 10
> MHz would not sound better than one with frequency response out to 20
> KHz, because the human ear can't hear the higher frequencies.

  Well, it probably would sound a bit better. Consider this:

A 20KHz sample on CD looks something like this


 * * * * * * * * * *
* * * * * * * * * *

which is just a dumb square wave. Sure, it's at high enough of a pitch 
that most people wouldn't be able to discriminate between it and a pure sine 
of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
you get a somewhat harsh sound. If I were really after a "human limits" sample, 
I'd bump the rate up to around 30KHz to minimize the distortion. (Assuming that 
a 40KHz sample is "wasteful".) 
  Since there are indeed people sensitive enough to these at-the-limits
conditions, we shouldn't write off any increase in the sapling freqeuncy as
wasteful just because simple math says "you can't hear anything higher than..."
The truth is, we CAN hear the effects BELOW the maximum frequency. Here's
an analogy (and an excuse to post here): The human eye can only discern between
a limited amount of colors, especially in small areas. The number is quite
small (on the order of 100s). So, should we abandon 24bit color displays since
we _shouldn't_ be able to tell the difference? Just because our hearing doesn't
necessarily scream at us "yuck" like our eyes do doesn't mean we should ignore
what does exist. I'll keep my 24bit color, thank you. And I'll also get the
SUPER-CD player that sample at a higher rate because _I_ can tell the diff.

Robert Minich
Oklahoma State University
minich@a.cs.okstate.edu

DISCLAIMER: One who takes forgotten floppies.

keith@csli.Stanford.EDU (Keith Nishihara) (03/21/90)

minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:

>From article by bas+@andrew.cmu.edu (Bruce Sherwood):
>> The analogy with audio is that a CD with frequency response out to 10
>> MHz would not sound better than one with frequency response out to 20
>> KHz, because the human ear can't hear the higher frequencies.

>  Well, it probably would sound a bit better. Consider this:

>A 20KHz sample on CD looks something like this

> * * * * * * * * * *
>* * * * * * * * * *

>which is just a dumb square wave. Sure, it's at high enough of a pitch 
>that most people wouldn't be able to discriminate between it and a pure sine 

I can't take this any more!  You  don't  just  feed  the  samples
through  an audio amlifier and see the square wave!  You put them
through  a  `reconstruction  filter'   which   reconstructs   the
waveform.   An  ideal reconstruction filter, with a step function
low pass frequency response at 20kHz will reconstruct  the  20kHz
waveform  as  a  *perfect* sine wave.  It will also reconstruct a
19.99 kHz waveform *perfectly*,  notwithstanding  the  fact  that
there  is  a  beat between the sample frequency and the frequency
represented (the sample  points  `walk'  slowly  along  the  wave
shape).

So if the basilar membrane in your ear responds up to  20kHz  you
_will not hear_  the  difference between a properly reconstructed
signal from 44.1kHz samples and a signal reconstructed from 20MHz
samples!   Most  adults'  hearing is far below this limit, in any
case (15kHz is considered good -- if  you  have  often:  operated
heavy  machinery, fired a gun, driven a car with the window open,
or listened to loud music with headphones on, 8 to 12 kHz  may be
more like it!)

Before someone asks what if the original were not  a  sine  wave:
recall that complex waveforms may be considered as a summation of
sine waveforms of different amplitudes and frequencies, so  in  a
linear system it is valid to think only in terms of the behaviour
of the individual sine wave components.

Of course, perfect reconstruction filters are hard  build,  so  a
44.1  kHz  sample  rate  permits reconstruction filters to have a
finite roll off starting at 20kHz and being essentially fully cut
at 22.05kHz (the limit for a 44.1 kHz filter), and *still* repro-
duce all frequencies up to 20kHz *perfectly*.  Now if the  filter
did  not cut off frequencies above 22.05 kHz, some of that 20 kHz
signal would appear as a 24.1 kHz signal (reflected in  frequency
about the Nyquist frequency).  This would be undesirable.

Oversampling (no one sells CD players that don't `oversample' any
longer, do they?) permits some of the reconstruction filtering to
be done using a digital filter.  Consider  4x  resampling:   each
sample is replicated four times in a row at 176.4 kHz.  A digital
filter with a cut off frequency of 20 kHz can  be  applied.   Now
when  reconstructing,  the  analog filter still has to be flat to
20kHz, but need not be fully cut until 88.2 kHz, the Nyquist rate
for  the 4x oversampled signal.  Since the digital filter has en-
sured that there will be no frequency components in  the  digital
signal  between  20kHz  and 88.2kHz, a much lower Q filter may be
used, which is much easier and cheaper to design.

Now what about those 18 bit players?  CDs only have 16  bit  sam-
ples  dont they?  but if you use oversampling and digital filter-
ing, you can `interpolate' between the original samples and  sam-
ple  quantisation.   But what does it buy you?  The reconstructed
signal is only as good as the orignal digital material.   A  good
advertising  gimmick,  in  my opinion.  (What about the precision
and linearity of those 18 bit A-D converters?)

Neil/.		Neil%teleos.com@ai.sri.com

Note that our mail feed  via  SRI  is  currently  dead,  so  that
flames,  questions  and  assertions  that  `my hearing is good to
37.496e29 MhZ -- medically verified' (you must be an alien)  will
be thrown into the bit bucket.

turk@media-lab.media.mit.edu (Matthew Turk) (03/22/90)

In article <5478@okstate.UUCP> minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:
>
>    Well, it probably would sound a bit better. Consider this:
>
>  A 20KHz sample on CD looks something like this
>
>   * * * * * * * * * *
>  * * * * * * * * * *
>
>  which is just a dumb square wave. Sure, it's at high enough of a pitch 
>  that most people wouldn't be able to discriminate between it and a pure sine
>  of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
>  Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
>  you get a somewhat harsh sound. If I were really after a "human limits" sample, 
>  I'd bump the rate up to around 30KHz to minimize the distortion. (Assuming that 
>  a 40KHz sample is "wasteful".) 
...
>  The truth is, we CAN hear the effects BELOW the maximum frequency. 


The problem is quite a bit better understood than you are assuming.
The aliasing you describe is eliminated by prefiltering the signal
with a lowpass filter.  Also, digital signals are not reproduced as
square waves.  If the human ear was indeed insensitive to signals
above 20kHz, then an ideal system would prefilter the signal (lowpass
at 20kHz), sample at 40kHz, then reconstruct the (filtered) analog
signal exactly to be amplified and sent to your speakers.

The real issues here are: (1) a perfect low-pass filter is not
realizable, so you either have to accept some aliasing or filter at a
higher rate; (2) the human frequency response isn't an ideal low-pass
system, so there's no clear and clean cutoff point.  ~20kHz is, I
believe, the -3dB point.  Since the CD sampling rate is 44.1kHz,
there's a little room for variation -- perfect filtering would fully
represent signals < 22.05kHz.  In real systems, we can definitely
avoid any noticable aliasing, but this reduces the frequence response.
Anyone know how tight the filters used in digital recording are?

	Matthew

bowers@elxsi.dfrf.nasa.gov (Al Bowers) (03/22/90)

In article <5478@okstate.UUCP> minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:

>From article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu>, by bas+@andrew.cmu.edu (Bruce Sherwood):
>> The analogy with audio is that a CD with frequency response out to 10
>> MHz would not sound better than one with frequency response out to 20
>> KHz, because the human ear can't hear the higher frequencies.

>  Well, it probably would sound a bit better. Consider this:

>A 20KHz sample on CD looks something like this


> * * * * * * * * * *
>* * * * * * * * * *

Exactly!  Or maybe I would have described it as:
_ _ _ _ _ _ _ _ _ _ _
 _ _ _ _ _ _ _ _ _ _ 

>which is just a dumb square wave. Sure, it's at high enough of a pitch 
>that most people wouldn't be able to discriminate between it and a pure sine 
>of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
>Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
>you get a somewhat harsh sound. If I were really after "human limits" sample,
>I'd bump the rate up to around 30KHz to minimize the distortion.(Assuming that
>a 40KHz sample is "wasteful".)

I realize that this is a little out of place but as an example we in
the aircraft industry prefer to sample at a minimum of 5 times the
maximum frequency of interest and we usually prefer 10 times the max
frequency.  Now I realize that this is far and away more than required
for decent sound (or maybe even exceptional sound) but having the cut
off just above the limit of average human hearing just seems to be a
little rash.

--
Albion H. Bowers  bowers@elxsi.dfrf.nasa.gov  ames!elxsi.dfrf.nasa.gov!bowers
         NASA Ames-Dryden Flight Research Facility, Edwards, CA
                  Aerodynamics: The ONLY way to fly!

                     Live to ski, ski to live...

kassover@jupiter.crd.ge.com (David Kassover) (03/22/90)

In article <BOWERS.90Mar21114921@drynix.dfrf.nasa.gov> bowers@elxsi.dfrf.nasa.gov (Al Bowers) writes:
...
>I realize that this is a little out of place but as an example we in
>the aircraft industry prefer to sample at a minimum of 5 times the
>maximum frequency of interest and we usually prefer 10 times the max
>frequency.
...
I also work in conjunction with the aircraft industry, and have a
devil of a time convincing the airframe and powerplant types that
sampling at more than their beloved 10x maximum frequency just
introduces sampling noise into the analysis (and therefore
anything downstream of the sampler)  (As well as making it hard
for me to build hardware and software that can actually sample
*and* do the required calculations that fast 8-) )

Remembering way back into High School Health, the generally
accepted *nominal* range of human hearing is 20-20kHz.  I don't
think there's a problem with building audio components that are
"flat" (well, flat enough. +- .5 dB?) out to say 30kHz.  And
economic for most of us to buy.  If someone wants to spend more
than that and get "better"  frequency response, they may.

Assuming, of course, that the input signal contains meaningful
information at those high frequencies, anyway.  (NOT whether it
should, just whether it does).  The case of an audio frequency square wave
is somewhat misleading, since the mechanical components of the
system possess enough inertia to low-pass filter the signal.
(like speaker cones, eardrums, and ossicles)

Back to the point.  The standard that is adopted should be
capable of providing a *reasonable* benefit for most people.
Even unreasonably, say 3.5 sigma out from "normal" or "average",
and furthermore should allow for those 4.0+ sigma people to add
on, at additional cost to themselves.

I don't see why there is so much agony about making the people
who want video to do traditional graphics and the people who want
video to look at more true-to-life images have to put up with the
same standards.  I submit that *most* people who do both kinds of
activities will not need to convert from one to the other often,
and those who do will be able to get hardware, software, or
whatever to do it.  I submit that it is not necessary to
displease everyone, nor to provide the new technology immediately
to the man on the street for only $49.95 in 1990 dollars.

Before you all flame me for wasting bandwidth:

I am reading this through comp.graphics.  The article I am
following-up was posted to no less than 3 other newsgroups, none
of which I read.  IMHO, the proposed standard for HDTV is a reasonable piece of
information for posting here.  The argument, polemics, and other
such stuff is maybe better restricted to someplace else.

johna@gold.GVG.TEK.COM (John Abt) (03/22/90)

In article <5478@okstate.UUCP> minich@a.cs.okstate.edu (MINICH ROBERT JOHN) writes:
>From article <sa0KhqO00Uh7M2R25C@andrew.cmu.edu>, by bas+@andrew.cmu.edu (Bruce Sherwood):
>> The analogy with audio is that a CD with frequency response out to 10
>> MHz would not sound better than one with frequency response out to 20
>> KHz, because the human ear can't hear the higher frequencies.
>
>  Well, it probably would sound a bit better. Consider this:
>A 20KHz sample on CD looks something like this
> * * * * * * * * * *
>* * * * * * * * * *
>which is just a dumb square wave. Sure, it's at high enough of a pitch 
>that most people wouldn't be able to discriminate between it and a pure sine 
>of the same requency, but what happens if, say, you have a 20,001Hz waveform? 
>Then, 20KHz just isn't enough to provide a nice, "symetric" waveform. Thus, 
>you get a somewhat harsh sound. 

But the 20 KHz square wave is just a "dumb" sine wave after it goes through
the re-construction filter. And, as any student of Fourier will tell you,
the only thing that can change the periodic shape of a sine wave are 
harmonics - the first of which for a 20 KHz waveform occurs at 40 KHz.
Nobody can hear 40 KHz.

>...... Here's
>an analogy (and an excuse to post here): The human eye can only discern between
>a limited amount of colors, especially in small areas. The number is quite
>small (on the order of 100s). So, should we abandon 24bit color displays since
>we _shouldn't_ be able to tell the difference? 

Bad analogy because it's not always applicable, e.g., the eye is extremely
sensitive to correlated discontinuities in an image. The number of different
colors that are discernable when seperated by distinct line is far greater.
Matter of fact, 10 bit RGB makes better pictures.


- John Abt,  Grass Valley Group