dave@onfcanim.UUCP (01/08/87)
This article is a reply to a question in rec.video, but it may be of interest to computer graphics people also, thus the double posting. The information that follows comes from "Principles of Color Television", Knox McIlwain and Charles E. Dean, John Wiley and Sons, 1956. It is essentially a book about the process that produced the NTSC colour standard. However, the library containing the book is now 400 miles away, so there could be an error or two in what follows. When the NTSC standard was being discussed, there were a number of things already "cast in stone". There was already a black&white standard, and a large number of TV sets in consumer hands. The new colour standard was to be compatible with the black&white one, in the sense that B&W broadcasts would appear in B&W on colour sets, and colour transmissions would produce good-quality B&W pictures on B&W TV sets. The B&W standard specified a 60 Hz vertical frequency and 15750 Hz horizontal frequency, giving 525 lines in two interlaced fields. The spacing of TV channels was 6 MHz, and the sound carrier was located 4.5 MHz above the picture carrier. I won't discuss the reasoning behind the way that the colour signals were encoded into a luminance and two colour difference signals of different bandwidths. Here, it is enough to know that the two colour difference signals were to be transmitted as quadrature amplitude modulation of a colour subcarrier signal that is added to the luminance and sync signals to produce a composite video signal. Since the colour information occupies part of the frequency spectrum that is also taken up by fine detail in the B&W (luminance) signal, it is necessary to minimize the interference between them. It is an observed fact that for most TV images containing typical patterns of objects, the detail in the luminance signal is not evenly spread across the frequency spectrum, but occurs mostly at multiples of the horizontal scanning frequency. In other words, most of the luminance information is found at frequencies that are whole integer multiples of the horizontal frequency. (To make things easier, let's call the horizontal frequency Fh). For the same reasons, most of the energy in the colour component of the signal will be found at the subcarrier frequency (Fsc) plus integral multiples of Fh above and below it. To separate the luminance and chrominance information as much as possible, we can make sure that the subcarrier frequency Fsc is an odd multiple of half Fh. Thus, for some integer k, Fsc = (2*k+1) * Fh / 2. If we do this, most of the energy of the luminance component will be found at frequencies of (2*N) * Fh/2, and most of the energy of the chrominance component will be at frequencies of (2*N+1) * Fh/2. The luminance and chrominance signals are "interleaved" in frequency space. In general, this means that fine detail in the luminance signal, even though it generates frequencies near the colour subcarrier frequency, is not mistaken for colour information by the TV receiver. Only unusual images containing fine diagonal stripe patterns, fine herringbone patterns, and other sorts of fine detail that is neither horizontal nor vertical, may be misinterpreted. This is, in fact, why the suits that Johnny Carson is famous for wearing on occasion cause such bizarre effects - the fine black&white detail in the suit is being mistaken for colour information by the receiver. (Note: it is possible to avoid this problem by using a notch filter on the luminance signal ahead of the NTSC encoder - it just throws away the fine detail that might cause problems.) When a colour signal is displayed on a B&W TV, the colour information shows up as bogus fine detail in the picture - the higher the saturation of the colour, the greater the amplitude of the false detail. By selecting the colour subcarrier to be an odd multiple of half the line frequency, you guarantee that any visible pattern in the image inverts polarity from one line to the next. Thus, highly coloured areas show up as being covered by an extremely fine "checkerboard" pattern. This checkerboard pattern is far less visible than the vertical stripes you would get if Fsc was an even multiple of Fh/2. Since Fsc is locked to Fh in phase, the checkerboard pattern is perfectly stationary, which is far less visible than the drifting pattern that would result if Fsc was not locked to Fh. At normal viewing distance, the checkerboard just looks uniform grey. All of this so far has just explained why Fsc must be some odd multiple of Fh/2, and must be phase-locked to it (in practice, they are generated by dividing down the same oscillator). Now we have to pick Fsc itself. Tests with viewers had shown that, given the available video bandwidth of 4.2 MHz (which could not be changed and still remain compatible with B&W), the ideal Fsc is somewhere around 3.6 MHz. It needs to be kept as high as possible to reduce the visiblity of the "checkerboard" pattern displayed on B&W sets - the higher the frequency, the finer the pattern, and also the lower its amplitude due to the generally poor high frequency response of consumer TVs. However, Fsc must be kept low enough that there is enough space between it and the 4.2 MHz cutoff to allow at least one of the colour subcarriers to have a full upper sideband. With Fsc at 3.6 MHz, there is 0.6 MHz for the sideband, which allows the Q colour channel a bandwidth of 0.5 MHz - still pretty minimal. So, we need an odd number J such that (J * 15750/2) = 3600000, approximately. The nearest odd integer is 457. However, all of these frequencies are going to be generated by dividing down a high-frequency oscillator, and 457 is a prime number. There was no cheap digital logic available to implement a modulo-457 divider, so picking 457 would make life difficult for the engineers, and that's who was designing this standard. Now, 455 factors as 5*7*13, and modulo-13 dividers were possible by available analog techniques. So 455 was chosen. At this point, Fsc is 3.583125 MHz. However, there is yet another frequency relationship involved. Many TV sets use some of the same RF/IF circuitry for both sound and video. There is going to be some interaction between the sound and colour subcarriers, producing a beat frequency at low amplitude. To minimize the visibility of this spurious signal, its frequency should also be an odd multiple of Fh/2. The sound carrier (Fs) should also be near 4.5 MHz, within the tolerances allowed by the B&W standard. So, Fs - Fsc = (4.5 - 3.583125)MHz = 916,875 Hz. This is approximately equal to (L * 15750/2) for some odd integer L. The closest value is L=117. So, now we have Fsc = 455*Fh/2 and Fs = (455+117)*Fh/2 = 572*Fh/2. With Fh=15750, this gives an actual Fs of 4.504500 MHz, just 0.1% too high. Instead of leaving well enough alone, the NTSC decided to tweak all of the frequencies downward to put the sound carrier back as close as possible to its nominal 4.5 MHz value. So, they calculated Fsc as 455/572 * Fs. This gives Fsc = 3,579,545.454545... Then they decided that they didn't like the repeating decimal place, and so defined Fsc as 3,579,545 Hz *exactly*. All of the other frequencies are then defined by their relationship with Fsc: Fs = Fsc*572/455 = 4,499,999.43 Hz Fh = Fsc*455/2 = 15,734.264 Hz Fv = Fh*2/525 = 59.94 Hz The new standards for Fh and Fv are 0.1% lower than the B&W standards. However, the tolerances on Fsc are +- 10Hz, or about 3 ppm. Since Fh and Fv are obtained from Fsc, they now also have tolerances of 3ppm, much tighter than the old standard. The new frequencies plus their tolerances fit within the tolerances of the old frequencies, so the standard is still safe. Also, in the early days of B&W, the station's master oscillator was sometimes locked to 60Hz power line frequency, so that any hum bars on receivers would be stationary rather than crawling up or down the screen. With colour, this was no longer possible, since all of the frequencies must be crystal-controlled with 3ppm tolerance. But locking to the local power line wasn't possible if you were broadcasting a network program, so stations generally weren't dependent on this anyway. So there's the story of where the NTSC frequencies come from. All of this brings up my favourite way of calibrating a frequency counter, or similar time measuring device: Pick up a commercial broadcast TV station, and feed the demodulated video to a video sync generator that will genlock to an external signal. Measure subcarrier out from the sync generator. (Or, just sample the subcarrier oscillator on your colour TV while receiving a network station). The signal you get is 3579545 +-10 Hz, virtually guaranteed. Check several networks to be sure. Adjust to minimize average error if you like. (Avoid the local cable company's own channels - they may not be as careful as broadcasters are required to be.) Dave Martindale
dmc@videovax.Tek.COM (Donald M. Craig) (01/14/87)
I'm not sure how much people want to know about the origins of NTSC, but there was a time when it didn't look like there would be a compatible color system at all. The following information is excerpted from a paper by Walt Bundy, Chairman of the Specialist Group on NTSC Specifications of the Advanced Television Systems Committee. Black and White TV and NTSC-1 The first National Television System Committee met during 1940 and 1941. The group's work resulted in the submission of "transmission standards for commercial television broadcasting" to the Federal Communications Commision (FCC) at a hearing on March 20, 1941. The FCC made the NTSC-1 recommendations the standards for "monochromatic transmission systems", in May 1941. Commercial television broadcasting began in the United States on July 1, 1941. Color TV and FCC During 1950, the FCC, after eight months of hearings and about 10,000 pages of testimony, selected the incompatible field sequential color TV system. The Commission's decision was challenged in the courts, with the US Supreme Court upholding the validity of the FCC decision on May 28, 1951. The challenges to the field sequential color tv system at both the Commission and the court were based primarily on the fact that the field sequential color tv system was incompatible with the then existing black and white tv sets. The commission apparently felt that the talk about compatible color systems was just a way of delaying Commission action. On June 11, 1951, the FCC in Public Notice 656008 stated that it was to be the field sequential color tv system until someone could come up with a better (but not necessarily compatible) system. NTSC-2 The second National Television System Committee (NTSC-2) was started during 1950 by the Radio Television Manufacturers Association (RTMA) in hopes that the committee would be as effective as was the committee of 1940-1941. This group (which is often referred to as the RTMA group) did make comments to the FCC during the 1949-1950 color hearings, suggesting a series of tests for color systems. The NTSC/RTMA committee also set up and Ad Hoc group to study the then state-of-the-art. From this Ad Hoc group came the broad ideas of using the 1941 B&W standard for the brightness information and adding color (painting) information using a subcarrier. With the release of the FCC Notice on June 11, 1951, the Chairman of the committee recommended that NTSC2/RTMA be reorganized as a group to achieve the optimum standard for commercial color television. This reorganized NTSC2 met 26 times between June 18, 1951 and September 1, 1953. The committee's only real goal was a compatible color tv system. NTSC2 had a main committee of about 40 members with 10 panels or working groups. The writing of the present US color tv system came from the work of panel 13, which did the video, and panel 14 which did the synchronizing pulses. (Panel numbering started at 11.) Sync and Burst There were many suggestions to panel 14 on where to put the synchronizing burst including: tip of sync, after sync with pedestal, after sync without pedestal, everywhere. Panel 14 liked tip of sync until the broadcasters explained that the operation of tv transmitters is not linear in the sync region. The next choice was after sync with a pedestal and was recommended for the first field test. The burst frequency was 3.898125 MHz. The burst after sync with pedestal caused problems for the then state-of-art B&W tv sets. The revised specifications for field test called for burst after sync without pedestal and a burst frequency of 3.579545 MHz. Operating with burst after sync without pedestal caused "brightening of the retrace" and was called for in the report of panel 15 Sub-Committee on Test Procedures (4-21-1952) as necessary for compatibility "..., the the signal is compatible if, (1) the horizontal sync pulse is widened to 8%, and (2) the burst pedestal is removed". Color Video To eliminate "brightening of the retrace", panel 13 then recommended a 'setup' of seven and a half percent, which would raise the black level of the picture above the burst and blanking levels. Setup was already in use in 1953 by the B&W standard because any video that went near or below the blanking level would cause some then state-of-the-art tv sets problems. (The elimination of setup is currently under discussion in various standards committees, here in 1987.) Of the 27 B&W sets used for subcarrier compatibility testing at RCA labs (October 23, 1952), 5 sets had separate carrier (IF) sound and 13 of the sets (all inter-carrier) used an IF frequency of 20 or 21 MHz. When using the 3.898125 MHz. subcarrier the separate sound sets and some intercarrier sets had a 608 KHz. beat signal in the video (4.5 MHz. Sound - 3.89 MHz. Color = 608 KHz). When using the 3.58 MHz. subcarrier the beat frequency became 920 KHz. which was a finer pattern. The early color signal specifications had "Color Phase Alteration (CPA)", which was later dropped because of flicker problems. NTSC and FCC On February 2, 1953, NTSC-2 had approved for publication a recommendation for transmission standards for color television. At the meeting on June 24, 1953, the full committee was aware that RCA was planning to jump the gun and file a petition for rule making. The next day, June 25, 1953, RCA and NBC did file a petition for rule making. The RCA-NBC color television system was in fact identical to the NTSC-2 approved recommendation. On July 8, 1953, Rosel Hyde, Chairman FCC, wrote to NTSC, asking for field testing information. At the full NTSC committee meeting of July 21, 1953, the NTSC Petition was ready and a motion was passed sending the petition to the FCC. The NTSC petition was filed with the FCC on July 23, 1953. Final Report The FCC issued a Notice of Rule Making (Docket 10637, Amendment of the Commision's Rules Governing COlor Television Transmissions) on August 7, 1953. NTSC2 held its final meeting on September 1, 1953, and issued a final report. From the RCA-NBC petition: "4. The color standards proposed in this Petition are technical signal specifications approved February 2, 1953, by outstanding engineers and scientists of the radio and television industry, including members of Petitioner's staff, through the National Television System Committee (NTSC). Petitioners know of no responsible engineer or scientist in the radio or television field who proposes adoption of any other color standards." Well, as to the last part... Dana Griffin of Communication Measurements Labs and Paul Raibourn of station KTLA were in there to the last, fighting for a Line/Field Sequential Color System. The NTSC's answer was that line and field sequential color systems were fixed systems whereas the NTSC specification could grow. As an example, from a report of A. V. Loughren, Chairman, Ad Hoc Subcommittee of NTSC, 8-12-53: "The committee concludes that a set of standards proposed for television broadcasting sould have the expectation of long-term utility, thus satisfying the optimists of 1953 and presumably also the pessimists of 1973; in brief, the basic consideration is `Don't sell the future short'". -- Don Craig dmc@videovax.Tek.COM Tektronix Television Systems ... tektronix!videovax!dmc