Sorry; I was unclear. Yes: the sync height should be 0.3v and the video part 0.7v to give a 1v p-p signal. So if the signal has lost amplitude in transmission so the sync height is only 0.2v, its auto gain control can apply gain until the sync is 0.3v; that lifts the gain of the whole signal so the picture isn't gloomy. Equally, once the sync is the right height, the colour burst - the reference phase for the colour decoder - should be the same height and again, an AGC can change the gain at the colour burst frequency to achieve that. Otherwise, the colours would be washed out (not enough colour burst) or glaring (too much colour burst.
Broadcast engineers tend to be lazy: they say 'size' when they mean 'amplitude' because they're usually looking at lines on a waveform monitor (which is just a fancy oscilloscope characterised for viewing video signals).
But to be clear: I'm conflating two concepts here. What is used for broadcast video has little reference to a computer monitor, in particular because the signals are (except for 80's home computers) never sent as composite video (i.e. including a colour burst). Almost all (prior to the HDMI era) use analogue RGB signals with separate sync signals. The RGB signals can't be gain controlled automatically because there's no reference with them.
Neil
|