Author Topic: Is it possible to tweak the time base error corrector of a CRT TV?  (Read 504 times)
Binarix128
Member
*****
Offline

Gender: Male
View Posts
View Gallery
220V AC 50Hz; 480i59.94 NTSC


GoL UCOUT2noI2R__jgPSJUjGRtA
WWW
Is it possible to tweak the time base error corrector of a CRT TV? « on: May 09, 2021, 12:16:36 AM » Author: Binarix128
I wonder if it's possible to decrease the tolerace of a CRT TV to time base errors of a video input, like CRT TVs that where used in broadcast. I want that for see how screwed up the video output is from a mechanical accesed media like a VCR or a video/hi 8 handycam.
Logged

LED will never beat other lamps. :inc: :hps: :mv:

Medved
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Re: Is it possible to tweak the time base error corrector of a CRT TV? « Reply #1 on: May 09, 2021, 01:11:06 AM » Author: Medved
I wonder if it's possible to decrease the tolerace of a CRT TV to time base errors of a video input, like CRT TVs that where used in broadcast. I want that for see how screwed up the video output is from a mechanical accesed media like a VCR or a video/hi 8 handycam.

If you mean the lock-in range for the synchronization, usually it is not possible, at least without loosing the ability to sync whatsoever.
Primarily the concept of the wide PLL lock in range is used to avoid the need of accurate components in the TV scan timing circuits, because these component tolerances are the main "consumer" of the tolerance band.
Maybe some old vacuum tube TVs may have the sync range narrower (that is, why they needed the user accessible controls to adjust the free running frequencies), the more modern TVs all relied on the wide lock in range.

Maybe some analog CRT TVs used crystal controlled time bases (TVs with one single big chip in the video processing with a single crystal and otherwise very few components around), but these have their parameters really fixed and can not be tuned around (maybe changing the crystal frequency).

But there is another aspect: Even when mechanical, the movement speed of the tape and head drum use to be locked in together by the various tracking schemes and all that uses to be locked in to an accurate crystal reference, so selfadjusting so the signal output is time-wise very accurate (assume it isn't slipping, so motion is smooth). Without locking the head spin vs tape advance, the picture playback won't work at all. And if these lock ins are there, locking all that onto a crystal cost nothing and gets rid of all adjustment and avoids the need for all the parts to be so precise. 
Logged

No more selfballasted c***

Binarix128
Member
*****
Offline

Gender: Male
View Posts
View Gallery
220V AC 50Hz; 480i59.94 NTSC


GoL UCOUT2noI2R__jgPSJUjGRtA
WWW
Re: Is it possible to tweak the time base error corrector of a CRT TV? « Reply #2 on: May 09, 2021, 02:00:53 PM » Author: Binarix128
I still want the TV to lock vertically, I want it not lock horizontally, so the TV doesn't compensate for the horizontal "jitter". I wonder if that can be tweaked in the service menu without the need of changing components or screwing cans in the circuit. Maybe tweaking the RF AGC in the menu.

Despite mechanically accessed data is controled by tracking pulses which adjust the speed to match the local oscilator reference, the output is far from being as perfect as the output from a digital video generator like a DVD or an HDMI to AV box.

That's why it is ideal to use a time base corrector when capturing VHS, because a huge error in the time base, missing reading or "glitch" might kill the capture device causing it to hang in a frozen frame, not being able to lock again to the signal.

An AV to VGA, HDMI converter or capturer won't show the time base errors because they are entirely digital, and they're "hard wired" to allow a wide error range, if that error threshold is past the device will show a blue screen.

Seems like the only option would be to get a broadcast grade TV.
Logged

LED will never beat other lamps. :inc: :hps: :mv:

Medved
Member
*****
Offline

Gender: Male
View Posts
View Gallery

Re: Is it possible to tweak the time base error corrector of a CRT TV? « Reply #3 on: May 09, 2021, 07:39:24 PM » Author: Medved
I don't think there would be any user menu adjustment for this. With not the most modern analog TVs you may find internal analog components that could be tweaked, but the most modern used a single crystal for all picture processing time base, so not possible to tune at all.
You may have some luck with the modern USB TV receiver dongles, where the dongle just digitizes the IF and the rest is processed in the SW (all the demodulation, sync separation, color decoding, audio demodulation,...), some drivers allow quite extensive tweaking of all the parameters (as all the processing is in the SW anyway).
Same I would expect for the digitizers (same thing: The HW is just sampling the signal at few MHz and the rest is in the SW). Maybe these digitizers would more likely have wider tweaking possibilities, mainly because the main use is just to recover analog records, which use to have various distortions. And for most frequent distortion types (tape uneven stretching,...) the SW will likely already contain some automatic corrections.


Classic CRT TV design uses true PLL to lock horizontally and "advancing by sync pulses" synchronization for the vertical.
That usually leads to wider sync range for the vertical, but narrower for the horizontal.
The reason is, the total accumulated noise causing the jitter of a TV broadcast (= signal generated with perfect timing, but significant noise) is proportional to the bandwidth. For vertical, just simple low pass (units of kHz) is able to reduce the noise so the sync is satisfactory even when nothing is visible on the screen anymore. With 15kHz horizontal, similar direct sync would need the cut out frequency to be set way higher (way above 150kHz), so lead to more noise in the signal, causing too much H jitter.
The way around was the use of a PLL. That allows to generate the 15kHz H frequency, but with noise bandwidth even in few 10's Hz, if the oscillator phase noise would allow it, dictated by the PLL loop frequency response. But this same frequency response dictates how far off the signal could be from the free running oscillator to lock.
Early TV's had just a single, fixed PLL filter, which had to be a compromise between the component tolerance robustness vs the noise bandwidth. So it usually require precise alignment, which becomes problem with long term stability. Even with that the noise bandwidth was rather wide.
The later ICs allowed the complexity of the loop filter to be adaptive, so if PLL was not locked, it was set to faster response, allowing to cover greater component tolerance. Once locked, the response was switched to slow, greatly reducing the bandwidth, so the noise, so improved the picture quality and robustness against signal disturbance.
This allowed very wide lock in range, so lousy so cheap component tolerances, while still offering good jitter performance. Or using a bit more accurate components and then allow really very low jitter.
Many newest analog TVs used similar technique on the V signal as well as for color carrier, all in a form of a large picture processing chip (deflection pulse generation, color decoding), surrounded with just the power components and a single crystal as the main time base for everything.

Some chips contained compensation for time stretching distortion typical for VCRs, where the H and color carrier were tight together but overall allowed to be modulated across the frame.
Or often just the slow response was disabled when switched to VCR display mode.
The modification for the VCRs varied a lot among chipset makers, varying in what deviation they have tolerated. Usually it was compromise about picture quality with not that distorted record, vs how severe the distortion could be to still play at least somehow.
Logged

No more selfballasted c***

Print 
© 2005-2022 Lighting-Gallery.net | SMF 2.0.19 | SMF © 2021, Simple Machines | Terms and Policies