Why TVs have no problems with interlaced video?

flektogon

Veteran Member
Messages
7,674
Solutions
8
Reaction score
5,719
Location
Ontario, CA
Well, this is my question, as I am not still sure why it is so. Yes, the old, CRT TVs used such interlaced projection as their native, but I assume that contemporary LCD or (O)LED TVs have implemented progressive projection, exactly like the LCD monitors in combination with the video cards used by computers.

In spite of it, when I watch my videos taken with my rather old Sony CX-12 camcorder, which can record only 1080/60i videos, on my (modern) TV those videos are perfect, while watching them on my PC is a disaster. Problem is apparently the de-interlacing, which works perfectly on the TV, but poorly on the PC. This is visible mainly during any horizontal movement or panning. And it has nothing common with the global shutter, as the vertical lines do not lean into any side, but the upper part of picture appears like to be sharply bent to one side and this bent area is moving slowly up. I tried the MPC-HC and VLC Media player, but both behave identically. No such problems when watching video on TV.

Regards,

Peter
 
TV is designed for video. It has many systems to improve moving video.

Computer monitor is quite bad showing video. If you have perfect 60P 4k video the monitor can show less than 0.5k when there is movement in image.

60i needs proper de-interlacing. I use BOB (frame doubling). It gives best motion but if there is little motion the FIELD BLEND gives sharper video.
 
TV's have built-in deinterlacing engines. In the USA, almost all stations broadcast in 1080 interlaced, except for ESPN, which broadcasts in 720 progressive. Search for a program that will display your video as deinterlaced as it plays, or better yet, run your videos through some software that will permanently deinterlace it (HandBrake or most video editing software).
 
Thanks Vesku and Andrew. So you think that the TVs have special de-interlacing engines built in. Why no video card manufacturer provides something similar for the computers? I tried all possible software methods, including BOB, but the results are incomparable with what the TV can do.

Regards,

Peter
 
Thanks Vesku and Andrew. So you think that the TVs have special de-interlacing engines built in. Why no video card manufacturer provides something similar for the computers? I tried all possible software methods, including BOB, but the results are incomparable with what the TV can do.
What is exactly the issue or difference with BOB compared to TV.
 
Thanks Vesku and Andrew. So you think that the TVs have special de-interlacing engines built in. Why no video card manufacturer provides something similar for the computers? I tried all possible software methods, including BOB, but the results are incomparable with what the TV can do.
What is exactly the issue or difference with BOB compared to TV.
Video is very jumpy, in spite that my PC is equipped with the I7 processor (though of the 3-rd generation). With a better video card (currently I have AMD Radeon HD 5700) it might be less jumpy, but the TV displays an absolutely fluent video.

Peter
 
Thanks Vesku and Andrew. So you think that the TVs have special de-interlacing engines built in. Why no video card manufacturer provides something similar for the computers? I tried all possible software methods, including BOB, but the results are incomparable with what the TV can do.
What is exactly the issue or difference with BOB compared to TV.
Video is very jumpy, in spite that my PC is equipped with the I7 processor (though of the 3-rd generation). With a better video card (currently I have AMD Radeon HD 5700) it might be less jumpy, but the TV displays an absolutely fluent video.

Peter
It should be smooth. 1080 60i is not hard to show properly.

It may be field order issue. What player you are using? Can you try VLC or Potplayer. They have many different de-interlacing options.
 
Thanks Vesku and Andrew. So you think that the TVs have special de-interlacing engines built in. Why no video card manufacturer provides something similar for the computers? I tried all possible software methods, including BOB, but the results are incomparable with what the TV can do.
What is exactly the issue or difference with BOB compared to TV.
Video is very jumpy, in spite that my PC is equipped with the I7 processor (though of the 3-rd generation). With a better video card (currently I have AMD Radeon HD 5700) it might be less jumpy, but the TV displays an absolutely fluent video.

Peter
It should be smooth. 1080 60i is not hard to show properly.

It may be field order issue. What player you are using? Can you try VLC or Potplayer. They have many different de-interlacing options.
I am using VLC and the best results are with the Yadif. Still far away from what my TV delivers.

Peter
 
How do TVs de-interlace, do they run at twice the frame rate and present each field
as a frame or use the more complex adaptive de-interlacing that only de-interlaces
things that move within the frame. A shot of a static scene in i is the same as P.

Adobe Media Encoder seems to use adaptive with an i source and P destination.

Most broadcasting uses 1080i but this won't interlace a progressive source, it can't
as each frame was shot at the same time. 1080i simply carries the P in a i framework
that can displayed with little damage. The TV could unpack this back to P if it's
smart enough to see no time difference between lines. In Aust all sports, multicam shows are shot in i for greater fluidity of motion, 1080p50/60 tx would make interlaced capture obsolete.

This is called PSF, progressive segmented field.
 
How do TVs de-interlace, do they run at twice the frame rate and present each field
as a frame or use the more complex adaptive de-interlacing that only de-interlaces
things that move within the frame. A shot of a static scene in i is the same as P.

Adobe Media Encoder seems to use adaptive with an i source and P destination.

Most broadcasting uses 1080i but this won't interlace a progressive source, it can't
as each frame was shot at the same time. 1080i simply carries the P in a i framework
that can displayed with little damage. The TV could unpack this back to P if it's
smart enough to see no time difference between lines. In Aust all sports, multicam shows are shot in i for greater fluidity of motion, 1080p50/60 tx would make interlaced capture obsolete.

This is called PSF, progressive segmented field.
Thanks again for the explanation. Well, you are professional no doubts. But, my TV (the first generation LG 55" OLED) can display absolutely flawlessly even the genuine interlaced video taken by my Sony CX-12 camcorder (1080/60i), even though it displays just 60Hz (progressive) images. So, it has to have some very efficient de-interlacing engine built in. I assume that it is some special H/W processor/unit, which is doing it, because the CPUs they use are far less powerful than what PCs use. If yes, my question is why a similar H/W circuit is not built into any video card used by computers. All those methods used for de-interlacing in PCs (laptops) are software based, and their quality depends grossly on the speed of your computer. Actually I do not know what is more important here, the central processor speed or the video processor speed, but I assume that the former.

Regards,

Peter
 
Thanks again for the explanation. Well, you are professional no doubts. But, my TV (the first generation LG 55" OLED) can display absolutely flawlessly even the genuine interlaced video taken by my Sony CX-12 camcorder (1080/60i), even though it displays just 60Hz (progressive) images. So, it has to have some very efficient de-interlacing engine built in. I assume that it is some special H/W processor/unit, which is doing it, because the CPUs they use are far less powerful than what PCs use. If yes, my question is why a similar H/W circuit is not built into any video card used by computers. All those methods used for de-interlacing in PCs (laptops) are software based, and their quality depends grossly on the speed of your computer. Actually I do not know what is more important here, the central processor speed or the video processor speed, but I assume that the former.

Regards,

Peter
I have never had any problems watching my GH2 60i videos with different computers. Motion is smooth with BOB de-interlacing. Motion looks much clearer and sharper in TV but it is smooth and fluid in both.
 
I am using VLC and the best results are with the Yadif. Still far away from what my TV delivers.

Peter
I've also had issues with watching interlaced video. I used VLC with Yadif 2x but as my PC just wasn't fast enough it didn't work well. With an i7 however it should work. There might be a problem with your graphics settings (GPU drivers, is hardware acceleration enable in VLC, ...)?

If you convert the video (with Handbrake or so) to a progressive format (30p or 60p, with the same bitrate as your original footage), does is work any better?
 
Last edited:
How do TVs de-interlace, do they run at twice the frame rate and present each field
as a frame or use the more complex adaptive de-interlacing that only de-interlaces
things that move within the frame. A shot of a static scene in i is the same as P.

Adobe Media Encoder seems to use adaptive with an i source and P destination.

Most broadcasting uses 1080i but this won't interlace a progressive source, it can't
as each frame was shot at the same time. 1080i simply carries the P in a i framework
that can displayed with little damage. The TV could unpack this back to P if it's
smart enough to see no time difference between lines. In Aust all sports, multicam shows are shot in i for greater fluidity of motion, 1080p50/60 tx would make interlaced capture obsolete.

This is called PSF, progressive segmented field.
Thanks again for the explanation. Well, you are professional no doubts. But, my TV (the first generation LG 55" OLED) can display absolutely flawlessly even the genuine interlaced video taken by my Sony CX-12 camcorder (1080/60i), even though it displays just 60Hz (progressive) images.
It could still be displaying each field as a frame with 60.
So, it has to have some very efficient de-interlacing engine built in. I assume that it is some special H/W processor/unit, which is doing it, because the CPUs they use are far less powerful than what PCs use. If yes, my question is why a similar H/W circuit is not built into any video card used by computers. All those methods used for de-interlacing in PCs (laptops) are software based, and their quality depends grossly on the speed of your computer. Actually I do not know what is more important here, the central processor speed or the video processor speed, but I assume that the former.
Yes probably some dedicated HW de-interlacer. For TVs unlike computers they would
want interlaced to look good as sports and all the legacy programming is i.

best

Peter
Regards,

Peter
 

Keyboard shortcuts

Back
Top