Using a HDTV LCD TV for a PC monitor...

Martin127954

Leading Member
Messages
861
Reaction score
21
Location
US
I am buying a 45 inch LCD HDTV soon. It has a native resolution of 1920 X 1080. Among the many signal inputs on the back is a DVI port.

I use a 1920 X 1200 LCD monitor for my PC already. I want to use the new TV in the living room as another PC monitor as well.

The PC will recognize the odd 1080 vertical resolution of the TV, right????

If anyone else has done this, a reply of experience would be greatly appreciated!

-Martin
 
The video card is not going to be the issue. It can display even higher resolutions.

I'm concerned about the HDTV talking with the PC to reveal the 1080 vertical resolution.

Unfortunately the online manual is vague about this.

I guess I am going to have to find a store that sells the TV and try it.
 
Several have bought the new Sceptre 37" HDTV (1080P). This a Question and Answer review from actual users who have recently bought this newly released 1080P LCDtv. Simply put, when connecting via DVI, the resolution will be autosensed and thus the default resolution of the outpot monitor is used (which for the Sceptre is 1920 x 1080).

http://www.avsforum.com/avs-vb/showthread.php?t=573173&page=1&pp=30

Copy protection / DRM is being mentioned for displaying HD content while running the forthcoming Windows Vista OS. Also there is mention that future PC videocards will have HDMI ports rather than DVI ports for DRM reasons. So if you intend to buy any display, make sure that it has HDMI ports.
 
Thanks so much for providing the answer and the links.

I figured this was the case but did not want to assume. I was looking at the Sharp Aquos 45" LCD. (LC-45GD4U) AS far as I know this is the biggest LCD with 1080P resolution other than a $28,000 60+ inch set. Do you know of any other new models coming out?

Also, my gut tells me to avoid plasma. Do you agree? Some say that it would be 4 years before I would notice a reduced picture quality but if I also use the TV for PC usage then I figure the burn-in issue could become a big deal.
 
I thought you meant that TV! I have been looking at it since it came out, but in no way can afford it. The first 1080p HDTV that came out I think. Anyway, it does depend also (and mostly) on your video card. That TV takes DVI, if I'm not mistaken, so there should not be any problems with the TV, and also they advertise that it can be connected to a PC.

Now, your PC video card should be able to output the not-so-common 1920x1080 resolution, at least at 60Hz. Mind you, not just high resolultions, but this exact one. If you got say 1600x1200, and then it just jumps to 1920x1200, you won't be having 1:1 resolution, and the picture will be distorted and blurry. If you can't do this with your card, try updating to the latest drivers. My Radeon 9800 pro can do this 1920x1080 resolution since I updated I don't know how long ago, but the original drivers couldn't do it.

And finally, if you haven't bought it yet... at this time, if I had the money, I would definitely not buy plasma, the burn-in issues are worse than CRT. But I wouldn't buy anything anyway (yet). Canon and Toshiba are coming up with SED TV's, if you haven't heard it it's the Holy Grail of TVs. Color and contrast (and picture overall) of CRTs and flat panel! I would wait for these. (Again, if I had the money.)

Also, if you're interested, read my post above about DRM and copy protection the other person brought up.
Thanks so much for providing the answer and the links.

I figured this was the case but did not want to assume. I was
looking at the Sharp Aquos 45" LCD. (LC-45GD4U) AS far as I know
this is the biggest LCD with 1080P resolution other than a $28,000
60+ inch set. Do you know of any other new models coming out?

Also, my gut tells me to avoid plasma. Do you agree? Some say that
it would be 4 years before I would notice a reduced picture quality
but if I also use the TV for PC usage then I figure the burn-in
issue could become a big deal.
 
Hi,

Just to clarify, DVI and HDMI are interchangeable. There are cables that transport (not convert) the video signal from an HDMI end to a DVI end effectively and efficiently, and viceversa. HDMI is just a glorified DVI port. The difference is for convenience: HDMI carries high bandwidth digital audio also, so while it has a higher bandwidth than DVI overall, for just video data required for HDTV it is the same, and by the same I mean completely interchangeable.

So actually what you mean is buy a display that has HDMI OR DVI ports (though HDMI is better, DVI will work too) with HDCP (it is a copy protected digital DVI or HDMI port). If your display or HDTV does not have HDCP enabled digital ports, you won't be able to display digital HD content that is protected, and believe me, MANY movies are gonna be that way. HD-DVD, if it ever makes it, is gonna use HDCP exclusively for digital connections, and won't be able to output HD analog signals (like via component or RGB). It will downconvert any protected analog signals to 480p (doesn't it SUCK!). Word "on the street" is Windows Vista will make use of this too, so your next generation video card should have HDCP. And Blu-ray will most likely do the same as HD-DVD, (c'mon, at least give us analog HD!)

So, basically, in the future, we proletarians who don't have a fancy HDCP enabled display won't be able to watch most HD movies in blu-ray or HD-DVD unless we buy a new set! SCREW YOU HOLLYWOOD AND YOUR CRAPPY MOVIES! HAHA I'll only watch Japanese and European and Latin American movies now! You know, the ones that actually want their movies SEEN by people, rather than just SOLD.

Disclaimer: the use of the word "proletarians" was a joke. If you live in the US (like me) and feel like stoning me for being a "dirty commie" please bear that in mind, and don't.
Copy protection / DRM is being mentioned for displaying HD content
while running the forthcoming Windows Vista OS. Also there is
mention that future PC videocards will have HDMI ports rather than
DVI ports for DRM reasons. So if you intend to buy any display,
make sure that it has HDMI ports.
 
Actually I was talking about that (Sharp) TV and initially I thought that the other poster's link provided the conclusion but turns it it may not have. See at first I thought that maybe the 1920X1080 res might just be a setting unsupported by PC video cards. At least that would explain why the Sharp LCD's manual from the web only lists STANDARD resolutions up to I believe 1280X... After reading the link in this thread about the other TV from Sceptre I saw that that TV could indeed display 1920X1080 from a PC's DVI port. I checked ATI's site about my video card and it also confirmed that supported resolution.

The trouble is that the Sharp manual does not explicitly show that res in it's supported res chart! Bummer! So I have gone to MicroCenter and BestBuy and so far I can't get anyone who will take the time to hook a PC up to it and actually try 1920X1080 on it. So until I can find out for sure I am starting to rule this TV out.

The resolution is VERY appealing but mostly from the occaision use as a PC monitor. (Not too much of a gamer.) I plan to make my own Tivo-type system and that would be the majority of the PC connection to the HDTV.

So if I can't utilize ALL of the Sharp's resolution then I figure I may as well also consider a PLASMA in my pre-purchase research. Since the TV would not be a heavily used PC monitor the burn-in issue is not too much of a worry. Also the fact is that HDTV brodcasts really don't pump out the full resolution now anyway! It is 720p or 1080i usually. 1080p is not being done. Even watching a DVD is only 480 lines before upscaling. Also many of the "HDTV broadcasts are just lower-res signals upscaled to 1080i.

It will probably be many years before the highest HDTV resolution is even close to mainstream. So in that regard maybe a new plasma is the better choise in my case. I have seen that the new Hitachi's coming out this fall will have a life rating of 60,000 hours. That's many, many, many years of viewing. As long as I don't abuse the TV by falling alseep with it on then it should probably be fine.

...Still weighing all the facts....
I thought you meant that TV!
 
Thanks for the very useful information.

For anyone interested in the 45" Sharp, it does have DVI & HDMI inputs.
 
Wait for next year... be patient. Aside from SEDs there are other technologies coming out, as well as a great number of 1920x1080 hdtvs that I'm sure will be able to display your PC's 1920x1080 natively. I am still unsure if the Sharp actually doesn't, because I remember having looked for that information myself, you know, while dreaming. In those dreams I had the TV for exactly the same purpose as you. But I am happy and I am rooting for you. So my dreams now are about having, say, a 42"-50" 1080p SED TV connected to my PC. So I recommend to wait. Plasmas have lower resolution than LCDs generally. They are more limited, and burn-in is accentuated when connected to a PC.

Is there an online manual of the sharp where I could read the info you have? I would like to know for sure now. It seems to me that if the display and the card are on the same page (1920x1080 @ 60Hz output and 1080p input via DVI) they should be able to mingle well. I am 90% sure about that. Also, my ATI Radeon 9800pro's latest drivers allow me to force HDTV resolution, 1080i or 720p. I am 98% sure that would work if 1080p does not. 1080i (yes, that's interlaced) even works with my lowly CRT PC monitor via VGA connection. Which card do you have? If you get the ATI VGA to component adaptor, if your card allows it and you can force 1080i, you can output native 1080i res through the HD component inputs. That is 99.99999% sure (leaving some uncertainty always present in our quantum universe). If those inputs take 1080p, you can do the same without having to force interlaced.
 
Thanks again for all the good points.

It will actually be a couple of months before I buy a TV but I don't think I'll be willing to wait for the SEDs to be introduced, perfected, and reduced in price. (I read that they are going to initially be somewhat linited in production and thus very costly as most new technoogy is.)

About my video card: it is an ATI and it does support the 16:9 1920X1080p signal. You asked about the Sharp manual...here it is:
http://www.sharpusa.com/files/tel_man_LC45GD6U.pdf

About plasma display issues and myths stemming from the earlier stages of the technology: The burn-in and lifespan issues are really not true issues anymore. For example Hitachi & Panasonic plasma panels now have a life of 60,000 just like LCDs. What does this mean? It means that the estimated time to wear down to HALF of the display's original brightness level is 60,000 hours. That is an extremely long life for a TV.

Also, the burn-in issue is largely a myth too. It is in fact possible burn-in a game menu or something like that after hours and hours and hours and hours of abusing the TV. BUT, it normally will go away after a day or so -- really. What the REAL issue is is something like always watching a 4:3 ratio signal for month after month. This should not be a big deal with ever increasing wide-screen signals.

Read page 8: ftp://ftp.panasonic.com/pub/Panasonic/Drivers/PBTS/papers/Plasma-WP.pdf

See also: http://www.pioneerelectronics.com/pio/pe/images/portal/cit_3424/273087528Pioneer%20DTV%20White%20Paper%20-%20FINAL.pdf
 
Sorry but that paper on the Pioneer website (how many pioneer LCDs are there in the market?) is not too convincing. I can believe the burn-in healing itself (although a minor issue, still an issue), but the black levels being better as a "reference" CRT? I haven't heard anything like that, or even better, seen anything like that with my own eyes. I bet the FD trinitron-based (it is not a Sony) monitor I got for free from a friend and which I am staring at right now has better black levels than any $5000 plasma. They don't say which monitors and displays exactly they tested, and they are somewhat ambiguous on their claims. I read a much better conducted and much more professionally balanced test on http://www.extremetech.com that says otherwise.

Also, having just looked at the pioneer usa website, it seems the link to that paper is everywhere, as if it was one desperate attempt of vindication for plasma. I know the benefits of plasma over LCD and viceversa, but for picture quality alone I don't think any plasma or LCD can beat a good CRT monitor. But SEDs may or may not. Right now for me LCDs are the better compromise, higher resolution, and good-enough color and contrast. And no burn-in, even if it is reversible.

Anyway the issue for me still would be cost and native resolution. I believe SEDs will probably be expensive, but I also read some arguments saying they will try to keep costs down, and that they can actually do it, in order to be competitive with the already established plasma and LCD technologies. That also makes sense to me. We'll see.
Thanks again for all the good points.

It will actually be a couple of months before I buy a TV but I
don't think I'll be willing to wait for the SEDs to be introduced,
perfected, and reduced in price. (I read that they are going to
initially be somewhat linited in production and thus very costly as
most new technoogy is.)

About my video card: it is an ATI and it does support the 16:9
1920X1080p signal. You asked about the Sharp manual...here it is:
http://www.sharpusa.com/files/tel_man_LC45GD6U.pdf

About plasma display issues and myths stemming from the earlier
stages of the technology: The burn-in and lifespan issues are
really not true issues anymore. For example Hitachi & Panasonic
plasma panels now have a life of 60,000 just like LCDs. What does
this mean? It means that the estimated time to wear down to HALF of
the display's original brightness level is 60,000 hours. That is an
extremely long life for a TV.

Also, the burn-in issue is largely a myth too. It is in fact
possible burn-in a game menu or something like that after hours and
hours and hours and hours of abusing the TV. BUT, it normally will
go away after a day or so -- really. What the REAL issue is is
something like always watching a 4:3 ratio signal for month after
month. This should not be a big deal with ever increasing
wide-screen signals.

Read page 8:
ftp://ftp.panasonic.com/pub/Panasonic/Drivers/PBTS/papers/Plasma-WP.pdf

See also:

http://www.pioneerelectronics.com/pio/pe/images/portal/cit_3424/273087528Pioneer%20DTV%20White%20Paper%20-%20FINAL.pdf
 
Yes, I meant making sure any display purchases one is contemplating has HDMI WITH HDCP. Of course the various industries may decide on another port. (DisplayPort seems to be a replacement for VGA and DVI)

http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=14831

http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=14770

DRM issues and implementation haven't been finalized (For example, US federal court shot down the Broadcast Flag requirement and but DRM advocates are resorting to Congress to pass DRM legislation). What DRM equipment, connectors/ports will be allowed to carry protected DIGITAL HD signals? Because of the DRM issues and connectivity issues haven't been settled, I've decided to wait before spending thousands on a nice TV and subsequently discovering that because the TV didn't have the proper equipment, I am not allowed to watch the protected signal, and instead end up watching a blank screen or a lower-resolution version. That's why, although people are falling in love with and spending $$$$ on these HD displays, they might be rudely surprised and furious if, in a couple of years, these expensive displays can't show DRM protected video because it didn't have the proper DRM equipment and the attitude of the DRM advocates is "We're so sorry, but hey, we'll let you watch a blank screen / lower resolution version instead which pushes you to buy a NEW DRM-enabled TV!!!!!".

So my approach/advice is:

Of course, you can still go ahead and buy that 45" display and gamble that you won't have to buy another TV in a couple of years by just buying some add-on device to let you watch DRM-protected video--but that's a gamble. Unless you really absolutely, desperately NEED to have that 45" display--

it would be your advantage to save your money and wait for these DRM issues to be settled. Let others be the guinea pigs and have to deal with these issues because they bought too early. In the meantime, technology gets better and prices keep falling. So in several years, you will most likely afford a much better display at a cheaper price. (As a perspective, a 20" analog LCD monitor cost > $5500 USD in year 2000; a Dell 20" digital LCD monitor on sale for
 
You are right, I had forgotten completely about DisplayMate! WHAT THE HELL! Freakin' Hollywood at it again. This is another very good reason to wait for at least next year to buy a big screen flat panel. But still I don't see why they had to "develop" displaymate, what the hell is wrong with HDMI? They are even shoving HDCP down our throats and up our asses, isn't that enough?
http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=14831

http://www.cdrinfo.com/Sections/News/Details.aspx?NewsId=14770

DRM issues and implementation haven't been finalized (For example,
US federal court shot down the Broadcast Flag requirement and but
DRM advocates are resorting to Congress to pass DRM legislation).
What DRM equipment, connectors/ports will be allowed to carry
protected DIGITAL HD signals? Because of the DRM issues and
connectivity issues haven't been settled, I've decided to wait
before spending thousands on a nice TV and subsequently discovering
that because the TV didn't have the proper equipment, I am not
allowed to watch the protected signal, and instead end up watching
a blank screen or a lower-resolution version. That's why,
although people are falling in love with and spending $$$$ on these
HD displays, they might be rudely surprised and furious if, in a
couple of years, these expensive displays can't show DRM protected
video because it didn't have the proper DRM equipment and the
attitude of the DRM advocates is "We're so sorry, but hey, we'll
let you watch a blank screen / lower resolution version instead
which pushes you to buy a NEW DRM-enabled TV!!!!!".

So my approach/advice is:
Of course, you can still go ahead and buy that 45" display and
gamble that you won't have to buy another TV in a couple of years
by just buying some add-on device to let you watch DRM-protected
video--but that's a gamble. Unless you really absolutely,
desperately NEED to have that 45" display--
it would be your advantage to save your money and wait for these
DRM issues to be settled. Let others be the guinea pigs and have
to deal with these issues because they bought too early. In the
meantime, technology gets better and prices keep falling. So in
several years, you will most likely afford a much better display at
a cheaper price. (As a perspective, a 20" analog LCD monitor cost
$5500 USD in year 2000; a Dell 20" digital LCD monitor on sale
for
 
But still I take advice from objective, sensible and professional testing. To determine what's objective, sensible and professional seems to be subjective, though. If you need another reason to wait, read below about DRM, we're just trying to help.

Hope everything works out.
 
I meant to say displayport... It's just that right before I was reading an article where they used displaymate testing for some monitors...
 

Keyboard shortcuts

Back
Top