Is digital output for LCD noticeably better for viewing

Searching

Senior Member
Messages
4,136
Reaction score
1,256
Location
CA
than analog output for LCD. Is it worth the upgrade?
 
Some LCD's do analog better than others, and can get close to the quality of DVI, and some video cards put out better analog than others. If you have a video card that puts out good analog, and an LCD monitor that displays analog well, then you might just want to stick with it.

If you already have an LCD that supports DVI, and just need a new video card, it might be a worthwhile purchase.

If you need a whole new LCD monitor, and a new video card, it might not be worth the expense, but it depends on how good or bad your current combo is. If you do need both, then since almost all cards with DVI also have VGA, you can hook your new monitor up to the DVI, and your old monitor up to the VGA, and have twice as much work space.

--
 
I dthink you saved me some money. My analog card and analog LCD (samsung SyncMaster 715V) seem pretty decent to me. Don't think it's worth the upgrade unless I redo the whole PC.
 
This may not be a popular comment, but I would think the upgrade of the most value would be to go to a CRT monitor. In my experience LCD's don't cut it for photo editing.

Ron
 
I agree. But where you buy a good CRT monitor??? Here in Poland we can choice only between poor quality (cheapest) CRT monitor or LCD. I own Sony CPD-E200 right now but it's close to death – I repair them 2 times over 3 month...... And now sad news: no one produce a good CRT monitors right now!!!!! So when time will came I MUST (because I have no choice) buy LCD.... Thx to all “must be good because it’s new” maniacs ....
 
I have heard this many times before, that for critical work and colours, CRTs are better. I am colour blind for a start so I really get mixed up with colours. I prefer the brightness of the LCD. Once you get used to it it's great, I don't think I'd ever go back to CRT.
 
Probably depends upon the monitor... I have a 19 in. Viewsonic VX 900 (which takes both inputs) , and a ATI video card that outputs both. I used to use the DVI, but had to change to analog when I hooked up my work laptop thru a KVM switch. (the laptop has only analog output, and I could not find a KVM swith that handled both - it was one or the other) I was really expecting to see some significant degradation, but to be honest I can't see any difference at all.
 
This may not be a popular comment, but I would think the upgrade of
the most value would be to go to a CRT monitor. In my experience
LCD's don't cut it for photo editing.

Ron
Like most things, this depends very much on the monitor and the display adapter driving it. My desktop machine has a Samsung 191T driven in digital mode from a Matrox G550. This runs at 1280 x 1024, and I think that it is VASTLY better for photo editing than the top-of-the-line 19" Nokia CRT that it replaced.

Now, my laptop has a 1400 x 1050 LCD screen, but that is almost totally useless for photo editing as the perceived image is VERY MUCH a function of the viewing angle, something which the Samsung is essentially independent of.

Harvey
 
Exactly what I wanted to know.
 
I have two Viewsonic VG710 monitors and my video card has one digital and one analog output. I've spanned a photo across both monitors and their is no visible difference.
 
I have a home built P4 desktop and a T40 laptop. Both have DVI outputs but I choose to use the VGA cable to drive the new Dell 2005FPW monitor. I tried between DVI and VGA but could not see a huge difference. The text was slightly sharper.

Two reasons to use VGA instead DVI are:
1. The DVI capable USB+KVM switches are both expensive and premature.

2. The DVI connection disables the color adjustments which are used by monitor calibration device, EyeOne Display 2, for better control.

In the near future, the USB+DVI KVM switches will mature and cost much less. The future monitor calibration software should also be able to adjust color levels digitally better than it is today.
--
Nelson
http://pbase.com/nelsonc
 
I kind of like the monitor I have.
 
For me DVI was leaps and bounds above analog. I have several monitors, Dell, LG and Iiyama and on all of them the DVI is better than the analogue. I used High quality analogue cables too - no 4m unshielded things which will end up giving you ghosting.

http://www.interfacebus.com/Design_Connector_Digital_Visual_Interface_DVI_Bus.html

shows the different types of DVI connector. From my understanding not all DVI is the same, some will actually transmit analog across the DVI connector - correct me if I am wrong.

The card I have in this machine (Dell Optiplex GX-280) is a dual link - so it has one connector that you connect a y-cable to and you get two monitors from the one connector on the graphics card. I've only seen this on Dell machines so far.

When I used analog I had 'softness' on the screen in three vertical stripes of about 1cm wide. It was almost as if the monitor was out of focus. Although this monitor was a cheap (£400 a year ago) 19" LG.

YMMV but I hope that helps.
 

Keyboard shortcuts

Back
Top