I haven't used a tube monitor in years so I can't compare the
state-of-the art in CRT vs. LCD but I have the Dell 1900FP and love
it. The DVI connector provides a straight digital path from the
graphics card to the monitor.
I see details using DVI that are
lost when going digital to analog (CRT) or digital to analog to
digital (LCD).
There is not necessarily ANY information (detail) loss in going
from digital to analog. This is according to basic digital signal
processing theory (DSP).
I think this is a very careless and inaccurate remark. Please
review your basic Fourier and La'Place transforms to further
understand this issue.
If there is any lose due to digital to analog conversion it is
likely neglible compared with the myriad of other factors (color
acuracy,dpi, refresh rates, price, etc...) affecting the choice of
display device.
What you said DOES apply to ADC (digital sampling of
analog-> digital). This is when information is lost due to sampling
and aliasing occurs. Again a thorough understanding of time to
frequency domain transformations is required. But for those
without background in advanced integration and differential eqn's
the Nyquist frequency provides a useful rule of thumb.
Even ADC probably poses little problems as far as detail is
concerned due to the extremely cheap DSP chips that have been
developed in recent years. They provide ample sample resolution
and frequencies.
I would go with a CRT personally. They simple have much much
better pictures ESPECIALLY for the money. I really don't think
there is any comparison, objectively by the specs, or subjectively.
If you truely want quailty. If want what is chich though, go with
an LCD. Your video card is also important.
I use and recommend any trinitron or similar high resolution 19" or
greater tube with an ATI radeon 9700 or other good video card.
Good luck!
KC
This gist is
- -- - - - - - - - - - - - SMoody
http://www.pbase.com/smoody
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
--
Love my Canons=es forty + es nine-hundred