"Don Parker" wrote in message
...
Joe C wrote:
connector from the cable coming from the monitor. I don't understand
how that would work, however, since the monitor accepts ANALOG and the
DVI output of the ATI card I thought was digital. I must be missing
something.
You got it right! The adapter not only matches the plugs, but it converts
the digital to analog for your monitor - there is some degradation to the
signal, but for most applications you'll never notice....
No, you both have it wrong.
When a video card allows a DVI output to be adapted to a regular VGA/analog
output, it's simply matching form factors. The video card itself will emit
analog signals out the DVI port. All the adapter does is connect the
appropriate output wires on the DVI pot to the appropriate output wires on a
VGA-compatible output.
The signal is no more degraded than it would otherwise be using regular
analog output to the monitor.
There are genuine DVI-to-analog converters, but there's no way you're going
to find one for $1.95 (nor is it likely that you need one).
Just for future reference, let the monitor warm up to room temp BEFORE
turning on. The condensation can cause some fireworks in the high voltage
section (among others if a drop or two gets in between the wrong pins of
an IC) - and your 22" crt probably has in the neighborhood of 30,000 volts
just inching to burn it's own path to ground!!
Since condensed water vapor is unlikely to contain any electrolytes, or much
in the way of any impurities at all for that matter, it's likely to be as
good an insulator as the air around the monitor.
Maybe if you live right next to an ocean or something, there might be enough
ambient salt to allow the water to conduct. But otherwise, while I think
it's always a good idea to let stuff come to ambient temperature and to
allow any condensation to evaporate, I wouldn't worry too much about getting
any short-circuits from condensation.
Pete
|