• DVI vs VGA
    30 replies, posted
What advantages does DVI output have over VGA? I've hooked my computer up to HD TVs with VGA plenty of times before and the picture is always perfectly crisp - so I'm not sure what could possibly be improved by DVI. I figure there's got to be some sort of benefit because nowadays everything is using DVI and VGA is getting phased out.
DVI is digital. It allows for more bandwidth, less signal degradation, longer cords, higher resolutions, and is less work to do on the monitor's end. It's basically making room for the newest innovations in technology. Same reason they started shipping PCIe x16 sockets years ago, and we still haven't made GPUs to max out that bandwidth.
I think DVI is just a bit sharper than VGA at the moment.
Might have been a different problem, but a previous dual monitor setup with one DVI and one VGA had the vga monitor just buzz distortion like crazy. It happened on and off.
DVI is more advanced than VGA DVI Bandwidth: (Single Link) 3.96 Gbit/s (Dual Link) 7.92 Gbit/s That's a fucking lot [editline]12:32AM[/editline] I doubt VGA comes near that. I couldn't even get my display past 640 x 800 without looking like shit.
[QUOTE=Unreliable;21498452]DVI is more advanced than VGA DVI Bandwidth: (Single Link) 3.96 Gbit/s (Dual Link) 7.92 Gbit/s That's a fucking lot [editline]12:32AM[/editline] I doubt VGA comes near that. I couldn't even get my display past 640 x 800 without looking like shit.[/QUOTE] I've had my laptop display games and movies at 1920x1080 on a flatscreen TV using VGA with no perceptible loss of quality. Not to say that DVI isn't better. I was just wondering if there was any current advantage to it.
From what I understand DVI and HDMI are the same image quality (or bandwidth if you like), the difference is HDMI also transfer sound, where DVI is for video only. If someone can confirm this, please do.
We're comparing DVI and VGA, not DVI and HDMI.
at 1920x1080 you should indeed see a difference between VGA and DVI. VGA is analog, DVI is digital. at a certain point/resolution, the picture quality in VGA will begin to drop, somewhat sharply, as well. i believe it's at about 1600x1200 where quality drops. with VGA it's just sending a picture of a frame, with DVI, it's sending a frame of pixels, etc.. [QUOTE=Within;21498707]From what I understand DVI and HDMI are the same image quality (or bandwidth if you like), the difference is HDMI also transfer sound, where DVI is for video only. If someone can confirm this, please do.[/QUOTE] HDMI is far more advanced than DVI, you can control a wide selection of devices through one device over HDMI, and with the new specification, 1.4, audio will transfer between all of the devices so you can have, for example, a blu-ray player directly connected to the tv, then the tv connected to the audio reciever, and still get the sound out of the receiver. you can also share an ethernet/wifi connection the exact same way. all of this is totally irrelevant to this discussion, however, as HDMI is neither DVI nor VGA.
DVI is basicly HDMI only that HDMI also sends audio signals.
DVI is digital, thus its "lossless"
[QUOTE=Strikebango;21500013]DVI is basicly HDMI only that HDMI also sends audio signals.[/QUOTE] HDMI is far more advanced than DVI, you can control a wide selection of devices through one device over HDMI, and with the new specification, 1.4, audio will transfer between all of the devices so you can have, for example, a blu-ray player directly connected to the tv, then the tv connected to the audio reciever, and still get the sound out of the receiver. you can also share an ethernet/wifi connection the exact same way. all of this is totally irrelevant to this discussion, however, as HDMI is neither DVI nor VGA.
[QUOTE=M_B;21500767]HDMI is far more advanced than DVI, you can control a wide selection of devices through one device over HDMI, and with the new specification, 1.4, audio will transfer between all of the devices so you can have, for example, a blu-ray player directly connected to the tv, then the tv connected to the audio reciever, and still get the sound out of the receiver. you can also share an ethernet/wifi connection the exact same way. all of this is totally irrelevant to this discussion, however, as HDMI is neither DVI nor VGA.[/QUOTE] But the video signal in HDMI is identical to the DVI one, right?
I run a 1920x1080 monitor off a VGA connection, there is no visible quality loss at this res compared to DVI. Anyone that says there is either has a bad cable, a bad monitor or is lying.
[QUOTE=thf;21500790]But the video signal in HDMI is identical to the DVI one, right?[/QUOTE] basicly yes ofcourse.
VGA has a max resolution of 2048x1536@85hz, while dual link DVI maxes out at 2560x1600, I don't know the refresh rate. VGA has unlimited bandwidth theoretically, but is limited by the graphics card to that resolution, VGA also has all colors available. DVI is digital so it has error correction and is better over distances. However the bandwidth does have a limit so, so does resolution and refresh rate. DVI also is limited to 24bit color, while it is usually 22 bit color on most TN panels.
I recently RMA'd my monitor and once I got it back, I started using VGA with it. I saw some really bad flickering and the colors seemed rather dull. I switched to DVI two days ago and it eliminated the flickering and the colors are much brighter (1680x1050). So I dunno.
[QUOTE=thf;21500790]But the video signal in HDMI is identical to the DVI one, right?[/QUOTE] to an extent, yes [editline]11:07AM[/editline] [QUOTE=Xera;21501391]I run a 1920x1080 monitor off a VGA connection, there is no visible quality loss at this res compared to DVI. Anyone that says there is either has a bad cable, a bad monitor or is lying.[/QUOTE] or maybe you just have poor eyesite, or you've never connected it via DVI. i mean is there really any reason you have it on VGA and not DVI if you really even do have a DVI cable?
[QUOTE=M_B;21506337]to an extent, yes [editline]11:07AM[/editline] or maybe you just have poor eyesite, or you've never connected it via DVI. i mean is there really any reason you have it on VGA and not DVI if you really even do have a DVI cable?[/QUOTE] My 21" CRT at 1600x1200 is nicer then my 22" LCD at 1680x1050. My LCD is nicer on VGA then DVI also, it can run 75hz in VGA mode at native resolutions but only 60 through DVI. There is zero difference is visual quality between VGA and DVI at resolutions below 1920x1200 on a good screen. DVI still has room for improvement, single link DVI can only handle 1920x1200@60hz. That means that as 120hz monitors become more common that is the highest resolution available via dual-link DVI. VGA has theoretically more resolutions then even dual-link DVI however the quality of the cable plays a greater role with VGA.
[QUOTE=M_B;21506337]or maybe you just have poor eyesite, or you've never connected it via DVI. i mean is there really any reason you have it on VGA and not DVI if you really even do have a DVI cable?[/QUOTE] Uh, no. I have tested it with DVI. I would probably use it except my PS3 is hooked up through it(HDMI - DVI) for HDCP support.
On both of the 19" LCDs I have, DVI gives a better quality image than VGA, especially on my older CHIMEI LCD; VGA on that causes very slight blurry patches on the screen, which can be adjusted by the screen's controls, but never totally eliminated. Using a DVI cable fixed this. I'm also able to run my main 19" Viewsonic at 75Hz on both DVI and VGA, but VGA seems brighter and crisper, and IMO looks better when running at non-native res (for old DOS games and such). I see no point in using VGA cable if I have a DVI cable, and since my graphics card doesn't have a VGA socket on it I'd need an adapter to run VGA, which makes using DVI simpler.
[QUOTE=Unreliable;21498452]DVI is more advanced than VGA DVI Bandwidth: (Single Link) 3.96 Gbit/s (Dual Link) 7.92 Gbit/s That's a fucking lot [editline]12:32AM[/editline] I doubt VGA comes near that. I couldn't even get my display past 640 x 800 without looking like shit.[/QUOTE] I'm rocking 2944x1440 by VGA, and shit's so crisp. [editline]09:08PM[/editline] [QUOTE=4RT1LL3RY;21503976]VGA has a max resolution of 2048x1536@85hz, while dual link DVI maxes out at 2560x1600, I don't know the refresh rate. VGA has unlimited bandwidth theoretically, but is limited by the graphics card to that resolution, VGA also has all colors available. DVI is digital so it has error correction and is better over distances. However the bandwidth does have a limit so, so does resolution and refresh rate. DVI also is limited to 24bit color, while it is usually 22 bit color on most TN panels.[/QUOTE] Kill whoever told you that.
So it looks the consensus is that there's no consensus. I've been googling around and every site I check is just as filled with contradictory information. This is weird: I've never seen such a blatantly undecided technical issue before.
[QUOTE=ButtsexV17;21510779]I'm rocking 2944x1440 by VGA, and shit's so crisp. [editline]09:08PM[/editline] Kill whoever told you that.[/QUOTE] what kind of monitor(s) do you have?
I had to let a friend borrow my DVI cable at a recent lan party since he only brought his VGA and only my monitor had a VGA port, and I could tell the picture seemed rather fuzzy instead of DVI, where it's perfectly crisp.
Like I said in the previous thread. I noticed a vast colour improvement on my HDTV with a switch to HDMI. Also, I could run a thinner cable (Due to a lesser shielding requirment), allowing me to stuff a Cat5e cable into the bundle in the wall.
lesser shielding requirement? you don't really need any shielding with digital connections aside for maybe to keep the cable from getting pinched and frayed
Well, I certainly wouldn't like to run an unshielded cable alongside 10M of mains flex regardless of transmission method. There is a point where the SNR will become so bad even digital data becomes corrupt.
Dvi is less susceptable to signal interference. Because it is, well, digital. If your cable is improperly sheilded on vga it causes weird lines and such on the screen. If you got a shit Dvi cable, you may never know it unless it doesn't work, period.
I have dual vga setup, and only one of the monitors will goto 1920x1080 where the other errors 'Out of range', so I must run it at 1680x1050.
Sorry, you need to Log In to post a reply to this thread.