Apparently they're noticeably different, but from Google I get mixed opinions, so I'm curious as to what FP has to say about the matter.
Okay, and what about them? DVI is digital, and VGA is analog. If you can, use DVI. VGA will work fine too as long as its not some huge resolution.
most people here will say the difference is huge but it's negligible anywhere below 3 million pixels
What are you asking exactly? How well either displays images? Connections? Quality?
DVI is the best, hands down. Especially on larger resolutions.
DVI has more pins on the connector. DVI is digital. The cable quality has little effect on signal quality.
VGA is a 15-pin connector. VGA is analog and is more susceptible to interference due to its analog nature. The quality of a VGA cable has little impact until you get to high resolutions or long lengths. Just about any VGA cable handles 1600x1200@100hz fine. The slightly thicker ones are what I use for when hook to an HDTV or other high resolution screen. I have a 6 foot long medium thickness cable that drives my 22" CRT at 1920x1440@85hz with no interference or scanline artifacts.
A good CRT monitor shows little to no interference with a solid connection, just like HDMI on a screen. You are best of using DVI or HDMI for any type of digital panel, LCD, Plasma, etc. You should only use VGA when using a CRT or other analog type screens.
[QUOTE=Demache;26488894]Okay, and what about them? DVI is digital, and VGA is analog. If you can, use DVI. VGA will work fine too as long as its not some huge resolution.[/QUOTE]
As in monitors look "better" with DVI as apposed to VGA.
Right now I'm using a VGA for my laptop and it's displaying at 1920x1080.
There's a bit of static on certain shades of colors, so basically will using DVI cords on my other computer I plan to have built by Christmas clear it up?
oh, and with DVI you can get away with less shielding.
[QUOTE=Mobon1;26488993]As in monitors look "better" with DVI as apposed to VGA.
Right now I'm using a VGA for my laptop and it's displaying at 1920x1080.
There's a bit of static on certain shades of colors, so basically will using DVI cords on my other computer I plan to have built by Christmas clear it up?[/QUOTE]Yes, it should.
[QUOTE=Mobon1;26488993]As in monitors look "better" with DVI as apposed to VGA.
Right now I'm using a VGA for my laptop and it's displaying at 1920x1080.
There's a bit of static on certain shades of colors, so basically will using DVI cords on my other computer I plan to have built by Christmas clear it up?[/QUOTE]
Most laptops don't have DVI ports, usually VGA and HDMI. HDMI is just like DVI but with a different connector. You can use a connector to change from DVI to HDMI without a problem as long as you don't need audio.
[QUOTE=4RT1LL3RY;26488932] I have a 6 foot long medium thickness cable that drives my 22" CRT at 1920x1440@85hz with no interference or scanline artifacts.
[/QUOTE]
I've got a thicker 4 foot cable driving mine at 2048x1536@60Hz and it looks great
[editline]4th December 2010[/editline]
[QUOTE=4RT1LL3RY;26489046]Most laptops don't have DVI ports, usually VGA and HDMI. HDMI is just like DVI but with a different connector. You can use a connector to change from DVI to HDMI without a problem as long as you don't need audio.[/QUOTE]
iirc hdmi is all digital, dvi is hybrid digital/analogue
[QUOTE=4RT1LL3RY;26489046]Most laptops don't have DVI ports, usually VGA and HDMI. HDMI is just like DVI but with a different connector. You can use a connector to change from DVI to HDMI without a problem as long as you don't need audio.[/QUOTE]
Well yeah this laptop's a bit old so basically I just wanted to know if something was generally crap about my monitor or it was just the VGA cord I'm using at the moment.
[QUOTE=ButtsexV3;26489067]iirc hdmi is all digital, dvi is hybrid digital/analogue[/QUOTE]
But the digital part of DVI has an identical signal iirc
Also since I have HDMI, would it make a difference if I used them over DVI?
I know you get audio, but I'm using separate speakers other than the ones built into my monitor (as they probably sound shitty.)
[QUOTE=thf;26489144]But the digital part of DVI has an identical signal iirc[/QUOTE]
I think it does but either way it's not just like it with a different connector.
VGA is an analog transmission of the data, which can degrade and the cord doesn't have enough bandwidth to handle very large resolutions perfectly clearly, and shitty cords will get you some interference. DVI is digital, the signal can't degrade, and there's no interference (DVI and HDMI are pretty much the same thing sans encryption).
VGA vs DVI was something I got caught up in and thought it was a big deal and I NEEDED DVI TO GET BETTER PICTURE QUALITY but seriously it doesn't matter. Unless you're noticing your screen has analog artefacting or scan line problems, there's zero reason to switch. Large resolutions, above 1080p, are going to see anywhere from a slight to a somewhat noticeable difference in quality, but nothing SPECTACULAR AND MINDBLOWING.
Of course, DVI is the superior choice so if you DO have a choice, go DVI, just don't run out and replace all of your VGA cords because it doesn't matter. There's a reason that we still use VGA today, if it was terrible and shitty and useless it would have gone the way of coax
[editline]4th December 2010[/editline]
ofc if you're using a 3000x2000 resolution you better be using DVI
[QUOTE=ButtsexV3;26489178]I think it does but either way it's not just like it with a different connector.[/QUOTE]
I thought that DVI-I was analog and/or digital.
DVI-A was just analog, and not used for anything anymore as VGA is more convenient.
DVI-D was just digital and had the exact same signal spec as HDMI.
3000x2000@60hz isn't doable on a single dual link DVI port is it?
[QUOTE=4RT1LL3RY;26489328]I thought that DVI-I was analog and/or digital.
DVI-A was just analog, and not used for anything anymore as VGA is more convenient.
DVI-D was just digital and had the exact same signal spec as HDMI.
3000x2000@60hz isn't doable on a single dual link DVI port is it?[/QUOTE]
not on one but it could do it on 2 easily.
[QUOTE=Cheesemonkey;26489207]VGA is an analog transmission of the data, which can degrade and the cord doesn't have enough bandwidth to handle very large resolutions perfectly clearly, and shitty cords will get you some interference. DVI is digital, the signal can't degrade, and there's no interference (DVI and HDMI are pretty much the same thing sans encryption).
VGA vs DVI was something I got caught up in and thought it was a big deal and I NEEDED DVI TO GET BETTER PICTURE QUALITY but seriously it doesn't matter. Unless you're noticing your screen has analog artefacting or scan line problems, there's zero reason to switch. Large resolutions, above 1080p, are going to see anywhere from a slight to a somewhat noticeable difference in quality, but nothing SPECTACULAR AND MINDBLOWING.
Of course, DVI is the superior choice so if you DO have a choice, go DVI, just don't run out and replace all of your VGA cords because it doesn't matter. There's a reason that we still use VGA today, if it was terrible and shitty and useless it would have gone the way of coax
[editline]4th December 2010[/editline]
ofc if you're using a 3000x2000 resolution you better be using DVI[/QUOTE]
We still use old school coaxial on our TVs. Though our cable company has it set up in hybrid analog/digital mode. I'm surprised that old COAX cables have enough bandwidth to handle 80 analog channels, 80 digital channels (some being in 1080), a few hundred subscriber digital channels, and internet.
[QUOTE=Demache;26489421]We still use old school coaxial on our TVs. Though our cable company has it set up in hybrid analog/digital mode. I'm surprised that old COAX cables have enough bandwidth to handle 80 analog channels, 80 digital channels (some being in 1080), a few hundred subscriber digital channels, and internet.[/QUOTE]
It's called compressing.
Usually when you have that many channels and internet on one cable, they skimp on overcompressing HD.
Fiber evvrryy day.
[QUOTE=>VLN<;26489535]It's called compressing.
Usually when you have that many channels and internet on one cable, they skimp on overcompressing HD.
Fiber evvrryy day.[/QUOTE]Still, that's a whole lot of content considering coax is so old. Then again, we had to upgrade the cabling in our house a few years back because our cable internet and digital box would fight each other for signal.
Sorry, you need to Log In to post a reply to this thread.