• Is 750W enough for 770 SLI?
    34 replies, posted
My current rig is I7 2600K 4.3,8GB Ram,GTX 580, 750W Corsair TX 750 V2, I have 1 SSD and 2 HDD's. I was wondering if I 770 SLI's(I plan just to get get on in a month and add one down the line when I need one) should it work fine or would I have to buy another PSU?
You'll be fine.
[QUOTE=Zerokateo;41679221]You'll be fine.[/QUOTE] Thanks for short and sweet answer, hopefully more people can pitch in :D
I run a Dual 580 rig on a 1000Watt system and even in the most demanding games I run maybe 650watts from the wall. So dual 770s will still give headroom likely.
You're good.
I have a 770 and a 2600K @ 4.4GHZ, along with 5 hard drives and an SSD and I'm only using between 500-600 Watts under full load, you will be fine.
more than enough.
Thanks a lot guys! It seems the 770 is the best $400-500 range card ATM. Would you guys say 7970 is better since I plan to use 3 screen display(I'm only using one for gaming and the other two for movies/forums/etc) Does VRAM effect that at all or not really?
7970 is dated compared to the 770, and the 770 is slightly faster.. I'd suggest you to go with the 770 instead also because of support for cuda and physx.
[QUOTE=B!N4RY;41682312]7970 is dated compared to the 770, and the 770 is slightly faster.. I'd suggest you to go with the 770 instead also because of support for cuda and physx.[/QUOTE] Alrighty ty =D
i7-2600k - 95W (probably 150W overclocked) 2 x GTX 770 - 460W Motherboard+RAM - around 40W max SSs - 5W max HDDs - 34W max Under maximum load you can expect to pull around 689W. You'd have around 60W to spare.
Well, just to be safe I would go for a 1000W, the 750W can run it but you never know it.
I've heard, don't quote me, that if you plan to use a PSU for a few years, you need to factor in a certain % of power/efficiency lost every year, so you should always exceed your maximum power requirements by a 1-200 watts or so.
I have a really solid PSU from Corsair so I don't think that should be a problem. But I'm stuck between 760 SLI or a Single 770/780 . The only thing I'm worried about for SLI is stuttering people talk about.
I've got a 590, and I've never really noticed any stuttering, don't know if having the Dual GPU's on a single card helps or not though.
[QUOTE=Del91;41695432]I've heard, don't quote me, that if you plan to use a PSU for a few years, you need to factor in a certain % of power/efficiency lost every year, so you should always exceed your maximum power requirements by a 1-200 watts or so.[/QUOTE] The good rule is double the watts you need because a PSU runs best at about 50% efficiency, if you factor in loss, you're looking at about a 750 watt psu per about 400 watt (give or take) setup. I tested this once with a shit cooler master psu in my old machine, it's still running flawlessly after 6 years now. So, I either have the best luck with psu's or it's a functional rule. [editline]3rd August 2013[/editline] [QUOTE=GiGaBiTe;41695038]i7-2600k - 95W (probably 150W overclocked) 2 x GTX 770 - 460W Motherboard+RAM - around 40W max SSs - 5W max HDDs - 34W max Under maximum load you can expect to pull around 689W. You'd have around 60W to spare.[/QUOTE] Yeah, he wants a 1000W psu, and that's even pushing it. A 750W psu will fail after a few years because of the % of power/efficiency lost every year, sped up by the fact you're putting the PSU under near max load constantly.
[QUOTE=draugur;41701541]The good rule is double the watts you need because a PSU runs best at about 50% efficiency, if you factor in loss, you're looking at about a 750 watt psu per about 400 watt (give or take) setup. I tested this once with a shit cooler master psu in my old machine, it's still running flawlessly after 6 years now. So, I either have the best luck with psu's or it's a functional rule. Alright thanks, I'm probably just gonna get one 770 for now and upgrade a PSU when I need to or just upgrade to a whole new card. [editline]3rd August 2013[/editline] Yeah, he wants a 1000W psu, and that's even pushing it. A 750W psu will fail after a few years because of the % of power/efficiency lost every year, sped up by the fact you're putting the PSU under near max load constantly.[/QUOTE]
[QUOTE=GiGaBiTe;41695038]i7-2600k - 95W (probably 150W overclocked) 2 x GTX 770 - 460W Motherboard+RAM - around 40W max SSs - 5W max HDDs - 34W max Under maximum load you can expect to pull around 689W. You'd have around 60W to spare.[/QUOTE] Two 770's using 460 alone? That seems quite high.
[QUOTE=draugur;41701541]Yeah, he wants a 1000W psu, and that's even pushing it. A 750W psu will fail after a few years because of the % of power/efficiency lost every year, sped up by the fact you're putting the PSU under near max load constantly.[/QUOTE] Uh, what. The reason (quality) PSUs fail over time is because of the capacitors wearing out, not because the unit as a whole degrades and "loses capacity" (which it doesn't.) The entire PSU is solid state, besides the capacitors which use coiled paper doped in a dielectric. Dielectric breaks down with age, but it hardly makes a dent in the efficiency. When dielectric gets old, it can't regulate voltage or ripple current as well and this is what causes machines with old PSUs to start malfunctioning. You can easily rejuvenate a tired PSU by replacing the caps with new ones (and I've done it dozens of times with dead and aging PSUs to give them another 5 years of life.) [QUOTE=Brt5470;41702190]Two 770's using 460 alone? That seems quite high.[/QUOTE] A GTX770 has a 230W TDP. TDP estimates are usually conservative though and often times the device can exceed that in certain situations. Intel and AMD procs often draw far more than their rated TDP (I've seen 95W Sandy Bridge chips pull up to 150W and AMD Bulldozer chips rated at 125W pull up to 200W.)
[QUOTE=GiGaBiTe;41707415]Uh, what. The reason (quality) PSUs fail over time is because of the capacitors wearing out, not because the unit as a whole degrades and "loses capacity" (which it doesn't.) The entire PSU is solid state, besides the capacitors which use coiled paper doped in a dielectric. Dielectric breaks down with age, but it hardly makes a dent in the efficiency. When dielectric gets old, it can't regulate voltage or ripple current as well and this is what causes machines with old PSUs to start malfunctioning. You can easily rejuvenate a tired PSU by replacing the caps with new ones (and I've done it dozens of times with dead and aging PSUs to give them another 5 years of life.) A GTX770 has a 230W TDP. TDP estimates are usually conservative though and often times the device can exceed that in certain situations. Intel and AMD procs often draw far more than their rated TDP (I've seen 95W Sandy Bridge chips pull up to 150W and AMD Bulldozer chips rated at 125W pull up to 200W.)[/QUOTE] I bring it up because my MSI GTX580's are rated for 260W TDP each. And I've just never seen them get that high. I have a UPS which shows realtime power usage for my whole desk (Monitors, external drives, chargers, lamps, audio equipment.) and the highest I've seen it get ever was 720Watts in Crysis 3 maxed out completely. And when you subtract my monitor that's like 600watts, which even with my 8 drives, chargers, all my fans, and stuff is quite low it seems. All I'm getting at is from my experiences TDP for video cards seem to be if every part of it was running fully loaded, but I've yet to experience that.
[QUOTE=Brt5470;41712303]I bring it up because my MSI GTX580's are rated for 260W TDP each. And I've just never seen them get that high. I have a UPS which shows realtime power usage for my whole desk (Monitors, external drives, chargers, lamps, audio equipment.) and the highest I've seen it get ever was 720Watts in Crysis 3 maxed out completely. And when you subtract my monitor that's like 600watts, which even with my 8 drives, chargers, all my fans, and stuff is quite low it seems. All I'm getting at is from my experiences TDP for video cards seem to be if every part of it was running fully loaded, but I've yet to experience that.[/QUOTE] I really doubt your monitor alone pulls 120W, unless you have a CRT.
[QUOTE=GiGaBiTe;41713070]I really doubt your monitor alone pulls 120W, unless you have a CRT.[/QUOTE] Uh, have you never heard of a 27-30" IPS? From what I remember BRT uses a ZR30W and yeah, it hits 150W max brightness from the wall. Panels in that range can easily go beyond 100W.
[QUOTE=GiGaBiTe;41713070]I really doubt your monitor alone pulls 120W, unless you have a CRT.[/QUOTE] Like Kaabi said, I have a ZR30w and it normally pulls 125watts on average brightness. It's a big panel that needs to produce a ton of light.
You should be fine, but if I can propose something, if you have money to spend get a Seasonic 860W Platinum, that way you are set for many years whatever you are going to change in your PC
Maybe get warrenty for whatever PSU you get just incase.
[QUOTE=Brt5470;41714657]Like Kaabi said, I have a ZR30w and it normally pulls 125watts on average brightness. It's a big panel that needs to produce a ton of light.[/QUOTE] lumens don't linearly scale with power usage, it must be a really inefficient panel or uses shit CCFLs.
[QUOTE=GiGaBiTe;41729518]lumens don't linearly scale with power usage, it must be a really inefficient panel or uses shit CCFLs.[/QUOTE] Perhaps, though all monitors in this class use this kind of power.
[QUOTE=GiGaBiTe;41729518]lumens don't linearly scale with power usage, it must be a really inefficient panel or uses [b]shit CCFLs.[/b][/QUOTE] You clearly don't understand professional monitors. No pro has ever wanted anything but CCFLs up until the era of proper RGB or GB LED backlighting, because it's impossible to get proper white reproduction when "white" LEDs are just blue LEDs with a yellow phosphor to make "white", and they're limited to sRGB at best so you can't use them if you do print work because you should be using AdobeRGB for that. Apple's cinema display used the same panel as all the other WQHD monitors, but with LED backlighting. It uses less power, and is also unsuitable for applications that something like a Dell U3011 is. Like Brt said above me, every monitor in that class except for very recent models using systems like GB-LED(see Dell U3014) use that amount of power. It's not that the panels are inefficient, it's that they're not quite comparable to some $90 Acer 1080p TN piece of garbage.
Frankly, I don't need the WideGamut CCFL option for this monitor, I'd love a sRGB version directly as I don't like to work in AdobeRGB as it's annoying when shooting in it but then having to stick to sRGB for slideshows or web use. If they had a white LED version of this at the time I'd have gotten it since it helps warm my room up. But considering I'm in florida that's not a good thing.
[QUOTE=Heinserver;41695369]Well, just to be safe I would go for a 1000W, the 750W can run it but you never know it.[/QUOTE] Definitely not. It is a bad idea to overcompensate for power in the realm of PSUs. Why not find out the max TDP of all your components and add it together, then get a PSU that is the closest above this number you can get, OP? Don't us any online calculators where you put in your components or anything, many of them make you overcompensate by a ton just because they want you to buy their more expensive PSUs.
Sorry, you need to Log In to post a reply to this thread.