Nvidia come out swinging for the PC at E3: “the PC is the most powerful gaming platform out there”
43 replies, posted
[QUOTE=Clavus;41032615]The biggest contribution of the new generation of consoles for PC gaming is that they'll drastically increase the minimum requirements for games, which is a good thing. 8GB of RAM being used optimally to create huge seamless worlds? Awesome.[/QUOTE]
It should be noted, though, that the 8GB RAM in the Xbox One and PS4 is shared between the GPU and whatever the rest of the system is doing. As a result, I think we're gonna see mostly higher resolution textures, and not really that big a jump in RAM utilization on the PC.
There's no such thing as future proofing your computer.
[QUOTE=Odellus;41038286]all of the crysis games already did
it doesn't make that big of an impact in performance[/QUOTE]
just crysis 1, and it didn't affect performance because all the other systems in the game were designed to keep memory usage under the 32 bit address limit
[QUOTE=Dr.C;41042481]On the brightside, even if we can't run a game at max settings, it will still look great right?[/QUOTE]damn right. Crysis 2 looked beautiful even on the lowest preset and ran on my GTX260 + Phenom tri-core toaster perfectly smoothly
Well no shit your £600 GPU is more powerful than a £350 console.
[QUOTE=ZombieDawgs;41044816]Well no shit your £600 GPU is more powerful than a £350 console.[/QUOTE]
I think you mean the [url=http://www.scan.co.uk/products/2gb-msi-gtx-680-28nm-pcie-30-(x16)-6008mhz-gddr5-gpu-1006mhz-boost-1058mhz-cores-1536-plusfree-game]£330 GTX 680 GPU[/url] listed in the chart.
[QUOTE=pebkac;41032127]So, 14 years? How about no. Let's get realistic here, you are going to need some upgrades every 4 years or so if you want to play all the latest games. If not for other reasons it's simply because developers tend to drop support for older hardware because it would be kind of a hassle to make sure everything runs properly over 5+ generations of hardware, even though the old hardware is still powerful enough compared to consoles. Many games these days require DX10, some even DX11, just to run. So a machine with a core 2 duo and a radeon x1950 or geforce 7900 won't be any good today even if it was high end 7 years ago and is still on par with the consoles.[/QUOTE]
I'd definitely push the envelope over 4 years. I had a four year old laptop with middling hardware that still had no insane issues playing games. Sure I had to turn details down to minimum, but considering I had to do that often when it was new that's not such a huge loss.
You could imho enjoy 6 years minimum from higher end desktop hardware, even longer if you don't need to play on max details all the time.
I mean just look at the hd3000. The fact that an integrated GPU can run a lot of stuff while being massively weaker than a lot older dedicated GPUs should be telling.
[URL]http://www.notebookcheck.net/Intel-HD-Graphics-3000.37948.0.html[/URL]
And that's two year old hardware generally. And on the very low end.
[quote]
But let me rephrase it: If you want to be able to play 99% of the games, 7 years old hardware will do (8800GTX, the first DX10 GPU, came out in 2006), but that would mean a very high end system at the time. If you want to play ALL games, more than 4 years old hardware won't suffice (Radeon 5870, the first DX11 GPU, came out in 2009).
If you have anything older than 7 years, you're going to be missing out on many games that require DX10.
[/quote]
A lot of games require DX10 on the software (vista and later) but are just fine with dx9 cards.
[QUOTE=GreenDolphin;41044885]I think you mean the [url=http://www.scan.co.uk/products/2gb-msi-gtx-680-28nm-pcie-30-(x16)-6008mhz-gddr5-gpu-1006mhz-boost-1058mhz-cores-1536-plusfree-game]£330 GTX 680 GPU[/url] listed in the chart.[/QUOTE]
I don't think you get what I was trying to say do you.
[QUOTE=pebkac;41043701]Yeah, I'm pretty sure you'd need to build a top of the line multi GPU system if you want[B] everything[/B] to run at 60fps at max settings 4 years into the future, and even that is questionable.
But let me rephrase it: If you want to be able to play 99% of the games, 7 years old hardware will do (8800GTX, the first DX10 GPU, came out in 2006), but that would mean a very high end system at the time. If you want to play ALL games, more than 4 years old hardware won't suffice (Radeon 5870, the first DX11 GPU, came out in 2009).
If you have anything older than 7 years, you're going to be missing out on many games that require DX10.
If you are one of those people who actually builds a top of the line system, you probably don't do it to be future proof so you can still play games on it 6 years later, you do it because you want to always have the best possible system available. Otherwise it makes way more sense to build a system that's half the price and replace it every 3 or 4 years or continually upgrade it when it becomes necessary.
So please, let's stop pretending that building a future proof PC that's going to last a whole console generation is actually a sensible thing to do. The reality is that you simply will need to upgrade every now and then.[/QUOTE]
you just misread what I wrote. I said "for 1080p 60fps yes you need expensive" but for playing it at console standards (sub 60 fps) and at not so high resolutions, you don't need high end stuff. Also you never need multi gpu, you can just get one powerful gpu, no microstuttering as a bonus
[QUOTE=Sgt-NiallR;41024643]A have a friend who's brother is your standard 13 year old squeaker on XBL.
Most of the time he's an all right guy, but it's a little pathetic to see him try to compare ARMA3 to BF4.
It's a shame that people tend to think of PCs as either a downgrade or a solid equal to consoles. Especially when it can be so hard to persuade them otherwise.[/QUOTE]
I don't know how can anyone see the PC as a downgrade next to a console...
Which takes me to this: about a week ago, I was with my friends, and they said consoles are better for games than PC, because the PC has all the other background stuff running and so on.
Sure, PCs do get background stuff running, but comparing them to consoles is pretty retarded, especially when console games are heavily optimized (or were)
Too bad no one will listen.
Everyone I know, outside of a few friends, thinks that you can't play games on a PC, that the PC has no good games, or they try to play Crysis on their shitty desktop their parents got in 2003, and when it doesn't run well, think that's how PC gaming is supposed to be and start shit talking it all the time.
The reason why the console market is so big is because consoles a social tool to play with friends either sober or drunk on the couch. PC is for playing "alone" with hundreds of other people (Depends on the game) - While in a system you fully control.
[QUOTE=Clavus;41032615]The biggest contribution of the new generation of consoles for PC gaming is that they'll drastically increase the minimum requirements for games, which is a good thing. 8GB of RAM being used optimally to create huge seamless worlds? Awesome.[/QUOTE]
One major caveat that I hope is brought to light is proper, optimized code. None of this Notch bullshit.
[URL="http://www.codinghorror.com/blog/2008/12/hardware-is-cheap-programmers-are-expensive.html"]Sadly[/URL]
This just made me remember that the PC expo is a thing and Star Citizen might make an appearance.
Color me excited as fuck
[editline]17th June 2013[/editline]
oh wait just checked they didn't.
sad faic
Sorry, you need to Log In to post a reply to this thread.