Nvidia announce GeForce RTX 2070, 2080, and 2080 Ti launching September 20
254 replies, posted
1. I didnt even mention consoles
2. Yes they do, actually. Given how few games are pc exclusives noways, even those are limited by the bar set with console games. It just isnt profitable to push that bar anymore - especially since the most profitable games on the market right now are below average in terms of graphics (PUBG, fortnite, overwatch etc.)
so we used DirectX 9 for like ten years because we just didn't like DX10 and not because the 360 was stuck on DX9?
Technically you're right, but PC-exclusives are so rare these days. The majority of cross-platform games is developed for consoles (the biggest market) first (correct me if I'm wrong). This dictates the initial available technology. Everything fancy on the PC is bonus, and typically devs today don't give the PC version as much attention technology-wise as they do with the console versions.
Yeah, plenty if not most games look better on PC but it's rare as fuck for a game to go the extra mile to make a game really shine on PC only. Crysis 1 and The Witcher 3 are the only ones that come to mind off the top of my head. Most devs can barely be assed to optimize for PC in the first place.
Just you wait, it's only a matter of time before someone comes up with a raycoin that's mined specifically with the raytracing operations.
No, my friend, this stuff is not as much a conspiracy as you think! I have been running a 550 GTX Ti for a long time now and for many years I thought I could get away with it on lowest resolution and lowest settings... but the minimum hardware requirements for games have steepened these past few years! I can't run games like Mankind Divided cause apparently my card's below minimum reqs and it just crashes erratically sometimes. I'm surprised you can get away with 1080p really, my 550 Ti can barely get chugging on lowest resolution for some games.
Anyway, at some point you're gonna have to upgrade just to play some of the newer games when the time comes... you must succumb to the conspiracy eventually.
Snagged a 1080ti for $599, 10xx cards are just rock bottoming.
Can't think where else to post this, I hadn't seen it.
https://www.youtube.com/watch?v=2NsdM1VS5u8
If this is the absolute best it can do right now then I really can't justify the price. It definitely looks nice but not considerably better than traditional lighting. I'm sure that in the future the tech will shine but right now I don't see much point in dropping $1k on one.
I've decided to upgrade my home theater system instead
I can't justify paying $500+ for Crysis-mentality GPUs anymore, especially since games reached a point of diminishing returns for graphics years ago. It's been almost 7 years since BF3's release and the game still looks damn good
The new tech certainly is cool, but personally I'd rather have more FPS to future-proof my investment instead of paying hundreds of dollars more for 20 FPS that I'm going to lose with ray tracing anyway. I mean shit, we still can't get a stable 120 fps on modern games running at 1080p with the best card in the market today
I wish games and GPU companies would put more focus on optimization and user experience (temps, power draw) because having 30+ fps drops is a lot more noticeable for me than being able to have nicer reflections on a street puddle
I'm still wondering what the hell exactly DLSS does behind the scenes. Are they basically doing checkerboard rendering and then using neural networks to intelligently interpolate the missing pixels? The fact that it has to be specifically implemented into the game (as opposed to being able to force it at the driver level) and the advertised performance increase would suggest that it's possibly something along those lines.
I'm still skeptical about how good it'll actually look in practice, it can't really be as perfect as the cherrypicked examples they're showing right now. That being said, even if it clearly falls short of native res + proper AA, it's still a welcome addition if you want to play at stupidly high resolutions like 4k.
i wish checkerboarding was more widespread on pc, i hope dlss takes off there are a lot of games where even with a high end card 1440p 60+ is easy but you cant really hit 120/144. maybe with it high resolution high refreshrates will become possible on the 2060/70 cards
It could very well be a 2070 in that benchmark. The article states that it could be either, since the only numbers they have to go off of are the memory and effective speed, which are the same between both cards. I'm not sure why they'd bet it's a 2080 just based on the fact that the GPU score is essentially equal in the comparison.
Because the leap in tech isnt that different? Its not a massive change up like the 900 series was to the 1000 series. Its more like 800 to 900 which was a slight boost with efficiency/features.
At least I’ll be able to pick up a 1080ti at some point for a lower price
They're still making a bold claim for the pascal -> turing change. 50% better performance per core? IF that claim is actually true then theoretically it puts the 2070 on par with the 1080ti.
That's clever marketing, technically it's true when you isolate performance to purely ray tracing, or perhaps in games that support their DLSS stuff.
They can make claims all day, but its odd they didn't show off any real gameplay demos other than the RT stuff. Like no shit RT is better on the RT card, whats the performance for regular gaming.
I mean, their demos that showed RTX weren't pre-renders..
https://www.youtube.com/watch?v=VK7lL3E2LVc
I mean with baked lighting instead of RT. I know it was live.
It's not about looking nice, it's about looking nice for a day's worth of real time passes instead of two weeks and finally just saying 'fuck it go with pass 14, we're three days over milestone'.
SLI isn't a thing anymore right?
It's "NVlink" now
That sounds just as retarded as this new RTX branding.
It’s not worth it anyway
Not even for stuff like VR?
afaik SLI/nvlink has never been compatible with VR. nvidia/oculus were touting a lot of stuff back before the CV1 came out about nvidia working on single eye GPU rendering but it never showed up. all i've heard is that SLI/dual card setups cause unbearable stuttering and frame drops for VR applications. i cant think of a single period of time where multi-GPU setups were useful for anything other than compute applications or bragging rights.
What do you mean by baked lighting?
Sorry, you need to Log In to post a reply to this thread.