• Nvidia announce GeForce RTX 2070, 2080, and 2080 Ti launching September 20
    254 replies, posted
Depends entirely on how well the game you're playing manages frametime. A good example GN and Jay both used was F1 versus Sniper Elite. F1's hitches are like car wrecks with their severity, Sniper Elite compensates very very quickly to any hiccups, trying toget you back on track immediately. A game made in UE4 will have pretty clean transitioning as long as you don't just leave that shit at stock settings and bake *cough capcom, cough gearbox), stuff made in Criware or Unity tends to be jank as hell unless the team is really on their game.
what? They tested it in that one article with the 1080ti vs 2080ti only squeezed 20-30 frames more which is garbage considering you're paying that much. $1K is ludicrous for such meh performance and hollow promises for future tech. If Nividia can pull off getting devs to use DLSS or RT that would be great, but I have a deep feeling it will be the next Nividia tech that will be forgotten in a handfull of years. You're way better off waiting for 7nm cards and having the same performance leap the 10xx series had.
It's cool that you're spouting like the most obvious fucking conclusion possible to reach that a majority of reviewers have already come to, meanwhile in a large chunk of titles 4K performance specifically for the Ti is much higher than the average spread. Did I say omg run out and buy this card right now anywhere in either post? No, no I did not.
No, but it sure as hell isn't "virtually doubled". Its virtually 20-40% performance increase is atrocious from the price point. And this is ignoring titles that have been already optimized for the card for benching. Stop being a dick.
4K is already a dead end in of itself anyway. The data requirements alone make it virtually impossible to be used in any meaningful way outside of very specific examples.
4K is going to be the standard in nonstreaming commercial video by the end of 2019.
That is fucking stupid.
I'm not the one ignoring specific across the board 4K benchmarks for the aggregate total benchmarks in order to grab a fruit that's quite literally already laying on the ground and is butt-obvious. Meanwhile in the real world, hitting 90 locked at 4K/10/1000 and 144-200 on 2K/8_8+2/550+ is likely to be of salient value to a rather large chunk of people, considering their first three fab runs are already pre-sold out, so someone is obviously paying the completely lopsided premium already.
I guess I am getting a 1080ti in a few weeks then. My 970 with its effective 3.5GB of vram is not cutting it anymore now that I threw in a 4k monitor.
http://i.imgur.com/iu9n61q.jpg These are the resolution stats according to steam. A vast, vast minority of players are using 4k. They’re throwing their money away.
Since it's their money and not yours, I'm assuming they're fine with as someone who's been running 4K 10 550 for two years, my gaming has been fine, my game development has been fine and my video watching has been fine.
You're missing the point. Using the "but they're good for 4k" argument is moot since so few users actually use 4k. For every other resolution the Price to Performance ratio is just way too much on the 2000 cards. These cards sold out before ANY performance benchmarks were released. The flagship features of DLSS and Ray Tracing aren't even available yet. And from what we saw of Raytracing in the pre-release stage, it kills frames (though that is subject to change if the demos showed off to the press just weren't optimized). With all of that, it is a waste of money for the vast majority of gamers and this is only empowering Nvidia to be even more agressive with their price hikes and rushed launches.
Nope. The person missing the point is you and your armchair twitter activism level logic doesn't mean anything, and here's why. The first three runs of all three SKUs have sold out. If 1080 was the defacto standard you claim it is then they wouldn't have. Really that simple. Go back and read what I just wrote. I didn't say all available, I didn't say all three initial versions, I said three fab runs, including two runs that haven't even hit the streets one of which isn't even fabbed yet. And those, as of right now are still sold out, so apparently despite the price and upgrade ceiling you say is the doom of the palantir, people don't care. Kinda like Apple. They don't care that the phone is literally twice what it should cost based on name alone, they just want it. nVidia hasn't had competition for two gens and here we are, even professional reviewers aren't bothering to include AMD in the pipe. People don't care. Secondly 1.3% of 67-125 million users is 8,700K to 1.6 million people, so there's plenty of cards to be sold, and sold at apparently 1.2K a pop. Thirdly, you're not going to compel nvidia to anything. They've done this every time they've had no competition, and this is the fourth time they've had an undisputed advantage against amd, so not only is your argument based entirely on feels versus reals, your argument is circa the nVidia 5900, aka 2003, aka fifteen years ago aka nVidia selling an overpriced leafblower and those sold out too. Meanwhile double leverage for people that game at 4K apparently compels sales.
lmao wtf does this even mean. the numbers don't lie, 4K is not a large pool. People are acting really stupidly with their money for an upgrade that just isn't worth it. For fucks sake, Linus didn't even come to a real conclusion on the cards because he feels they're unfinished and the launch was rushed (he didn't get drivers etc... until two days before the embargo lifted). The consensus is clear, while their 4K performance is great, that does not outweigh everything else. they're overpriced, and people are throwing their money at them. and I'm saying this is a problem since we didn't even have numbers until yesterday. people bought them on hype alone which is absolutely ridiculous. And the numbers we do have are saying that frankly it's just not worth it. okay buddy, just ignore the fact that price per frames is just awful and well below last gen (even rivaling the Vega 64). My argument is based on actual, holistic statistics. you're cherry-picking the 4k results as if that's the only one. Not to mention that the 1080ti can match the 2080.
Nvidia espouses corporate bullshit, with super inflated gsync prices, geforce experience, and anti-competitive developer relations, but this has been a long time coming. Raster rendering is awfully hacky to get anything close to photorealism, I'm sure it makes graphics programming a pain in the ass to try to manually add on special effect after special effect to achieve what raytracing naturally accomplishes. I have the predator x27 and will be upgrading my whole build from a gtx1080/4930k system to 2080ti+9900k, so I'm really anticipating the raster rendering performance boost (looks like 2080ti is about 90% faster than a 1080 at 4k, and the CPU upgrade should push that to over +100% performance in existing games, ignoring all the unused silicon), just as much as realtime raytracing (and DLSS), can't wait for it to catch on. AMD just needs to launch a card that also supports the DX12 raytracing extensions, because fuck nvidia corporate and I hate having to support them to get something like this, but I've been waiting for this moment since first learning about raytracing over a decade ago.
I would give it a few months, I imagine once DLSS sets in the 2080 will see more increase over the 1080ti
that's only if it can deliver on the promise. I'm waiting for actual tests, rather than Nvidia curated demos (which have fixed camera angles, meaning it works flawlessly). I 100% agree with Linus that people have bought these cards on a promise, and that promise isn't yet fulfilled. Raytracing isn't available in anything, and neither is DLSS. The launch was frankly rushed.
I don't understand Nvidia's angle here. Pascal and the mining boom edged AMD's Vega out of the market almost entirely. Now it seems like they're handing AMD a future victory by pricing their cards so high. All AMD needs to do is release Navi in Q1/Q2 2019 without ray tracing hardware and price it similarly to Turing at MSRP, or even lower if they're okay with whittling down profit margins for the sake of growing their user base. Maybe release some marketing insinuating that ray tracing isn't worth the price (because it clearly isn't). They'll regain mindshare almost immediately. Sure, Navi is just going to be another GCN refresh, so it won't blow any minds and will probably retain a lot of Vega's flaws, but I predict it'll be a boring, sensible choice for price-conscious consumers looking to upgrade from Maxwell or Pascal.
There's nothing to understand. They're literally passing the cost of developing the three new techs onto the consumer, if it stick you can bet the 7nm slight upgrade version will cost the same. This is what happens when there's no competition.
4K isn't even worth when you factor in the amount of gigs a single movie in 4K takes in physical space or through streaming. Games are going to be even fucking worse.
you can already run games in 4k, it doesn’t magically inflate the game’s size. but, there are a lot of issues with scaling etc... when using 4k for everything else (HardwareCanucks did a video like a year ago about trying 4k for the first time). I really don’t think it’s as good as its hyped up to be. for games at least we should really be focusing on uktrawide resolutions, getting extra stuff on the sides of the screen is a godsend.
Id say 1440p is perfect, but I cant see the difference between 1440 vs 4k. Also i care 100x more about framerate than pixel count.
Weren't they going to do something interesting with Navi, though? I was hearing that it was going to be something similar to how they are sticking cores together on Threadripper. Of course, it was just rumors, so it really could just be a GCN refresh. Which would be utterly disappointing.
its doubtful GCN can handle being Chiplet-ized. In general GCN as we know it needs to come to an end, either something totally new (Super-SIMD?), or serious reworking of GCN (Like: Slim up each pipeline by ~30%, re-architect hardware hardware scheduling for arbitrary number of SE/"Pipelines" - then stack two extra for high-end GPU dice.). But Navi is probably just going to incorporate all the promised Vega features we didn't get (DSBR/TBR), plus some extras into a Polaris-class die, with GDDR6 memory. AMD should look at what Nvidia is doing with its monolithic-class dies, and go a chiplet route - a group of little int8 chips connected to a stack of HBM2 on an interposer, connected via PCB to another package containing a more "classical" GCN GPU with HBM3, or something thereabout.
What processor do you have, out of curiosity.
Microsoft is implementing ray tracing into directX Nvidia's implmentation is just a shallow gimmick for most of us that won't be used in a few years
An old ass one that bottlenecks the card. An I7-6700k, which can pull 120fps on most games, but struggles on others. But with borderlands as an example, the usage on both gpu and cpu literally sits at idle with temps reaching 40 cpu and 50 gpu.
RTX literally uses the DXR API. It's single-purpose hardware for DXR.
I fucked up to mention its the pre-sequel and it didnt reach 100% usage after double checking. Its just that horribly unoptimized.
Borderlands 2 and tps has a shitty physx optimization.
Sorry, you need to Log In to post a reply to this thread.