Relive is a hell of a lot better and more customizable I've found after using both. You can even trim videos and stuff within AMD's software and change stuff in game with the overlay.
You are welcome.
@Thread I am a bit disappointed that it is still is just a Vega with a die shrink and bumped clock speeds, if I am not totally off the mark those cards will again be efficiency nightmares.
Glad they could also upgrade the overall foundation of the GPU but my hype overall is somewhat limited.
Remember, this is what they've achieved with like a 1/3rd of the R&D budget and manpower Navi has thrown at it.
Hmmm, you make a good point, I am vaguely optimistic then and I will await the independent benchmarks.
Yeah, they doubled the bus width and the ROPs to 128 because that's where Vega 1 was lacking. Not much of a boost in terms of compute throughput though, they dropped the CUs from 64 to 60 and increased the clocks for a much smaller gain.
Basically, Vega 1 was an unbalanced design that had weaknesses in certain areas, now they completely turned the situation on its head so it's overkill in those areas but not much of an improvement in others. They're quoting a ~30% performance increase on average, but I'm really interested to see how the best and worst case scenarios turn out. In theory, there could be areas where it's barely 10% faster than a Vega 64, but on the other hand there could be cases where it beats a 2080ti.
Just as I thought, searching up the GPU online (on both Google and Youtube) is coming up with results every so often "slamming" the Radeon VII. Asking why it exists, how you can currently get aftermarket RTX 2080s for cheaper, and how the upcoming 1100 series will apparently make this card even more superfluous. Some even bring up the topic of ray tracing.
I'm personally hopeful for the card. $700 is steep as fuck, but I'm hoping this is just due to it being essentially the same as an NVIDIA founders edition release right now, and that the actual GPU and especially aftermarkets will bring that price down a good chunk.
Of course, NVIDIA could just answer back by dropping the price of the RTX 2080, but we'll see what happens as this year goes on.
“The performance is lousy and there’s nothing new,” Huang said. “[There’s] no ray tracing, no AI. It’s 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we’ll crush it. And if we turn on ray tracing we’ll crush it.”
Yea... right. You turn on DLSS and RTX in 2.5 games which support it and get your FPS halved. I wouldn't call that "crushing competitors card"
Good, cause the only thing nVidia will respond to is money, and when they start losing it, their policies and pricing will change.
I like how he thinks RT is worth the upgrade when barely any software has supported it yet and so far the benches for it have been dogshit. No one wants to buy a 2080ti to run RT when they could just run 4k instead.
He says a few things about FreeSync:
1) "As you know, we invented the area of adaptive sync." -- blatantly false there.
2) "Freesync was never proven to work" -- the headline quote doesn't actually come up in the actual interview (:asshole blogger), but this is still blatantly false.
3) "The truth is most of the FreeSync monitors do not work." -- most is a nasty weasel word, but this is actually kinda true. Not AMD's fault like he's implying, but FreeSync being free (and AUOptronics/Samsung/LC having objectively terrible quality control) means that there are a load of crappy monitors out there that have the FreeSync sticker but don't support their full stated refresh range.
That first thing was so blatently false im surprised no one called him out on it.
DLSS is not RTX Ray Tracing, you have those mixed.
You are correct its just as rare but its different, it is a new form of Anti-Aliasing (AA) that has its own benefits and drawbacks.
He isn't wrong that DLSS is actually a legit benefit of RTX Cards and games that support it run better because its a lot less intensive compared to other AA options.
DLSS has its own form of artifacts though, so its not perfect either but other AA methods either are way more performance eating or have other artifacts.
In short, games with DLSS run faster then the same game using comparable TAA for example, thus you get more FPS for the same graphical fidelity.
But DLSS isnt as widely spread as TAA. Its better to hold off and see if it becomes standard than gamble. Remember when hairfx was supposed to be next gen rendering for fur and hair? Only what, 5 games supported it after showing it off.
I'm concerned that they've moved ZEN2's IMC to a separate die, this is something that'll increase latency and affect performance in games in particularly negatively
There are some theories about what's going on in there, but until we see some slides specific to Zen 2 on AM4 we know nothing for sure.
I think this reddit thread is a really good breakdown of the AMD CES.
I kind of feel bad for NVIDIA getting slammed for the wrong reasons. The problem isn't that the RTX line isn't much faster, for that was never the main selling point of the lineup. The real issue is that by the time RTX cards are finally able to literally shine, they will be rendered obsolete by the next generation. NVIDIA has done a poor job of getting developers on board the ray tracing train, and that's a bloody shame, because dodgy marketing aside, robust ray tracing for 350 dollars is such a fantastic thing in theory, and I can't wait to see the death of the visual cancer known as screen space reflections.
Sorry, you need to Log In to post a reply to this thread.