• AMD’s CES 2019 Livestream
    80 replies, posted
So (wait for benchmarks disclaimer aside), I'm wondering if the price of the Radeon VII has partly to do with the 7nm yields since it is such a new process. In the Tom's Hardware coverage there is this: The Radeon Vega VII will be available February 7th. AMD will sell the card through third-party retailers as usual beginning on October 7, but the company will also sell the card directly, a new tactic for AMD, through its website beginning February 7. Unless that is a typo - October - it would not surprise me to find that the third-party retailer release to be a lower price.
Vega's launch was hit hard because of the lack of AIB partners straight away, at least they have what looks to be a decent reference cooler this time.
The first chips to support a feature generally don't support it well. From what I'm hearing, apparently if you're buying any of these first-gen RTX chips hoping to see decent RT performance out of them once it starts getting truly utilized in games, you're doing it wrong. These are more of a tech demo than anything and serious ray-tracing chips will only come later. Like how my old 5770 technically supports DX11, but doesn't exactly do it with aplomb. And while Radeon VII is impressive, I was kinda hoping we'd see a lower-tier model for those of us who aren't looking for balls-to-the-wall performance, rather than the economy models still being on Polaris. Vega only having 1070 and 1080 equivalents was bad enough, but they're not even doing 2070 this time.
https://www.anandtech.com/show/13829/amd-ryzen-3rd-generation-zen-2-pcie-4-eight-core If the 15% performance improvement in Cinebench translates to gaming performance in roughly the same way it'll be a nice upgrade from 1800X, even if they stick with 8 cores. That said, I hope they release a 12/16 core CPU as well, if only to fix the lack of symmetry with the die placement. That vacant spot is too distracting.
I think its more of a small improvement over the intel chip along with slkghtly lower cost and a lot less power usage.
From that purported leak a few weeks ago, the is that Ryzen 7 and 9 will have two 7nm dies, while Ryzen 3 and 5 will use the second die spot for a Vega iGPU. The tiers are now gonna be a core count thing, it seems: R3: 6c/12t + Vega R5: 8c/16t + Vega R7: 6c/12t + 6c/12t (totaling 12c/24t) R9: 8c/16t + 8c/16t (totaling 16c/32t)
Oh yeah no doubt. I know its very dated and any upgrade is a net gain. Its not unusual for me to max it out anymore (though, I don't play as many games as I used to) but I have seen FH4 push it a few times. But I absolutely refused to pay for the hyper inflated RAM prices and 24 GB of DDR4 was a small fortune. Even downgrading to 16 GB wasn't cheap, but at least its below the $100 mark now. Now it seems upgrading for less than $400 is actually reasonable task.
That really depends what you're playing. Apart from Assassin's Creed and certain Total War or single-threaded titles, i'm seeing 60+ FPS on a 3770k and a 2070. For most heavily threaded titles, it's still not the CPU bottlenecking. I'll get 90+ FPS and about 70% utilization on average (far Cry 5, Witcher 3, anything current gen), so we're getting close. But it's still very fine for 4k and 1440p gaming this next year. I've been keeping my eye on ryzen forever, because i'm not even remotely about to go intel with the value propositions AMD are throwing around. But i'm also not about to upgrade my CPU if my GPU is still the bottleneck on average. Couple that with games starting to adopt more and more "variable/adaptible" settings (for mostly GPU, but also starting to include LOD and other CPU heavy features) and you're looking at probably the most longevity of any CPU gen ever in the Sandy/ivy bracket. It's not the IPC that'll be the issue. It's the amount of threads. My 3770k is doing fine due to hyper-threading allowing games to eat more threads. but a 2500k is probably bound to start having some severe microstuttering going on in certain games. It's kind of nuts that it's not lack of IPC that'll render Sandy Bridge obsolete, but chocking on a lack of threads. I'm 100% sure that sandy bridge will live out this generation, but i'd love to see if it'll be able to do 30 FPS gaming in the next console gen.
For sure, I probably won't be truly itching until Ryzen 3xxx comes out, which gives me time to budget some money for it. Its usually only in games I notice its limitations. Even with a half dozen VM's its not a big issue because they usually aren't under full load. Though core count will be a big plus there. I'm not sad about the 2500k though. It will live on to replace my Athlon II ESXI box, which is actually showing its age.
So the Radeon 7 is a joke at that price. Ryzen 3 was just previewed. 70% of the presentation was just talking about nothing new really. Epyc 2 was in there as well. Man, I sorta expect AMD to not deliver but that Radeon 7 is quite a blow. 2-3 years later they bring out their top GPU which is on the same level as Nvidias last gen top (1080 Ti) and now Nvidia got even faster cards where it only rivals the normal 2080. Cheaper variants of the Radeon 7 are also still away till mid 2019. RTX 2080 here sells for 699€ actually. MSRP prices usually translate 1:1 to €, maybe with a little extra, so this card will be at least the same price of a RTX2080, probably more at like 730€ or so. The 1FPS more in those benchmarks was hilarious and lets not forget that show benchmarks cherry pick results.
I have to ask you just what you expected. They have to first catch up to overtake. Not only that, but Nvidia is pushing DL technology hard as well, which is just another distractive measure to keep AMD running in circles AND they're sort-of-kind-of-adopting freesync. It's seriously foolish to expect anything but what we got on the GPU front IMO. They just spent two generations catching up and refining their CPU tech and are in a position to keep up with and possibly over-take on that market if they keep their pace going. It's kind of logical that the same effort on the GPU front couldn't go on at the same time due to means and manpower constraints. Their old lead left and his GPU project flopped hard not even 2 years ago. This is honestly the best case results we're looking at. I'm baffled that you expected more TBH.
On the contrary, I think the Radeon VII is a big blow against Nvidia. Yes, it's only marginally more powerful than the 1080 Ti... but so is the 2080. Nvidia isn't that far ahead of them. The only card that undoubtedly overshadows VII outside of Titan cards is the 2080 Ti and that thing still costs a small fortune. On the topic of Titan, AMD appears to be pushing this as their Titan equivalent in terms of a card that works amazingly for both gaming and work. And when you consider that Titans are even more absurdly expensive, something that can perform as well as a 1080 Ti/2080 for gaming at a MSRP and potentially also power through professional workloads is a big deal. I do think their conference was disappointing overall with how many guests they brought on to speak about absolutely nothing (I mean, at least Microsoft and Ubisoft had sort of a point being there because "hey, we're development partners, and we can vouch for how great this stuff is working around the scenes", but that eSports dude just was just completely useless) and the lack of concrete details on Ryzen 3xxx, but as far as I'm concerned, Radeon VII knocked it out of the park. I think it's a far bigger return to form for AMD graphics - and Vega in particular - than anyone could've expected coming into this. It's the first signs in ages of AMD starting to hit back hard on both the CPU and GPU fronts at the same time.
i think the price of the VII is too much, same price as the 2080 with less features and equal performance literally no reason to buy it. if nvidia release the rumoured 11xx cards it has even less appeal
Dang they didnt show a release date for 3rd gen ryzen Should i wait or stick to my plan of getting a ryzen 3 2200g, keeping that for a bit then upgrade to zen 3 + a dedicated gpu? I kinda want a pc for March or a bit earlier
I wasn't expecting much but I guess even just a 2080 Ti equivalent was too much. That they just managed to catch up to not even the top end card again is quite a letdown. If its really that foolish and over the top then I at least wasn't as delusional as good chunk of AMD fans who got crushed by that price alone. And the 2080 selling at or under MSRP in some other regions doesn't really matter when a new Radeon 7 at release will likely not hit that $699 as well, neither in my region or the others. Unless thats only Nvidia problem in those markets. I would call them just on par in performance. Show benchmarks that show 1FPS more is not boding well for actual independent benchmarks, well and the extra in Vulkan games is nothing new but Vulkan games are also quite rare still. Either way, we need proper benchmarks that aren't slideshows by AMD. As you name content creation or other work, please don't forget that Tensor Cores can be quite valuable for all kinds of machine learning as well as the RTX cores for rendering, some software will soon make use of that outside of games, Blender3D apparently getting faster rendering through it. They also recently announced better NVENC improvements in RTX cards, coming soon to OBS as well. https://www.nvidia.com/en-us/geforce/news/geforce-rtx-streaming/ CUDA still pretty popular for a lot of other work as well and you can probably throw OptiX in there, while not requiring RTX its build on RTX tech, which already gives neat things like an AI denoiser. I guess it depends in what area you work, OpenCL heavy works or not. AMD and NVidia both can not fill all work loads. 2080 has the same MSRP of $699, which kinda makes this harder. Radeon 7 being the same MSRP is just the killer of what could have been a great deal otherwise.
And so will this. Kind of funny how quickly people forgot AMD used to be the undisputed king of hee haw batshit sammich cards that took your entire EATX case for triple the price of the nvidia equivalent. That's also presuming you're going to be able to get it one day one. This is basically a very very good use of the V20s that didn't make the wafer cut overvolted to hell and back, and then good use of Samsung and Hynix getting spanked for jacking with memory prices. The fact they don't have any AIB versions and everything is reference and they didn't even mention TDP once when that's been there thing since dx10 days is pretty telling. They also have a straight win in the HPC and render farm area because of the memory amount and throughput, you can get a $700 AMD or a $2500 nVidia, no brainer there; but this card is gonna be hot loud and require 650-700 minimum. It's still a kick in the balls to Jensen given how slow nVidia has been on yields and refresh. Ask chipsnapper or 85.
Vulkan is going to have its own Raytracing solution and with AMD working closely with the Vulkan creators(Mantle eventually evolved into Vulkan), I wouldn't be surprised if AMD cards will run raytracing natively so long as they natively support Vulkan.
AMD (along with Nvidia) has been working with 3dMark on their raytracing benchmark so we should hopefully see it soon. The GCN architecture is pretty damn good at compute as it is anyways, especially Vega.
Haven't paid much attention to the GPU front as I just picked up an ASUS Strix Vega 64, but it's good to see Zen2 keeping up the pace. Was originally considering the 3600X to replace my 3570k (which from 2013 until the last year or so had run most games in conjunction with my old GTX 770 at 60FPS) but now the 3700X looks like a good opportunity for future proofing.
The only exciting thing about the Vega card is the fact that it’s a fairly big chip being mass produced for the consumer market on 7nm. That bodes well for the future. And well, it shows that Vega 64 was always an unbalanced design.
I just fucking upgraded to an R5-2600, can I at least use my AM4 MB with the new ryzen 3xxx stuff? Also, /g/ is on fire.
Yeah she mentioned it running on the same socket. AM4 is supported through next year AFAIK. I upgraded to the R5 2600 myself a few months ago and while I'm interested in seeing the benchmarks for the new Ryzen, the 2600 is such a damn good CPU I probably wont be upgrading either way. Was hoping for some mid-range card announcement but oh well, guess I'll stick to Nvidia for now.
Unless the RVII ends up costing less than the 2080 when it hits the market, it'd be hard to choose that when I'm so used to the convenience of Shadowplay.
Radeon VII seems to be just a cut-down version of AMD's MI50 compute GPU with 60 Compute Units instead of full 64. If all 64 CUs are functional, they can sell it for $5000 as a compute GPU. If 1-4 CUs are faulty, they'll sell it as a gaming GPU under the Radeon VII brand for $700.
AMD has ReLive, or what's your point?
I was unaware they had anything similar
Relive is a hell of a lot better and more customizable I've found after using both. You can even trim videos and stuff within AMD's software and change stuff in game with the overlay.
You are welcome. @Thread I am a bit disappointed that it is still is just a Vega with a die shrink and bumped clock speeds, if I am not totally off the mark those cards will again be efficiency nightmares. Glad they could also upgrade the overall foundation of the GPU but my hype overall is somewhat limited.
Remember, this is what they've achieved with like a 1/3rd of the R&D budget and manpower Navi has thrown at it.
Hmmm, you make a good point, I am vaguely optimistic then and I will await the independent benchmarks.
Sorry, you need to Log In to post a reply to this thread.