AMD’s Ryzen 3000 Series Announced: This is it, fellas
104 replies, posted
They didn't really bin much for the previous Ryzen products. The -X variants got their higher clocks from higher voltages rather than having better silicon.
Oh yeah for sure, I meant the 3800X having more in the way of overclocking headroom. My bad should have clarified
I want to see the voltage: frequency curve for these. Hopefully it doesn't scale as exponentially like Zen 1 did.
AIUI the overclocking limits of Zen/Zen+ were mainly a result of the fabrication process, not the actual architecture. GF had worse parasitic capacitance or something at higher frequencies. Zen2 is on a different node (7nm) on a different foundry (TSMC), so the electrical characteristics could be completely different. I wouldn't get my hopes up too high, TSMC does a lot of mobile chip fabrication so I doubt 4GHz+ speeds were a huge priority to them, but it's entirely possible it could be better than Zen1.
One thing that kinda gets me nervious is even if AMD's comparison benchmarks are 100% real, they might be comparing ryzen to post-security-patch Intel performance. This will give "Independent" Shilltel reviewers an angle to say "LOOK INTEL IS STILL THE BEST" and not disclosing that their numbers come from a pre-patched OS. I hope that independent reviewers can account for that, and perhaps show how AMD compares to intel in both pre and post-patched OSs.
I'm not going to trust any Intel-Zen3 performance comparisons unless the Intel side specifies if it does or does not have the various speculative-execution patches applied -- and which ones (no bullshitting by applying one Meltdown patch and claiming it's the full suite, for example). I kind of assume AMD is comparing their chips against Intel's post-patch performance because every leg up helps and the justification is "well of course we're comparing against a CPU with blatant security holes patched" because you can make anything run faster if you disable all the safeties.
I know that I'm also a lot more informed and interested in this topic than like 80% of consumers so rip but enthusiasts will care, and corps being pitched by AMD will care.
Honestly, all benchmarks should be done on properly patched OS.
You might be able to say Intel beats AMD in home computing if you disable all prevention measures. But in sever computing that's not really an option, and that's where I hope to see AMD dropkick Intel in their teeth.
I agree with you with all benchmarks being on properly patched OSs, but Intel isn't entirely a stranger to misleading marketing and borderline-smear campaigns. And I'm not surprised if they (or tech reviewers who are partnered with them) do some misleading stuff or find a very specific and uncommon edgecase to try and save face.
I expect it. Intel's entire track record of "competing" with AMD is doing anticompetitive stuff to cripple or at least kneecap them. Ryzen is showing what happens when present-day Intel's forced to actually compete: desperate flailing and shitting themselves with 10% upclocked re-releases of last year's 14nm Core chips at inflated prices.
But this time there's now a culture of listening to independent bloggers/YouTubers as well as the top 25 tech mags/pundits (which Intel can quietly bribe a majority of) that Intel's disinfo has to contend with, and every time they try and pull one over and diminish AMD's stats or inflate their own, they Streisand the truth they're trying to cover up.
So tell me if I'm wrong here, but is it possible Intel and AMD literally aren't playing the same game anymore?
I've been rooting for the rise of AMD since I became convinced of the value and innovation of the latter years products, but this doubt came across just from how effortless their rise has seemed.
Basically, what I'm hearing in the industry is that CPU's are being relegated to fewer and fewer tasks as GPU technology is being explored. Apparently developers are finding that GPU's are actually better and stronger than CPU's in a great GREAT many tasks, and I've heard the idea put forward (since Intel also started developing GPU's) that Intel has forfeited the CPU race in most respects as CPU's in general will become a vestigial, boilerplate basic task handler while all the REAL power will be on huge, advanced, beefy GPU's.
Therefore, is it possible Intel is sorta luring AMD into winning a pointless trophy just so they can put their chips on the REAL future of PC's and leave their old opponent an impotent joke with outdated hardware again?
I mean that almost makes sense except AMD's been making consumer GPUs for longer than Intel has. Intel would be abandoning their home turf for AMD's back yard, and while they've got the monopolistic might to sink fifty billion into GPU R&D and compete, AMD's already there and Navi is about to come out.
Intel's plan requires AMD to pull out of GPUs in order to go all-in on CPUs and fall for Intel's feint. And that's not quite what's happening so far.
I imagine that something like that happening is a very long ways off. I'm not a computer engineer, but my understanding is that CPUs are very careful and precise about what the do and how they do it. GPUs don't have this ability; they aren't "smart" about what they do and instead just do it really fast. Also, GPUs don't have a full instruction set whatsoever, so getting modern software to treat a GPU as a central processor just wouldn't work without massive code rewrites at the very least. For the switch you're talking about to happen, hardware and software manufacturers would have to do a synchronous about-face and abandon the industry-standard computing architecture that's been in place for over 40 years and drives ~90% of consumer software. I know that resting on the laurels of favorable statistics is a historically slippery slope, but asking the industry to unanimously come together to pivot the very underpinnings of modern consumer computing architecture away from CPUs and onto graphics processors is a little far fetched when even two companies in the same field of silicon manufacturing (see AMD vs Intel) can't cooperate even a little bit.
I think a more likely conclusion to be gathered from all of this is that Intel got caught with their pants down, stagnating their consumer product research in favor of whatever-the-fuck-else they've been cooking up for the last several years while AMD kept R&D up and maybe got lucky a few times and is now poised to dethrone them with what looks to be a straight-up better product.
If this was true Intel would've hastened its rollout of PCIe 4.0 and high PCIe lane counts. AMD is ahead of Intel on both of those, supporting up to PCIe 4.0, and 128/64 lanes of PCIe (2p/1p socket).
In fact, Nvidia has even championed of AMD CPUs because it allows Nvidia to shove more GPUs and bandwidth into a server.
Also every one of Intel's forays into GPGPU has failed so far, including the recently canceled Knights Hill.
That bit about AMD's GPU's is a good point. Yeah, I know it sure LOOKS like Intel is just getting SMOKED. I mean, I remember their fake 5ghz sly-exotic-cooled thing they brought to trick people a year ago. I just didn't want to bandwagon without some sense of caution, let alone make any bad buys in the future based on apparent trends. Like, I'm set for a while graphically with my 1080ti... unless games in the future will rely more and more heavily upon the RTX technology for raytracing. Sometimes it's just not certain which way the wind is blowing.
What do you all think about this guy, besides his strange voice? (watch at 1.25x)
https://www.youtube.com/watch?v=zdCi_r_lm7o
Sorry, you need to Log In to post a reply to this thread.