• AMD's rumored 16-core Ryzen CPU may run at 3.1GHz to 3.6GHz
    13 replies, posted
[url]http://www.pcgamer.com/amds-rumored-16-core-ryzen-cpu-may-run-at-31ghz-to-36ghz[/url]
This would be a serious game changer if it holds.
[QUOTE=Sombrero;52008782]This would be a serious game changer if it holds.[/QUOTE] More cores does not equal more performance, many application don't even properly utilise 4 cores let alone 16, single threaded performance is more important.
[QUOTE=Sombrero;52008782]This would be a serious game changer if it holds.[/QUOTE] Just like Bulldozer was...Oh wait
[QUOTE=Chryseus;52010210]single threaded performance is more important.[/QUOTE] For gaming* Most professional tools are optimized to scale with the number of cores your chip has. Video encoding, rendering, compiling, simulation, they all benefit from more cores. Not to mention this rivals a server chip
It's probably a Server part and some guy saw it and leak it to the media.
[QUOTE=AJ10017;52010227]For gaming* Most professional tools are optimized to scale with the number of cores your chip has. Video encoding, rendering, compiling, simulation, they all benefit from more cores. Not to mention this rivals a server chip[/QUOTE] Which is totally irrelevant for the vast majority of consumers. This is for servers, rendering farms, and DIY rendering at the lowest. But you can damn well bet that there will be a ton of misinformed "gamers" thinking 32 threads = more better
[QUOTE=Chryseus;52010210]More cores does not equal more performance, many application don't even properly utilise 4 cores let alone 16, single threaded performance is more important.[/QUOTE] It's good for when you have a lot of shittily optimized processes running in the background so the other cores can be freed up. Looking at you, battle.net
[QUOTE=Snowmew;52010281]Which is totally irrelevant for the vast majority of consumers. This is for servers, rendering farms, and DIY rendering at the lowest. But you can damn well bet that there will be a ton of misinformed "gamers" thinking 32 threads = more better[/QUOTE] It's actually for workstations, so it'll be useful for things like testing larger projects with compilation, datacenter simulations, and ordinary virtual machine systems. I mean usually 8 cores are enough there, but you can do something useful with 16 cores and 32 threads with simulated build systems and deployment systems. The ability to run a virtualized disposable continous deployment environment locally is pretty great, and is something I miss a lot. I mean sure you can install a few programs and deploy your shitty PHP app pretty quickly but it's something else to deploy datacenter scale applications and ensure their stability is testable on a local system. Or I guess games could start being not so shit and better utilize the more cores we have available these days, and then perhaps Intel could stop being fucking retarded kids for once. [editline]25th March 2017[/editline] [QUOTE=Sombrero;52010396]It's good for when you have a lot of shittily optimized processes running in the background so the other cores can be freed up. Looking at you, battle.net[/QUOTE] That's based on the assumption that your OS running the process isn't using a completely simple and stupid scheduler.
[QUOTE=Snowmew;52010281]Which is totally irrelevant for the vast majority of consumers. This is for servers, rendering farms, and DIY rendering at the lowest. But you can damn well bet that there will be a ton of misinformed "gamers" thinking 32 threads = more better[/QUOTE] Then it's the consumers fault? Nothing is stopping people from buying Xeon chips :why: The cost will drive uninformed 'gamers' away anyway, someone isn't going to blow 600-900 dollars or however much it costs on a single component without researching it.
[QUOTE=AJ10017;52010456]The cost will drive uninformed 'gamers' away anyway, someone isn't going to blow 600-900 dollars or however much it costs on a single component without researching it.[/QUOTE] Oh, child, bless your heart...
[QUOTE=AJ10017;52010456]Then it's the consumers fault? Nothing is stopping people from buying Xeon chips :why: The cost will drive uninformed 'gamers' away anyway, someone isn't going to blow 600-900 dollars or however much it costs on a single component without researching it.[/QUOTE] Reminder that people bought GTX Titans Only for them to be obsoleted by the <whatever> TI card early the following year People do actually buy Xeon chips for a reason, too. They tend to have a lower TDP than the competing consumer chip from Intel, more cache, and especially when they're older, a better price point.
If it's no more than $1200 I might consider it.
That would have been so great back when I was re-encoding videos to put on my phone.
Sorry, you need to Log In to post a reply to this thread.