• Intel is developing a desktop GPU to fight AMD, nVidia
    38 replies, posted
https://www.forbes.com/sites/jasonevangelho/2018/04/11/intel-is-developing-a-desktop-gaming-gpu-to-fight-nvidia-amd/#67294a43578f So this is why Raja Koduri left AMD after bombing Vega.
Intel is as bad as Nvidia when it comes to business practices so I can't say I'm thrilled.
The more competition the better, since its intel we're probably going to see fierce battle between amd and nvidia and it
Yeah, first thing that came to mind was that they just want to tap into the crypto market.
Intel is way worse. Here's a history lesson for those interested. https://www.youtube.com/watch?v=osSMJRyxG0k
It'll be interesting to see if intel's offering is even good
Well they're aiming to be providing graphics cards by the year 2020 - given how fickle crypto-mining has been, I doubt that's the market they're trying to reach. It may be - if it is then all power to them but it seems like a huge risk if that is the primary market they're aiming for.
I wish there where more GPU and CPU manufacturers around. The fact that Intel got a borderline monopoly on CPUs is creepy and sad.
x86/64 licensing is a mess iirc making it basically impossible for more companies to jump in.
This is definitely interesting. Excited to see what comes out of it.
Not impossible, just absolutely irritating because no one owns enough of the pieces to give a blanket license to anyone. Ideally we'd see the value of the platform and nationalise/internationalise it as an accepted standard.
Intel would never let that happen. We're more likely to all start using ARM processors like Apple is rumored to be planning than x86_64 becoming an open standard.
Yeah, that's why I said ideally. A shame really, imagine the developments we'd see if more than two companies were actively making strides to improve the product. (VIA doesn't count, they haven't done anything super market relevant in years.)
This makes Intel's Vega iGPUs even more confusing
x86-64 is Too Big To Fail at this point. Intel tried replacing it with IA-64/Itanium and look how that went. Microsoft tried to bring ARM to a semi-desktop context and look how that went. Apple used PowerPC homebrew CPUs for decades and finally caved to x86-64. The new consoles have hopped on board too. x86 is here to stay for a long-ass time.
How is that too big to fail tho? Also all of those were shit, so of course they failed.
So much of the world's software in so many fields runs on x86. It would be an incredible effort to switch away to something like ARM (which, from what I've heard, isn't even very suitable for desktop use.) Not to mention legacy software, backwards compatibility, old PC games, etc.
Wouldn't suprise me if it was extremely overpriced like their trend of CPU pricing.
every single computer i've ever owned has had an intel chipset and by god they are some of the worst things i've ever used for gaming. i have to look up how a game runs on youtube before i even consider getting it. i know my way around many .ini files because of these things. however, i'm excited to see what this brings to the market altogether. maybe we can start seeing these GPUs inside laptops as well. although i think it'll be expensive and pretty much gone on arrival anyway if the crypto fad is still going.
Emulators unironically tend to work better for old PC games than running new x86 hardware tbh. And that's true that it's dominant but that doesn't make it too big to fail really. Hell, a lot of software builds just fine on other architectures if it was written in accordance to standards and the necessary shit has been ported. I just don't really see what catastrophe you're referring to.
This won't be competition, locked down devices don't compete, they control their users and avoid them from switching to something else.
You can complain about Intel having a monopoly but AMD has been a non-starter for desktop gaming with the exception of the new line up of NUC's. The market is prime for them to jump right in
Yeah, almost no software for desktop use has been written in assembly since the Windows XP days, which makes things easy. We can see this in practice with Linux; tons of desktop Linux software compiles and runs on ARM platforms like the Raspberry Pi with no change to the source code.
That's the same opinion and experience I've had with AMD / ATI chips. All of my PCs that ran Intel, including this one, have ran like a dream, no issues whatsoever. The few I ran with AMD / ATI chips were an unreliable, unstable nightmare that required constant babysitting, and was an overall dreadful experience. For that matter, I've had the same good/bad experience with NVidia / AMD. Never once had issues with an NVidia card, never once had an AMD / Radeon card that ran remotely smooth. Back on topic, I, too, am very interested to see what Intel brings to the table. I'm honestly expecting a lackluster performance well above its equivalent NVidia price-point (to say nothing of being lightyears above its equivalent AMD price-point), but I might be surprised. I hope I am - I'd like to see another company be able to compete with NVidia and get their prices in check - for as much as I like NVidia cards, I certainly don't like their pricing schemes.
AMD should be split in two companies. Nvidia should buy the CPU part, Intel should buy the GPU part. Don't kill me
Why would you want two companies who are known to be predatory each pushing full-stacks of exclusive bullshit.
This is literally the worst suggestion I've ever heard. Even if you don't like AMD, their very existence makes Intel and Nvidia better by making them compete.
Because I'm not too optimistic about AMD's future, because of the exact same reason you mention they are a much smaller company and really need a push. You also mention Samsung, that is also an interesting option. But Samsung is so big, they could buy Nvidia and leave AMD for desert.
At the end of the day, as long as modern-featured compilers exist for the chip and it supports the hardware required to run the programs, there won't be any problems. Unix systems in particular would have essentially 0 problems swinging to a new standard since so much is just completely open source.
I don't know why, AMD cards are good cards. They're not going to help you epeen but if you're on a budget you can make a decent midrange with AMD cards. I feel like the total ignorance of the middle market is due in part because of these dumb benchmark tests which are often times loaded with vacuous unneeded settings that you can turn off and suddenly get x10 the performance with little quality drop.
Sorry, you need to Log In to post a reply to this thread.