• AMD Finally Reveals How It Intends to Position Polaris 10 & 11 GPUs in the Market
    22 replies, posted
[QUOTE] AMD has posted their first quarter 2016 financial results. Apart from a sequential and year-over-over decline in revenue, one paragraph that particularly caught our eye among the lines of this latest press release sheds light on the market positioning of upcoming Polaris chips. We know the target markets of the next gen Polaris 10 and Polaris 11 GPUs have generally been defined in previous reports; however, this is the first time ever when AMD has officially spelled out the situation. Confirming the rumors we’ve been hearing for a while, the chip maker revealed that Polaris 11 will power cards designed for low-end gaming PCs and thin-and-light notebooks. Whereas, Polaris 10 is aimed at mainstream desktop segment and high-end notebook market. To give you an idea of the term ‘mainstream’ and about how powerful these Polaris 10 GPUs could be, well, they are set to replace the Radeon M300 high-end segment based on the Tonga silicon. Reportedly, Polaris 10 GPU code named “Ellesmere” could feature 2304 stream processors across 36 CUs, and support up to 8GB of GDDR5(X) memory on a 256-bit memory interface. Polaris 11, codenamed “Baffin,” is to succeed the “Curacao” GPU which powers various mid-range cards. The GPU could feature 1024 stream processors over 16 CUs, and is expected to be be capable of having 4GB of GDDR5 memory on a 128 bit memory interface. [/QUOTE] [URL="http://techfrag.com/2016/04/22/amd-finally-confirms-target-markets-polaris-10-polaris-11-gpus/"]via techfrag[/URL]
Polaris better kick some serious ass. They need too.
[QUOTE=shadowboy303;50188424]Polaris better kick some serious ass. They need too.[/QUOTE] So does Zen, but Intel's been dicks regarding locking-out/disabling neat features from enthusiast CPU's (You want overclockability, then no VT-d for you) in the recent years. Yet my old Q9550 could reliably handle VT-d while being clocked the fuck out of, as long as it was stable in general benckmarking. Seriously though, Intel needs a suckerpunch, and Nvidia a stern talking to, they're starting to act like Intel.
[QUOTE=shadowboy303;50188424]Polaris better kick some serious ass. They need too.[/QUOTE] The Polaris 10 is reported to run Hitman at 60FPS 1440p (Unknown settings, could be low. Hard to tell the difference between settings in that game)
Hopefully these are the GPUs that Sony intends on adding to the PS4.5 Not only would this give AMD the money it needs, but it's definitely something needed to push the competetive market forward against Nvidia. Another thing which I am seriously concerned about though is that I really, really want to know how these benchmarks against my current R9 280x crossfire setup. I'd really would love to upgrade this year, but AMD hasn't produced anything reliable in the last two years. Also with how people are mentioning that you can now multi-gpu with Nvidia and AMD cards, that would be also a really fun thing to do.
The one real advantage they have this go round is asynchronous compute/render, and it appears they're expecting that to be able leverage enough efficiency that it will hold against nVidia's massive brute force upgrade,and I do mean massive. However it's kind of telling that they aren't even attempting to position anything against the upcoming Titan, and already making efficiency and "value-per-watt" claims.
[QUOTE=27X;50188755]The one real advantage they have this go round is asynchronous compute/render, and it appears they're expecting that to be able leverage enough efficiency that it will hold against nVidia's massive brute force upgrade,and I do mean massive. However it's kind of telling that they aren't even attempting to position anything against the upcoming Titan, and already making efficiency and "value-per-watt" claims.[/QUOTE] Bet they're go back to the "two GPU's on one card" strategy for their eventual high-end cards, just like they did back in the HD3xxx and HD4xxx series.
[QUOTE=27X;50188755]The one real advantage they have this go round is asynchronous compute/render, and it appears they're expecting that to be able leverage enough efficiency that it will hold against nVidia's massive brute force upgrade,and I do mean massive. However it's kind of telling that they aren't even attempting to position anything against the upcoming Titan, and already making efficiency and "value-per-watt" claims.[/QUOTE] Their high-end cards are called Vega so far, but nothing else is really known about them.
[QUOTE=Van-man;50188785]Bet they're go back to the "two GPU's on one card" strategy for their eventual high-end cards, just like they did back in the HD3xxx and HD4xxx series.[/QUOTE] Likely.
[QUOTE=27X;50188755]The one real advantage they have this go round is asynchronous compute/render, and it appears they're expecting that to be able leverage enough efficiency that it will hold against nVidia's massive brute force upgrade,and I do mean massive. However it's kind of telling that they aren't even attempting to position anything against the upcoming Titan, and already making efficiency and "value-per-watt" claims.[/QUOTE] You keep saying this but I haven't seen shit outside of marketing material and bullshit "leaks" from WCCFTech.
wait so when are the high end cards for big shit computers coming
Not for a while. Titan is likely to be an uncut or low-cut GP100, so you aren't going to see those until the HPC market slows down unless nVidia's yields are magically perfect (there's no reason to expect a new fab to have any kind of magical yields, quite the opposite) and Vega is very likely to be a sammich card, which means it'll be a while before both are grubby mitts ready.
[QUOTE=Turing;50188994]wait so when are the high end cards for big shit computers coming[/QUOTE] They're focusing on mid-range, because even though everyone's usually hype for high-end, only few can afford buying high-end cards. I'm borderlining to Toxx'ing that their high-end cards will be dual midrange GPU's on one card, that's how certain I am, especially since they've done that before.
Why has the gpu industry in the past year or so been so shit?
[QUOTE=The Baconator;50189626]Why has the gpu industry in the past year or so been so shit?[/QUOTE] Shitty yields.
[QUOTE=JoeSkylynx;50188506]Hopefully these are the GPUs that Sony intends on adding to the PS4.5 Not only would this give AMD the money it needs, but it's definitely something needed to push the competetive market forward against Nvidia. Another thing which I am seriously concerned about though is that I really, really want to know how these benchmarks against my current R9 280x crossfire setup. I'd really would love to upgrade this year, but AMD hasn't produced anything reliable in the last two years. Also with how people are mentioning that you can now multi-gpu with Nvidia and AMD cards, that would be also a really fun thing to do.[/QUOTE] Both Nvidia and AMD haven't made any big steps for 2 years. However the AMD390, 380x and 380 are objectively better than their Nvidia counterparts. The difference is however small, so NVIDIA keeps on steamrolling and due to their superior marketing and game works crap. Amd doesn't make alot of profit on their Xbox and ps4 chips btw. They are absolutely massive chips using a big procedure which means low yields per wafer. Due to the low price of a console compared to a gpu, it makes them almost no profit. [editline]24th April 2016[/editline] [QUOTE=The Baconator;50189626]Why has the gpu industry in the past year or so been so shit?[/QUOTE] It's a very small market, not alot of money there. And the fabs are being hogged by Samsung and the likes for making phone socs.
[QUOTE=Van-man;50188459]So does Zen, but Intel's been dicks regarding locking-out/disabling neat features from enthusiast CPU's (You want overclockability, then no VT-d for you) in the recent years. Yet my old Q9550 could reliably handle VT-d while being clocked the fuck out of, as long as it was stable in general benckmarking. Seriously though, Intel needs a suckerpunch, and Nvidia a stern talking to, they're starting to act like Intel.[/QUOTE] That changed with Haswell refresh and later. Also -E parts never had that restriction. [editline]24th April 2016[/editline] [QUOTE=taipan;50189698]Both Nvidia and AMD haven't made any big steps for 2 years. However the AMD390, 380x and 380 are objectively better than their Nvidia counterparts. The difference is however small, so NVIDIA keeps on steamrolling and due to their superior marketing and game works crap. Amd doesn't make alot of profit on their Xbox and ps4 chips btw. They are absolutely massive chips using a big procedure which means low yields per wafer. Due to the low price of a console compared to a gpu, it makes them almost no profit. [editline]24th April 2016[/editline] It's a very small market, not alot of money there. And the fabs are being hogged by Samsung and the likes for making phone socs.[/QUOTE] The hardware may be better in some ways (though I am sure there will be benchmarks favouring both sides), they really need to win over developers too. I know people see Gameworks as evil, but I think that if AMD is going to really succeed then they need to provide a compelling reason for developers to switch from technology that they know, to a lot of people open source isn't really enough, though GPU Open is certainly nice to see.
I really want AMD to succeed. I've always used AMD products because while they're not the strongest, for the cost and pricepoints I get a lot more movement out of them and they're generally easier to upgrade(especially the fucking CPUs.) They're also really open and nice with all their techs being open sourced and such and generally having better solutions for both cards. Anytime I hear a game has gameworks or opens with one of those Nvidia openings I immediately just down tune everything that involves tessellation and in general because I'm expecting either hamstringed tuning or poor optimization.
nvidia are dicks with their closed software, whereas amd is all open and shit so when nvidia use a tech its only for them but when amd use a tech its open and so nvidia can develop with it and so it's kinda unfair
[QUOTE=taipan;50189698]Amd doesn't make alot of profit on their Xbox and ps4 chips btw. They are absolutely massive chips using a big procedure which means low yields per wafer. Due to the low price of a console compared to a gpu, it makes them almost no profit.[/QUOTE] Why would AMD take that hit and not Sony/MS?
[QUOTE=Scot;50197345]Why would AMD take that hit and not Sony/MS?[/QUOTE] Because they're desperate and Sony/MS already sell the console at a loss.
[QUOTE=Levelog;50197872]Because they're desperate and Sony/MS already sell the console at a loss.[/QUOTE] Desperate to lose money on each chip sold?? Sony/MS can't just use a different vendor, so why would AMD sell at loss? To keep the fab workers employed (I know they had a wafer quota at some point with GF, but iirc that was renegotiated, so it can't be that)? Doesn't make sense to me.
[QUOTE=GoDong-DK;50198233]Desperate to lose money on each chip sold?? Sony/MS can't just use a different vendor, so why would AMD sell at loss? To keep the fab workers employed (I know they had a wafer quota at some point with GF, but iirc that was renegotiated, so it can't be that)? Doesn't make sense to me.[/QUOTE] No no, AMD doesn't sell for a loss, just a shit profit. MS/Sony sells for a loss.
Sorry, you need to Log In to post a reply to this thread.