AMD declares all remaining pre-GCN GPUs & APUs as having reached peak performance optimization and h
24 replies, posted
[url]http://www.anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy[/url]
[quote=Anandtech]Alongside today’s release of the new Radeon Software Crimson Edition driver set, AMD has published a new page on their driver site announcing that video cards based on the company’s pre-Graphics Core Next architectures have been moved to legacy status. This means that GPUs based on the company’s VLIW5 and VLIW4 architectures – the Evergreen and Northern Islands families – have been retired and will no longer be supported. All of AMD’s remaining supported GPUs are now based on various iterations of the Graphics Core Next architecture.
Overall this means that the entire Radeon HD 5000 and 6000 series have been retired. So have the Radeon HD 7000 to 7600 parts, and the Radeon HD 8000 to 8400 parts. AMD and their partners largely ceased selling pre-GCN video cards in 2012 as they were replaced with GCN-based 7000 series cards, so pre-GCN parts are now about 3 years removed from the market. However some lower-end OEM machines with the OEM-only 8000 series may only be 2 years old at this point.
In their announcement, AMD notes that their pre-GCN GPUs have “reached peak performance optimization” and that the retirement “enables us to dedicate valuable engineering resources to developing new features and enhancements for graphics products based on the GCN Architecture.” Furthermore AMD is not planning on any further driver releases for these cards – the announcement makes no mention of a security update support period – so today’s driver release is the final driver release for these cards.[/quote]
[QUOTE=wickedplayer494;49180455][URL]http://www.anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy[/URL][/QUOTE]
rip vliw.
I'm surprised they even went that route to begin with. Just because it makes sense for the architecture doesn't mean its a good idea. I bet it means they can lay off all the engineers that supported it.
[quote]Overall this means that the entire Radeon HD 5000 and 6000 series have been retired[/quote]
Cant say that I'm mad about this. I have a 6870 and its going to be 5 years old in the spring. It was only a matter of time. Thats a long time on that hardware.
[QUOTE=Code3Response;49180474]Cant say that I'm mad about this. I have a 6870 and its going to be 5 years old in the spring. It was only a matter of time. Thats a long time on that hardware.[/QUOTE]
Same here. And it's a Sapphire, no less; Way past it's prime.
Holy shit when they said GCN I thought they meant Gamecube for a second. (obv gamecube does not get driver updates)
That would be dedication.
Don't forget the new amd crimson drivers that came out today. Dropping old card and bringing in a new UI and expanding features, it's a pretty big change.
That seems pretty fast for some of those cards. The HD 6950 was still selling for ~$250 only 3 years ago and can still run games like Fallout 4. I know quite a lot of people that are still using them.
[QUOTE=sgman91;49180990]That seems pretty fast for some of those cards. The HD 6950 was still selling for ~$250 only 3 years ago and can still run games like Fallout 4.[/QUOTE]
Hasn't been produced for a long while, though - as a customer owning a 6950 I feel like it's fair enough, though obviously I'd rather have even longer support.
[QUOTE=GoDong-DK;49180999]Hasn't been produced for a long while, though - as a customer owning a 6950 I feel like it's fair enough, though obviously I'd rather have even longer support.[/QUOTE]
Seems too fast to me. I generally wait ~5 or so years to upgrade and have never run into this issue yet. There are a ton of people still using them. Honestly makes me just want to go Nvidia next time.
[QUOTE=sgman91;49181009]Seems too fast to me. I generally wait ~5 or so years to upgrade and have never run into this issue yet. There are a ton of people still using them. Honestly makes me just want to go Nvidia next time.[/QUOTE]
Nvidia has considerably better legacy support to be honest. On their drivers they actively maintain several legacy branches and fix bugs that come up with newer OSes and software. I think part of what it comes down to is that AMD simply doesn't have the money to support their old cards, considering the support for current cards is questionable even.
List of Nvidia cards by legacy branch: [URL="http://www.nvidia.com/object/IO_32667.html"]http://www.nvidia.com/object/IO_32667.html[/URL], active support for cards in the geforce 7 series, going back over 9 years.
[QUOTE=nintenman1;49181041]Nvidia has considerably better legacy support to be honest. On their linux drivers they actively maintain a legacy branch and fix bugs that come up with newer Xorg versions, with similar support in the windows driver set, and occasional game fixes/optimizations. I think part of what it comes down to is that AMD simply doesn't have the money to support their old cards, considering the support for current cards is questionable even.
List of Nvidia cards by legacy branch: [URL="http://www.nvidia.com/object/IO_32667.html"]http://www.nvidia.com/object/IO_32667.html[/URL][/QUOTE]
I really feel like this is a short sighted decision. Sure, they might save some money now, but, like I said, I know a whole lot of people with these cards who will now most likely not become returning customers.
[QUOTE=nintenman1;49181041]Nvidia has considerably better legacy support to be honest. On their linux drivers they actively maintain a legacy branch and fix bugs that come up with newer Xorg versions, with similar support in the windows driver set, and occasional game fixes/optimizations. I think part of what it comes down to is that AMD simply doesn't have the money to support their old cards, considering the support for current cards is questionable even.
List of Nvidia cards by legacy branch: [URL="http://www.nvidia.com/object/IO_32667.html"]http://www.nvidia.com/object/IO_32667.html[/URL], active support for cards in the geforce 7 series, going back over 9 years.[/QUOTE]
"Support" should be used lightly when they break the shit out of Fermi and earlier cards all the time.
[editline]24th November 2015[/editline]
[QUOTE=sgman91;49181066]I really feel like this is a short sighted decision. Sure, they might save some money now, but, like I said, I know a whole lot of people with these cards who will now most likely not become returning customers.[/QUOTE]
Considering for older nvidia cards the rule of thumb is not to be up to date, as those aren't the best performing or stable drivers, they shouldn't make a deal about about it.
I've been rocking a 5770 for the past 5 years. Two days ago I ordered a whole new PC and the R9 390 I ordered just came in today. :v:
5670 will be 6 years old in the spring and doesn't support DX12. Combos pretty well with my i3 CPU though, apparently Crysis 3 is really well-optimized or not as demanding as people think
[QUOTE=sgman91;49181066]I really feel like this is a short sighted decision. Sure, they might save some money now, but, like I said, I know a whole lot of people with these cards who will now most likely not become returning customers.[/QUOTE]
Quite. But a lot of companies' foresight only goes out a few quarters, and doesn't consider customers who may feel slighted by the move and while they don't immediately need to switch, may pick up a competitor's product that they feel offers more long term usability.
[QUOTE=Cronos Dage;49181085]5670 will be 6 years old in the spring and doesn't support DX12. Combos pretty well with my i3 CPU though, apparently Crysis 3 is really well-optimized or not as demanding as people think[/QUOTE]
game scales really well across GPUs honestly if you have a reasonable CPU, which an i3 is more than enough. Crysis 3 was crunched all the way down to run playable framerate on xbox 360 (Ati X1800) and ps3 (GTX 7800) after all.
[QUOTE=sgman91;49181066]I really feel like this is a short sighted decision. Sure, they might save some money now, but, like I said, I know a whole lot of people with these cards who will now most likely not become returning customers.[/QUOTE]
I don't really think it's that weird for them to remove support for VLIW4/5 cards.
They obviously cannot support all cards forever. Some cards just physically can't support the modern software paradigm - you can't make a Rage Pro run DirectX 9 shaders in hardware no matter how hard you try. When the time comes, you don't stop them from working, you just stop adding new features.
So the question becomes when, not if. And a reasonable answer would be "when the cards can no longer support the latest APIs.
Well, DirectX 12/Vulkan hit soon, and VLIW4/5 can't support it. VLIW5 came out when Dx11 was new, so this isn't exactly a surprise.
So regardless of whether AMD "officially supports" it or not, my 6870 is never going to run Dx12 games. "Official support" for these cards would basically just be game-specific optimizations - making the card run new games faster than it currently does. (Dx12 is actually trying to eliminate this practice, or rather, move it onto the game developers to optimize their games at the low level, instead of making driver authors do it.)
And like others have said, Nvidia may claim to "officially support" cards a lot longer, but they don't actually do it. I've got a Kepler card (same timeframe as GCN1.0) and it's already missing out on some driver features. Plus the driver crashes surprisingly often on W10. No idea why.
Suppose my launch 5870 has seen some shit over the last 6 years, I don't see myself replacing it though still for at least another year.
[QUOTE=nintenman1;49181087]Quite. But a lot of companies' foresight only goes out a few quarters, and doesn't consider customers who may feel slighted by the move and while they don't immediately need to switch, may pick up a competitor's product that they feel offers more long term usability.
game scales really well across GPUs honestly if you have a reasonable CPU, which an i3 is more than enough. Crysis 3 was crunched all the way down to run playable framerate on xbox 360 (Ati X1800) and ps3 (GTX 7800) after all.[/QUOTE]
All of the Crysis games scale extraordinarily well, I could run Crysis with a laptop Core 2 Duo and 9600m GT at about medium settings.
So, does this mean I can keep my HD7850 a bit longer?
It's only 7000 to 7600, right?
[QUOTE=gman003-main;49181175]I don't really think it's that weird for them to remove support for VLIW4/5 cards.
They obviously cannot support all cards forever. Some cards just physically can't support the modern software paradigm - you can't make a Rage Pro run DirectX 9 shaders in hardware no matter how hard you try. When the time comes, you don't stop them from working, you just stop adding new features.
So the question becomes when, not if. And a reasonable answer would be "when the cards can no longer support the latest APIs.
Well, DirectX 12/Vulkan hit soon, and VLIW4/5 can't support it. VLIW5 came out when Dx11 was new, so this isn't exactly a surprise.
So regardless of whether AMD "officially supports" it or not, my 6870 is never going to run Dx12 games. "Official support" for these cards would basically just be game-specific optimizations - making the card run new games faster than it currently does. (Dx12 is actually trying to eliminate this practice, or rather, move it onto the game developers to optimize their games at the low level, instead of making driver authors do it.)
And like others have said, Nvidia may claim to "officially support" cards a lot longer, but they don't actually do it. I've got a Kepler card (same timeframe as GCN1.0) and it's already missing out on some driver features. Plus the driver crashes surprisingly often on W10. No idea why.[/QUOTE]
I would probably go another year or two at least. Some of the cards can still sufficiently run modern games like Fallout 4. I have a 6950 and average about 40 fps with low settings. My point being that people with the card would easily be able to get another year out of it without needing to upgrade.
You can talk about technical reasons to drop support, but they don't line up with consumer needs. I know that there are definitely a lot of people who will be effected by this. Take Overwatch, for example. The 6950 will be able to run it just fine, but now there wan't be any driver update if an issue arises.
[QUOTE=Burnout6010;49181324]So, does this mean I can keep my HD7850 a bit longer?
It's only 7000 to 7600, right?[/QUOTE]
yes, 7850 is GCN 1.0, and is thus included in support for 15.11
[editline]24th November 2015[/editline]
[QUOTE=Levelog;49181069]"Support" should be used lightly when they break the shit out of Fermi and earlier cards all the time.
[/QUOTE]
Until recently cards much older than fermi (tesla etc) were supported on the mainline branch, they broke off the older cards at 340 series into a legacy branch, and removing support of them from mainline drivers allows to to avoid "breaking" older cards with changes more focused on the modern GPUs. Fermi to maxwell are much more similar than tesla and prior design wise, so it makes sense that they would be kept active on a recent mainline driver. Hell, Geforce 400 series is over 5.5 years old at this point, anyone still rolling one of those still has current driver support, that's pretty good if you ask me. Geforce 8 series (their first unified shader architecture) still has first rung legacy driver support, and is fully supported on windows 10.
[QUOTE=Burnout6010;49181324]So, does this mean I can keep my HD7850 a bit longer?
It's only 7000 to 7600, right?[/QUOTE]
Yup, 7700+ and the R7/R9 2xx/3xx series of cards are GCN cards, so they're fine.
Keep in mind that AMD actually released their first version of the Crimson driver as the "final" driver for the now phased out cards, it will not recieve any further updates after this one though but it's still nice to have.
[url]http://support.amd.com/en-us/download/desktop/legacy?product=legacy3&os=Windows+10+-+64[/url]
[QUOTE=sgman91;49181369]I would probably go another year or two at least. Some of the cards can still sufficiently run modern games like Fallout 4. I have a 6950 and average about 40 fps with low settings. My point being that people with the card would easily be able to get another year out of it without needing to upgrade.[/QUOTE]
And you can keep your 6950 for probably a few more years without needing to upgrade. The dropping of support won't immediately trash performance - just look at benchmarks from games at launch (before game-specific optimizations) and with later drivers. The difference is rarely over 5%.
So maybe instead of 50fps you'll get 47fps. That's not exactly "mandatory upgrade".
Your point about not getting fixes for issues is valid inasmuch as driver updates sometimes do fix issues, but quite often the fixes come on the game side instead, from patches. So if an issue affects most cards on an architecture, and the game is otherwise playable on low-end hardware, I would be quite surprised to not see a fix.
If you're budget-constrained, and can tolerate low settings and resolutions, I wouldn't abandon the 6950 until Dx12-only games start coming out. That's the point where it will really become obsolete, and that's an actual hardware problem.
[QUOTE=sgman91;49181369]I would probably go another year or two at least. Some of the cards can still sufficiently run modern games like Fallout 4. I have a 6950 and average about 40 fps with low settings. My point being that people with the card would easily be able to get another year out of it without needing to upgrade.
You can talk about technical reasons to drop support, but they don't line up with consumer needs. I know that there are definitely a lot of people who will be effected by this. Take Overwatch, for example. The 6950 will be able to run it just fine, but now there wan't be any driver update if an issue arises.[/QUOTE]
You're overestimating the effect of these optimizations. They're nice but not having them doesn't mean you can't run the game (unless there's an actual bug preventing you from doing so).
Sorry, you need to Log In to post a reply to this thread.