• Nvidia CEO warns of “extraordinary, unusually turbulent, disappointing” Q4
    37 replies, posted
https://arstechnica.com/gaming/2019/01/nvidia-ceo-warns-of-extraordinary-unusually-turbulent-disappointing-q4/
Probably because the 20** series doesn't bring anything significant to the table, RTX is a long way off entering the mainstream and the 10** series are still powering through everything most gamers need. Hell, my 770 was handling everying I needed. Also proprietary bullshit. RTX shouldn't be locked off to the rest of the industry.
The main reason why is because amazon/google have started making their own data center chips, and nvidia's upper management lied out their asses about the surplus from the bitcoin craze. Add ontop that one of their main benefactors recently pulled out, its all big ass warning lights flashing at Nvidia's stock.
Basically, it's because NVIDIA's profits for Q4 lack the massive surge in profit that came from the days of mining in the previous quarters. They hoped the release of Turing would at least help that blow in profits, but when it got out that most of the Turing line was just "Pascal but more expensive" in terms of performance and with a feature that lowers that performance by half for what is right now minimal visual gain, consumers didn't bite all that much. If they did, it was on Pascal GPUs that were sold on the second hand market (ebay) because NVIDIA decided to discontinue Pascal. Most consumers would rather wait until the tech matures or provides a real change to performance compared to what they have for the same price or less. Pascal was on the market for 3 years completely unchallenged, and Turing so far makes it look like NVIDIA was just sitting on their asses for most of those 3 years.
Oh I'm sorry, I thought RTX would let you "crush" the Radeon VII? Looks like it's not even doing great on its own~
Still, a GTX 1080 Ti is largely equivalent to an RTX 2080/GTX 1180, which means it's the very next tier below RTX 2080 Ti/GTX 1180 Ti, which is the leading non-Titan card on the market at the moment. The fact that AMD has an answer to that at all is pretty big. What I'm more concerned about is their lagging behind on anything beneath it. The original Vega 56/64 aren't all that compelling and those are all they have to match the GTX 1070/1080. And anything below those is still on Polaris.
If the new video cards weren't so fucking expensive. they'd probably do better. The 2080Ti is nearly twice as expensive as the 1080Ti, but far from being twice as fast. Not a good move either to completely discontinue the 10-series. They're trying to sell cards on the whole RTX tech, but it's honestly not fast enough to be seriously used yet.
Radeon VII might find moderate success when non reference cards start to come out and prices lower. The reaction to Radeon VII was a mix of "YEAH GO AMD!" and "GODDAMN THAT'S EXPENSIVE, WHAT ARE YOU DOING?, so the faster you can obtain it for cheaper, the better for AMD. As for Navi, that'll basically cover the mid-range from what we're hearing, with the top end card rumored to either match or come close to a 2070. Essentially AMD has caught up, so it'll probably be another year before we see any real advancements to performance from both parties. Still, it's crazy. We've had the same GPU horsepower for essentially 3 years, and it looks like we're shaping up to make this year the 4th year. GPUs used to drastically improve every year or two, so I'm left wondering what exactly happened to cause this.
Vega fell on its face thanks to Sony's meddling and Nvidia became complacent after the massive success of Pascal.
Isn't the same kind of stuff happening with CPUs to a degree? I wonder if we're reaching the limits of Moore's law, or if companies just aren't trying.
Well Sony and Microsoft are both heavily sponsoring Navi since it’ll be in the PS5/Xbox Lockhart, so it’ll be at least set a new bar for NV and AMD to jump over.
Nvidia is doing a horrible job of convincing people to upgrade from the GTX 1000 series cards, which are the most common gaming GPUs out there. They're functionally the exact same, but bumped down a naming tier and with RTX added, which nobody really seems to use. Even the pricing is the same. The RTX 2070 is the same performance and price as the GTX 1080. The RTX 2080 is the same performance and price as the GTX 1080 Ti. The only reason to buy one of these is if you have a Maxwell or previous card and Pascal cards are out of stock.
I have a Maxwell card still (GTX 970), and I still don't see any reason to get an RTX GPU simply because the prices are so ludicrous and RTX is clearly still in its early stages. As such, if I'm to get a new GPU this year, it's either going to be a Radeon VII or Navi GPU, or maybe a used 1080ti if I can find one for cheap (unlikely given they aren't being produced anymore). Until then, my 970 is still serving me surprisingly well. I just expected to upgrade it by now is all, with a GPU that is actually a massive upgrade.
Can you elaborate on just what meddling was going on?
Sony's been involved in AMD's GPU business lately due to wanting PS5 to run on Zen/Navi. As to what "involved" means, I'm not entirely certain, but I imagine it has to do with funding its GPU R&D. There are all-but-confirmed indications that Vega ended up the disappointment that it was because Sony strangled its development budget and forced AMD to push it towards Navi instead.
Probably because nobody can fucking afford them. They're still pushing the high prices of the bitcoin rush and don't seem to understand it's over now.
Has nothing to do with competition other than the lack of it. nVidia oversold by almost an order of magnitude on mining and mining alone. Their data center and learning profits have been stable, and there's nothing competing from AMD, when AMD does enter at competition their solution is basically a slightly cheaper sandwichized knock-on with the TDP blown literally through the actual power and temp roof, which is not an elegant answer in any playbook. They basically stalled on their r+d roadmap for a year based solely on mining profits. Considering people were taking out second mortgages to buy banks of 1080tis for over two years nVidia had little financial incentive other than to wait for manufacturing side upgrades to fabs, especially after getting the nod from Nintendo. They just got greedy, and all of Jensen's talk about making sure gamers and universities aka their core business was supplied first was bullshit. RTX is, in fact, a pretty big deal, the same level as T+L was back in the day. AMD took a bath on being first with T+L and tessellation both, now both are industry and discipline standard (and nVidia had better versions of both, ironically enough) and ray tracing is coming hell or high water, so for nVidia to attempt to be first isn't really anything controversial or foolish. Where nVidia fucked up was charging the customer close to the actual cost instead ameliorating it over time and generations as has been done in the past. nVidia didn't stick to the standard model of shaving on margins and manufacturing and simply decided based on volume of mining cards to simply gouge the shit out of consumers to get DXR standardization working and in place before AMD, and instead of rolling the capability out in enough silicon to test over gens, they simply bolted it wholesale on to an existing engineering model and made consumers pay for it. Again simple greed, but someone was always going to take the hit on RT, and this time it was team green instead of team red. AMD deliberately and by design fucked over Polaris to make sure Navi got the next batch of console wins to guarantee the sale of 30 million SKUs sold, which is great for console people a year to two from now, but has sucked for PC people for two years running and the next year minimum thanks to lack of competition, as they've had zero answer to anything nVidia has done other than 'nVidia sucks, I hate them and wont buy it'. SO AMD's sainthood-by-proxy is subjective at best Yeah except Su already stated straight up console Navi is a mid range chip equivalent to X70, so if you're expecting mad gains on current console that can't do even 4K without checkerboarding and randosmapling the frame buffer, you're going to pretty sad, because 60/'4K-ish' under dx12 is still the general target. If AMD was the champion their fanboys purport they'd have a non ray tracing card just blasting frames out in the literal hundreds at 4K before Navi, and not only do they not have that now, Navi won't have that either unless they make another X99 mega sandwich card for ya know the cheap cheap price of about 2-2.5K like they did two gens ago. nVidia got greedy and now the market much like Blizzard and EA cannot sustain their greed. Simple as. The notion that AMD is 'winning' simply by existing while nVidia slaps themsevles vigorously with their own dick is specious as hell.
lol no. A 1080Ti is better than a 2080. https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-Nvidia-GTX-1080-Ti/4026vs3918 Has more VRAM and performs (arguably) better.
This generation from Nvidia has been about as disappointing as the 7xx series. I'll be retiring my 2016 GTX 1070 as soon as Radeon 7's released and move on to a 1440p monitor
gee i wonder why the cards aren't selling well https://files.facepunch.com/forum/upload/110524/56807f86-f455-4954-bd6e-6c9e2b67a653/image.png
Then what of Xbox? Lockhart is rumored to be a beefier One X.
Kinda miffed that Nintendo went with Nvidia for the Switch, especially with it only being a Tegra X1. One of a couple reasons why I thought Switch could've used another year in the oven (then again, so could've the Wii U and especially 3DS in hindsight...) They'd been on AMD CPU/GPU chipsets since the GameCube, and I was kinda hoping that we'd see something a bit more comparable in power at least PS4/Xbone, considering that it's a Gen9 system and yet isn't even as strong as the best of Gen8. At least the Wii handily beat the original Xbox and likewise the Wii U vs. PS360, but Switch is still between Wii U and the base Xbone. Would've preferred to have a mobile Zen in the thing. What's this Lockhart/Anaconda business? Last I checked the new Xbox was going by Scarlett.
Memory Cost, 4K not taking off like a lot of people have been saying and additionally(if steam stats are to be believed last year the 4K monitors have actually reduced in percentages of PC players), the demands for VR have been met for the most part. There's no reason to push further except for Raytracing; notice how it was Raytracing and some Algorithmic bullshit AA that just makes the picture look better in a cosmetic way. We're about to enter the Uncanny Valley of Visual Graphics, and no one wants to make the first jump into that.
I mean anything AMD could've put in it would've made it suffer both in cooling and battery life (and with that, performance), and Nvidia already had something perfectly suitable for them, so I honestly don't see why Nintendo would've gone a different route than they did.
Except the heaviest weighting on userb reviews is "user experience" which is about as objective as smearing your CPU with thermal peanut butter and mayo before jamming a ceiling fan on it cause two great tastes that taste great together. meanwhile: [H]ardOCP GeForce RTX 2080 Founders review A very clear winner and it isn't the 1080. The price is the issue, not performance, a 25-40% title dependent upgrade does not double the price nor should it ever. Thew cross corollary to that is you shouldn't expect a 100% increase for the same price either. ONE gen had this happen and now people like Adored bray at the top of their lungs that every gen should almost double performace for equal price, not happening almost ever, nor should anyone expect it. Clickbait at best. They have roughly have the same white specs at a higher clock with more memory, it depends entirely on their learning the lesson that got them spanked versus sony last time at selling a system for more money while delivering 45% less throughput. I'll assume they aren't dumb enough to make the same msitake twice.
This news tanked Nvidia stocks again, not looking good for investors.
WHY ARENT PEOPLE BUYING OUR £1300 GRAPHICS CARDS
Could literally never forget. That card was amazing and lasted me years, and years, and years. the 8800GT was good enough to run Crysis at launch, not at staggering settings, but it fucking ran.
Well yes. Because we're talking about the 1080Ti. https://i.imgur.com/0VHwlWO.png https://i.imgur.com/ka3hjFD.png https://i.imgur.com/aSJ5AMg.png taken from https://www.youtube.com/watch?v=8L47x0YN1fg
Oh boy and right after TSMC announced that tens of thousands of wafers from their 16/12nm process that Nvidia uses have had to all be thrown away. Nvidia sure has it rough this year already lmao. https://www.pcgamesn.com/tsmc-chemical-manufacturing-fault-nvidia-gpus
Sorry, you need to Log In to post a reply to this thread.