• Fermi GeForce = GTX 4XX
    1,778 replies, posted
[QUOTE=ShaRose_;21070233]Still can't admit he has a source, after all of this. Pretty sad.[/QUOTE] Maybe I haven't made myself clear. I don't follow his articles, so I don't know if he's a good source or not. I don't [i]care[/i] if he's a good source or not. I simply prefer to get my info from someone who doesn't get hard when he sees the color red. [editline]10:53PM[/editline] And since the latest insider info about graphics cards is not particularly important to me, I feel I can be choosy about where I read it.
[QUOTE=Roast Beast;21070655]Maybe I haven't made myself clear. I don't follow his articles, so I don't know if he's a good source or not. I don't [i]care[/i] if he's a good source or not. I simply prefer to get my info from someone who doesn't get hard when he sees the color red. [editline]10:53PM[/editline] And since the latest insider info about graphics cards is not particularly important to me, I feel I can be choosy about where I read it.[/QUOTE] Earlier, there was no real insider info. It was charlie, and nvidia's PR teams. And nvidia's PR seemed less accurate. At least charlie explained his ideas, and actually gave some technical knowledge on the subject on hand.
[QUOTE=Roast Beast;21070655]Maybe I haven't made myself clear. I don't follow his articles, so I don't know if he's a good source or not. I don't [i]care[/i] if he's a good source or not. I simply prefer to get my info from someone who doesn't get hard when he sees the color red. [/QUOTE] "This person cannot be correct/reliable because I dislike him"
[QUOTE=that1dude24;21071002]"This person cannot be correct/reliable because I dislike him"[/QUOTE] No, "I don't want to listen to this person because I dislike him". Nowhere did I say that he couldn't be correct or reliable. [editline]11:21PM[/editline] [QUOTE=ShaRose_;21070995]Earlier, there was no real insider info. It was charlie, and nvidia's PR teams. And nvidia's PR seemed less accurate. At least charlie explained his ideas, and actually gave some technical knowledge on the subject on hand.[/QUOTE] Insider info must be the wrong term, I just meant the latest news that isn't widely distributed.
[QUOTE=Roast Beast;21071097]Insider info must be the wrong term, I just meant the latest news that isn't widely distributed.[/QUOTE] Same thing applies, nividia's PR went everywhere.
[QUOTE=acds;21060909]Well considering the temps, it's a good idea to get a decent case (not a no-name brand case with one puny fan in the front). It's not required, but a good airflow is a good idea with these cards (though I doubt anyone buying a 500€ card doesn't have a good case already).[/QUOTE] i doubt the temperature difference between an nvidia card and an ati card is really drastic enough for that, unless you already have a shitty case. [editline]12:28AM[/editline] [QUOTE=ShaRose_;21070233]Still can't admit he has a source, after all of this. Pretty sad. Also, XFX basically said that fermi was shit as well, and won't carry it. [url]http://www.legitreviews.com/news/7707/[/url][/QUOTE] there's absolutely no source for that statement other than "WE JUST RECEIVED CONFIRMATION!!!!" not even a screenshot of the confirmation in question. [editline]12:30AM[/editline] also as per the charlie debate, remember just a few months ago when he made an article saying that nVidia decided to drop out of the graphics card business completely? yeah
Wasn't there a rule against fanboyism somewhere? Also use the ignore lists guys.
TSMC basically screws everyone over, but the screwed dont have much choice since i guess TSMC has the biggest production scale from all the silicon companies. Anyways, Fermis heat and power consumption may come from the architecture alone, but may also come from things like the properties of the wafer. (quality, ETC.) At least thats what i [b]SPECULATE[/b] Finally, I think Charlie will kill himself if Nvidia does a face-heel-about-turn and releases very cheap and powerful cards while ATi doe their version of fermi.
Nvidia won the contract for Nintendo's new handheld: [url]http://www.aussie-nintendo.com/news/19860/[/url] Just thought to post a link that is positive news for Nvidia, since the only thing that gets posted is "lololl nvidia is dead lolol".
[QUOTE=acds;21074974]Nvidia won the contract for Nintendo's new handheld: [url]http://www.aussie-nintendo.com/news/19860/[/url] Just thought to post a link that is positive news for Nvidia, since the only thing that gets posted is "lololl nvidia is dead lolol".[/QUOTE] they're anything but dead when it comes to mobile stuff ATi might have desktops down but that's all they seem to focus on, at least nvidia is branching out more with stuff like ion and tegra, and trying new ideas e.g physx
They are not trying physx, they are fucking enforcing it. So you have ati as main gpu and nvidia for physx? Well fuck you, we won't officially allow you to use it anymore.
I never said physx was good, I said they're trying new ideas. and I still can't believe you're all on about this. it's nvidia's idea, why do you all think they should just hand it to the competition? last I checked when they developed the atomic bomb they didn't go around handing out the plans. if it's so important, why haven't ATi developed their own version? as usual, nvidia thread just means everything runs down to red fanfaggotry
Not really nvidia's idea, they bought it from ageia in the first place [editline]03:46PM[/editline] Physx is bound to die anyway
I realise it costs a lot more but it's just a really nice feeling knowing my 5970 still leads, even though it's a dual GPU card, all about feeling good about your purchase and what not. don't misconstrue that as fanboyism as it's nothing about ATI and all about the actual card i own, not the brand. if ati went and put out a 5990 i'd be pretty unhappy. oh wait god fucking damnit
ATI is trying new ideas... Eyefinity?
[QUOTE=thedekoykid;21075641]ATI is trying new ideas... Eyefinity?[/QUOTE] th2g
[QUOTE=reapaninja;21075159]they're anything but dead when it comes to mobile stuff ATi might have desktops down but that's all they seem to focus on, at least nvidia is branching out more with stuff like ion and tegra, and trying new ideas e.g physx[/QUOTE] They're anything but dead when it comes to anything. Loads of people still buy Nvidias, prebuilts often use Nvidia and most computer technicians will still suggest Nvidia cards (and Nvidia has like 64% of the market on Steam, ATI only 28%). Plus they have loads of sales to non-privates, and these contracts. And neither Eyefinity nor PhysX are original ideas, but originality doesn't really matter as long as you have exclusivity.
[QUOTE=acds;21075903]Nvidia has like 64% of the market on Steam, ATI only 28%). [/QUOTE] Nvidia 61.88% ATI 30.92%
[QUOTE=opaali;21075947]Nvidia 61.88% ATI 30.92%[/QUOTE] Yeah my fault, old data. But my point still stands.
[QUOTE=M_B;21072980]also as per the charlie debate, remember just a few months ago when he made an article saying that nVidia decided to drop out of the graphics card business completely? yeah[/QUOTE] Except that's not what he said at all. He said that he had received word from OEM insiders that Nvidia was cutting production of its GPU dies from the GTX260 upwards in an attempt to focus on Fermi and save costs, not that they were dropping out of the graphic card business completely. And he was partially right. [url]http://www.fudzilla.com/content/view/15919/1/[/url] Obviously he's biased as hell and you can't take everything he says seriously, but honestly a lot of his articles have quite a bit of accuracy in them. He's an anti-Nvidia lunatic, but he certainly has a lot of contacts and he does his research, even if it's all in the name of dragging Nvidia's name through the mud.
[QUOTE=reapaninja;21075376]I never said physx was good, I said they're trying new ideas. and I still can't believe you're all on about this. it's nvidia's idea, why do you all think they should just hand it to the competition? last I checked when they developed the atomic bomb they didn't go around handing out the plans. if it's so important, why haven't ATi developed their own version? as usual, nvidia thread just means everything runs down to red fanfaggotry[/QUOTE] They did try to do that thing with Havok you know. Fuck it, we have directcompute and OpenCL now. All of that is doomed soon as the first physics library for GPGPU pops it's head out.
[QUOTE=reapaninja;21075376]I never said physx was good, I said they're trying new ideas. and I still can't believe you're all on about this. it's nvidia's idea, why do you all think they should just hand it to the competition? last I checked when they developed the atomic bomb they didn't go around handing out the plans. if it's so important, why haven't ATi developed their own version? as usual, nvidia thread just means everything runs down to red fanfaggotry[/QUOTE] OpenCL. Unlike nvidia, ati tries to use open standards that work everywhere, at least with regards to stuff that should be. nvidia uses physx with is proprietary and can only be used on nvidia cards. ati uses openCL which works on both. nvidia uses 3D vision, again proprietary and can only be used on nvidia cards. ati puts out drivers that allow other companies to hook in and allow 3d (bitcauldron, whenever they release the things. At least it's open standards again.). [editline]11:20AM[/editline] [QUOTE=BmB;21076190]They did try to do that thing with Havok you know. Fuck it, we have directcompute and OpenCL now. All of that is doomed soon as the first physics library for GPGPU pops it's head out.[/QUOTE] That'd be bullet btw. They support openCL. And that other library, I can't remember the name.
[QUOTE=reapaninja;21075376]I never said physx was good, I said they're trying new ideas. and I still can't believe you're all on about this. it's nvidia's idea, why do you all think they should just hand it to the competition? last I checked when they developed the atomic bomb they didn't go around handing out the plans. if it's so important, why haven't ATi developed their own version? as usual, nvidia thread just means everything runs down to red fanfaggotry[/QUOTE] Dude did you honestly compare PhysX to the atomic bomb? Also I swear to god, people in this section have an ATI bias, that's pretty clear. You however, have a fucking anti-"imaginary ATI fanboy infestation" bias that is really starting to piss me off. Your constant baseless accusations of everyone being an ATI fanboy are borderline trolling and more annoying than anything else in the entire forum. I agree with you on quite a few points, but seriously why do you keep having to bring up "red fanfaggotry" and other assorted retarded terms for everyone arguing against you?
pixelux, that's it. [url]http://www.pixeluxentertainment.com/[/url] [editline]11:22AM[/editline] [QUOTE=Shogoll;21076332]Dude did you honestly compare PhysX to the atomic bomb?[/QUOTE] No, he compared graphics card features to the atom bomb. Quite the fanatic.
[QUOTE=reapaninja;21075376]If it's so important, why haven't ATi developed their own version?[/QUOTE] [url=http://www.tomshardware.com/news/nvidia-physx-ati,5764.html]Because they don't have to. It runs on ATI hardware just fine.[/url] In my opinion PhysX is just a gimmick; it's cool and all, but it really isn't necessary and it bogs down performance too much for it to be worth using. I'd also like to mention that I am not a very big fan of ATI. My worst experiences from them stem from their shitty drivers and even worse support for OpenGL/Linux. To me, Nvidia has always been putting out better drivers. They aren't perfect, per se, but they are definitely better than the drivers on the several ATI cards I've used.
I can't speak for Linux support but the Windows drivers for ATI are awesome now.
[QUOTE=ShaRose_;21076276][i]OpenCL. Unlike nvidia, ati tries to use open standards that work everywhere, at least with regards to stuff that should be.[/i] [/QUOTE] If you're saying Nvidia doesn't have support for OpenCL (1.0), you've might missed something. ([url]http://www.nvidia.com/object/cuda_opencl_new.html[/url]) If not, sorry for interrupting you.
I think he's saying that Nvidia promotes closed platform lockin and ATI promotes open standards.
IF nvidia does one thing good, its prediction of future software and customer trends. (Ion and Tegra, anyone?)
[QUOTE=ShaRose_;21070047]I think you should stop being so pigheaded. The 1.7 you are referring to was in september, I'm referring to nvidia's A3 risk wafers, of which they ordered 9000. The ones being used NOW. Which, if current launch quantities are correct, is under 2 percent again! It's almost as if you refuse to read and instead are too [B]BIASED[/B] to understand what I'm saying.[/QUOTE] I'm sorry I don't keep track of every yield report and read every article that Charlie writes because I don't care about yields and companies so long the cards are put on the market. And I don't read Charlie's articles anymore because for like a month's duration a while back he was blatantly shitting all over Nvidia and I don't feel like reading "professional" articles by a person who does that. Yes he's a great source but trawling through that blatant bias is a chore. [QUOTE=Dark-Energy;21070079]And I didn't backpedal out my statement, did I? No. I just simply started trolling afterwards cause I thought it was funny and had nothing else to do. It's funny because you will probably find one little statement that appears, at first, that I backpedaled. Infact, I don't feel that I backpedalled at all. I had absolutely no intentions of doing that cause I was right. I bet you're going to find something that you misinterpreted as back pedalling, and then going off on a huge tangent about it. Just watch.[/QUOTE] [url=http://www.facepunch.com/showpost.php?p=20613372&postcount=177]Since you seem so livid about me doing it,[/url] [url=http://www.facepunch.com/showpost.php?p=20614709&postcount=188]I'll do it just to [/url][url=http://www.facepunch.com/showthread.php?t=888788&page=5]appease you.[/url]
Sorry, you need to Log In to post a reply to this thread.