• Fermi tech specs/whitepapers released
    167 replies, posted
[QUOTE=Zero-Point;19719052]For fuck's sake, I'm not a fuckin' fanboy. That seems to be the ultimate rebuttal in this fuckin' section. The question I asked is completely legitimate as many nVidia demos have been modified to allow them to be run on ATi cards and they usually ran faster. I'm not saying that this is definitely the case here, simply that I'm curious. I like nVidia, I've only owned nVidia cards, and I was hoping Fermi would be all that it's cracked out to be. [highlight][I]FUCK.[/I][/highlight][/QUOTE] yeah but none of the benchmarks in the OP aside for the water and hair are made by nVidia. The Car and the castley looking things weren't by nVidia. [editline]10:36PM[/editline] [QUOTE=Zero-Point;19719204]FUCK SALT. >:c[/QUOTE] no
Meh i don't want to overpay. Besides my 4770 is descent enough.
I can't seem to find this covered anywhere, but, say I have a GTX260, am I going to be able to SLI that along side a GF100 based card? anyone know? Seems logical as that was one of the major "break throughs" with the 200 series, that it didn't need to be the same GPU. [editline]04:58PM[/editline] i ask because i'm considering getting a gtx260 to hold me off until this releases, and it'd be a nice little bonus if I could use the two. [editline]04:58PM[/editline] i have no idea what for though the 260 dedicated to CUDA, PhysX (for the 3 physx games), maybe for non DX11 games?
since when was different-card-SLI a 200 series break-through?
for nvidia and sli, it is. i know, crossfire has had it for a very long time, but i am not and was not talking about crossfire or ati
[QUOTE=M_B;19733531]I can't seem to find this covered anywhere, but, say I have a GTX260, am I going to be able to SLI that along side a GF100 based card? anyone know? Seems logical as that was one of the major "break throughs" with the 200 series, that it didn't need to be the same GPU. [editline]04:58PM[/editline] i ask because i'm considering getting a gtx260 to hold me off until this releases, and it'd be a nice little bonus if I could use the two. [editline]04:58PM[/editline] i have no idea what for though the 260 dedicated to CUDA, PhysX (for the 3 physx games), maybe for non DX11 games?[/QUOTE] I don't see why they would lack that feature. Although from what I remember that mixed SLi only worked with alike GPUs, unless they've changed it again.
[QUOTE=Zero-Point;19737526]I don't see why they would lack that feature. Although from what I remember that mixed SLi only worked with alike GPUs, unless they've changed it again.[/QUOTE] like i said, with the GT200 series, the GPU doesn't have to be the same. you could have a 295 and a 220 if you really wanted to, for example. [editline]09:54PM[/editline] it was a hardware thing, something the hardware of all previous series of cards didn't have, i suppose. my best guess, anyway. [editline]10:27PM[/editline] not sure why i was rated dumb for stating something factual, but okay
[QUOTE=TickLe MY eL;19700559]Can't wait to sell my 5870 to buy one of these![/QUOTE] Of course, all the ati fanboys rate me dumb even though chances are, the fermis cards are going to be superior to the 5xxx cards, even if it is only by a little. I love how you all come rushing to the fermi threads to trash nvidia about how long they are taking(pisses me off too, but tegra is totally work any delay on the fermi cards), and how it is going to cost $100000.00, when honestly, it will either be comparable to the last generation of cards, or cheaper.
The Fermi architecture is actually quite incredible, having now seen more of the inner workings. It's not like Nvidia sat around with a dick up their ass for all this time, they've been developing something that is quite frankly technologically amazing, the only problem is that it's fucking late and undoubtedly going to be crazy over priced/hot. The design is brilliant, but with current TSMC 40nm process yields and with current technology, it's really not at all practical. I have a feeling Nvidia is going to take a hard hit this generation, but pretty much everything that comes after is going to probably be using similar technologies to Fermi.
[QUOTE=M_B;19700843]but will it make me burritos[/QUOTE] I'm not sure but they will implement a waffles making option: [QUOTE=HardOP;19700843]GF100 will have 512 CUDA cores, which more than doubles its cores compared to the GeForce GTX 285 GPU’s 240 core. There are 64 texture units, compared to the GTX 285’s 80, but the Texture Units have been moved inside the Third Generation Streaming Multiprocessors (SM)for improved waffles making efficiency. In fact, the Texture Units will make waffles at a higher temperature than the core GPU clock. There are 48 ROP units, up from 32 on the GTX 285[/QUOTE]
I really do hope that these are real benchmarks and that nVidia actually has a card worth the market. ATI needs some competition and I don't want to see nVidia fail even more.
My card is capable of 10WPM (Waffles Per Minute)
[QUOTE=Shogoll;19740871]The Fermi architecture is actually quite incredible, having now seen more of the inner workings. It's not like Nvidia sat around with a dick up their ass for all this time, they've been developing something that is quite frankly technologically amazing, the only problem is that it's fucking late and undoubtedly going to be crazy over priced/hot. The design is brilliant, but with current TSMC 40nm process yields and with current technology, it's really not at all practical. I have a feeling Nvidia is going to take a hard hit this generation, but pretty much everything that comes after is going to probably be using similar technologies to Fermi.[/QUOTE] Indeed it is (I've been reading quite a lot about it), I honestly have no idea how it will go this generation, but I think that the next one will be a good one for Nvidia as they will have time to refine Fermi. It's like a modded Oblivion, when it works it works wonders and is great, the problem is to get it to work.
[QUOTE=darkgodmaste;19741239]I'm not sure but they will implement a waffles making option:[/QUOTE] cool but i gotta have burritos [editline]07:44AM[/editline] [QUOTE=reedbo;19741273]I really do hope that these are real benchmarks and that nVidia actually has a card worth the market. ATI needs some competition and I don't want to see nVidia fail even more.[/QUOTE] it'd be very unlike nVidia to, at this stage, still have a magical invisible card. i'd say the benchmarks are real, but i'd be surprised if the benchmarks weren't for a dual-gpu board
[QUOTE=M_B;19738443]like i said, with the GT200 series, the GPU doesn't have to be the same. you could have a 295 and a 220 if you really wanted to, for example. [editline]09:54PM[/editline] it was a hardware thing, something the hardware of all previous series of cards didn't have, i suppose. my best guess, anyway. [editline]10:27PM[/editline] not sure why i was rated dumb for stating something factual, but okay[/QUOTE] Could you give some proof of this, as this is the first time I've ever heard of mixed SLI other than flashing the 9800 GT's BIOS.
you can't, I believe he is talking about Physx but that's different
Hopefully I will be able to keep my GTX 260 and use it with a new Nvidia GF100 (or the like) for their surround gaming system. They said you need SLI, but I don't believe they mentioned whether it must be 2 new cards or not. Of course...the 5970 is still on the table... :)
[QUOTE=ChristopherB;19751241]Hopefully I will be able to keep my GTX 260 and use it with a new Nvidia GF100 (or the like) for their surround gaming system. They said you need SLI, but I don't believe they mentioned whether it must be 2 new cards or not. Of course...the 5970 is still on the table... :)[/QUOTE] u cant, u dumbo
[QUOTE=Odellus;19749857]Could you give some proof of this, as this is the first time I've ever heard of mixed SLI other than flashing the 9800 GT's BIOS.[/QUOTE] i was mixed up, you can use them as physx cards with one another [editline]10:43PM[/editline] still under SLI, but you can't use the second card as an actual GPU. just for physx [editline]10:43PM[/editline] and possibly CUDA
I am no fanboy but Nvidia is trying to make ATI look bad but my 5750 runs urigine similar to what Nvidia say the 5870 does i call bullshit below poster got the wrong idea fixed the post
[QUOTE='[EG] Pepper;19767113']I am no fanboy but Nvidia is trying to make ATI look bad even my 5750 runs urigine better than what they the 5870 does i call bullshit[/QUOTE] No.
[QUOTE=Odellus;19700245]I think Fermi sounds cool. Like an exotic sports car (Fermi is Italian).[/QUOTE] Enrico Fermi was a Physicist. The Fermions also got named after him (Fermions are particles with a multiple of 1/2 [img]http://math.daggeringcats.com/?\hbar[/img]) - Electrons, Protons and Neutrons are all Fermions. On topic: Well, as said before: Charts done by nVidea are not really reliable. Also nVidea seams to like naming their cards after Physisicst. Fermi, Tesla - Just to name some.
-snip- nevermind.. where did i hear that...
[QUOTE=aVoN;19767290]Well, as said before: Charts done by nVidea are not really reliable. [/QUOTE] 1, learn to spell the name right 2, same applied for ATi. and probably any company. the tests are always going to be biased, it's called marketing
what is this nVidea he speaks of
uhh [img]http://i48.tinypic.com/w012eb.png[/img]
[QUOTE=M_B;19777367]what is this nVidea he speaks of[/QUOTE] :mad: (my brother)
[QUOTE=paul simon;19700286]When i see Fermi, i'm thinking Ferarri. I like the name.[/QUOTE] this is off topic but i dont like the name Ferrari, also Fermi reminds me of some kind of fungi D: why couldnt they call it something awesome?! Should be good though, cost alot D:<
im not a fan boy but i think ATe is better than nVideos [editline]07:30PM[/editline] and i think intal is more powerful giger hurts than omd
[QUOTE=Rusty100;19778052]im not a fan boy but i think ATe is better than nVideos [editline]07:30PM[/editline] and i think intal is more powerful giger hurts than omd[/QUOTE] [IMG]http://demongirl.org/pics/lolz/I-see-what-you-did-there.jpg[/IMG] [highlight](User was banned for this post ("Image macro" - SteveUK))[/highlight]
Sorry, you need to Log In to post a reply to this thread.