NVIDIA GeForce GTX 560 Ti: second-generation Fermi for the $250 mainstream
172 replies, posted
[QUOTE=reapaninja;27687437]you have pretty high standards if you think this is midrange, that's more like the 450/5770[/QUOTE]
Oh my bad I didn't know 450 exists :saddowns:
[QUOTE=Kybalt;27677762]so how does this compare to my 9800GTX+[/QUOTE]
a huge improvement over the 9800
Hell this is over %30 improvement over a 460, I would think it should wipe the floor with any non-fermi nVidia card that ever existed.
[quote=mactrekkie;27689469]hell this is over %30 improvement over a 460, i would think it should wipe the floor with any non-fermi nvidia card that ever existed.[/quote]
GTX 295?
/caps
[QUOTE=garrynohome;27682932]Honestly is a card like the 560 required to run basic source engine games?
Also does 40fps look smooth to youn? When my games dip below 60fps(I do NOT RUN an fps monitor, I can see it) I go insane. But most people tell me I'm just a strange case.[/QUOTE]
the eyes cannot see a difference past 20 FPS
[b]Forgive me i was wrong here[/b]
[editline]27th January 2011[/editline]
[QUOTE=thf;27689475]GTX 295?
/caps[/QUOTE]
duh
[QUOTE=ZombieWaffle;27689480]the eyes cannot see a difference past 20 FPS
[editline]27th January 2011[/editline]
duh[/QUOTE] I sure can... 20 FPS looks like a fucking slideshow to me.
[QUOTE=ZombieWaffle;27689480]the eyes cannot see a difference past 20 FPS
[editline]27th January 2011[/editline]
duh[/QUOTE]
Are you seriously saying you can't tell the difference past 20 FPS? You must be blind or something.
[QUOTE=Badal;27689493]Are you seriously saying you can't tell the difference past 20 FPS? You must be blind or something.[/QUOTE]
My information was wrong. My mistake.
It's actually 60. I'm not thinking straight.
I can't see a difference past 60 FPS on my current monitor, but I believe I could on a 120hz monitor.
[QUOTE=ZombieWaffle;27689540]My information was wrong. My mistake.
It's actually 60. I'm not thinking straight.[/QUOTE]
It's still wrong. As long as you have a monitor with a refresh rate over 60 you can see the extra frames.
[QUOTE=thf;27689475]GTX 295?
/caps[/QUOTE]
Oh I forgot about that :eng99:
clock is not how your supposed to compare different cpus or gpus of different models. the amount of hz is only relevent in comparisons of the same model of cpu. architecture varies.
[editline]27th January 2011[/editline]
page late sorry
[QUOTE=ZombieWaffle;27689540]My information was wrong. My mistake.
It's actually 60. I'm not thinking straight.[/QUOTE]
No it's not 60 you fucking idiot.
[highlight](User was banned for this post ("Flaming" - SteveUK))[/highlight]
[QUOTE=Mattk50;27689771]clock is not how your supposed to compare different cpus or gpus of different models. the amount of hz is only relevent in comparisons of the same model of cpu. architecture varies.
[editline]27th January 2011[/editline]
page late sorry[/QUOTE]
Hence retail 3.8ghz Pentium 4's getting whomped by even an Atom dual core. That's where the whole 'megahertz myth' comes from.
It's 24 unless you're willing to tell me that you're experiencing slideshow hell in the cinema?
Case closed and kids stopped lying.
Well in games, without motion blur and all that stuff, it looks terrible.
[quote=bomimo;27690468]it's 24 unless you're willing to tell me that you're experiencing slideshow hell in the cinema?
Case closed and kids stopped lying.[/quote]
HaHAHAHAHAHAHA
more illiterate gold from the self-proclaimed know-it-all, Bomimo.
[QUOTE=thf;27690543]Well in games, without motion blur and all that stuff, it looks terrible.[/QUOTE]
motion blur looks worse to me. much worse. if your running at lower than 60fps i can see why it may help though, but at maybe not 60 but 120fps i cant see the reason for motion blur
What does the Ti mean?
[QUOTE=Flubadoo;27690970]What does the Ti mean?[/QUOTE]
Titanium. It's an old "brand" they used to use in 2004, although the usage here is purely for marketing since it might as well just be called the GTX 560.
[editline]27th January 2011[/editline]
[QUOTE=Bomimo;27690468]It's 24 unless you're willing to tell me that you're experiencing slideshow hell in the cinema?
Case closed and kids stopped lying.[/QUOTE]
Hahahahaha. Watching a movie =/= playing a game.
[QUOTE=Fycix;27682878]I hate when I see "Oh that's a shitty card" type deals, because here I am playing Source Engine games on mid and getting 40 FPS with my shitty HD4200 and AMD Phenom.[/QUOTE]
source games are not demanding at all
[QUOTE=ZombieWaffle;27689540]My information was wrong. My mistake.
It's actually 60. I'm not thinking straight.[/QUOTE]
No it isn't. Eyes does not have a "FPS cap"
The whole reason one would think they can't notice the refresh rate beyond 60fps is most likely caused by their monitor only supporting up to 60hz refresh rate.
[editline]27th January 2011[/editline]
[QUOTE=Bomimo;27690468]It's 24 unless you're willing to tell me that you're experiencing slideshow hell in the cinema?
Case closed and kids stopped lying.[/QUOTE]
please stop
[QUOTE=Bomimo;27690468]It's 24 unless you're willing to tell me that you're experiencing slideshow hell in the cinema?
Case closed and kids stopped lying.[/QUOTE]
lol
i guess this is the year to upgrade my computer!
[QUOTE=Odellus;27690729]HaHAHAHAHAHAHA
more illiterate gold from the self-proclaimed know-it-all, Bomimo.[/QUOTE]
Lol, More rushing headfirst in and calling everybody retards based on snippets by the "hardass" Odellus. How about you show off the post where i state i know everything?
[QUOTE=Odellus;27683169]I get 40 FPS less than my friend during hordes in L4D.
I have a Q9400 @ 3.6 GHz with a GTX 570, he has an i7 920 @ 4 GHz and a GTX 470. He's even running a larger resolution (2048x1152) than me but the same (max, 16xQ CSAA) settings, including transparency anti aliasing set to 4x SSAA.
I'm only complaining because I run at 120 Hz so anything below 120 is quite noticeable. That 40 FPS is the difference between 100 FPS and 60 FPS (or 120 and 80), worst case scenario, where 60 looks plain terrible compared to 100 and up.[/QUOTE]
I was right *does a happeh dance*. In response to your comments about refresh rate and frame rate. You don't believe that crap about refresh rate being the limit of frames that can be displayed do you?
[editline]27th January 2011[/editline]
[QUOTE=Badal;27689601]It's still wrong. As long as you have a monitor with a refresh rate over 60 you can see the extra frames.[/QUOTE]
I have a monitor(LCD) with a refresh rate of 60. You can show me Cod4 running with vsync on(60fps) and vsync off(160+ fps) and I can always tell the difference so obviously the refresh rate isn't the cap of frames that can be displayed, granted you will have tearing but it doesn't limit the frames displayed to 60 per second.
So. Frames Per Second becomes a whole different deal when it's games according to you guys? wouldn't that depend on a shitton of circumstances beyond just 24 Frames delivered in a second?
AFAIK it also depends on how stable the delivery is. What does 24 FPS help if 16 of them appear in the first or last fifth of the second due to stressed GPU. which again is individual for each game when that happens (how demanding they are). You can easily play some games well and stable at 25 FPS if the interval between each frame isn't borked to hell and the workload still is below the straining point.
Ah, what the fuck does reasoning and discussing anything do anyway. You guys are too smug to discuss anything with anyway. Ok classy gamers, 60 FPS or your eyes burn, fuck stable intervals and whatnot, But 24 FPS in the Cinema's just fine and dandy. maybe consider more than just 60 frames a second but how they got distributed during that second. Or are you just so much cooler than me that it's inconceivable?
Oh, hey Smug Illiteracy thread.
[sp]25 evenly distributed FPS>30 FPS mashed in there somewhere (which is where "my eyes hurt" apply)[/sp]
[QUOTE=Bomimo;27695103]So. Frames Per Second becomes a whole different deal when it's games according to you guys? wouldn't that depend on a shitton of circumstances beyond just 24 Frames delivered in a second?
AFAIK it also depends on how stable the delivery is. What does 24 FPS help if 16 of tme appear in the first fifth of the second due to stressed GPU. which again is individual for each game when that happens (how demanding they are). You can easily play some games well and stable at 25 FPS if the intervals between each frame isn't borked to hell and the workload still is below the straining point.
Ah, what the fuck does reasoning and discussing anything do anyway. You guys are too smug to discuss anything with anyway. Ok classy gamers, 60 FPS or your eyes burn, fuck stable intervals and whatnot, But 24 FPS in the Cinema's just fine and dandy. Get more bullshit guys.
Oh, hey Smug Illiteracy thread.[/QUOTE]
You're an idiot. In movies every frame is repeated THREE FUCKING TIMES! Even you can do 24x3. Also go run cod4 at 30fps. The game doesn't use any motion blurring techniques thus it doesn't look smooth to the eyes of a human being. Then put it to 60fps, looks smooth. So yes, games and movies do look different even when the framerate is the same for both.
Sources: [url]http://en.wikipedia.org/wiki/Frame_rate[/url]
[QUOTE=garrynohome;27695153]You're an idiot. In movies every frame is repeated THREE FUCKING TIMES! Even you can do 24x3. Also go run cod4 at 30fps. The game doesn't use any motion blurring techniques thus it doesn't look smooth to the eyes of a human being. Then put it to 60fps, looks smooth. So yes, games and movies do look different even when the framerate is the same for both.
Sources: [url]http://en.wikipedia.org/wiki/Frame_rate[/url][/QUOTE]
Ok. True, I budge on that point. But i still call spoiled kids on 30 FPS being instasuck. It just isn't. Cod4 may suck at 30 FPS but on the opposite i've got crysis feelin fine untill the low-mid 20's and then countering that with Just Cause 2 and GTAIV being stupid turds anywhere near 45 FPS. It's varying.
I could agree that 30 FPS minimum is a good way to keep eyerape and stuttering at the door.
Find better ways to open an argumentative post than "you're an idiot" would be nice too.
[QUOTE=Bomimo;27695408]Ok. True, I budge on that point. But i still call spoiled kids on 30 FPS being instasuck. It just isn't. Cod4 may suck at 30 FPS but on the opposite i've got crysis feelin fine untill the low-mid 20's and then countering that with Just Cause 2 and GTAIV being stupid turds anywhere near 45 FPS. It's varying.
I could agree that 30 FPS minimum is a good way to keep eyerape and stuttering at the door.
Find better ways to open an argumentative post than "you're an idiot" would be nice too.[/QUOTE]
Crysis looks...ok below 30 at times. It's due to the games use of motion blur.
Sorry, you need to Log In to post a reply to this thread.