OT: your friend does make an interesting point, if you buy a larger resolution monitor, then obviously your graphics card will have to work that much harder to play the newer games.
So the larger the resolution the shorter the life of your graphics card for 'maxing out' the top games of the time, but even so, I imagine this isn't as pronounced an issue as it would've been 2+ years ago.
I would personally tell my friend to shut the fuck up and stick to his xbox. He obviously has claims that have no backbone. Hes hilarious none the less.
[QUOTE=BmB;23326842]1200p isn't technically inaccurate though[/QUOTE]
I didn't say it wasn't technically accurate, you could add a p to any fucking resolution your monitor could handle and it would be technically accurate to put a p after it as they're ALL fucking progressive. The point of 1080p and 720p were to distinguish the differences between them and 1080i.
And my point is that it shows people that do so don't understand the actual reason for the p but add it anyway.
[QUOTE=ADT;23326822]GTX460 is clearly a good card for playing (almost) every game on 1080p with settings maxed out
(When I said almost, don't except about Crysis, but who cares about a bad optimized game ?)[/QUOTE]
That card can run Crysis maxed. I wish everyone would stop using Crysis as some kind of impossible benchmark, It runs fine.
[QUOTE=liquid_phase;23326797]Full HD is a television broadcast standard. Sure it means 1920 x 1080 in a computing context, but when you see someone write 1200p it shows they know fuck all about the topic. My vitriol isn't aimed at people using 1080p or 720p, it's at people who use random resolutions and append a p to those.[/QUOTE]
Ok. I thought you were moaning at the people saying 1080p/720p. Taking the height of a random resolution and adding p is stupid though.
[QUOTE=Wiggles;23316789]I run most games maxed at 1920x1080 on an ATI Radeon 4870.[/QUOTE]
What are the rest of your specs?
[QUOTE=Xera;23326628]You can only do that if your monitor has the option to disable its scaling too. And a lot don't.
[/QUOTE]
I'm pretty sure the graphics driver adds black bars to the image and sends it to the monitor at 1280x1024 so the monitor doesn't attempt to rescale it.
What i meant was not disabling the scaling on the GPU, just make it so it doesn't scale the image:
[img]http://www.shrani.si/f/d/IT/2unB7FH8/scaling.png[/img]
The problem for me is that I have an awful computer, but I have a 22" monitor. So I can only really play games made before 2008 on it... at a max resolution of about 1024x768 when my monitor can support so much more. That's why when I put it into full screen it looks awful.
If this thread is now derailing into a general moan about display stuff, I would like to vent my frustration at ATI for assuming I want to run my 1920 x 1080 resolution on my KURO at a smaller scale than 1 to 1 and thus have black bars surround it. Sucks that I have to change this every fucking time I rebuilt my HTPC.
Hey, what happened with good ol' 800 x 600?
Fuck that shit I run Crysis at 150 frames on my 320p
only thing that doesn't run maxed is crysis
[editline]01:28PM[/editline]
no aa
everything else can be max though
Currently saving up myself for a 3D Nvidia surround setup, which is a total of 5910x1080 pixels, or 3x1080p screens
If this would look shitty, it wouldn't be known to man as the ultimade gaming experience.
The more pixels you fit in a smaller area, the better it looks, and you can even reach a point where you can't see the pixels anymore, which is used for very professional screens that needs such quality or resolutions.
[QUOTE=Tools;23328492]Currently saving up myself for a 3D Nvidia surround setup, which is a total of 5910x1080 pixels, or 3x1080p screens
If this would look shitty, it wouldn't be known to man as the ultimade gaming experience.
The more pixels you fit in a smaller area, the better it looks, and you can even reach a point where you can't see the pixels anymore, which is used for very professional screens that needs such quality or resolutions.[/QUOTE]
Try playing minesweeper on it
[QUOTE=AimlessGiant;23327334]That card can run Crysis maxed. I wish everyone would stop using Crysis as some kind of impossible benchmark, It runs fine.[/QUOTE]
It's still probably the most demanding thing you'll be running. So if it does Crysis fine it should do most everything fine.
[QUOTE=BmB;23329885]It's still probably the most demanding thing you'll be running. So if it does Crysis fine it should do most everything fine.[/QUOTE]
I've heard Metro 2033 is more demanding (because it's less optimized or something).
[QUOTE=Mindtwistah;23332608]I've heard Metro 2033 is more demanding (because it's less optimized or something).[/QUOTE]
Yeah, Metro is the new Crysis.. until Crysis 2 comes out later this year though.
[QUOTE=GammaFive;23315898]I enjoy my 1920x1080 monitor.[/QUOTE]
Me to, and especially with my gtx 470.
[QUOTE=Mindtwistah;23332608]I've heard Metro 2033 is more demanding (because it's less optimized or something).[/QUOTE]
Metro 2033 is well optimized for DX11 [B]hardware[/B], but not for DX9 [B]hardware[/B]. so if you have a current card, Metro 2033 [B]in dx9[/B] will run better than Crysis. if not, they'll be about the same.
[QUOTE=Mindtwistah;23332608]I've heard Metro 2033 is more demanding (because it's less optimized or something).[/QUOTE]
Oh yeah forgot about that.
[QUOTE=BmB;23326842]1200p isn't technically inaccurate though.
Crysis isn't poorly optimized. It's pretty well done actually. Now compare Arma II which has a similar level of detail but performs 2x worse. That's "bad" optimized.
It just really is that demanding.[/QUOTE]
ArmA II renders more things that Crysis.
It's it's not a problem of bad optimization, it's a problem of "don't fucking render things behind that hill I can't see them".
I've a got a 1050p (:downs:), and things looks good, even downscaled.
Well, just don't try 640*480.
[QUOTE=pikzen;23341889]ArmA II renders more things that Crysis.
It's it's not a problem of bad optimization, it's a problem of "don't fucking render things behind that hill I can't see them".[/QUOTE]
Uh, rendering things you can't see IS bad optimisation. ArmA II is a horribly optimised mess.
3840x1080 :h:
[QUOTE=Xera;23341975]Uh, rendering things you can't see IS bad optimisation. ArmA II is a horribly optimised mess.[/QUOTE]
You can change the rendering distance. It's not what I'd call badly optimized.
But yeah, they're lazy fucks that couldn't even change the default values.
I've got my 24" 1080p monitor here... With an NVIDIA GeForce 9500 GT...
And all my games. ALL OF THEM. Oblivion, Crysis (yes, I can run it on medium at about 20 FPS), TF2, etc. look perfectly fine. The trick is to click the little button in the video settings of your game that says "Resolution - 1920x1080". But even if you do play it in 720p, the game will still look... Like it's running in 720p. Like it does now.
So if you get the GTX460 (Which is leagues better than the 9500 GT I have!), you'll have no issues.
Stop absorbing any information your friend gives you.
In fact this probably belongs in [url=http://www.facepunch.com/showthread.php?t=956696]Computer illiterate people who think they know things V5 = I FLICK PSU VOLTAGE SWITCH[/url]
So in short, your friend is an idiot.
Even if you can't run at the full resolution, downscaling the res [i]and then running in a window[/i] will keep it nice and sharp.
I'm comfy at 1152x864.
I run TF2 at 720x480 :downs:
Sorry, you need to Log In to post a reply to this thread.