ARMS is Now Fully Playable at 60 FPS Locked [Nintendo Switch Emulation]
53 replies, posted
That’s basically what I’m going on about. It’s just exaggerated massively a lot
Framerate geeks have pretty much become the audiophiles of hardware communities.
Oh boy I remember when everyone went "60fps is the new standard, 30fps sucks and is unplayable" like it was yesterday, now we're at "100fps+ are good, 60fps is choppy and fucking sucks"
It is to me, why are you saying it isn't if it is?
To be fair, I wouldn't personally recommend it, but if they don't play too competitively and prefer visuals over performance and they're alright with that, I don't see why not. People get high-end hardware to make the best out of their games after all, whether it's a single-player or multi-player experience, and if their ideal experience is playing something like CS:GO at 8k resolutions and they themselves don't mind frame dips, then more power to them.
Don't get me wrong there's an objective advantage in visual clarity on playing at high refresh rates (I got a 144hz monitor too for a reason after all) but not everyone is in the whole sweaty 'I need 0 input lag and make everyone look chunky so I'll play on 4:3 and make my game look like they're made with clay and have 1000fps' competitive scene. And at the end of the day despite being an objective advantage, it's not the actual game changer and deciding factor whether someone will be good or bad at the game. It's all about what they're used to and how they adapt to something and what they're comfortable with. All about just having fun and playing at your comfort zone.
I agree in singleplayer games or stuff thats more about eye candy than requiring you to have super stable performance that doesn't hinders you, like well, any singleplayer FPS game, Rage 2 for example, stuff like that. Not Doom IMO, because the more fluid that game plays, the best the experience will look, provided you don't make it look like dogshit.
In competitive games however? No way. You're just setting yourself back a good few notches in favour of visual upgrades, which won't really make sense in fast paced fps games in which you won't really take anything from pretty graphics, seeing how you're busy paying attention to enemies instead.
In Siege for example, its the difference from knowing what a corner peak did to you, and going "what the fuck was that?" because someone had a clear advantage to you and was way too fast for you to even see.
People who play competitive games won't have any fun if they're losing. And most won't look back once they go into the 144hz territory, which btw isn't hard to obtain in games like CSGO.
It's part of the deciding factor because its the norm for most people into it. It isn't for games that can barely hold a casual multiplayer because they're yearly releases, but it is for games with established pro scenes.
I tried BO4 a while ago at 60 fps, looked like a choppy mess. It reminded me of when I had a bad PC but wanted to play new games at very high settings but couldn't reach 60 fps. 60 to 100 or 144 is the equivalent.
in the end, 60fps is doable for games that don't require you to have fast reaction times and that don't have extremely fast paced gameplay. Otherwise, it won't feel good at all.
I mean, when did 60fps become "the new standard"? Games (not all, of course) have been 60fps since basically the inception of gaming.
My point is some people do like the eye candy even in competitive environments, and some aren't too bothered losing that slight performance and gameplay advantage over having a pretty presentation.
I won't declare myself as a great player, but yeah I have played Siege regularly in the past and have reached decent ranks within the Gold 1 to Platinum 2 range. The game also looks pretty and I consider having a pleasing aesthetic a nice addition to my playing experience. I play it on max settings minus ambient occlusion and lens flare off, with FXAA as my AA solution. I like to say I did fine for the most part, where most of my deaths and mistakes are due to more of my lack of oversight or just being outsmarted or outskiled by my opponent. I never had any real moments where I could blame my performance or graphic settings being at fault nor I didn't see anyone due to any eye candy effects getting in the way. I owe a lot of wins thanks to map knowledge, good team communication and thinking outside the box, rather than my reflexes and my framerate.
Losing in a competitive environment never really feels great sure, and the communities established in these reinforce that. However despite how vocal the competitive scene is, a large chunk of these games are still a casual base, bar a few exceptions that have aged significantly and barely attracts a new audience like the Smash Melee community. Games still need to sell especially constantly evolving games like these. No game dev want to isolate potential money from a growing audience, they're not gonna make a bunch of people run to the store and purchase a new monitor just because they feel handicapped. Pros codifies the scene, but the casual base keeps it alive.
And as others mentioned before, this all depends on a person's adaptability. People tend to exaggerate how bad it is going down from a higher refresh rate, and it's mostly because they don't even give themselves time to adjust again. They play something for half an hour before declaring it unplayable. I can play a console fps shooter at 30fps/60fps, and go back to my PC at 100+fps, and do fine on both after playing for an extended period. A higher refresh rate is an advantage and a good boost in a player's experience, but it's not a necessary factor in a person's playing skills. A bunch of oldies on the arena shooter/Quake scene will tell you that their frags and pixel aims aren't coming from raw reaction time and seeing the opponent first, but rather it's about prediction and timing.
Once again end of the day it's all about what someone is comfortable with. Because a persistent and motivated person will always find a way to make something work despite any limitations, whether intentional or not, and while having more visual clarity thanks to better hardware and performance are considered a boon, it's not gonna be close to the importance of actually learning the game and getting used to it.
I'd argue that outside of certain genres, it's easy to get used to a lower framerate. I can't play first or third person shooters with a mouse at less than ~120 fps or else my input just feels like garbage. I can play Splatoon fine with Gyro, I can play a console game with a stick.
Pretty much any other genre is fine at 60. I always prefer higher, but a stable 60 is perfectly viable for everything else for me.
I can see the desire for a competitive fighter to also hit those high framerate because it gives the developers double the frames to work with when it comes to move timing and the such, as well as more responsive input (assuming they aren't working with a buffer like many games do).
If you're playing a game at 60 on a 144hz monitor OF COURSE it's going to look like a choppy mess, you'll be seeing a fuckload of frame tearing as that perfectly fine 60fps cannot be rendered consistently on a 144hz monitor. The game itself was no doubt running perfectly fine, but as you weren't getting proper frame sync it probably had just enough frame tears to fuck with you. Or the game has generally bad frame pacing, which I'm not led to believe it would outside of a few edge case rendering scenarios.
I'm on a 144hz display right now, and have been for about two years. Is 144hz nice? yes. When a game hits it it looks wonderful, it doesn't necessarily improve the quality of the gameplay, but it just looks great. Minimised motion blur is lovely, almost zero ghosting is great. But 72hz (the next-best divisible because 60 isn't a good idea on these displays) is perfectly fine too. As long as you're hitting divisors of your refresh rate everything should feel somewhat good.
The only problem you should be having with 144hz is when the game suddenly locks itself to something like 60 like quite a few do for cinematic things (looking at you DOOM). That really is noticeable and fucks with you if you're playing at 144. But playing at 60 from the get go? Should be fine short of the frame tears.
This is why I'm not willing to get a 144Hz display. I might upgrade to a 120Hz display at some point, but if I were to go any higher, I'd wait for at least 180Hz to become commonplace, if not 240Hz. I still don't get why 144Hz even became a thing in the first place; it's only a 20% jump up from 120Hz and screws with the whole base of 60. At the very least, they should go in multiples of 30. And 150Hz was only six higher...
I'll also add onto this by saying that playing stable at a lower fps is, by a fair margin, better than playing at a higher fps with frequent fps dips. If I can't get a game running smoothly even at mid settings at 60fps I would much rather crank up the settings and limit it to 30 to have a better looking game and a stable framerate.
Is 60/144/whatever better than 30? Sure, but if I can't get it stable then give me lower fps any day.
144 always did strike me as a weird refresh rate. I get why the panel manufacturers went for it, it's higher than 120 after all. But just...fucking...hit up 240. 144 is super weird to divide by, 144/ 72/ 36 is not normal.
If I had to guess, 144Hz may have been chosen because it's a multiple of 24 FPS, which is what movies (and bad AAA "cinematic" games) run at, and people might wanna use their monitors to watch stuff. Of course, 60 not going into 144Hz is less of a big deal with adaptive sync being a thing nowadays, or the fact that you can just set a 144Hz display to 120Hz to begin with.
Which is probably what I would do even if I did get a 144Hz monitor; I'd keep it at 120Hz if whatever I was doing didn't just use adaptive sync outright.
120Hz is just perfect IMO.
It divides neatly with all relevant framerates.
60, 30, 24
(fuck 25 & 50, you don't belong in this world)
It's not their fault they'd been given flesh. They were called here by Europeans.
I bet you prefer superior american framerates, such as:
23.976
29.97
59.94
You broke the joke you clod.
Joke breaking aside; 1,198,200,600Hz is the optimal refresh rate for monitors, as it divides neatly with every common framerate (23.976, 24, 25, 29.97, 30, 50, 59.94, 60, 120, 144)
What is even the point of those ? Who thought it would be a good idea to chop literally 0.1% off of common framerates and make that a new standard?
I've played in a 60hz monitor after playing in a 144hz, after sending it to warranty over dead pixels, and it literaly made me stop playing until the high refresh rate monitor came back.
60hz is so noticeable to me now, that anything that goes lower than 100 is absolutely jarring.
To me it's more of a desktop thing. I can get used to games at 60 fps, 30 fps take me a bit more but it's possible, but going back to a 60hz Desktop just feels weird to me now and I can't go back
Same, I play so many old games that need a specific frame rate (typically 60) to function properly that I am used to 60 fps games on my 144hz monitor
Sorry, you need to Log In to post a reply to this thread.