Is this a common problem? When ever I enter a forest or any such area my FPS just tanks. I average around 20 to 25 fps. Now I have a 660ti and a pretty beefy cpu. I can run brokenfield 4 at 60+ fps on ultra settings and that game is broken as fuck. So logic dictates that rust should run fine. I was just wondering if there is a broken setting right now that causes this or any quick fixes because I am loving the new branch of the game so far (besides the OP as metal armor D:<)
Press F2 and turn everything off. Even makes the game look better.
Yeah I have tried that and I do somewhat agree with it looking better. That motion blur is abismal. But i still get pretty crappy fps. I am playing on the good settings right now and it makes me sad if im going to have to go lower to play
Memory is also a problem, need 8gb to play until memory leaks are fixed.
[QUOTE=klondike;46173367]Memory is also a problem, need 8gb to play until memory leaks are fixed.[/QUOTE]
I have 8 gigs D:! This is such a shame, maybe Rust just hates my computer :(
I have also been experiencing frame rate drops in forests, especially arctic ones since yesterday.
I think it's due to more trees being moved over to speedtree. I expect the performance will increase soon once all the trees are moved over.
[QUOTE=seandewar5;46173485]I have also been experiencing frame rate drops in forests, especially arctic ones since yesterday.
I think it's due to more trees being moved over to speedtree. I expect the performance will increase soon once all the trees are moved over.[/QUOTE]
Thats what I was thinking too. I was just making sure that this was thing that was happening to people and demons just haddent possessed my computer.
I have 3gb of memory and I can still run the game decently on Fastest with everything off. I also get frame drops near forest. Drops from 45-60 to 25-40 so it's not horrible, but noticable.
[QUOTE=Thor-axe;46174091]I have 3gb of memory and I can still run the game decently on Fastest with everything off. I also get frame drops near forest. Drops from 45-60 to 25-40 so it's not horrible, but noticable.[/QUOTE]
lol 24 fps is not okay ever in a game like this. especially if i have to make it look like shit in order to achieve even that. I guess I will just wait a few weeks for some optimization to happen because it sounds like this is a pretty common problem.
I'm pretty sure that's because of the tree textures being borked at the moment, or there is some memory leak related to them.
[QUOTE=UrbanSasquatch;46174173]lol 24 fps is not okay ever in a game like this. especially if i have to make it look like shit in order to achieve even that. I guess I will just wait a few weeks for some optimization to happen because it sounds like this is a pretty common problem.[/QUOTE]
Considering the frames and performance I used to get, it's actually pretty damn amazing, being able to play the game for real.
It appears the latest experimental dev patch improved the framerate in forests for me!
I think it's due to the patch that made the trees cheaper to animate that appeared on RustUpdates two hours ago.
Can anybody else who is on the Steam Rust experimental dev branch confirm?
[QUOTE=seandewar5;46174860]It appears the latest experimental dev patch improved the framerate in forests for me!
I think it's due to the patch that made the trees cheaper to animate that appeared on RustUpdates two hours ago.
Can anybody else who is on the Steam Rust experimental dev branch confirm?[/QUOTE]
I am at work right now but I will check later tonight. If that is true then that is wonderful!
I've just had a play.. until I got eaten by a wolf. In a forest I get 9-12 fps.. Outside of forests 30 or over fps.
I don't know what the fuck is wrong with trees but it eats fps like hell. Also, shadows, ambiant occlusion and full reflections are really expensive. Without those and in fantastic mode, I have ~40fps except when deep in forest.
I have a laptop with GTX660 and i73610 2.3Ghz
[QUOTE=UrbanSasquatch;46174173]lol 24 fps is not okay ever in a game like this.[/QUOTE]
Why's that? ... (considering 24 fps is the industry standard for movies, and was okay in every film I ever watched)?
Play a game at 24 fps, then get back to us. Also did you see The Hobbit in the 48fps format? You could see more detail than you otherwise could.
[QUOTE=neil.hillman;46181097]Why's that? ... (considering 24 fps is the industry standard for movies, and was okay in every film I ever watched)?[/QUOTE]
Because games and movies are not the same thing. A movie is very dependent on static shots and points of focus, so even when the camera is moving around it your eyes are locked on one point. Even that being said the "industry standard" for films was bumped up to around 40fps after lord of the rings was made. But anyway, in video games you are in a sense the camera and the jerky frame rate is alot more noticeable. You can NEVER compare fps in a movie to a game for that reason. Do a little experiment. Take a camera from like 5 years ago (because most smart phones shoot at around 40 to 60 fps now anyway) set to record and spin in circles a few times, then review the footage and its going to look choppy and shitty. There is a reason why all games try and atleast get 30fps and 60 if they can. If 24 was good enough, all game would run at that and you would get crucified now if you put out a game that ran sub 30 fps, I promise you, I work in the industry.
i appreciate the dream of 60fps. i'd love to have 30, but the reality is i can play rust fine at 12-16fps. yeah, maybe it makes a difference in terms of First Person Shooter competitions, but for casual play in a game that is intended to be a survival game with First Person Shooter elements, i really don't think we NEED better rates.
We definitely need better than 16 FPS. I don't think the game will feel right at such low FPS rates.
I can't imagine any AAA game that a person with a good computer cannot get 30 FPS. They have to optimize it first, then FPS should be decent on good PCs.
getting the same ~20 fps with a gtx 780 in forests. Ive been trying to figure out whats bottlenecking it. cpu usage is like 90%. and ram usage not that high either. GPU usage is only like 40% when its lagging like this. vram usage 2545 MB out of 3 gigs. I have no idea idea how it can lag.
[QUOTE=TNOMCat;46189440]getting the same ~20 fps with a gtx 780 in forests. Ive been trying to figure out whats bottlenecking it. cpu usage is like 90%. and ram usage not that high either. GPU usage is only like 40% when its lagging like this. vram usage 2545 MB out of 3 gigs. I have no idea idea how it can lag.[/QUOTE]
I have worked with Unity a bit and I can say they modded the shit out of it to make this game a thing. There is for sure some bad code in there i can garentee it will be fixed in time they just need to find out what it is. My geuss it has somthing to do with the fact that all of the tree's are dynamic now (they are not permanent objects in the world) they could be causing some crazy memory leaking problems if they arnt coded right. I can say even in the last 4 days my preformace has inproved a ton so they are on the right track for sure.
[QUOTE=oXYnary;46182280]Play a game at 24 fps, then get back to us. Also did you see The Hobbit in the 48fps format? You could see more detail than you otherwise could.[/QUOTE]
Hollywood isn't switching to 48 frames per seconds for a reason. In games, I say definitely yes, higher frames are better at all times because you need to see everything little thing. In movies, 48 fps looks absolutely horrid; it makes things look sped up and off. I don't know the science behind it, but I'm sure you can find quite a few articles if you google it.
tl;dr Cinema and videogames aren't the same
EDIT: I kind of thought of an example of how this works. Imagine if when you fired a weapon in a game, swung a tool, etc, every frame and detail was recorded bit by bit to get the most detail, and then sped up to match the motion they want the tool to look. It would look awful and feel extremely unnatural; there are probably animations in games, or just animations themselves that you have seen that are like what I am describing. If you notice in most animations that feel solid/look great, you will note that there are only a limited number of positions that you can see something as it moves. This creates a more realistic effect because of the way your eyes work and capture things moving, especially with multiple things moving. For example: when shooting a bow, you cannot track the arrow in flight the entire time unless it is at a great distance (and probably silhouetted). If you are watching it from close range, you will either just see it hit the target, or it multiple parts of it's flight as you try to track it with your eyes. I could be wrong on this; I don't know everything, lol. But maybe this is why that's the way it is.
the minimum threshold for a perception of fluid motion is 18fps.
I think glitched out trees with a memory leak slowing down your computer is pretty much a no brainer.
You guys think they aren't aware of this? Whatever this speedtree 6.0 they are using is either as much in development as rust is or they just haven't finished implementing the code.
I am 99.9% sure this is one of the main things they are working on ATM.
[editline]9th October 2014[/editline]
I have an older CPU, a single GPU that rivals an r290 and 12 GB of DDR3.
Cpu: 1055T Thuban AMD X6 Processor @ 3.57 GHZ
GPU: ATI Radeon 7950 3 GB
RAM: 12GB DDR3
I turned graphics down from "Fantastic" to "Good" and I get between 30-40 frames in open areas and 25-30 in the forest or heavily built or otherwise graphically intense areas.
If I turn everything up its just a shit show, playable as a game of chess against Deep Blue.
Legacy is always maxed @ 60 frames unless admin has gone ham and built 120 story base near small rad and there are 50+ people running around. Then it starts chopping a bit.
Sorry, you need to Log In to post a reply to this thread.