[QUOTE=Kopimi;28528283]That makes sense. I'm without much experience in optimizing things like this mainly because I've never made things like this. I just saw it taking up a ton of CPU and figured that was a bad thing. I would still like to limit the framerate, because like DarKSunrise said, I don't want my FPS to be in the thousands, with the game pushing out 700FPS while my monitor is only getting so many of them.
Thanks for the posts though, great advice :buddy:[/QUOTE]
If you don't have a dedicated render thread, you could just sleep for some milliseconds every time you loop and that'd fix it.
[editline]10th March 2011[/editline]
Actually, even if you [B]did[/B] have a dedicated render thread that would fix the 100% CPU usage. I dunno what I thought of.
Is it possible to restore the depth buffer in openGL after clearing it twice? I want to clear the depth buffer before I render every single object on the scene but when I'm done I want the depth buffer for the entire scene which is gone because of the clearing calls. ? :[
One way is to render everything twice.. but that's slow.
Holy shit. NEVER pass a negative timestep to Farseer Physics. I'm too lazy to make a gif but I can assure you it is quite similar to Gmod's black hole effect :v:
[QUOTE=Dlaor-guy;28529005]Holy shit. NEVER pass a negative timestep to Farseer Physics. I'm too lazy to make a gif but I can assure you it is quite similar to Gmod's black hole effect :v:[/QUOTE]
Make it anyways. I'm interested.
Made a clone of jallens gravispark (hope you dont mind jallen :V )
still get large fps drom when theres large amounts of particles but it seems to work ok
[img]http://img818.imageshack.us/img818/526/43481021.png[/img]
BTW im hoping to make this a larger particle engine
[QUOTE=likesoursugar;28528794]Is it possible to restore the depth buffer in openGL after clearing it twise? I want to clear the depth buffer before I render every single object on the scene but when I'm done I want the depth buffer for the entire scene which is gone because of the clearing calls. ? :[
One way is to render everything twise.. but that's slow.[/QUOTE]
twice*
Sorry. You did it twice.
[QUOTE=Darwin226;28529854]twice*
Sorry. You did it twice.[/QUOTE]
heh
[QUOTE=Jallen;28527994]To expand on that, it depends on what type of game you are making.
If you're making a casual 2D rpg game then your users might alt tab, browse a bit, go on msn or whatever. You don't want to sap their CPU's.
If you're making the next Crysis I doubt people will start bawwing over your game using up CPU power. They'll be bawwing when it's not making effective use of it.[/QUOTE]
I'd freeze or at least slow down (i.e. insert sleeps) the game while out of focus anyway.
[QUOTE=Kopimi;28528283]That makes sense. I'm without much experience in optimizing things like this mainly because I've never made things like this. I just saw it taking up a ton of CPU and figured that was a bad thing. I would still like to limit the framerate, because like DarKSunrise said, I don't want my FPS to be in the thousands, with the game pushing out 700FPS while my monitor is only getting so many of them.
Thanks for the posts though, great advice :buddy:[/QUOTE]
If you want to figure out how well it runs, just display the framerate (time it takes to update and render a single frame) somewhere.
Not saying you shouldn't implement vsync (or limit the framerate otherwise); I guess the most preferable might be having it as an option.
[img]http://filesmelt.com/dl/myprog.png[/img]
My lastest work, this one actually does something and is useful!
edit: spelling is off I know that, will fix later
[QUOTE=Raven-Rave;28531434][img_thumb]http://filesmelt.com/dl/myprog.png[/img_thumb]
My lastest work, this one actually does something and is useful![/QUOTE]
I like forumlas
[QUOTE=likesoursugar;28528794]Is it possible to restore the depth buffer in openGL after clearing it twice? I want to clear the depth buffer before I render every single object on the scene but when I'm done I want the depth buffer for the entire scene which is gone because of the clearing calls.[/QUOTE]
Rather than clearing the depth buffer, you can just glDisable(GL_DEPTH_TEST) while drawing the objects. That way they'll ignore the depth buffer, but still write into it.
However, I think when you draw a "farther" object over a "nearer" one with depth testing disabled, it'll overwrite the values in the depth buffer too, which may not be the result you want. I don't think you can do depth culling for depth values but not for color values in a single pass, except maybe using a pixel shader instead of GL_DEPTH_TEST.
The object serialisation for file IO in C# is absolutely awesome. I really want something like this in C++
Dont you just hate it when random zombies start chasing you down the street?
[media]http://www.youtube.com/watch?v=sBgcHtd5dLw[/media]
Gah, where's all my motivation at :colbert:
Decided to brush up on my C++ knowledge a bit, sat down and started work on a simple game engine.
Using SFML for creating a window. On top of that I have states, transitions, threading, events, and simple network code written into it.
I just need some sort of simple game to create in it and I'm all set.
I was thinking a TD but if any of you have a better idea then I'm all for it, bearing in mind that I've never created a game before and the most I've used C++ before was console applications.
[img_thumb]http://ace.haxalot.com/pictures/random/zscreen/SS-2011-03-10_23.30.52.png[/img_thumb]
Or if you're more of a cornflower blue person
[img_thumb]http://ace.haxalot.com/pictures/random/zscreen/SS-2011-03-11_02.51.13.png[/img_thumb]
Do a tower defense game.
-snip-
Learning OpenGL with OpenGL SuperBible, had to compile GLTools myself.
[img_thumb]http://anyhub.net/file/25D0-superbible.png[/img_thumb]
I [i]overheard[/i] two people talking about wanting careers in video game design today, so I stopped them and asked if they knew how to program and they asked what that meant. :v:
Because you need to be able to program to be a [B]designer[/B].
Anyone else find it hard, when starting out on a project, to commit to VC because you're writing so much new code?
Full in-game intro/teaser for my 2D C++/SDL/OpenGL/GLSL/Lua/FMOD/FreeType/Boost extravaganza.
[media]http://www.youtube.com/watch?v=zUmzlxN5OnQ[/media]
Whole intro was scripted in Lua and then parsed by the game.
I'm working on a chat thingy in Love, got some basic networking stuff done.
[editline]10th March 2011[/editline]
Also, 8.8 earthquake in Japan. I feel sick.
There a massive tsunami, fucking [b]buildings[/b] are being washed away.
.NET Expression trees are incredibly awesome. I've never used them before, but it took literally 10 minutes to implement compilation in Tangaraular Grapher with them
[img]http://ahb.me/25KT[/img]
Would be nice if there was a good tutorial on it?
[QUOTE=Vbits;28541245]Would be nice if there was a good tutorial on it?[/QUOTE]
You don't need one.
Would a quick how too be out of the question as well?
[QUOTE=Vbits;28541598]Would a quick how too be out of the question as well?[/QUOTE]
Expression Trees just let you build ASTs at runtime and compile them. Take this for example:
[csharp]
// regular way
int x = 1 + 2 * 3;
// stupid tricks way:
using System.Linq.Expressions;
var find_x_expression = Expression.Add(Expression.Constant(1), Expression.Multiply(Expression.Constant(2), Expression.Constant(3)));
var find_x = Expression.Lambda<Func<int>>(find_x_expression).Compile();
int x = find_x();
[/csharp]
[QUOTE=Tangara;28540950].NET Expression trees are incredibly awesome. I've never used them before, but it took literally 10 minutes to implement compilation in [b]Tangaraular Grapher[/b] with them
[img_thumb]http://ahb.me/25KT[/img_thumb][/QUOTE]
[img]http://media.giantbomb.com/uploads/2/29661/1095675-futurama_fry_looking_squint_super.jpg[/img]
Expression trees do look interesting. I made my old Brainfuck compiler emit the IL manually, so I might remake it using these.
Sorry, you need to Log In to post a reply to this thread.