• What are you working on?
    5,004 replies, posted
[QUOTE=WTF Nuke;49639197]lso, when I imported things from blender, it used the wrong axis for gravity so everything got pulled to the left rather than down.[/QUOTE] [QUOTE=Fourier;49639268]Unity also has problems with Blender (switched axis)[/QUOTE] This is trivial to solve, and most of the fleshed out exporters in Blender have an axis conversion option. Blender itself is right handed Z-up iirc.
Yeah I know about that option, which is why I was so confused when changing it did nothing. This was a couple of years ago so maybe it got fixed along the way. But the lack of character controller is a real bummer. I remember I used a rigid body oval instead and whenever the character spawned they tipped over and fell, only able to roll around on one axis.
[QUOTE=WTF Nuke;49639881]Oh, that's a shame :/ Is there a nice alternative? Because from a developer perspective PhysX is just so so nice.[/QUOTE] [url]https://github.com/RandyGaul/qu3e[/url] [url]http://newtondynamics.com/forum/newton.php[/url] [url]http://www.ode.org/[/url] I worked a bit with a .NET binding of Newton, it was nice.
I'm working on a game in the GGJ, I've been asleep for not more than an hour in 2 days now. I'm surprised I'm still awake. :V We've got some problems working in Unreal, but I hope I can show you guys something cool tomorrow or the day after.
[QUOTE=WTF Nuke;49639954]Yeah I know about that option, which is why I was so confused when changing it did nothing. This was a couple of years ago so maybe it got fixed along the way. But the lack of character controller is a real bummer. I remember I used a rigid body oval instead and whenever the character spawned they tipped over and fell, only able to roll around on one axis.[/QUOTE] You can fix this with locking the rigidbody to only rotate in one axis. [URL="https://gamedev.stackexchange.com/questions/22319/how-to-disable-y-axis-movement-in-the-bullet-physics-engine"]Here's a stackoverflow problem addressing the problem.[/URL] If I end up finding some of my old source code I can show you how I pulled off my own character controlling only using rigidbodies (because the documentation on how to use the character controller in bullet is nowhere).
[QUOTE=WTF Nuke;49639881]Oh, that's a shame :/ Is there a nice alternative? Because from a developer perspective PhysX is just so so nice.[/QUOTE] Isn't Havok free? or is that for something else.
[img]http://i.imgur.com/5kckMGH.png[/img] cabin interior for duck hunt simulator 2016
[QUOTE=thatbooisaspy;49640388]You can fix this with locking the rigidbody to only rotate in one axis. [URL="https://gamedev.stackexchange.com/questions/22319/how-to-disable-y-axis-movement-in-the-bullet-physics-engine"]Here's a stackoverflow problem addressing the problem.[/URL] If I end up finding some of my old source code I can show you how I pulled off my own character controlling only using rigidbodies (because the documentation on how to use the character controller in bullet is nowhere).[/QUOTE] But you run into other issues with a rigid character controller, like getting tossed around by other rigids, drifting down slopes, slowing down when going up slopes, stairs are a bitch, etc. [QUOTE=nomad1;49640463]Isn't Havok free? or is that for something else.[/QUOTE] It is for commercial under $5k, but I remember trying to compare Havok's documentation with PhysX and it paled in comparison. I think I'm just gonna stick with what I know for now, don't really want to jump more hurdles.
[QUOTE=WTF Nuke;49640506]But you run into other issues with a rigid character controller, like getting tossed around by other rigids, drifting down slopes, slowing down when going up slopes, stairs are a bitch, etc. It is for commercial under $5k, but I remember trying to compare Havok's documentation with PhysX and it paled in comparison. I think I'm just gonna stick with what I know for now, don't really want to jump more hurdles.[/QUOTE] If people can manage to write good character controllers in Unity using rigidbodies (I did it once before) then there isn't a reason why people couldn't solve these issues in Bullet. I agree with your point, if what you're doing right now isn't causing you much trouble, don't try to change because then you enter the realm of scope creep.
Given this: [code] class Timer { public: Timer() : beg_(clock_::now()) {} void reset() { beg_ = clock_::now(); } double elapsed() const { return std::chrono::duration_cast<second_>(clock_::now() - beg_).count(); } void fps_wait() const { // elapsed_fps() is the same as elapsed() but in terms of fps_ auto towait = fps_(1.0 - elapsed_fps()); // Gotta use system_clock here, doesn't seem to work with high_resolution_clock std::this_thread::sleep_until(std::chrono::system_clock::now() + towait); } private: typedef std::chrono::high_resolution_clock clock_; typedef std::chrono::duration<double, std::ratio<1> > second_; typedef std::chrono::duration<double, std::ratio<1, 30> > fps_; std::chrono::time_point<clock_> beg_; }; [/code] And this: [code] Timer t, ot; unsigned char x = 0; do { t.reset(); // do something that takes time if (x == 0) { // print out the timer, ot (outer timer), and current system time. // That will show how long this iteration took, how long the last 30 iterations took, and the current time. ot.reset(); x = 30; } x--; t.fps_wait(); } while (1); [/code] Why does each printout come ~1040ms apart, and not ~1000? 40ms is a lot of error margin for a clock that has nanosecond precision. If I change fps_ to be 60 or even 2, it will always be ~30-40ms over the target. Interestingly, if I set fps to 31 but don't change the loop counter, it is regularly between ~990-1000ms. What am I missing here? :/
Sounds like scheduling delays, the function can only guarantee to sleep [B]at least[/B] N amount of time. Try sleeping for less than the desired amount of time you want, and then busywait the rest.
[QUOTE=cartman300;49640723]Sounds like scheduling delays, the function can only guarantee to sleep [B]at least[/B] N amount of time. Try sleeping for less than the desired amount of time you want, and then busywait the rest.[/QUOTE] It seems the <chrono> library is very flawed. I've tested a lot of different variations, and the result is always off by a lot. I've had to resort to just sitting around in a while loop until the required time has elapsed... There has got to be a better way to do this than eating cpu cycles
I don't have much to show (because it would just be a cube still), but I made some huge changes to my engine architecture :dance: I've always enjoyed Unity's OOP approach and since I used that the most (but UE4 does the same thing) I would copy that the most. So my "architecture" before used to be something along the lines of: [CODE]MAIN LOOP -> RENDERER -> BASEOBJECTS -> MODELOBJECT[/CODE] With this rewrite the main loop is now centered around a state machine which handles all logic and it's implemented by the engine class, which also holds all of the systems. Systems are objects which update components such as graphics and physics, where implementing them somewhere else (like the components themselves) would be terribly inefficient and too much of a hassle. All entities are created using a factory which keeps a list of all entities that are created so systems can reference this and search for the components it needs. For example, in my graphics system this is what it looks like to grab all of the entities with a MeshRenderer attached in order to call them to draw: [CODE] std::vector<MeshRenderer*> renderobjects = ObjectFactory::FindComponentsOfType<MeshRenderer>(); [/CODE] Component pointers can be grabbed akin to Unity: [CODE] test->GetComponent<MeshRenderer>(); [/CODE] I'm really happy with how this turned out, and was a huge improvement to my old system of hacked together pieces of cardboard with "[I]I tried[/I]" drawn on it. Now I can sleep happily knowing I didn't make a monstrosity of my engine architecture and this will be [I]really [/I]flexible now. There are some other small improvements like making meshes and textures more OOP-oriented, but that also means I took a step back because there is no more instancing, will fix that soon now that I have components. Here's the resources I used if you're interested: [URL="http://gamedevgeek.com/tutorials/managing-game-states-in-c/"]Managing Game States in C++[/URL] [URL="http://gameprogrammingpatterns.com/state.html"]State - Design Patterns Revisited[/URL] [URL="https://eliasdaler.wordpress.com/2015/08/10/using-lua-and-cpp-in-practice/"]Using Lua with C++ in practice. Part 1. Intro to ECS and basic principles[/URL] [URL="http://www.randygaul.net/2013/05/20/component-based-engine-design/"]Component Based Engine Design[/URL] Now to focus back on graphics, now that I tackled deferred rendering I want to take a crack at shadow maps and implementing image effects. More fun! :v: Edit: just went through the most painful hour of my life programming, I had to debug a serious issue in my design that would actually create a instance of components each time GetComponent() was called. C++ casts are fun.
normals generated with a 5x5x5 kernel [t]http://i.imgur.com/sQVrYF8.png[/t] pixel art lighting [img]http://i.imgur.com/2zsRLpl.gif[/img] going to reimplementing shadows soon
So I've been out of the game/simulation programming stuff for a few years. Decided yesterday to take a shot at using game maker (yeah I know..) This is coming along nicely though. Maybe soon I'll get back in to c++ [vid]http://webm.land/media/91Ny.webm[/vid] (Obviously some of this is just for debugging) ..It's not much.. but it's somethin
[QUOTE=DubsHelix;49641479]So I've been out of the game/simulation programming stuff for a few years. Decided yesterday to take a shot at using game maker (yeah I know..) This is coming along nicely though. Maybe soon I'll get back in to c++ -vid- (Obviously some of this is just for debugging) ..It's not much.. but it's somethin[/QUOTE] [img]http://i.imgur.com/AdqSnse.jpg[/img] That reminded me of one of my first Game Maker projects. GM has lost most of it's reputation for being a low tier tool. Don't feel bad about it :) [editline]31st January 2016[/editline] [url=http://sandbox.yoyogames.com/games/29491-goo-the-demo]oh wow, it's still live![/url]
[QUOTE=thatbooisaspy;49641216]stuff [CODE] auto renderobjects = ObjectFactory::FindComponentsOfType<MeshRenderer>(); [/CODE] [/QUOTE] Auto is your friend!
[QUOTE=Sidneys1;49640987]It seems the <chrono> library is very flawed. I've tested a lot of different variations, and the result is always off by a lot. I've had to resort to just sitting around in a while loop until the required time has elapsed... There has got to be a better way to do this than eating cpu cycles[/QUOTE] Sleeping is very system-specific because it's up to the OS's scheduler to decide which thread gets to run on which processor and for how long. If you tell the OS that your thread can sleep for 10 milliseconds, it may not schedule your thread for another 20 milliseconds. There are platform-dependent ways of adjusting the fidelity of the scheduler, which would allow more precise sleeping. They're used by multimedia applications (and notably by some browsers). But it often affects the processor use of every program that's running, which consumes laptop batteries like hell and causes other issues. Spinning is the most accurate way of waiting for a specific time. edit: And using [i]sleep_for[/i] for sleeping a specific duration is better than [i]sleep_until[/i] because the latter waits until the clock has reached the given time point -- which may never happen if someone adjusts your system clock backwards.
my game jam game ended up being way better than i'd have ever imagined. its a cross between harvest moon and a bullet hell shoot em up. during the day you you farm crops and animals to make money to buy items from caravans that come by at set hours of the day, then at night you try to survive an onslaught of monsters. The more days you last the longer the nights and the harder the enemies you fight. i'm trying to make the days fun because at the moment it's pretty much just waiting for night to start so you can hit flying skulls with your axe while laughing at the randomly generated names of the caravan owners. [editline]31st January 2016[/editline] and I haven't slept in 40 hours, i went hard last night trying to implement the last of the main features, i'm so wrecked
[QUOTE=Sidneys1;49640987]It seems the <chrono> library is very flawed. I've tested a lot of different variations, and the result is always off by a lot. I've had to resort to just sitting around in a while loop until the required time has elapsed... There has got to be a better way to do this than eating cpu cycles[/QUOTE] Why would you sleep anyway? Your application should really never sleep, unless there's a really really good reason for it to do so.
[QUOTE=mastersrp;49642504]Why would you sleep anyway? Your application should really never sleep, unless there's a really really good reason for it to do so.[/QUOTE] Avoiding burning CPU time unnecessarily is important especially for interactive applications like games, people don't like it if their game eats 100% CPU while it's actually idling logically. I mean, would you like it if your browser constantly ate 100% CPU for instance?
[QUOTE=Sidneys1;49640987]It seems the <chrono> library is very flawed. I've tested a lot of different variations, and the result is always off by a lot. I've had to resort to just sitting around in a while loop until the required time has elapsed... There has got to be a better way to do this than eating cpu cycles[/QUOTE] Precision depends on the implementation, in Visual Studio the std::high_resolution_clock was just an alias for std::system_clock until very recently. There is no guarantee for nanosecond precision, only that it uses the most precise timer available.
[QUOTE=mastersrp;49642504]Why would you sleep anyway? Your application should really never sleep, unless there's a really really good reason for it to do so.[/QUOTE] Generally if I sleep it's because I need to give an external API time to handle something.
Hi all, nice to see so many familiar faces having last posted here >1 year ago! Just wanted to check in; I'm now at Uni studying Computer Science (and I could rant for hours about how they're teaching programming...) and spending a decent amount of time working on personal projects. I'm also now running programming tutorials for various people as well as a sort of home hardware repairs business. Project wise, most importantly, I now have a [URL="http://sevenspace.co.uk"]shiny new domain[/URL] which I'll be using as a portfolio and a platform from which to market my hardware/software services. Currently it looks like crap because since buying it I've been dumped with coursework from about 4 different disciplines so I've been a fair bit busier than usual. My other big project is a kind of [URL="https://github.com/jonnopon3000/ProtoGL"]"boilerplate engine"[/URL] written in JS w/ WebGL initially intended for game prototyping. It's in an extremely juvenile state and contains a lot of inconsistency due to being literally smashed together from various projects, but I have a clear goal for it and work on it whenever the time presents itself. At the moment, I use it solely for jams and it can get me up and running with a prototype in minutes as opposed to hour(s). Progress is slow but sure.
[QUOTE=ThePuska;49642178]edit: And using [i]sleep_for[/i] for sleeping a specific duration is better than [i]sleep_until[/i] because the latter waits until the clock has reached the given time point -- which may never happen if someone adjusts your system clock backwards.[/QUOTE] sleep_until, in my tests, would sleep until within ~40ms of the specified time. sleep_for would either give me nothing by compiler errors related to its templating, or sleep for 0ms. I could not get sleep_for to work no matter how I configured it. Edit: Also, using Sleep(1) in my while loop caused 150 iterations to take >6000ms instead of the exactly 5000ms I get when just using an empty while. Edit again: Shit, uh, have some (old) content: [img]http://i.imgur.com/XqVsYIq.png[/img]
That feel when the world slips away... Using love2d, ignore my shitty texture mapping... [vid]http://richardbamford.io/dmp/2016-01-31_17-12-51.webm[/vid]
[QUOTE=Map in a box;49639178]physx should stop being used honestly, its a nice physics engine but nVidia is being really anticompetitive with it.[/QUOTE] PhysX CPU-solver is used in a lot of games. Pretty much every Unity game uses it, Arma 3 uses it, etc. Only the GPU-solver part is Nvidia-exclusive hence why just a few sponsored games use it. It sucked for standardizing GPU-accelerated physics in the past. But now the new consoles have GPGPU capabilities and engines like UE4 use vendor-independent GPU physics, I don't think we have to worry about games going for PhysX-exclusive stuff any more.
[QUOTE=Clavus;49643674]PhysX CPU-solver is used in a lot of games. Pretty much every Unity game uses it, Arma 3 uses it, etc. Only the GPU-solver part is Nvidia-exclusive hence why just a few sponsored games use it. It sucked for standardizing GPU-accelerated physics in the past. But now the new consoles have GPGPU capabilities and engines like UE4 use vendor-independent GPU physics, I don't think we have to worry about games going for PhysX-exclusive stuff any more.[/QUOTE] Most things that use physx iirc include the GPU solver. I might be mistaken, though.
my normals didn't have negative values, I feel like a dip but I'm suprised it still worked that well. [t]http://i.imgur.com/ZZnsphj.png[/t] corrected
[QUOTE=polkm;49644189]my normals didn't have negative values, I feel like a dip but I'm suprised it still worked that well. [t]http://i.imgur.com/ZZnsphj.png[/t] corrected[/QUOTE] This makes it look insanely good.
Sorry, you need to Log In to post a reply to this thread.