• What are you working on? v67 - March 2017
    3,527 replies, posted
Yeah I had triangulation on and trying without it doesn't seem to change anything. I don't have any winding culling on right now. What do you mean by not formatting the index data correctly? I'm trying with an .obj file and without aiProcess_JoinIdenticalVertices the indices only refer to a portion of all the vertices. I can render the whole model with glDrawArrays. If I enable the flag then everything breaks, the indices refer to all the vertices but I just draw a jumbled torso. [editline]13th March 2017[/editline] So I figured it out. The indices of the faces count up from the current mesh, not from some global mesh. So you have to add the amount of vertices to the index to get the actual index. This wasn't very well documented in my opinion, and unmentioned in the tutorial (and most other things I've seen).
[QUOTE=PelPix123;51949951]6 years later and i'm still working on my piano (the one with physically modeled resonance, dynamic release sounds and fully continuous pedaling). I'm trying to get soft pedal sounding right [url]http://picosong.com/pwXV/[/url] I'm really trying to create a usable solo piano. There are so so so so so many virtual pianos on the market, but I don't feel like any of them are really a solo piano. They're either studio recorded and super pristine for sitting in mixes or junked-sounding novelty pianos. There's no piano that sounds like when you go to a recital, and on top of my ultimate goal of providing an unparalleled playing experience for pianists, i want to fill that sound niche [editline]12th March 2017[/editline] i was really inspired by the sound of the piano in the nodame cantabile soundtrack. the samples are from piano in 162[/QUOTE] Sounds amazing keep up the good work.
SDL2 Raytracer [t]https://puu.sh/uGRE4/e2bb66ee65.png[/t] with fog [t]https://puu.sh/uGRTA/c9e51e08dd.png[/t]
Hello again facepunch people, I wasn't here for a long time, but some of you might remember me (vote rainbow if recognize :dance:). I started working on a random game I planned. Actually, "planned" is a pretty strong word. Here's a screenshot of my latest achivement: Panels dynamically created from xml instead of fixed. [IMG]http://i.cubeupload.com/1gMDr1.png[/IMG] Text is placeholder for now. Next up is some ability system, and maybe fixing NPCs (implementing "teams" or factions or something so they don't kill everyone including me :P)
[QUOTE=PelPix123;51949951]6 years later and i'm still working on my piano (the one with physically modeled resonance, dynamic release sounds and fully continuous pedaling). I'm trying to get soft pedal sounding right [url]http://picosong.com/pwXV/[/url] I'm really trying to create a usable solo piano. There are so so so so so many virtual pianos on the market, but I don't feel like any of them are really a solo piano. They're either studio recorded and super pristine for sitting in mixes or junked-sounding novelty pianos. There's no piano that sounds like when you go to a recital, and on top of my ultimate goal of providing an unparalleled playing experience for pianists, i want to fill that sound niche [editline]12th March 2017[/editline] i was really inspired by the sound of the piano in the nodame cantabile soundtrack. the samples are from piano in 162[/QUOTE] [url]https://www.youtube.com/watch?v=aT09uXPYmsQ[/url] tfw I once wanted to create a game like this, but modelling and programming and map editing etc all alone kinda sucked, mainly because I had to learn how to model just for that model
[QUOTE=0V3RR1D3;51940053]At first I was like this is waywo, not 3D modelling then I realised that it was acctualy realtime which was impressive enough. I live in wales so theres really no metro/underground here but I recently went on the tube in london and was so curious as to what it looks like deep in the tunnels and how it would feel to drive a train through it. I really want this game, any word on a demo or alpha release?[/QUOTE] Public beta testing will start in the summer (hopefully). It is realtime indeed! I've recorded some extra stuff from the same section of the level as displayed in the teaser: [vid]https://i.imgur.com/Zanjf7P.mp4[/vid] [vid]https://i.imgur.com/Zy8HnhM.mp4[/vid] I'm going to update the physics simulation a lot soon, if you'll notice the wheel jitters a little while brakes are holding it. The static friction constraint (a temporary constraint created to keep wheels in static friction condition) is actually currently invalid and doesn't entirely keep the angle too (making the wheel slip a little due to random numerical/calculation uncertainity).
[QUOTE=BlackPhoenix;51955430]Public beta testing will start in the summer (hopefully). It is realtime indeed! I've recorded some extra stuff from the same section of the level as displayed in the teaser: [vid]https://i.imgur.com/Zanjf7P.mp4[/vid] [vid]https://i.imgur.com/Zy8HnhM.mp4[/vid] I'm going to update the physics simulation a lot soon, if you'll notice the wheel jitters a little while brakes are holding it. The static friction constraint (a temporary constraint created to keep wheels in static friction condition) is actually currently invalid and doesn't entirely keep the angle too (making the wheel slip a little due to random numerical/calculation uncertainity).[/QUOTE] I'd be more than happy to test the game earlier if you need someone, i'd be willing to pay fairly too :)
Animation interpolation part 4: (Electric boogaloo)^2 edition Problem: As probably anyone who's worked with animations before knows except me, transitioning between two disjoint animations isn't ideal. In my current implementation I slerp between the two. This means that we have, say, 100 frames of mocap, a 0.2s slerp, and then the next 100 frames of mocap Sounds fine on paper, but in reality its super jarring because you lose all the irregularity of the motion capture while you slerp Solution: Playing two disjoint mocap replays currently involves simply merging them together into one larger replay, and adjusting the timestamps of the second replay. The existing interpolation code takes care of the entire thing If we instead take the difference between the start of the second replay, and the first replay (position/rotation), divide it by a number of frames, then distribute this over the frames of the replays (in my case i just dump it on the frames of the first), then we'll smoothly transition between the two over n frames. This works alright. Hooray! This concludes todays mandatory hand animation update. If you would like you unsubscribe from hand animation facts, check back in a week or 3 Edit: [vid]https://zippy.gfycat.com/DefinitiveScientificIcelandgull.webm[/vid] (I've upgraded to the modern age with jifs, if it doesn't work let me know as webms hate my browser)
Clouds coming along nicely. The new lighting has been optimized a little more and over-all it is running much better. I'm still wondering why every single raytracing tutorial or example for Unity has some sort of breakout for the render loop, because in all my experiments, early breakout = branches and branches = a shader that runs like crap [IMG]https://dl.dropboxusercontent.com/u/12024286/Dev%20stuff/fullspeed%20lighting%20sims.png[/IMG]
[QUOTE=phygon;51957498]Clouds coming along nicely. The new lighting has been optimized a little more and over-all it is running much better. I'm still wondering why every single raytracing tutorial or example for Unity has some sort of breakout for the render loop, because in all my experiments, early breakout = branches and branches = a shader that runs like crap [IMG]https://dl.dropboxusercontent.com/u/12024286/Dev%20stuff/fullspeed%20lighting%20sims.png[/IMG][/QUOTE] If I may offer some constructive criticism, I think the edges of the clouds should be more translucent. To me, the way the opacity falls off is jarringly sharp. I don't know how difficult it is to make a change like that, though. That aside, awesome work! Raytracing is way above my head, stuff's like magic to me. [editline]14th March 2017[/editline] [QUOTE=PelPix123;51949951]6 years later and i'm still working on my piano (the one with physically modeled resonance, dynamic release sounds and fully continuous pedaling). I'm trying to get soft pedal sounding right [url]http://picosong.com/pwXV/[/url] I'm really trying to create a usable solo piano. There are so so so so so many virtual pianos on the market, but I don't feel like any of them are really a solo piano. They're either studio recorded and super pristine for sitting in mixes or junked-sounding novelty pianos. There's no piano that sounds like when you go to a recital, and on top of my ultimate goal of providing an unparalleled playing experience for pianists, i want to fill that sound niche [editline]12th March 2017[/editline] i was really inspired by the sound of the piano in the nodame cantabile soundtrack. the samples are from piano in 162[/QUOTE] Your dedication to this project is inspiring, and it sounds better every time. Keep up the amazing work.
[QUOTE=Berkin;51957779]If I may offer some constructive criticism, I think the edges of the clouds should be more translucent. To me, the way the opacity falls off is jarringly sharp. I don't know how difficult it is to make a change like that, though. [/QUOTE] In general, or where they overlap? Where they overlap is an issue that I'm having a pretty hard time with. I don't want to render the lighting a ton of times, since even once isn't too performance-friendly. I think I'm going to try to make it so that it detects 2 points of light worth calculating if there's a cloud, then a gap, then a cloud behind it and blend them appropriately. I just need to find a way to avoid the extra calculations if a given section of cloud does not overlap without dragging the performance of the whole thing south. Anyway, thanks!
Working on screen-space reflections [t]http://i.imgur.com/pPuGJIm.png[/t] I'm getting close, the only issue now is that it seems to warp in the wrong direction. If I keep the camera level, then things mostly work. The issue I think is somewhere in my ray tracer, because the major guide that I'm following is in HLSL but D3D uses a flipped y axis, plus its derived from GLSL work, one that I think uses a different z axis direction to boot :suicide:
[QUOTE=phygon;51957498]Clouds coming along nicely. The new lighting has been optimized a little more and over-all it is running much better. I'm still wondering why every single raytracing tutorial or example for Unity has some sort of breakout for the render loop, because in all my experiments, early breakout = branches and branches = a shader that runs like crap [IMG]https://dl.dropboxusercontent.com/u/12024286/Dev%20stuff/fullspeed%20lighting%20sims.png[/IMG][/QUOTE] Branches should only be slower when the branch leads to more work. Since all threads in a block have to do the same task, whenever there's a branch, the block follows one path and blocks the progress on the other. Early-outs should be slightly faster since there might be a case where none of the threads have to do the task and the entire block can early out. But even if one thread has to do work, the early-out is already useless. It's mostly used to avoid using processing time on empty areas.
[QUOTE=FalconKrunch;51958140]Branches should only be slower when the branch leads to more work. Since all threads in a block have to do the same task, whenever there's a branch, the block follows one path and blocks the progress on the other. Early-outs should be slightly faster since there might be a case where none of the threads have to do the task and the entire block can early out. But even if one thread has to do work, the early-out is already useless. It's mostly used to avoid using processing time on empty areas.[/QUOTE] All that I'm experiencing with attempts to do early break outs on empty areas are inhuman shader compile times and a shader that runs *slightly* worse.
I just tried out render doc. My life has been changed. I had uniform buffer issues and within minutes I figured out what was going wrong. Just being able to see what actual data/commands are sent is priceless. I feel much more confident now that what I've written actually works and isn't all coincidental.
[QUOTE=WTF Nuke;51958416]I just tried out render doc. My life has been changed. I had uniform buffer issues and within minutes I figured out what was going wrong. Just being able to see what actual data/commands are sent is priceless. I feel much more confident now that what I've written actually works and isn't all coincidental.[/QUOTE] Wait, had you not used it before? I'm surprised, you seem to do some fancy stuff and even when I was just getting started with OpenGL it was instrumental for debugging. I can't imagine using more advanced techniques without it!
You kind of learn really primitive ways to debug, sort of like printf debugging but with the framebuffer. It wasn't much fun.
Holy fuck my asset finally got approved on the asset store [t]http://i.imgur.com/6WAmRGz.png[/t] [url]https://www.assetstore.unity3d.com/en/#!/content/82778[/url]
[QUOTE=WTF Nuke;51960696]You kind of learn really primitive ways to debug, sort of like printf debugging but with the framebuffer. It wasn't much fun.[/QUOTE] OpenCL supports none of this fancypants renderdoc nonsense, I prefer my debugging to be id ranges arbitrarily mapped to rgb colours (kill me)
At least you could arbitrarily map your id ranges. I had to debug a cubemap and to this day I am still not sure how to render it one side at a time. In fact I'm still not sure how I got it working. Maybe it doesn't work and it's all just coincidence.
Does anyone know any good resources on implementing screen space reflections? Everything I can find ultimately links back to or is inspired from [URL="http://casual-effects.blogspot.ca/2014/08/screen-space-ray-tracing.html"]this[/URL] I've never done any raytracing/marching before really (only raypicking once upon a time a few years ago), so I'm pretty sure this is the element I'm getting stuck on. I want to get as good at that as I am at shadowmapping and fully grasp it.
Progress on emoji rendering!         ☠️   烙      Just gotta figure out why some are rotated by 90° They look fine in the atlas :ohno: Also working with surrogate pairs in c# sucks :( (thats why some are a box instead of the real emoji) [IMG]http://i.imgur.com/HHtpvba.png[/IMG] edit: ok screw emojis for now. I'm moving on to better tooltips, that's what I'm actually missing. Right now it only shows what the action-bar button is bound to (and the name of the bound entity, in this case an "Ability"). [IMG]http://i63.tinypic.com/2r5fdok.jpg[/IMG] However its not as easy as just getting the description as that is dynamic, like in the image below. Updating the tooltip every frame is pretty costly as I don't know exactly how complex a damage calculation for an ability can become. Maybe I'll limit it to every 500ms, or just "OnShow" [IMG]http://i64.tinypic.com/23vggsy.png[/IMG] Text is a placeholder again (obviously haha). And you can also see that I experimented a bit with having stuff getting pushed by the bear swipe. But that didn't turn out too well when combined with a NavAgent :P edit 2: Anyone have any input on how to make that tooltip work? (as in, how you'd imagine it would work) Maybe I should just display the forumla thats used in the ability itself without doing the calculation? "Deals 5x your current hp in damage" But that would make some things ambiguous, for example when you want to know if some buffs stack with each other or not... Or I could let the system react to events, for example when a buff is added, all tooltips get recalculated. But what would I do when some things depend on time then? Ughhh, this stuff is getting pretty complicated, maybe I'm overthinking it haha
[QUOTE=WTF Nuke;51961823]At least you could arbitrarily map your id ranges. I had to debug a cubemap and to this day I am still not sure how to render it one side at a time. In fact I'm still not sure how I got it working. [b]Maybe it doesn't work and it's all just coincidence.[/b][/QUOTE] Pretty much my thought process whenever i code anything. [editline]15th March 2017[/editline] Eventually the coincidence becomes production and that's when I stop touching the keyboard entirely.
[QUOTE=mastersrp;51963292]Pretty much my thought process whenever i code anything. [editline]15th March 2017[/editline] Eventually the coincidence becomes production and that's when I stop touching the keyboard entirely.[/QUOTE] This was my thought when I first ran my CUDA Noise code - I had kernel timing/benchmarks in from the very beginning, and at first I was like "wow this is too fast, I broke something", then when the image output worked I was like "its still broken, somehow". Speaking of said benchmarks: [t]http://i.imgur.com/egDoLqz.png[/t] I'm pretty proud of the project, though. I wrote up a report using miktex (28 pages D:) that should hopefully be insurance for my grade, as our presentation was cut down to 9 minutes and my partner spent 6 minutes on his slides, leaving me 3 minutes to review the vast majority of the technical work we did and such. Like how I went through 3-4 iterations of each base noise algorithm to find the best one for the GPU, or how I spent way too long trying to make texture LUTs and texture objects work instead of just using managed memory. I'd link the repo, but I'd prefer to wait until I have it in a releasable state. I need to finish the rewrite to make it work with "Point" structs instead of using the thread indices as coords, along with doing a good chunk of cleanup, documentation, and creating a working cmakelists for the project. Its fast and feature-rich enough though that I do want to "release" it, since I think it could be useful. The presentations were also pretty interesting, lots of inspiring projects. There was a really cool one that has made me want to finally look into neural nets, and having a GPU that can do half-precision stuff I figure I might as well give it a shot and see how it works.
[QUOTE=Felheart;51962875]Progress on emoji rendering!         ☠️   烙      Just gotta figure out why some are rotated by 90° They look fine in the atlas :ohno: [...][/QUOTE] Facepunch destroys emoji when you edit posts. There are also lots and lots of them that don't fit into a single [I]char[/I] (or even two) in .NET, so it's probably a good idea to not use that type at all for (directly) manipulating text that's not purely Latin.
I'm in a Computer Architecture class finally learning about MIPS and related shit and I actually love it. I feel like I finally found the right major after three years.
[QUOTE=Adelle Zhu;51965736]I'm in a Computer Architecture class finally learning about MIPS and related shit and I actually love it. I feel like I finally found the right major after three years.[/QUOTE] I'm kinda hoping I feel the same next fall, when I kinda "restart" college at a smaller community college (in the CSE program, most likely). Hopefully, working for my meh-to-terrible salary for the next 6 months doesn't completely burn me out on programming. I really just want a degree at this point.
[QUOTE=paindoc;51966281]I'm kinda hoping I feel the same next fall, when I kinda "restart" college at a smaller community college (in the CSE program, most likely). Hopefully, working for my meh-to-terrible salary for the next 6 months doesn't completely burn me out on programming. I really just want a degree at this point.[/QUOTE] That's pretty much what I did. Three years ago I left New York for Utah State University as an Aerospace major and it was an unbelievably stupid decision. So I came back and now I'm at NYIT as a IT/Cybersecurity major.
[QUOTE=Adelle Zhu;51966317]That's pretty much what I did. Three years ago I left New York for Utah State University as an Aerospace major and it was an unbelievably stupid decision. So I came back and now I'm at NYIT as a IT/Cybersecurity major.[/QUOTE] Damn, coincidences are stacking - I'm leaving my uni after the aerospace program didn't accept me, and after I realized (while seeing aero engineers in my workplace) that Aerospace wasn't for me :v:
*dialup tone* Hello? Yes hello, is this the hand animation hotline? Oh good. I've been feeling a little lonely recently and I was.. do you have any minor improvements in hand animation that you can share with me? Oh really? That sounds great! What do you mean it probably works completely correctly this time, isn't that what you said the last two times? Sure sure I understand that you found a new crippling problem where transitioning from idle to active animations was really janky because the idle animation just gradually slid over to the start of the cool finger snapping animation, and now instead all animation transition error is distributed across animations that are marked as 'exciting', which intelligently merges multiple animations together to make sure that you won't notice transition errors by masking them at the beginnings and ends of high motion animations, but frankly both this joke and sentence have gone on long enough Here's a jif: [vid]https://gfycat.com/TastyZealousHoiho[/vid] Under the old system, the hand would have slid/slided/slidden over across to the start of the second hand animation because the error between two animations was always distributed at the end of one animation. Not anymore! Edit: Jifycat seems to have cut my jif off at the knees, here's the full wid [media]https://www.youtube.com/watch?v=X-jV2mK2rEI[/media]
Sorry, you need to Log In to post a reply to this thread.