• What are you working on?
    5,004 replies, posted
Here's something interesting: If you override [I].ToString()[/I] in a way that causes a [I]StackOverflowException[/I], and then try to examine that variable in the debugger (whether it appears in the "locals" window or in the mouse-over preview), this will forcibly detach the VS debugger and continue program execution. [editline]17th April 2016[/editline] Now I'll just have to limit my debug output to still show me all data while skipping existing information... [editline]17th April 2016[/editline] It's fixed: [code]MemberFormat<Kodex_Test.Notable>(A, B, c, e.E, RestoreCount, cscss.Value.Value.Value.Value.Value) MemberNote<Kodex_Test.Notable>_2BB23B(A = ValueNote_189441B(1), B = ValueNote_1D364F7(2), c = ValueNote_6E8CAF(c), e.E = ValueNote_3E2F22E(5), RestoreCount = ValueNote_2FA83A4(0), cscss.Value.Value.Value.Value.Value = ValueNote_2CEA0CA(9)) First copy: MemberNote<Kodex_Test.Notable>_267E1D0(A = ValueNote_1A6F050(1), B = ValueNote_2DE72D6(2), c = ValueNote_1D2098A(c), e.E = ValueNote_6255DD(5), RestoreCount = ValueNote_1A6F050, cscss.Value.Value.Value.Value.Value = ValueNote_37504C8(9)) Second copy: MemberNote<Kodex_Test.Notable>_3A9F0C(A = ValueNote_20F9772(1), B = ValueNote_28C5305(2), c = ValueNote_2EEEB2E(c), e.E = ValueNote_266449E(5), RestoreCount = ValueNote_198698E(2), cscss.Value.Value.Value.Value.Value = ValueNote_25BB5FF(9)) Cyclic: MemberNote<Kodex_Test.Cyclic>_1A8C1FA(Reference = MemberNote<Kodex_Test.Cyclic>_265601D(Reference = MemberNote<Kodex_Test.Cyclic>_1A8C1FA)) Drücken Sie eine beliebige Taste . . .[/code] This [I]should[/I] work for just about any free-standing data now (as per the configured [I]NoteFormat[/I]s. It's only automatic for the basic built-in types for now, but I'll add a few more to the defaults). [editline]17th April 2016[/editline] Now with (slightly) more readable output: [code]Notable_2BB23B(A = int_189441B(1), B = float_1D364F7(2), c = char_6E8CAF(c), e.E = int_3E2F22E(5), RestoreCount = int_2FA83A4(0), cscss.Value.Value.Value.Value.Value = int_2CEA0CA(9)) First copy: Notable_267E1D0(A = int_1A6F050(1), B = float_2DE72D6(2), c = char_1D2098A(c), e.E = int_6255DD(5), RestoreCount = int_1A6F050, cscss.Value.Value.Value.Value.Value = int_37504C8(9)) Second copy: Notable_3A9F0C(A = int_20F9772(1), B = float_28C5305(2), c = char_2EEEB2E(c), e.E = int_266449E(5), RestoreCount = int_198698E(2), cscss.Value.Value.Value.Value.Value = int_25BB5FF(9)) Cyclic: Cyclic_1A8C1FA(Reference = Cyclic_265601D(Reference = Cyclic_1A8C1FA))[/code]
It's been a looong time since I posted here (pong and snake, heh). Anyway, here goes: I've been remaking the old DOS game Alien Carnage / Halloween Harry for educational purposes. So far, I've got the player almost entirely finished, along with a basic camera system (which I will improve later), an entity system, level loading and projectiles with hit detection. [vid]http://a.pomf.cat/iwirtr.mp4[/vid] Here's some gameplay from the original: [media]https://www.youtube.com/watch?v=OP9XTszPcwQ[/media]
Been fiddling with WPF again, thought I'd make a less advanced note program. It's very basic and simple, but so useful(for me). [vid]https://dl.dropboxusercontent.com/u/65179543/ShareX/2016/04/2016-04-17_21-09-48.mp4[/vid] Again it is made in WPF. I'm happy about it, even though it is super simple. :v:
[vid]https://zippy.gfycat.com/SinfulAgileAxolotl.webm[/vid]
[video=youtube;FTwdXwDQtmU]http://www.youtube.com/watch?v=FTwdXwDQtmU[/video] So while working on my hlsl shaders to improve the quality of the fire/smoke generated, i mathed my math and it made some visual math on screen? :V Reminds me of water caustics actually!
More fun with simulating the solar system. I added support for asteroids - these do not exert gravitational pull on other bodies (optimisation, total mass of the asteroid belt is 4% of the moon), but gravity is applied to them. I dumped a whole bunch of asteroids into the orbit of jupiter to see if I could replicate the formation of jupiter trojans. I also enabled asteroid -> body collisions (although they're not super accurate), which means that jupiter will soak up asteroids that collide with it Here's the starting condition. Jupiter is the outside planet highlighted in red, earth is the inside planet highlighted in red [IMG]https://dl.dropboxusercontent.com/u/9317774/starting.PNG[/IMG] Here's how it looks after something like a million years [IMG]https://dl.dropboxusercontent.com/u/9317774/funwithasteroids.PNG[/IMG] Here's how it should look in real life [IMG]https://upload.wikimedia.org/wikipedia/commons/f/f3/InnerSolarSystem-en.png[/IMG] The most interesting thing for me is that this actually manages to replicate, to some degree (although not super accurately), the jupiter trojans and greeks in front of, and behind jupiter's orbit. I've also got another belt between saturn and jupiter - I have no idea if that exists in real life. There's no sign of the hildas though Dumping lots of asteroids into the inner solar system takes far too long to clear out (and makes it too visually cluttered), so I've not done this Edit: Turns out I accidentally gave all the planets circular orbits, this means fixing this and then simulating it all out again ;_;
[url]http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.23.6778&rep=rep1&type=pdf[/url] decided to review this for CS class
[QUOTE=Zelpa;50146440]I love how 99% of programming is just guessing. Do I need to ceil this value? Floor it? Maybe I should take away 1? Or maybe floor and then add 1? Gotta try every possibility.[/QUOTE] I can't imagine programming like this, sounds miserable.
Also I made a linguistic observation today: Computers can pass off pretty well for humans in highly contextualized speech, providing the grammar is correct. For example: "the" has no context. This can be generated by computer or human, but is not impressive. "The cat" has a small amount of context, and with a trivial amount of programming can be generated by a computer "The cat meows" has much more context. "Meows" makes sense in the context of "cat", and with more programming (and the correct source material), can be generated by a computer "the cat barks" makes little sense in any context. cat is a very concrete word, and can be associated with many things, in a rational universe, but not barking. If put next to each other, a human would likely choose the phrase "the cat meows" as having been generated by a human. So if we want to create convincing speech generated by a computer, we can either get really really smart with our programming, or choose highly contextualized speech as a source material. Philosophy is one such source. Philosophy often requires many paragraphs of context to result in some logical conclusion, and also often relies on previously established axioms and proofs, meaning that a quote, taken out of context, mid sentence, mid paragraph, mid chapter, such as: [quote]"[...] apply a priori to objects, the transcendental deduction of conceptions, and I distinguish it from the empirical deduction, which indicates the mode in which conception is obtained through experience and reflection thereon; consequently, does not concern itself with the right, but only with the fact of our obtaining conceptions in [...]"[/quote] has little more context and therefore meaning than a computer generated quote: [quote][...] other kind of metaphysics, which are employed in philosophy can, according to conceptions by the aid neither of possible experience conditions of the act of keys to possible experiences, like the dull mind that will force us to keep ourselves free from the condition to the conditioned. Thus [...][/quote]
[QUOTE=Kevin;50150167]Been fiddling with WPF again, thought I'd make a less advanced note program. It's very basic and simple, but so useful(for me). Again it is made in WPF. I'm happy about it, even though it is super simple. :v:[/QUOTE] I would love to use WPF in my project but I don't feel like dicking around with interoptibility between C# and C++. If anyone has any experience with that though let me know.
Working on an artificial neural network, trained through a genetic algorithm. It's been a lot of fun, and I've learned a lot from it already. Finding the right settings has been tricky, but I'm finally beginning to see success: [IMG]http://puu.sh/omwm6/f37cb6ab66.png[/IMG] This is the result of 6500 generations of a network with 1 input neuron, 2 layers of 10 hidden neurons, and 1 output neuron. The activation function is a parametric rectified linear unit: [CODE]{ x < 0 : y = 0.2 * x x >= 0 : y = x }[/CODE]The goal is to approximate a linear function. In this case, [ y = 4x+2 ]. Each generation consists of 75 neural networks. Each generation, a random number between -10 and 10 is chosen and used to test the fitness of the networks. They're all given a fitness value of 1 / abs(output - expected), and the 55 lowest-scoring networks are cut out of the gene pool. The 10 best scorers are cloned into the next generation, and then 65 recombinations of non-eliminated networks are added to the next generation. Every weight and every bias then has a 1% chance to mutate by adding or subtracting a random number between -0.05 and 0.05. The console only outputs once every 500 generations, and only displays outputs from 10 randomly-chosen networks, because 75 would take up far too much space. Initially I was trying to approximate sin(x) with it but I couldn't quite figure out the settings, so I went for a linear function instead just to make sure this network isn't totally broken. Tomorrow I'll revisit sinusoidals. If anyone here has worked on a neural network or genetic algorithm and has any tips, I'd appreciate any and all advice.
[QUOTE=DarkCarnage;50152991]I would love to use WPF in my project but I don't feel like dicking around with interoptibility between C# and C++. If anyone has any experience with that though let me know.[/QUOTE] I've got tons of experience in that field, just shoot me a message on steam or here or whereever.
Welp, Ludum Dare went by without a single hitch. This was my 15th competition and the first in which I've really enjoyed and benefited from the process of rapid development; ProtoGL really did make my life so much easier! [URL="http://ludumdare.com/compo/ludum-dare-35/?action=rate&uid=4856"]Feel free to find and play my game here (and feedback if that's your style)[/URL] And, enjoy the obligatory timelapse of the game's development over the weekend. [video=youtube;9CG55QGDRgw]https://www.youtube.com/watch?v=9CG55QGDRgw[/video]
I'm still slaving away at my entry, I slept like 6 hours in total. I feel like I'm putting in a shitton of work but never feel any closer to the goal and it literally makes it impossible for me to sleep. I might not even make it. So in case this is my last post, I dedicate all my belongings to tropsical guy. Enjoy the shitty integrated gfx 200€ laptop and 3 pairs of socks with holes, dear friend. Oh, also, the bananas are getting kinda old but should be safe to eat.
[QUOTE=Drury;50154250]I'm still slaving away at my entry, I slept like 6 hours in total. I feel like I'm putting in a shitton of work but never feel any closer to the goal and it literally makes it impossible for me to sleep. I might not even make it. So in case this is my last post, I dedicate all my belongings to tropsical guy. Enjoy the shitty integrated gfx 200€ laptop and 3 pairs of socks with holes, dear friend. Oh, also, the bananas are getting kinda old but should be safe to eat.[/QUOTE] Ludum Dare is a special circle of crazy. I think this time, over the 48 hours of the compo, I worked for about 25 hours (15 of those were [B]straight[/B] in pretty much a [I]single sitting[/I] :nope:) and slept a total of 10. I made significant engine improvements - splitting Entity geometry into a Shape component and altering entity management + rendering, mostly - and somehow concocted a functional shmup. As a major bonus, the code isn't a [I]complete[/I] mess; backporting the engine changes will be an absolute breeze. Good times! Good luck on your entry sir. Take regular breaks, and don't feel too bad if you don't "make it" - during such a short timespan having anything at all is impressive as far as I'm concerned. And watch that scope creep! Make sure your to-do list is well maintained and be liberal with adding items as you go. I aim to take breaks of 10-15 minutes after every "altering batch" of changes - that is, every time I step away from my computer I like the game to be noticeably further along in some way (graphically, mechanically, etc) than the last time I stepped away. Keeps me going because it tackles the feeling of the goal away from you, and helps completion because scope is clearly defined and easy to mitigate.
[QUOTE=DarkCarnage;50152991]I would love to use WPF in my project but I don't feel like dicking around with interoptibility between C# and C++. If anyone has any experience with that though let me know.[/QUOTE] I am not very experienced, so I can't help you unfortunately. But WPF is great, it's really awesome but it is very frustrating when things don't work the way you want them to. :v:
[QUOTE=cartman300;50153695]I've got tons of experience in that field, just shoot me a message on steam or here or whereever.[/QUOTE] On that note: Is there something like a "for dummies" guide on referencing a native library in VC++ (VS Community 2015 Update 2)? (Specifically the FBX SDK in this case, but that's [I]probably[/I] fairly standard, with a .lib and a .dll, and a (ton of) header file(s) of course.) I'd like to make a managed wrapper that isn't terrible, but there's no C API. [editline]18th April 2016[/editline] [QUOTE=Kevin;50154378][...] But WPF is great, it's really awesome but it is very frustrating when things don't work the way you want them to. :v:[/QUOTE] I think one thing to remember is that it's a fairly complete very high-level framework. That means it's often easier to use the existing "building blocks" and the visual editor to make a new control rather than to create them from code and/or from scratch.
How do you people do Ludum Dare? Don't you have school or jobs? How do you get the time?
[QUOTE=proboardslol;50154735]How do you people do Ludum Dare? Don't you have school or jobs? How do you get the time?[/QUOTE] They're always on weekends. Plus I have no job so I can work on monday :v:
Working on an writing assignment.
[QUOTE=Hemishi;50154916]Working on an writing assignment.[/QUOTE] Oh
[QUOTE=Hemishi;50154916]Working on an writing assignment.[/QUOTE] I was just sick on my new carpet so I'm working on cleaning that up.
[QUOTE=maaatts;50154985]I was just sick on my new carpet so I'm working on cleaning that up.[/QUOTE] At least you're not sick anymore
[QUOTE=proboardslol;50154735]How do you people do Ludum Dare? Don't you have school or jobs? How do you get the time?[/QUOTE] I'm a uni student so I'm never busy :v:
Features added: phase filtering, double exposure interferometry Double exposure interferometry means that I first take a reference image, store it, and keep on recording. I then look at the phase difference between the reference image and the current image. The phase is directly dependent on the "z-depth". So if I take a metal plate, take a reference image, then deform it a tiny bit (like, micrometers), the phase will have shifted. The problem is that it will always be within -pi and +pi, so it "wraps" around (think of it as modulo pi). [vid]https://dl.dropboxusercontent.com/u/17216535/ShareX/2016/04/2016-04-18_18-31-59.mp4[/vid] Here I deform a tiny metal plate with a uhh micrometer thingie (without the frame) [IMG]http://i.imgur.com/dhh1h9c.png[/IMG] The problem is that it's like really fucking sensitive. The deformations (~fringes) you can see in the lower left corner are the result of me just slightly touching the micrometer. The trippy shit in the other parts is air density variations. Each fringe is 532 nm of deflection. Here's [URL="http://i.imgur.com/El6UYSs.png"]an illustrative example[/URL] of what a phase wrapped image means. Here's my own rendition of what happens to the plate: [IMG]http://i.imgur.com/es2Fx0l.png[/IMG]
[QUOTE=Number-41;50155702]Features added: phase filtering, double exposure interferometry [vid]https://dl.dropboxusercontent.com/u/17216535/ShareX/2016/04/2016-04-18_18-31-59.mp4[/vid] Here I deform a tiny metal plate with a uhh micrometer thingie (without the frame) [IMG]http://i.imgur.com/dhh1h9c.png[/IMG] The problem is that it's like really fucking sensitive. The deformations (~fringes) you can see in the lower left corner (also phase-wrapped because I haven't implemented that yet) are the result of me just slightly touching the micrometer. The trippy shit in the other parts is air density variations. Each fringe is 532 nm of deflection. Also my supervisor has a severe case of feature creep...[/QUOTE] i still have no idea what i'm looking at
[QUOTE=Number-41;50155702]Features added: phase filtering, double exposure interferometry [vid]https://dl.dropboxusercontent.com/u/17216535/ShareX/2016/04/2016-04-18_18-31-59.mp4[/vid] Here I deform a tiny metal plate with a uhh micrometer thingie (without the frame) [IMG]http://i.imgur.com/dhh1h9c.png[/IMG] The problem is that it's like really fucking sensitive. The deformations (~fringes) you can see in the lower left corner (also phase-wrapped because I haven't implemented that yet) are the result of me just slightly touching the micrometer. The trippy shit in the other parts is air density variations. Each fringe is 532 nm of deflection. Also my supervisor has a severe case of feature creep... (Actual pictures incoming as soon as my POC phone uploads them)[/QUOTE] I kind of see it what is this thing: it's like depth camera with modulus applied. (phase = periodic) Something like this but more severe [IMG]http://i46.tinypic.com/2hwekxd.jpg[/IMG]
The thing that's getting me is that it can detect and reconstruct [i]changes in air density[/i] - no wonder Microsoft can see our heartbeats! :v:
Well yeah the Kinect sort of operates on the sample principle, but it uses IR light so my guess is that it will only be able to detect deformations of >700nm (which is sufficient I think for heart beats :v:, although that's only part of a huge problem) I'm wondering if this technique is actually used for fluid dynamics measurements... [Editline]18th April[/editline] It [URL="http://www.acs.psu.edu/drussell/Publications/Hologram-AJP.pdf"]is[/URL]! [Editline]18th April[/editline] As cartman300 pointed out, it does not work with interferometry but somewhere in the process you do end up with a phase-wrapped image.
[QUOTE=Number-41;50155702]Features added: phase filtering, double exposure interferometry Double exposure interferometry means that I first take a reference image, store it, and keep on recording. I then look at the phase difference between the reference image and the current image. The phase is directly dependent on the "z-depth". So if I take a metal plate, take a reference image, then deform it a tiny bit (like, micrometers), the phase will have shifted. The problem is that it will always be within -pi and +pi, so it "wraps" around (think of it as modulo pi). [vid]https://dl.dropboxusercontent.com/u/17216535/ShareX/2016/04/2016-04-18_18-31-59.mp4[/vid] Here I deform a tiny metal plate with a uhh micrometer thingie (without the frame) [IMG]http://i.imgur.com/dhh1h9c.png[/IMG] The problem is that it's like really fucking sensitive. The deformations (~fringes) you can see in the lower left corner are the result of me just slightly touching the micrometer. The trippy shit in the other parts is air density variations. Each fringe is 532 nm of deflection. Here's [URL="http://i.imgur.com/El6UYSs.png"]an illustrative example[/URL] of what a phase wrapped image means.[/QUOTE] I'm beginning to think you're just making things up and posting intimidating charts and graphs to keep anyone from calling your bluff :v:
Sorry, you need to Log In to post a reply to this thread.