• What are you working on? v67 - March 2017
    3,527 replies, posted
TIL that when forcing GLFW to render to a window using both OpenGL and Vulkan at the same time, the behaviour can be surprisingly intuitive. At least on my GPU it just looks like Z fighting between the two images.
[QUOTE=JohnnyOnFlame;52220014]Wrote a multi-threaded github network spider. Here's a scraped output displayed on ORA-LITE, number of stars as color: [t]http://i.imgur.com/2ATT7Lk.png[/t][/QUOTE] Any chance I can get an output in pairwise txt file or csv? I'm working on a very large scale 3D network renderer :)
[QUOTE=BAZ;52230805]Any chance I can get an output in pairwise txt file or csv? I'm working on a very large scale 3D network renderer :)[/QUOTE] [quote=Github ToS]Short version: You agree to these Terms of Service, plus this Section G, when using any of GitHub's APIs (Application Provider Interface), including use of the API through a third party product that accesses GitHub. 1. Limitation of Liability for API Use You understand and agree that GitHub is not liable for any direct, indirect, incidental, special, consequential or exemplary damages, including but not limited to damages for loss of profits, goodwill, use, data or other intangible losses (even if GitHub has been advised of the possibility of such damages), resulting from your use of the API or third-party products that access data via the API. 2. No Abuse or Overuse of the API Abuse or excessively frequent requests to GitHub via the API may result in the temporary or permanent suspension of your account's access to the API. GitHub, in our sole discretion, will determine abuse or excessive usage of the API. We will make a reasonable attempt to warn you via email prior to suspension. You may not share API tokens to exceed GitHub's rate limitations. You may not use the API to download data or Content from GitHub for spamming purposes, including for the purposes of selling GitHub users' personal information, such as to recruiters, headhunters, and job boards. All use of the GitHub API is subject to these Terms of Service and the GitHub Privacy Statement. GitHub may offer subscription-based access to our API for those Users who require high-throughput access or access that would result in resale of GitHub's Service. 3. GitHub May Terminate Your Use of the API We reserve the right at any time to modify or discontinue, temporarily or permanently, your access to the API or any part of it with or without notice.[/quote] Seems to me that I'm allowed to share crawled data for non-spam reasons, for as long as it isn't infringing on privacy. I'll PM you a few scraped outputs. It's in JSON format and you'll have to turn them to pairs yourself. [editline]15th May 2017[/editline] Also send me screenshots, please :D
Using johnnyonflames spider data: [img]http://i.cubeupload.com/wY34DQ.png[/img] [img]http://i.cubeupload.com/LEs0JX.png[/img] [img]http://i.cubeupload.com/0cLsWk.png[/img] tsk tsk one follow johnny I made this quick video too; [url]https://drive.google.com/file/d/0B8TARMxvrY9XRFZxSWJrTGI0ejA/view?usp=sharing[/url] [editline]15th May 2017[/editline] ignore the low fps values too i was running in debug which slaughters performance
[QUOTE=BAZ;52231193]<fucking cool stuff>[/QUOTE] Mind if I use any of these in my report? [editline]15th May 2017[/editline] Also yea, I don't follow many people, should probably change that sometime.
[QUOTE=JohnnyOnFlame;52231290]Mind if I use any of these in my report?[/QUOTE] did i just do your homework for you? Sure you can use it, please reference our company Kajeka Ltd, and our software Graphia! Here's the same graph MCL clustered: [img]http://i.cubeupload.com/UPiHDo.png[/img]
[QUOTE=BAZ;52231345]did i just do your homework for you? Sure you can use it, please reference our company Kajeka Ltd, and our software Graphia! Here's the same graph MCL clustered: [img]http://i.cubeupload.com/UPiHDo.png[/img][/QUOTE] I had quite a few graphs plotted using ORA, but nothing this fancy really. If anything, you supercharged it :v:. Thanks.
Oh I forgot to mention, two-way edges have a red stripe!
I'm thinking about using [url=https://kajeka.com/]Miru[/url]/[url=http://kajeka.com/graphia/about]Graphia[/url] to generate some screenshots and stuff. Ora's just too damn slow and I can't get Gephi to cooperate. Am I able to export generated tables and screenshots with the preview edition? If not, do you guys have any "100% affordable [read free]" means to get this on students hands? Even if this means having some hard limits. (Ora's limits are hitting too damn close to my networks, one of the networks I sent ya are 200 nodes close to the 2k cap)
I rewrote my ray bounce code to minimize redundant ray casting and speed up baking. To make it easier to debug I visualized the individual rays which turned out pretty neat in itself: [img]http://puu.sh/vSh5G.gif[/img] In the end I managed to speed up baking by about 20x.
[img_thumb]http://i.imgur.com/8jULxRc.png[/img_thumb] Since the last time I posted, I've added networked block placement as well as world replication, meaning that when you join a server the map is compressed and sent in chunks before being reassembled on the client side. Additionally, blocks placed/destroyed by other players will be reflected on the server and other clients. I also added a HUD and started on a weapon system. At the moment you can damage players but that is only shown on the HUD at the moment without any effects. Up next I intend to implement a model system that will allow you to create models in a voxel editor and save them to a file so you can easily replace the standard models with your own. This will be a part of a larger "voxel editor" game mode for making maps, models, and probably special entities later on. Before that I will probably implement a basic teams system as well as a death/respawn system so the game can actually be played beyond pretend shooting at eachother and building stuff. [editline]16th May 2017[/editline] I'll be sure to include a video next time to better show what I've been working on, but in the meantime here's this [img_thumb]http://i.imgur.com/B8faP1o.png[/img_thumb]
2 puzzles, first one is to get through the lasers, second is to rotate the layers until all lasers line up with the holes. Both are randomly generated puzzles. [IMG]https://i.gyazo.com/feca576facd22674b4c728a78fed4bf1.gif[/IMG] [IMG]https://i.gyazo.com/f2ea2632de7eb0a0245ec43e41f00a5b.gif[/IMG]
[QUOTE=fewes;52237805]I rewrote my ray bounce code to minimize redundant ray casting and speed up baking. To make it easier to debug I visualized the individual rays which turned out pretty neat in itself: [img]http://puu.sh/vSh5G.gif[/img] In the end I managed to speed up baking by about 20x.[/QUOTE] That is so fucking cool. Have you experimented with any dynamic light sources yet? Maybe you could make something raytrace in a compute shader and have it be ultra speedy to calculate, that would be really cool for character torches :)
I feel quite bad. Haven't really coded anything in months but worse than that I don't really have any ideas or projects I want to work on. Since it is completely a hobby, without something that grabs my interest I find it much more difficult to focus. Fair enough I had a difficult year work wise with some very tough exams but now that they are over I just find myself not caring about working on anything. It doesn't help I just upgraded my desktop and laptop and have to pretty much piece together my work environment from scratch. But that is just an excuse. Anyways I really just wanted to bitch, I'm sure eventually something will get me back to work eventually. If nothing else this thread is always a great source of inspiration.
fml [IMG]https://puu.sh/vSI7n/d30b2c19e7.png[/IMG] This suddenly happens on my laptop machine, after pulling changes that were absolutely fine on my desktop machine and also completely unrelated to logical device creation - or so I thought, seems like I was wrong.
[IMG]http://i.imgur.com/e1Xm4MT.png[/IMG]
[QUOTE=Fourier;52239903][IMG]http://i.imgur.com/e1Xm4MT.png[/IMG][/QUOTE] Cool, are you making a sort of SpaceBuild style game?
[QUOTE=JWki;52239865]fml [IMG]https://puu.sh/vSI7n/d30b2c19e7.png[/IMG] This suddenly happens on my laptop machine, after pulling changes that were absolutely fine on my desktop machine and also completely unrelated to logical device creation - or so I thought, seems like I was wrong.[/QUOTE] question: How do you have your validation layers setup? I had my own callbacks and such written, but they rarely worked. Setting VK_LAYER_PATH to the layers I built locally and VK_INSTANCE_LAYERS to the core stuff ended up giving me tons of information that my callbacks were missing. I can't quite figure out why that is, either, but I'd prefer to not rely on environment variables to enable/disable debugging. I've gotten similarly fucky debugging though: it only happened after I updated the drivers on my work computer's K1200 though, so I just rolled back. The workstation cards kinda act weird, though. I've been forcing myself to go back and read about data structures and algorithms, and I'm still shocked at how far I've gone without them. I've been trying to implement the sorting algorithms as I go, too. Comb sort is the neatest I've learned about so far, and Skip sort is by far the one I am having the hardest time implementing. Also officially applied to the university I'll be attending next fall. Unfortunately, despite my experience (kinda) in-industry I still have to take the base prerequisites. I [I]think[/I] this community college uses C# instead of Java and I really hope that's true because I have a mostly unjustified disgust of Java
[QUOTE=paindoc;52240780]question: How do you have your validation layers setup? I had my own callbacks and such written, but they rarely worked. Setting VK_LAYER_PATH to the layers I built locally and VK_INSTANCE_LAYERS to the core stuff ended up giving me tons of information that my callbacks were missing. I can't quite figure out why that is, either, but I'd prefer to not rely on environment variables to enable/disable debugging. I've gotten similarly fucky debugging though: it only happened after I updated the drivers on my work computer's K1200 though, so I just rolled back. The workstation cards kinda act weird, though. I've been forcing myself to go back and read about data structures and algorithms, and I'm still shocked at how far I've gone without them. I've been trying to implement the sorting algorithms as I go, too. Comb sort is the neatest I've learned about so far, and Skip sort is by far the one I am having the hardest time implementing. Also officially applied to the university I'll be attending next fall. Unfortunately, despite my experience (kinda) in-industry I still have to take the base prerequisites. I [I]think[/I] this community college uses C# instead of Java and I really hope that's true because I have a mostly unjustified disgust of Java[/QUOTE] I just use what comes with the LunarG SDK pretty much, I have no environment variables set besides VK_SDK_PATH pointing to the most recent SDK installation folder, and VULKAN_SDK which does the same - I think one of the two is written by my build tools to be able to find the SDK but I'm not actually sure if that's the case. In code, I just do the bare minimum of what vulkan-tutorial.com does in the initial chapter really. It seems to give me all of the output you'd expect for a very basic validation layer (no API dump or anything). The issue I have isn't present on my desktop at all which is interesting. I seem to have fucked up something that's only a fuck up on my nvidia driver but not may amd driver. Regarding data structures and algorithms, I feel that's the most important thing to learn as a programmer and I still have to look up way too much in that regard. It's also one of the lecture paths I wish I had put more effort into at Uni. They seem less funky than fancy design patterns but I find them to be so much more important and useful. Sure, nobody needs to keep a dozen sorting algorithms in their head but knowing that they exist and in what situations they can be applied is def. something that makes you a better programmer. Also there's no unjustified disgust towards Java, it's always completely justified.
1 month from now I'll graduate with a 3-year BSc in something that I lost interest in, am bad at, and is completely unrelated to programming. I've managed to re-enroll, so next fall I'll try my hand at a second major, this time in applied computer science. I'm already technically halfway done and can finish a 4-year BSc in 3 more years (or a 3-year in 2). Although I hadn't taken any classes in the field since early 2012, I've been studying on my own time ever since, learning C++, OpenGL (from the fixed function pipeline up to 4.4), building and using external libraries, and most important IMO: source control. There is so much more that I want to learn. Probably the biggest reason why my longest running project has been going for nearly 3 years is because I frequently find out about different a better ways of doing things involving techniques and such that I've never learned of before. My only concern is that I don't burn out this time around.
[QUOTE=Karmah;52241511]1 month from now I'll graduate with a 3-year BSc in something that I lost interest in, am bad at, and is completely unrelated to programming. I've managed to re-enroll, so next fall I'll try my hand at a second major, this time in applied computer science. I'm already technically halfway done and can finish a 4-year BSc in 3 more years (or a 3-year in 2). Although I hadn't taken any classes in the field since early 2012, I've been studying on my own time ever since, learning C++, OpenGL (from the fixed function pipeline up to 4.4), building and using external libraries, and most important IMO: source control. There is so much more that I want to learn. Probably the biggest reason why my longest running project has been going for nearly 3 years is because I frequently find out about different a better ways of doing things involving techniques and such that I've never learned of before. My only concern is that I don't burn out, I'm going to have a very very busy schedule as I also will be moving out of the city, meaning I have to drive in to attend lectures as well as work :suicide:[/QUOTE] Be really careful with work/school stuff: I only took one course alongside work when I did this, and even at ~30hrs a week it was pretty tough to manage. The mindset change between work and school was sometimes a challenge, too, because I had gotten out of the habit of simple things like turning homework in and double checking deadlines (despite doing the work). Its also really easy to get burnt out on personal projects when you're this busy - and if work is involved with programming, and school is programming, going home to program more really won't feel appealing (in my experience). Careful time management will be the key. How many credit-hours per quarter/semester do you plan on taking? [quote] Probably the biggest reason why my longest running project has been going for nearly 3 years is because I frequently find out about different a better ways of doing things involving techniques and such that I've never learned of before.[/quote] despite the (oft-misquoted) thing about premature optimization, I enjoy learning about these new things so much too that I often find myself delaying progress on content for the sake of trying the new thing I just learned. Doesn't happen at work, though. I'm just trying to get this damn app done :v
[QUOTE=Legend286;52239596]That is so fucking cool. Have you experimented with any dynamic light sources yet? Maybe you could make something raytrace in a compute shader and have it be ultra speedy to calculate, that would be really cool for character torches :)[/QUOTE] My guess is that there wouldn't be a simple way to do this. Ray tracing is something better for CPUs to do than a GPU since it involves a decent amount of calculation (but who knows, maybe it's fine in 2D?). I remember learning about a Radiosity method that breaks a 3D scene in patches and calculates the light going from a patch to any other patch and calculating the steady state for that (by solving a big linear equation). [url]https://groups.csail.mit.edu/graphics/classes/6.837/F03/lectures/18_radiosity.pdf[/url] I'd imagine that'd be more amenable to GPUs but I dunno what you'd do about faking sun shafts.
[QUOTE=Ziks;52240070]Cool, are you making a sort of SpaceBuild style game?[/QUOTE] It's more of a survival, but yes now that you said it, the electric system might have some similarities to space build :happy:. But it was probably already 10 years since I played spacebuild/garrys mod, hmm.. Also another, more clear picture: [IMG]https://i2.wp.com/floatlands.net/wp-content/uploads/2017/05/electric_system_lowpoly_floatlands.png[/IMG]
[QUOTE=DoctorSalt;52242026]My guess is that there wouldn't be a simple way to do this. Ray tracing is something better for CPUs to do than a GPU since it involves a decent amount of calculation (but who knows, maybe it's fine in 2D?). I remember learning about a Radiosity method that breaks a 3D scene in patches and calculates the light going from a patch to any other patch and calculating the steady state for that (by solving a big linear equation). [url]https://groups.csail.mit.edu/graphics/classes/6.837/F03/lectures/18_radiosity.pdf[/url] I'd imagine that'd be more amenable to GPUs but I dunno what you'd do about faking sun shafts.[/QUOTE] Ray tracing is much better suited for GPUs since it's extremely parallelizable.
[QUOTE=Darwin226;52245562]Ray tracing is much better suited for GPUs since it's extremely parallelizable.[/QUOTE] its seemingly one of the best applications of GPGPU stuff (and/or improved rendering techniques), and its one of the neatest demos that comes with CUDA runtime. That being said, I'm not at all sure how I'd do it with OpenGL. Compute shaders are ideal, I imagine, but how would you get data about the scene geometry (for intersection tests) to the shader? That might cause some expensive memory transfers, if you have frequently updated geometry. I imagine register pressure will come up, too. And/or thread divergence. Both big problems that I experienced when I tried to get my GPU noise stuff going, because while that was also an embarrassingly parallel problem most of the base material to draw from is written with CPU's in mind. [editline]edited[/editline] oh, I see the intended application now. I could see that being fairly simply, since its "2D" geometry mostly and doesn't seem to change often.
[QUOTE=paindoc;52246223]its seemingly one of the best applications of GPGPU stuff (and/or improved rendering techniques), and its one of the neatest demos that comes with CUDA runtime. That being said, I'm not at all sure how I'd do it with OpenGL. Compute shaders are ideal, I imagine, but how would you get data about the scene geometry (for intersection tests) to the shader? That might cause some expensive memory transfers, if you have frequently updated geometry. I imagine register pressure will come up, too. And/or thread divergence. Both big problems that I experienced when I tried to get my GPU noise stuff going, because while that was also an embarrassingly parallel problem most of the base material to draw from is written with CPU's in mind. [editline]edited[/editline] oh, I see the intended application now. I could see that being fairly simply, since its "2D" geometry mostly and doesn't seem to change often.[/QUOTE] I mean, you have to transfer your geometry to the GPU with any kind of rendering. This is even cheaper since you can describe things like perfect spheres with just a few values instead of transfering the whole mesh.
[QUOTE=Darwin226;52246508]I mean, you have to transfer your geometry to the GPU with any kind of rendering. This is even cheaper since you can describe things like perfect spheres with just a few values instead of transfering the whole mesh.[/QUOTE] oh, right. You can do functionally defined modelling this way. Duh.
I've working with python for months now and I'm moving out of Flask and into pure python scripting. I still have yet to find a definition of __init__() that makes sense to me and makes me want to use it. Can someone perhaps relate it to Java?
[QUOTE=Adelle Zhu;52247530]I've working with python for months now and I'm moving out of Flask and into pure python scripting. I still have yet to find a definition of __init__() that makes sense to me and makes me want to use it. Can someone perhaps relate it to Java?[/QUOTE] __init__ is kind of like a constructor - it's a good place to initialize your object's variables. [code] class Thing: this_is_a_static_class_variable = 0 def __init__(self): self.this_is_local_to_this_instance = 1 [/code]
I rewrote a large part of my portal asset and it now runs ~even faster~ so that's cool. Particularly with physics, which have been totally rewritten. I also had to do a bunch of dirty shit with reflection to get unity to be a good boy which was a pain for auto-setup for users. I do have an issue that I can't seem to solve, primarily because I cannot reproduce it on my machine. This strange issue is happening where realtime lighting is really fucked up on some people's machines. I'm rendering utilizing camera.rect, which effectively makes the camera render to only part of its render destination (this is used primarily for minimaps and the like). Does anybody know what could cause this? It happens with both forward and deferred rendering. [IMG]http://i.imgur.com/n9Z2mUN.png[/IMG]
Sorry, you need to Log In to post a reply to this thread.