Messing around waiting for the Scriptable Render Pipeline.
https://my.mixtape.moe/nevrgi.mp4
I played Kingdom Come: Deliverance for like 20 hours straight. Wednesday was not a productive day. I have to do triangle picking on a sphere for OpenGL, implement Naive Bayes in matlab, and worst of all, explain the math in a LateX style paper, in the next 24 hours.
Turns out my weird flickering issue was due to casting between int/uint being buggy in my shaders: it was not a tremendous feeling to discover that hours ot being stuck was due to just casting between types in an order that didn't work well in the generated SPIR-V
Now getting scenes like this though:
https://i.imgur.com/uUk2ZM5.png
to-do is:
fix the textures for roughness and metallic-ness(?) being all fucked up
improve the performance of the compute shaders, since the two heaviest ones take up to 4ms a piece (D:)
shadows!
skyboxes and global lighting
Still performing quite well for 4096 lights though!
Unity does have a great C# Raycast library you can dig into and get an overview in before you make your own from scratch. The theory is the same.
https://docs.unity3d.com/ScriptReference/Plane.Raycast.html
https://docs.unity3d.com/ScriptReference/Physics.Raycast.html
You could also do something like Raytracing: https://github.com/LWJGL/lwjgl3-wiki/wiki/2.6.1.-Ray-tracing-with-OpenGL-Compute-Shaders-%28Part-I%29
https://files.facepunch.com/forum/upload/1857/1a40a2a5-453f-4fa5-b9fb-0adbca1dd933/image.png
https://files.facepunch.com/forum/upload/1857/a54c809a-8842-4c2c-b743-02c1b09e9134/image.png
Screenshots of our upcoming game Mosaic that we are currently in crunch mode trying to get ready for Day of the Devs @ GDC
Haven't posted anything from this in a good while.
I just spent a bunch of time making my terminal look better because I apparently value my time very little
https://files.facepunch.com/forum/upload/111926/d2a799ad-36f2-45b1-867d-dcbd08739da3/image.png
Thanks, I'l look at that! I'll admit I've only had a brief introduction to Unity. I generally get a little put off by graphical editors for things when it comes to code, but from what I understand Unity can be scripted in C#, but Unreal uses these things called Blueprints
Ah Both of them have a pretty good amount of programming. You can code in C++ with Unreal and C# in Unity. The blueprints in Unreal are for non-programmers mostly. And Unity doesn't provide an out of the box visual scripting solution yet but it has many plugins for it.
I also recommend SFML2 for C++ since object oriented programming is the new thing of the century ;)
Very nice my friend. I like the animation very much.
I can't fucking stand app developers who don't make a web version of their app. I don't want to plan an entire vacation on a 6" screen, I want to use a laptop and a spreadsheet
I'm so happy that nsight supports opengl gl shader editing. I recall not being able to do it last year or maybe the year before.
It took 1 1/2 hours of debugging + online research, but I was able to track down a source of light flickering during my bloom pass. It turned out that within my BRDF Specular lighting logic, there was a weird case that caused these weird 1 off pixels that were EXTREMELY bright, but none of their neighbors were. Then said pixel got blurred to generate bloom, and for a single frame a wide blob of light would flicker into existence.
I was able to lock in on the pixel using the pixel history tool, and watched its value changed as I isolated parts of the specular lighting equation, and found I needed to just clamp/saturate my dot products between (normal, lightDirection) and (normal, viewingDirection)
It works!
https://files.facepunch.com/forum/upload/133270/c5a04fff-833e-4de9-86d9-ea1f196f617a/image.png
THICC
Looks awesome, how is movement going to work? It's point and click? Free movement? Sequences?
https://files.facepunch.com/forum/upload/133252/9d254b0f-44ba-44b7-a45e-b626f3ec44ce/asteroids.png
Pulling data from the NASA API to make a little asteroid visualiser. Asteroid sizes are proportional to each other, but obvs Earth and the Sun are random sizes. The distances are all to-scale, though!
Point and click on PC. Working on the controller movement design.
So i just realized that you guys had recommended physics engines when i was talking about raycasting. Maybe we had a misunderstanding, but i was talking about the rendering method, not collision detection. Im writing a software renderer
Did a photo-op for the Apertus Axiom Beta camera today (fully open source 4K cinema camera). My office-mate is developing an SDI module for it, and is going to hold a presentation at the German ministry of education, and he needed some pretty photos for his poster. I will try to get involved in the project via google summer of code if I can.
https://drluke.space/public/apertus/apertus_axiom_beta_25.JPG
Setup: https://my.mixtape.moe/qoqtbr.jpg
Anyone know of any newer-ish techniques for dealing with reflections in real time graphics?
In my last engine I was using a 3 part solution for reflections in my scene: screen-space reflections, falling back on pre-computed parallax corrected cubemaps, falling back on the skybox cubemap.
That pipeline worked, but it relies on manually placed environment maps and also relies on there being a skybox texture. I would like to ditch the skybox all-together in favor of a sky-shader
They're pretty bog standard, why would you have to port them? I'm sure you could find an open source premade version of it if you did not want to implement it yourself.
Ah nah I meant I did it before and I could reuse my old code but just lookin if anyone knows any better techniques. Iirc parallax-corrected cubemaps have been around for awhile (2012?)
I've managed to fix up my Global Illumination model. I'm still using Radiance Hints, but I basically completely reimplemented it from scratch instead of porting my old engine's one.
For starters, I actually properly encode/decode the spherical harmonic values. My previous implementation couldn't do this right, and it resulted in a chain of bad math trying to amplify the result.
https://i.imgur.com/NoYyniE.png
https://i.imgur.com/4mHmnUn.png
Secondly, in order to optimize, I shrunk the volumetric texture holding this light bounce data down to fit the first nearest-half the viewing frustum's size. Too big of a 3D texture really impacts performance, and too small can start to look bad. Previously, I was inclined to try to mitigate this by cascading them (like directional shadow maps), but the performance hit was too big, plus at the time I couldn't blend them nicely (though I did mess up the encoding as noted above). This time, I've set the 3D texture to clamp to border and have a linear filter.
This means that further away objects can still get some (probably wrong) indirect lighting, but it will suffice for outdoor scenes with objects far away.
(linked images to not plug up the page too much)
Scene from far away versus same scene but with GI disabled versus same scene but with actual gi region filled
I've also redid screen space reflections using more physically based math, but SSR on its own looks bad without something to fall back against, which I will work on in the future.
https://files.facepunch.com/forum/upload/106992/0c023196-4a82-4cf8-b27c-3d7a7418ce11/pomf.mkv
Finally got around to implement a shitty obj loader. Doesn't read textures from the file, only extracts vertex data and calculates normals
https://www.youtube.com/watch?v=Hyvi4qH-iZQ
If you start trying to load MTL files, its really nice to have support for the PBR "extension" to that format to support more varied texture maps: Exocortex | Extending Wavefront MTL for Physically
PBR is fun! I've broken mine a bit still and have way too shiny of specular reflections, and my roughness maps are still acting oddly. It's crazy how much better looking it all is compared to my old attempts, though, and I've not even touched shadows or AO or bloom/HDR yet either lol
This is my next step, now that I have a couple thousand dynamic lights rendering - I'm going to be making some more improvements though, to get my dynamic light count even higher and to use a better sorting method. Could you use compute data to offload some of your calculations somewhere? Consistently, time and time again, AMD has found that most devs really leave the compute capabilities of GPUs drastically underutilized. In many cases, using them can be super beneficial since the data is already where it needs to be and you're just processing it a bit.
Not sure how to handle synchronization in OpenGL. In Vulkan, it's a bit "easier" if only because you can have such explicit control over it - like using memory barriers to protect resources, but also using semaphores to let the device signal precisely when it's safe for the onscreen rendering pass to start reading from what the compute pass generated
Good news everyone! MoltenVK is now open-source. This means that you can now develop graphics applications using Vulkan and get them working on all three major platforms without having to pay any licensing fees. I've used MoltenVK in the past, and it's really quite nice. Downsides are:
Shader binary code has to be converted before being used by the MoltenVK library, still
No geometry shader support - iirc Metal just doesn't support these at all
Tesselation control and tesselation evaluation shaders not currently supported, but are at least WIP
There's also work to get Vulkan running over DX12, too! I'm ridiculously excited, because the Vulkan portability initiative is also still well off and in-progress, with the goal of replacing GLES nearly entirely! For once, being one of (if not the only) dedicated Vulkan dev actively posting in here is feeling less like a dumb choice, lol. This has mostly been led by Nvidia and Valve it seems, so it shouldn't be losing steam anytime soon
Article summarizing the Khronos press release here: https://www.phoronix.com/scan.php?page=article&item=vulkan-on-mac&num=1
now someone please hire me or at least help me be treated less like a disposable code monkey at my current job ;~;
I've never looked into compute shaders yet so idk how they can be used. I'm wanna avoid feature creep this time around and just get the essentials implemented and well documented first. This whole thing is just for fun, but I'm hoping that in the next couple of weeks to finish the "alpha" version and pull the changes over to my main branch.
Work for me, I'll pay you 5 dollars an hour to deal with support tickets containing strongly-worded messages from angry Russians
lets swap, you can deal with FORTRAN models written by NASA scientists in the 70's after the Apollo missions. And legacy code that tries to delete passed operators and causes a memory leak
I used to have whiskey in my desk lol
A lot of interesting ways, actually. There are a number of examples of using them to generate irradiance maps and BRDF LUTs at runtime, instead of using offline tools ahead of time. Compute shaders are a little different conceptually though, and some of the challenges you have to tackle in writing performant compute code for GPUs really highlights just how fucking weird they (and the SIMT model) are compared to conventional CPU programming paradigms and modes of thought.
I don't feature creep, I just creep along slowly as shit since I enjoy finding the best way to do things and really refining my code and such
In case anyone who reads these posts is interested, I just found out there is an augmented version of the Global Illumination model I was following.
I was using Radiance Hints, a technique published in 2011. In 2014 the same people released a newer technique called Radiance Caching using Chrominance Compression. From what I can tell so far (only lightly skimmed through it), it works off of radiance hints but gives better results and is more efficient.
Decided not to use .NET after all (if only c# had C++ templates!) So now I'm trying pure C++ and OpenGL+GLFW. Can't seem to get the basic triangle working though - don't have a physical linux machine to dev on so running a Hyper-V VM =P
Sorry, you need to Log In to post a reply to this thread.