This is probably more abstract than the usual Facepunch modelling thread, but I’m teaching myself to model visualizers for concert motion graphics and I made this cool thing with Blender:
If you look closely, the orb reacts to different frequency ranges. It pumps in size to the beat with the low frequency kick, gets “wavier” from midrange frequencies, and high frequencies like hi-hats and cymbals make it “spikier”. This was a Blender render but I think my next venture will be to get visualizers like this working in Unreal Engine real-time. I think it might be possible with morph targets. Does anyone know anything about using morph targets?
Do any of you mess around with modelling visualizers and getting things to react to music? If you do please share! Do you have any tips or tricks?