Metal Gear Solid 5 animations

I’m researching Metal Gear Solid 5 animations. They have unique and interesting system. I was able to read all position/rotation data for all frames/bones and unpack rotations to proper quaternions. They were packed another weird way I’ve never seen before.

Animations currently exported to SMD, with IK helpers to create constraints.

Tool beta published here:
Works on all human (Ground Zero & Phantom Pain) ingame (MTAR) skeletal animations now.

This video was recorded during quat research. Changing one component sign made snake (and everyone else) walk in a strange way. I also have a video of finding Paz in this way, its pathetic.

It’s awesome seeing people doing modding research on mgsv. keep it up!

This is another video that demonstrates that they have some kind of physics affecting the animation. You can see when random data is applied to model origins Y, they start jumping up and down, but their legs act naturally. This made me think Euphoria is used here, but i have no proof of that, so maybe they have their own kind of stuff.

Looks like simple IK to me.

p much. He’s yanking the root around judging by what he’s describing

OP, can you elaborate a bit on their animation system? You said it’s interesting, I’d like ot hear more about what makes it special

I can be wrong on terms, but this is how it looks for me. Before this I reversed animations for 3 other games: Overwatch and Black Ops III (both having their own system) and Bioshock Infinite (using Morpheme). They all had it like rotations/translations described for all bones from root to fingers, all behaving like a simple rigged skeleton.

Now what I see here is different.

For example here the position and rotation of Snake’s palm is defined independently in animation. On this video I’m changing the Y of left palm a bit up and down. I’m changing the value recorded inside the animation, and this is the only value I’m changing. But you can see how his elbow moves according to his body physics. The elbow position is not defined in animation. It seems its calculated on the fly based on the positions/rotations of his shoulder and palm.

There is an animation track for arm, but only thing it does is rotate the whole arm around. It can’t change the positions/rotations of shoulder or palm. Look at this:

Well, it looks like they actually baked IK data INTO the animation, that is unusual as I’ve been told, most games apply IK AFTER the animation. Let’s see if it will be possible to convert these strange animations into something usable.

This new video shows it even better. Here I changed the position track for waist. So its clear how 4 IK nodes are baked into the animation itself. His hands/feet are trying to stay at same position, while his whole body flies back.

I still can’t understand how they are solving the IK bones and without that the animations can’t be exported.


One of the first Snake animations extracted (to SMD). Right now its exported with arms and legs separated from hands/feet and after export we have to create 4 IK constraints to make it working (because SMD cannot hold IK constraints). There are 2 solutions: make a software IK solver in the tool, or make script for maya/blender/max to create constraints during export.


I presume that would be a big pain in the ass to make exported anims compatible with SFM rigs?

Also, what about facial animations?

Before you can do any of that, the rig needs the same number of bones with the same name as the animation, or else it won’t work, or it might not play correctly.

I didn’t check them yet, but the way they have rotations/positions recorded seem to be same for all anim types.

You can use a technique such as puppeting in the SFM.

I have done something similar with my Daz Animations Resource Pack (warning: lewd jigglebones), which collates animations from Skyrim, Mass Effect, Dark Messiah of Might and Magic, SBPR, and a few other sources.

The skeletons of the source and target are not required to be the same, but a script does need to put together that interfaces between them.

If you can get the animations into SMD, with a reference of some sort (doesn’t even need materials, literally just a mesh for reference), and get it compiled, then I can put together a basic puppeteering script that you guys and gals can use to puppet onto Valve Biped. Then you can adjust the script to fit whatever skeleton you want.

Last few days I was fighting with a problem with palm/feet positions a little off. You can notice that in climbing animation one of his palms detaching from the hand because of wrong position (IK solver can’t solve it because the palm was too far away). Now I got it all fixed:


Red is his exact pose dumped directly from game memory.
Yellow is my extracted animation.
You can see that at the beginning and at the end his pose exactly fits now.

The yellow bones sticking from his shoulders/thighs are his actual arms/legs not connected to IK nodes.

The first test with a model. Some pixels in his right hand are weighted wrong (was a little mistake in bone name).
You can see pole vectors for his arms. They are connected manually for now, but must be some way to make a script.

And now the whole body.

Very nice work id-daemon!!!

Thanks. As you can see, there are still problems with a model animated. Soon I will publish the test tool, and I hope experienced animators here will be able to solve them.

Today I ran a test on big 2400 animations package. Its all unpacked!

Though, looks like not all of them went smooth. We have problems when his feet are posed backwards. As for strange effects in transitions between animations, ignore them, it happens because they’re not supposed to work in one go. Other than that, the tool is ready to be published.



Feet rotation fixed. If anyone wants, I can record a new video.