Thanks for that! always like using the GUI more then just the normal datatool.
Does anyone know if the latest datatool works with the latest OW ?
Just asking because it looks like both were released on the same day, but seeing it's stuck on initializing I'm not sure it works
An updated DataTool has not been released yet. There will be one soon, we're a bit behind on releases.
so mush less bother when you don't play the game... unless you want new DLC, just don't update & rip whenever you want!
I've been working on tools. I have models and animations loading in reasonably well.
https://github.com/mayasombra/maya_ow_tools
An updated DataTool has been released
Release Notes:
Added support for Overwatch 1.32.0.1.54011
Added support for Overwatch 1.32.0.1.54052 (latest)
Toolchain: https://ci.appveyor.com/api/buildjobs/u7a6qn5swn01dehn/artifacts/dist%2Ftoolchain-release.zip
TankView: https://ci.appveyor.com/api/buildjobs/u7a6qn5swn01dehn/artifacts/TankView%2Fbin%2FTankView.zip
Other Artifacts: https://ci.appveyor.com/project/yukimono/owlib/builds/21577054/artifacts
You do know that the original maya importer got updated about a month ago right? Just making sure
I didn't. I've been working on this for a few months now when I wasn't able to contact the original author...
I contacted him about a year ago asking whether or not he is going to update it and he said no and that I can do it myself. And here we are...
Well, let's see what we can do, then!
His recent commits are a subset of what I've done. It's basically updating some of the reader code to account for the new versioning. His focus seems to be more on maps, where I've been interested in playing with the models.
My code improves on the existing code by importing the bones correctly so they are usable with SEAnim tools. I've identified the bug in the Blender import and told their maintainers, but they didn't seem interested in my fix. So at least Maya users won't have to do the goofy "delete the skeleton, load the SMD, rebind the skeleton" steps. I also improved the performance of the Maya importer; it was taking minutes to load a model, and now takes < 10 seconds, more comparable with the Blender importer.
I have a few commits sitting on my machine I'd like to push. I started building out different ShaderFX graphs for the different Overwatch shaders. I've gotten a workable 44 shader, and am implementing the more common ones. I'm currently writing some tooling to get a catalog of all the different types of shaders that exist and what inputs they use. It'll then be easier to start building out the skeletons of the shaders and then have people who are better at art to get each shader look its best. I'm not particularly good at that part.
Any new news on getting a working update out? I have NO CLUE how to use CMD and no matter how hard I try can't figure out how to use it. I have never been good with code and I have been doing animations with the models in Blender but now with the new update I haven't been able to download the characters I need. Like I said I have been trying really hard, watching youtube tutorials on it, and reading mostly every post on these past 4 threads now. Please don't ask me what I am doing wrong or anything like that cause I don't know. Iv'e tried everything and I really need help!!!!!!!!
A little more information would be nice. What update? What are you trying to do? What are you having trouble doing specifically?
So, you want to use CMD to download character models?
Thank you very much for your work!
I checked out both your and Kjasi's updated scripts and while I also appreciate his work, it has a few shortcomings. Namely the slow performance and the still randomly connected textures. I was hoping that he would address the latter if he ever updated his script :|
Anyway, I've been playing around with your script and added support for Redshift materials. It's still a work in progress and a bit of a hack job, but it works so far.
If you're interested in adding support for Redshift, I can send you what I wrote. If you have an Nvidia card, you can also download the Redshift trial here to see for yourself what those materials look like.
Your work on a shader catalogue sounds interesting and I'm looking forward to seeing it! But to be honest, I'm not sure all that work is necessary. From my experience with the OW models, there are so many special cases and setups that it's almost always a better idea to just let the artist deal with those.
The way I'm dealing with this in my Redshift script is to cover all use cases for the diffuse, spec, normal, transparency and emission maps. Every other map that is associated with the material, but not explicitly dealt with by the script, will be collected in a layeredTexture, which will then be connected to a dead slot of the material. The artist is left with a basic working material and if they're dealing with a more exotic material, the unused textures will be waiting inside the layeredTexture node for them to inspect.
Again, thank you very much for your work so far! I hope you continue development on this and eventually get around to map imports :D
But I'm really grateful for what we already have. Working with OW assets in Maya has been made a lot more convenient!
On a side note, a while ago I wrote a thumbnail generator for obj and fbx files. I've now added support for owmdl files as well, which worked really well! I'll be expanding on that and add an option to make renders instead of playblasts. When it's done, I'll post a link to it here.
Please do send me your work! I've downloaded the Redshift trial and it looks very promising.
I've been debating supporting multiple renderers for the materials. I used Stingray initially because that's what was suggested in the original code, but Arnold seems to have more capability to match what the models need. Having a third potential options makes it worthwhile to make it configurable.
I'll get started on maps after I get the option UI going so the importer can be configurable. Maps have a lot more complexity and would benefit from having selectable options.
I'll post more about the shader grouping idea I have tomorrow.
Redshift is what Blizzard uses for their OW shorts, so it only makes sense that it works well with these materials =)
Yeah, I think supporting third party renderers is the way to go. Personally, I see no point in using shaderFX or Stingray other than maybe for viewort presentation.
The more I'm working on the Redshift script, the messier things are getting. Without knowing with what kind of shader I'm dealing with, the correct use of a texture needs to be determined by checking what other textures have been used so far. Since they don't seem to be listed in any particular order, this leads to bits of code in every step to check for one thing or another.
So yeah, having some shader identifier would makes things much easier. I guess a workaround for now could be to just sort the keys beforehand so that some textures are handled before others.
Here's my Redshift code so far. I didn't get around to dealing with refractive objects yet and I'm still in the process of identifying special cases.
It takes the same inputs as your Stingray function and it also returns the shadinggroup, so you can just put a switch somewhere to use this function instead.
Your script assigns the wrong UVset to the file nodes with this though, so I commented out the lines 195-201 in import_owmdl.py to fix it. I'm sure you have a better understanding of why that happens and how to fix it properly :X
Here's a list of things it does differently, compared to your stingray setup:
I pulled the texturetype dictionary from the Blender script. Using strings rather than those numbers makes it easier to figure out what's what. I also tag the file nodes with those strings to help with troubleshooting. The dictionary doesn't need to be inside the Redshift script of course. I would suggest this approach for all other materials as well.
If a key is not defined in the dictionary it will be treated as "Unknown". The associated file node will be marked as Unknown and also have that unassigned key in the name. The idea is to expand the dictionary as these nodes pop up.
Any unused or unknown texture is connected to a layeredTexture, which is then connected to the material's (unused) third subsurface scattering layer.
I'm not using the material name (mname) that's passed on to the function. Instead I'm generating an ID based on the first and last few characters of every texture used in a given material. This way I'm avoiding generating duplicate materials that would be identical in every way, except the name.
I'm not a programmer, but I think I know what an artist wants, so I designed the script around that. Also, this is a very noobish question, but how can I update changes to say import_owmat.py without restarting Maya? Saving out the script, unloading the plugin or re-importing import_owmat.py don't do the trick :|
Cool, thanks for the code.
What seems to be the problem here?
https://files.facepunch.com/forum/upload/271535/8a1ba30d-696c-4fa2-9f92-74377488d685/Capture.PNG
I used:
datatool "ow location" extract-unlocks "output location" "*|skin=(leagueteam=none)"
and it did the first few, (Reaper, Tracer, Torb, Mercy, Hanzo without a problem, but with Reinhardt that error pops up. I tried it with Ana and currently Ashe and they're extracting properly. What would cause this? Also, should it be extracting them in random order? I figured it would start with Ana but it started with Reaper.
I just tested and I got this error as well, I'll get it looked into.
Also the order it extracts them is irrelevant, it is not alphabetical but it's not random, it's just the order that they are defined in.
Ahhh, okay. Also, maybe unrelated to the tool itself, but I noticed that some Mercy GUI images, specifically the ones we see while in a game are blacked out.
Charity. Zhuque and Sugar Plum are the ones I've noticed so far.
https://files.facepunch.com/forum/upload/271535/13570140-8b6a-4845-a9fa-33c736abc640/anoither.PNG
But the others, like Fortune, seem to be fine.
https://files.facepunch.com/forum/upload/271535/60047fb5-39e4-4387-97ba-393cdbdf84ba/nopr.PNG
Is that a known bug?
I believe after a certain update those icons changed to actually have two frames/layers. If you open it up in Photoshop/GIMP/etc., you'll find the blacked out version and the original icon.
Good to know!
@Js41637
I extracted all models and Orisa was the only other one that gave me the same error, in case you were wondering.
Thank you! I'll continue working on the Redshift bit then and keep looking for weird special cases.
Don't know anything about this, if it hasn't been posted here then we probably don't know, we don't really use the tool that much to be honest.
Ok thanks, I fixed the initial crash but, it was just the first crash of many and there was a lot more further down, Blizzard have fucked something up with the data of these heroes. It'll take a bit longer to look into.
Is it possible to extract the animations of abilities of Overwatch??
Cool. I'm away from my PC this week, so I can't do much integration work on Redshift. I was able to install Redshift on my Mac, but it crashes rather badly when I even instantiate a shader in script. I was hoping it'd be a little better behaved, but oh well.
I'm going to shift gears a bit and work on reading the raw shaders from the dump. I've already figured out their encoding, and am trying to run them in a sandbox.
More interesting to you, I'm also merging Kjasi's last work and trying to load maps into Maya.
That is great news! Don't burn yourself out on the map support tho. And don't worry about the Redshift integration as it's working fine as it is. I just need to get around to dealing with some of the special cases, but that doesn't require any changes on your end, I don't think.
I was going to make a feature request and while testing that idea, I noticed something odd about your script, which I think should be addressed: the imported models have a scale that is inconsistent with other conventions or importers.
A very common convention in Maya is that one unit equals one centimeter. Even models that were exported from game files usually have a realistic scale. They might be off by a factor of 100 or something, but the numbers are usually very sensible. So for example a character might import to Maya with a height of 1.7 or 170 units. That has been the case with Overwatch models as well - Mercy for example is about 1.68 units tall when imported with Kjasi's script, which is also the case with the Blender script. However, your models are off by a factor of roughly 0.3937. I couldn't figure out where that discrepancy is coming from, but I think maintaining that convention is important, especially when the scale is off by such an odd number.
Now about that feature request :X
It would be nice if the user could define a global scale offset for the importer. I like to work with 1unit=1cm which means scaling these models up by a factor of 100. There's a ton of potential issues that can come with scaling, especially when the skinned meshes will have a scale of 1 (since their joints are being scaled), while others will have a scale of 100.
I tried adding a variable called userScale and then multiply it with the position of the bones and vertices inside import_owmdl as well as the bone radius value. This worked fine and had the desired effect of making the imported models the correct size without applying any scaling to anything. So it would be easy to implement and since you already have an options menu, that setting could just be put in there. Applying that scale to the object coordinates is probably all that needs to be done to make it work with the map importer as well.
Anyway, inside import_owmdl, there's only three lines to adjust, if you'd consider implementing it.
The scaling discrepancy is because of the SEAnim files, which have their default units in inches rather than centimeters. That adjustment factor you mention is converting from centimeters to inches (1 inch/2.54 cm = .3937 in/cm) in order for the animations to play correctly. Now that I have an options screen, I agree it's easy enough to configure. The original import didn't import the reference poses correctly, so it never got as far as making the animations work, so that wasn't an issue previously. The reason I adjusted the models was so I could use the existing SETools plugin without modification.
I think the proper solution would be to have the OW exporter tools create the SEAnim files with the correct units so this conflict didn't exist, but I am willing to make the Maya plugin more configurable as a compromise.
Ah, I see! If making the necessary modifications is too much work or too messy, maybe you should just leave it as it is. Maybe nobody else cares about it anyway. I could always just write a script for myself to scale and freeze everything after the fact.
I wanted to clarify this comment for the benefit of the folks doing the work on the OW tool, and leave this for future reference.
There's a Blender plugin that is recommended for importing SEAnim files. That plugin seems to have originally been targeted for importing data from Call of Duty (CoD).
The original developer notes that they are adjusting the scaling directly in lieu of building a scaling option.
io_anim_seanim/import_seanim.py at master · SE2Dev/io_anim_seani..
This means that a transform of 1 unit length in the model doesn't correspond to 1 unit length in the animation.
The Overwatch import process attempts to deal with that by multiplying by 2.54, so that the division in the SEAnim plugin gets canceled out, and the units between SEAnim and OWMDL match. This is done in
https://github.com/overtools/OWLib/blob/master/TankLib/ExportFormats/SEAnim.cs#L121
So, when we come to the Maya tools, the Maya SEAnim importer doesn't have this scaling behavior so the animation skeletons that Maya sees are 2.54x bigger than the models. In Maya, I resolved this by setting the import units to inches, importing the model, and then flipping back to centimeters. This effectively multiplies the model by 2.54, causing it to match up with the skeleton. As someone who is a programmer by trade and a hobby artist, I wasn't particularly worried about what the numbers are, just that the data was correct. The comments from @ellowas indicate that it'd be nice to have units that map well to metric. I would like to see that happen as well.
I've opened a bug against the io_anim_seanim project to see if they'd make the importer configurable. If that happens, the hack in SEAnim.cs could be reverted and everything would be much cleaner.
@zingballyhoo any thoughts?
pig year ow upd when i working with datatool(
need off blizrd louncher when working with ow data. Keys pls)
Sorry, you need to Log In to post a reply to this thread.