Workshop

So, how S&Box is going to handle the addons distribution? Is it still going to use Steam Workshop or something different, like a toybox perhaps?

From what I have read they are not using the Steam Workshop but making their own system for it. I do not know if this is true as I have not seen @garry confirm it yet.

2 Likes

I hope S&Box will still use the workshop. What would be a better alternative? The workshop is as good as it can get, I believe, especially because it is so nicely integrated with Steam.

I mean if they have full control over the workshop then they can deal with issues that arise them selves and not have to wait for steam to get involved, they would have more control and be able to prevent certain things from being uploaded if it becomes an issue.

1 Like

I think they aren’t using it since they want to release s&box on stuff other than steam too.

IMO the Steam workshop is far from a perfect solution, or even a “good” one. In my experience it has been buggy at best and infuriating at worst. Filtering sometimes works and sometimes doesn’t. Discoverability is poor. There’s no way to do community spotlights, and it has been challenging to moderate in the past. It’s cluttered with junk, which, because of poor categorization, overwhelms genuinely good contributions. Half of the workshop right now is cluttered with abandoned server content packs.

A custom solution would be able to sidestep these issues. You might not even have to give up web-browsing the workshop if there’s a way to login into your S&box account from the facepunch website.

4 Likes

If anyone thinks workshop is good. Try uploading and then updating (or worse - maintaining) a garrysmod addon.

Git is the way to go. Maybe not github because we’d need space for large files, but maybe facepunchGit or something like that?

1 Like

Git isn’t very good at versioning large binary files, in my experience at least. It works fine, but it doesn’t really do good diffing on them and it also starts to really slow down with large files.

Had a case where I had a repo with a few dozen 20-50 MB binary files. Doing a git status on that repo took about 5-10 seconds. It was hardly usable. Sure, for content distribution it might be fine as you can do a shallow clone, but when diffing starts to matter it really don’t play well with binary files.

For binary content diffing there are other tools/algorithms/protocols that work fairly well, e.g. rsync et. al. It should be possible to adapt an existing algorithm to diff the content between the player’s storage and a new updated version.

If nothing else, doing a per-file diff with hash comparisons would go a very long way.

Maybe we should have a special hybrid that uses different algorithims for code and binary files?

Workshop or a facepunch alternative is the only way I can see working well. I don’t know why we’re really mentioning git here, it’s a bit dumb for this usage.

Its easier to maintain and update addons when the workshop pulls them from your git repo and automatically compiles/saves/zips/compresses them instead of you having to do all that nonsense manually.

I think you mean is having a sort of git integration for content creators to have their addon auto update when they create a new release. That would be nice and makes a lot more sense.

1 Like

That’s what I mean, but I don’t entirely understand source control and continous integration, etc. Thanks

1 Like

That’s pretty much what Git LFS is. A thin extension over git for versioning large files. It does require having an LFS-supported git server. IIRC there are size caps for LFS storage on GitHub/GitLab, but I might misremember.

That’s a perfect solution. But maybe the workshop itself (and its ‘github/gitlab’ variant) should be hosted by facepunch to allow bigger size limits.