Server list should show size of content to be downloaded

I don’t think anyone actually hosts servers crucial to a game’s function out of their own home or office, at least not these days.

There are tons of readily available solutions for automatically deploying and managing servers, potentially in multiple locations in the world, all at a fairly reasonable cost. AWS, Azure, Google Cloud, you name it. They also come with the added benefit of much better reliability guarantees, stability, backups, support and scalability.

A distributed system of 5-20 servers will probably be able to serve players all over the world with reasonable latency and at a reasonable price - assuming something like $100-$200 per server per month (which gives you a very beefy machine these days). For a company like Facepunch this probably isn’t an issue, especially considering the potential revenue from S&box.

EDIT:

For example, there’s a nifty calculator for AWS at https://calculator.aws/#/createCalculator

Putting in:
CPUs: 32
Memory: 64GiB
Storage: 500GiB
Number of Instances: 10

Puts you at around $5000 per month for the whole thing. While out of reach for most individual users, this is fairly within the price range of a company with even a moderate revenue, let alone one like Facepunch. And these servers are probably overkill anyway.

1 Like

Never use the word ‘overkill’ when reffering to stability and performance of a sandbox game’s servers. Besides having company-managed gameservers is a bad idea for many reasons (most of them being that when something goes wrong (think EA Servers, GTA Servers, Ubisoft Servers, or any big company servers) all servers are down and not just one) leaving it player-hostable is the best what can be done. (And again, as with microtransactions, having it too easy to host a server is bad.)

I like the idea of showing the server’s content download size if possible. It would be nice if it was possible to offload content downloads to a webserver or something though.

Is fastdl in source 2? I’m guessing not, but it would be nice.

I would approve of being able to freely change your masterlist.
Automatically using a basic Valve masterlist if a server can’t connect to Facepunch should be enough though.

Maybe make it torrent.

(If there’s a player near you who has some addon, you download it from them instead of the server. Then you download the checksum from the server to make sure there was no manipulation. You can download each addon from a different player.)

There are good reasons why cloud computing is such a big deal these days. Hosting your own servers is not a trivial task, doesn’t scale well, has a lot of nuances and overall is a headache. Even very big companies are hesitant to host their own infrastructure unless they absolutely need it.

Also, not sure what this has to do with hosting a gameserver. The discussion was about hosting the master servers for the game list and whatnot. Obviously, people should be left free to host their own gameservers. I think that’s a no-brainer.

Ah, i might have understood you wrong. Sorry for that. Master servers should be cloud hosted for sure. Torrent trackers for downloading addons from other players when joining a server should be there too. Git repos for server collections too perhaps (but perhaps with size limits or just addon ids) Where to host the addons?

I don’t want to ever be forced into downloading server content from a tracker, personally.
For those who choose to go out of their way to opt in that seems fine.

Loading/updating from repos directly when joining the game would be cool.
It’d be even easier for servers to list what content they have from repos.

I think Garry wanted his own workshop solution, but I’d like to also have Steam workshop access.
A custom workshop might also better interact with the server finder/browser to provide better results.

Torrents would be way faster and easier on the servers letting the hosting be cheaper and network traffic used on networking the game and not giving you hatsune miku models.

(You download an addon from another player who has it instead of the server and then you download the addons checksum from the server or the workshop.)

Edit: I’m only talking about a solution for replacing FastDL. Not for regular workshop downloads.

Torrent addon downloads would bring along a lot of unwanted privacy concerns and potential problems, maybe even security risks? I can see many players wanting a feature to opt out on seeding their content, which would make this entire system pretty pointless.

What security risks? That’s what the checksum’s for. And why on earth would anyone want to opt out of seeding? They never use their internet upload anyway.

One issue with torrents is that it’s a bit dubious in terms of the actual legality of that. A lot of people, especially in rural areas (rural America, Australia, etc.) do not have a lot of bandwidth, and on top of that have severe monthly bandwidth caps. Besides that, transparently using players’ bandwidths for content distribution without their explicit consent is a bit of a shady practice anyway. I certainly wouldn’t want S&box hogging my bandwidth just to let other people download some addon.

If you’re hosting a server you’ll probably want a decent uplink anyway. Modern VPS providers usually give you a bandwidth cap of around 1-20TB/mo. and most have a 1gbit uplink (if not more). I have a server with a 20TB bandwidth cap and a 1gbit uplink for around $15/mo. I think that’s fairly affordable.

1 Like

If that’s the case, then why are there so many problems with fastdl in gmod? If its just downloading content from a server?

Also downloading directly from server practically kills playing on far-foreign servers where download speeds are gonna be really bad.

Maybe regional proxy/forward/cdn servers that just send far-away content and network traffic through a high-speed lane? I don’t really know much about networking though

It’s an old system that wasn’t built for distributing massive amounts of contents. One issue is that it uses a single HTTP requests for every file, and I’m not sure if it even parallelizes downloads. Setting up an HTTP connection is a fairly slow process, and it’s likely Source 1 doesn’t reuse TCP connections for multiple downloads like in HTTP/1.1

If you’re downloading a lot of small files, and each one at a time, the overhead of setting up a new TCP connection, and then doing the whole HTTP request/response shebang, quickly becomes comparable to the size of the file itself.

Using HTTP/2 would likely solve this issue and still have a low barrier of entry for server administrators. It’s just a matter of the FastDL mechanism being really old and not suited for downloading a few thousand files of varying sizes.

The issue of slow uplinks to far-away servers isn’t necessarily solved by torrenting either, if most of the players on that server are far away from you as well.

Regional CDNs are the way to go. A server administrator can fairly straightforwardly set up mirrors of their content in a few key locations in the world to give most players reasonable DL speeds, and for people who don’t want to bother with that, you might as well use Facepunch’s workshop solution.

EDIT: Another problem with the lack of TCP reuse is that TCP is fairly slow to start up. Congestion control means that your download speed doesn’t jump to your max capacity immediately. It’ll take a few seconds, in some case dozens of seconds, to utilize your bandwidth to full capacity on a fresh TCP connection.

1 Like

Thank you, now I understand everything. I think there should be a guide for setting up a Sandbox-specific CDN for server admins though. (Or even a facepunch provided fastdl2 server and fastdl2 mirror server (as executables?)) <- So that you can quickly set up your mirrors and just provide them with a “workshop” collection ID (hassle free)

1 Like

I think there should be a guide for setting up a Sandbox-specific CDN for server admins though.

I think that’s an excellent topic for a guide. Thankfully there are a lot of different solutions available, including existing ones, as well as setting up your own.

In the case of a simple HTTP/2 content server it’s as easy as throwing nginx on your server and telling it to serve content from a directory. You could use rsync or a custom solution (potentially provided by Facepunch directly) to keep the mirrors up to date with 5-10 minute delays at worst. All of this is fairly basic stuff, and with the advent of dockerized applications can be made even easier for server administrators to set up. Might even be as easy as just running a script or two.

1 Like

What’s why I want it to be in a guide. Its easy when you know that it exists. Most people won’t think about it. Even (game)server admins.

Of course. This isn’t obvious stuff for someone who isn’t familiar with content distribution. I’m just trying to be reassuring that you don’t need to reinvent the wheel, and that these things can be “guidified” fairly well :slight_smile:

We’d have to see what options Facepunch gives us for content distribution initially and go from there, I think. If their solution is adequate enough for 90% of users there might not even be a need for custom solutions. Would still be nice to give server owners some other options though, not everyone wants to rely on third-party services.

2 Likes

To be honest, I think the better way to do discovery is not having a server list at all. s&box will have more than gmod’s 4 gamemodes and considering everything is just an addon, most servers are basically going to be separate games. It also seems Garry’s thinking along those lines too.

What would be nice is having everything face-front, like downloading all the assets before you even try and connect to the server, like the custom games on dota2 for example. There’s no reason we can’t just get the client to download addons only on the workshop. You’ll be able to easily see what addons a server uses and how large the would be. (Which would also be better for addon creators)

1 Like

Definitely and also the option to subscribe to the addons before joining for faster downloads compared to gmod.