Why can't Gmod just download a zip of all the required files with FastDL?

It seems pretty simple that the current reason FastDL is so unreliable in its speeds is because of pure connection latency with the HTTP server. Every time it needs to download a file, it connects to the server, establishes the connection, downloads the file (even if it’s 1kb), and closes the connection. The connection times for 600+ lua files can be five minutes less if FastDL only needed to connect to one file, download one file, and close the connection of one file. Then Gmod could extract the files on the clientside, and process them just like Gmod does now. Not to mention that zipping the files could cut the file sizes in half, depending on the types.

It’s not that hard garry, we know you can do it.

Rate agree if you agree with me.

You do know that source comes with native support for bzip2 right?

Even better than your standard zip.

Well the idea of it is cool.

Okay, but still gmod does not take advantage of this feature.

Bzip, zip, doesn’t matter. If it comes in one file, it will be 2x better than bzipping/zipping ever single file like we currently do.

Okay, how does it all get into one file? And how does it come out of that one file?

Step one) Zip all the required files to download

Step two) Client downloads the zip

Step three) Client extracts the zip, uses files inside.

Not that hard.

So this will be done automatically?

Do you automatically upload the FastDL files to your server? No. So, no, not automatically. But if it were, it would be pretty awesome.

Alright so now we’re creating a greater chance of error when zipping, and now the client needs more time to unzip the package.

There was a module thread in the lua releases, about making files into .zip and back, if you had the module, or the server did. Another thread, aVoN had mentioned his making a module for servers to AUTO bzip everything, and host the sv_downloadurl from the server itself.

But untill then, to bzip alot of - files at once…

. http://files.getdropbox.com/u/1170861/bzip.zip .

Extract it’s folder, put your files into the compress folder, then run ‘bzip’ … Your files will not be in the folder anymore, so just put COPIES into the compress, as they’ll be a bzip when it’s done. Running ‘bzip’ - , DO NOT close the DOS-ish window that comes up, it’ll close itself when it’s DONE. Just let it finish…

Enjoy…

There’d need to be some standard of directories. It’d be a fucking mess if there wasn’t. D;

Like a file that lists where - each thing - goes. But some would have the same name.

And bzip=better.

Fffail thread.

It’s very possible and we should have it.

It’s annoying to piss away 10 minutes at the loading screen to download the many thousands of files that the average server has.

It should also be possible to increase the max direct content download speed so you don’t have to make a server for sv_downloadurl.

Which means that the server would start lagging, fail, then everyone would leave resulting in a lower quality server overall.

Yes it does, I’m using it right now as a matter of fact. 30MB file? No problems, Bzip it to 10m and a fastdl server to the mix, and voila! People download it in seconds.

Having one giant file means increased fragmentation, and a bigger load on both the server and client to download it as it would be one massive file. The way it does it at the moment means that each file speedily gets to the client, and once they have it, they have it.

Why?

The reason source has a default cap on how fast the user can download the content is because if it were to allow the client to download at max speeds, it could cause the entire server to lag, especially for those that use listen servers.

How?
Are you sure you know what you are talking about?

This does not stop the “open connection, download file, close connection” problem.

To the topic: For your idea, the server would need to generate a new zip file everytime a client connects and requests a new file. This can take ages. Espeiclally if you need to download maps, materials and sounds on the MB scale.

Also: Apache is quite fast for fastdownload. On my server, people downloaded all shit very fast. Also depends on the connection-speed of your server.

The only thing I could think about to solve this issue is garry implementing fast-download in GMod to open parallel connections at once.

[editline]07:22PM[/editline]

Create a file, having all required files (maps,materials,sounds,lua): 30 MB (mostly the map is huge).
Change one lua file (< 1 kb).
Server needs to regenerate that file (a few seconds)
Everyone downloads 30 MB again, because < 1 kb of it changed.

Definitely a No-Go.

[editline]07:25PM[/editline]

Yes, this is my plan.
For now, I have successfully ran a webserver on my gmod. Now I need to optimize it (read a file and send it in 100 kb chuncks instead of reading all 30 MB first - lag - before sending it).
bzip2 compression is also planned. Every requested file will be bzipped.

I will work on this at the weekend again.

Most listen servers have shitty home connections to begin with, it’s not going to matter.