It seems pretty simple that the current reason FastDL is so unreliable in its speeds is because of pure connection latency with the HTTP server. Every time it needs to download a file, it connects to the server, establishes the connection, downloads the file (even if it’s 1kb), and closes the connection. The connection times for 600+ lua files can be five minutes less if FastDL only needed to connect to one file, download one file, and close the connection of one file. Then Gmod could extract the files on the clientside, and process them just like Gmod does now. Not to mention that zipping the files could cut the file sizes in half, depending on the types.
There was a module thread in the lua releases, about making files into .zip and back, if you had the module, or the server did. Another thread, aVoN had mentioned his making a module for servers to AUTO bzip everything, and host the sv_downloadurl from the server itself.
Extract it’s folder, put your files into the compress folder, then run ‘bzip’ … Your files will not be in the folder anymore, so just put COPIES into the compress, as they’ll be a bzip when it’s done. Running ‘bzip’ - , DO NOT close the DOS-ish window that comes up, it’ll close itself when it’s DONE. Just let it finish…
Which means that the server would start lagging, fail, then everyone would leave resulting in a lower quality server overall.
Yes it does, I’m using it right now as a matter of fact. 30MB file? No problems, Bzip it to 10m and a fastdl server to the mix, and voila! People download it in seconds.
Having one giant file means increased fragmentation, and a bigger load on both the server and client to download it as it would be one massive file. The way it does it at the moment means that each file speedily gets to the client, and once they have it, they have it.
The reason source has a default cap on how fast the user can download the content is because if it were to allow the client to download at max speeds, it could cause the entire server to lag, especially for those that use listen servers.
This does not stop the “open connection, download file, close connection” problem.
To the topic: For your idea, the server would need to generate a new zip file everytime a client connects and requests a new file. This can take ages. Espeiclally if you need to download maps, materials and sounds on the MB scale.
Also: Apache is quite fast for fastdownload. On my server, people downloaded all shit very fast. Also depends on the connection-speed of your server.
The only thing I could think about to solve this issue is garry implementing fast-download in GMod to open parallel connections at once.
Create a file, having all required files (maps,materials,sounds,lua): 30 MB (mostly the map is huge).
Change one lua file (< 1 kb).
Server needs to regenerate that file (a few seconds)
Everyone downloads 30 MB again, because < 1 kb of it changed.
Definitely a No-Go.
Yes, this is my plan.
For now, I have successfully ran a webserver on my gmod. Now I need to optimize it (read a file and send it in 100 kb chuncks instead of reading all 30 MB first - lag - before sending it).
bzip2 compression is also planned. Every requested file will be bzipped.