Hi, im going to buy a dedicated server and i have no idea what i need.
CPU : Intel Atom 230/D425 x64
1 Core
2 Threads
1.6Ghz
Ram : 2GB
Bandwidth : 100 Mbps
Question : Do i need a 1GBPS bandwidth ?
You'll squeeze about 10 players on it with that CPU. The connection is fine.
[QUOTE=12voltsdc;39391286]You'll squeeze about 10 players on it with that CPU. The connection is fine, 100 Mbps is 1GBPS.[/QUOTE]
1024 Mbps = 1 Gbps
yea looks good for about 10-15 players I'd say
[QUOTE=BlackVoid;39392580]1024 Mbps = 1 Gbps[/QUOTE]
Or if you want to get even more accurate, 1000 Mbps = 1 Gbps, and 1024 Mibps = 1 Gibps
[QUOTE=Handmade;39396437]Or if you want to get even more accurate, 1000 Mbps = 1 Gbps[/QUOTE]
[IMG]http://gyazo.com/195db1f44c08f069f1f933ea85b43366.png[/IMG]
Google disagrees.
[editline]29th January 2013[/editline]
Ontopic: I am assuming you are colocating at a datacenter / renting because of the port speed, but you should probably upgrade as you're wasting your money on that core/memory.
SRCDS won't need a port speed over 100 mbps unless you're getting DoS/DDoS'ed, and even then the port speed won't help you if its over 300 mbps.
That processor is utterly terrible and you're going to have severe lag on it. If this isn't free, I wouldn't invest in it, but testing is the best way to go. Also, GMod isn't multithreaded (only uses 1 core at once).
2 GB of RAM should be fine.. I have never seen SRCDS use more then 2 GB of ram at any time, but your windows overhead might max it out. You won't be able to host more then one server without running out of memory you can allocate.
So in the end, you probably can host a single server on a 10 - 20 tick rate with 5 - 10 players, but this really depends on a lot of things, the only sure way to find out is [b]do tests yourself[/b]. Depending on what you're hosting, you might be able to push a tick of 10 with 25 players. If it isn't free [b]don't invest in it[/b].
Thanks
[QUOTE=zzaacckk;39396759][IMG]http://gyazo.com/195db1f44c08f069f1f933ea85b43366.png[/IMG]
Google disagrees.[/QUOTE]
The prefix Giga- denotes a factor of 10^9, period. That is its definition. Yes, it is used commonly to represent a factor of 2^30, but this is incorrect. It has been so popularized, though, that it is *arguably* acceptable usage. Depending on how rigorous the context in which it is being used is, it may be considered acceptable, or just plain incorrect. (In this case it is incorrect, as we are talking about an internet connection. More about that in my last paragraph.)
The prefix Gibi- denotes a factor of 2^30. That is its definition. The entire point of these binary prefixes was to eliminate the confusion that comes from using the decimal prefixes to denote both factors, for example Giga- representing both 10^9 and 2^30. With the binary prefixes, Giga- would (correctly) represent ONLY 10^9 and Gibi- would represent 2^30.
This is all before getting into JEDEC standards, in which the decimal prefixes ARE used in place of their binary counterparts, but only to represent storage capacities. (Simplified)
Google is all well and good, and I use it all the time, but I also always cross-reference it with more rigorous reference material.
Finally, just to get the point across in one sentence, 1 Gbps in terms of data connections is WIDELY considered to be 1000 Mbps, or 1000000 Kbps, or 1000000000 bits per second. This is as specified in the current IEEE 802.3 standards. You can see more about the standard here: [url]http://www.javvin.com/protocolEthernet.html[/url] Unfortunately, as far as I can tell, the actual IEEE standards documents need to be paid for to view, and that page cites the standard from 2002 :(
EDIT: Aha! I found the current standards. You can view them here: [url]http://standards.ieee.org/about/get/802/802.3.html[/url] More specifically, you'll want to get section three.
Sorry, you need to Log In to post a reply to this thread.