Is there any maximum size a table can be in lua? I’m asking because I’m trying to integrate my server with a MySQL database and instead of having to load some things on the go I’d like to load whole tables on server startup to have instant access to the data. But some of these tables, such as ones containing player data might get quite big over time. Is this going to affect the server’s performance or eventually make the tables too big for lua to handle?
I took a look at my server’s current sqlite database and one of the tables has 30.000 records in it but still the database file is only 5MB in size. Would it be safe to, for example, load the entire table into a lua table on server startup?
I’m trying to make a system where people can login with their Steam ID on a web page and be able to run some commands on the server. The web page would send a command to the server containing the user’s ID from the database and then I’d like to retrieve info about the player on the server (for example check if the player has access to the command).
The problem is that if I do it with mysqloo using callbacks then I can’t send an rcon response to the command after all the information is processed because it’s asynchronous. If I had all this data loaded into a table in lua then I could just have functions that return the required information right away and be able to send the response.
I think I have just found a better solution. I might make the server only load data for on-line players, and as far as the the web panel goes make it send an rcon command to the server when a user logs in, and then load the user’s data on the server. This way I will only have the data I actually need loaded and have instant access to it. This might be a bit more complicated than just loading everything on the server on start-up, but I guess it’s going to be a better solution in the long run.
However, even though I’m probably not going to deal with such big tables anymore, I’d still like to know if using big tables (for example with 30K rows) in lua would impact the performance of the server? Just in case i need this for something else in the future…
While there is no software limit to the size of a table one should keep in mind that Lua tables are indexed using a red black tree structure, (google it for more information if you’re really interested). The larger the table becomes the longer it will take to write values to / read values from it when in this structure. For some cases it may be more efficient to look into breaking your data up into a data structure using multiple tables that group values that are used together.
I don’t believe this is correct. It was my understanding that Lua tables are implemented as a two-part data structure whereby dense non-negative integer keys are stored as a simple array and are therefore indexed by a simple pointer addition and dereference, which is O(1). All other keys (non-integer, negative and sparse integer) are stored in a hashmap as a chained scatter table which stores key-values pairs as a flat array indexed by the hash of the key. Collisions are resolved by storing pointers to the next element with the same hash alongside this (essentially a linked list of colliding elements). At worst case, where all elements collide, the complexity of this implementation is O(n), however it is expected that on average the number of collisions per element is 1 and the maximum is 2; this means that the average complexity is O(1).
The implementation of Lua tables is such that, even with a huge number of elements, lookup is as quick as possible.
This man is correct. It’s perfectly fine to have tables that hold a large amount of data. You can find out just how much memory your table is using by doing
In the appropriate locations before/after inserting data into a table.