SQL limit?

Title isn’t as informative as it should be but I couldn’t think of a better one.
So I was wondering, since I have a quest system, meaning every player could end up with a really big table, would this affect anything?
Or at least, is there a limit to what I can insert into a column? Since util.TableToJSON will be used to convert what could be big table for every individual character, will this affect the servers performance in any way or somewhat screw with the SQL database?

Most SQL databases can easily handle millions of rows without a major performance hit, assuming your queries are made properly you’ll have issues with util.TableToJSON before SQL

What do you mean by issues? Taking longer than usual or just not working at all?

Since you want to convert it into JSON I presume you want to send it to the client, depending on the size of the resulting string you might have to split it into multiple parts

Well converting it to JSON is to simply store it in the database. The table is sent via net library whenever its updated, which now I assume isnt a great idea?

Why would you want to convert to JSON to store it in a database, that pretty much ruins the entire point. As far I know you shouldn’t use the net functions for sending tables

How else can you store tables into a database?

By storing each field in it’s own column like you would with any relational database

That won’t work since quests are constantly made, removed or edited. To have columns for every quest would mean to modify the databases columns constantly. Not to mention all the quest vars. Its just a ton easier having it in one column isn’t it?

You mean rows, right?
Yeah, you’d have to change them whenever you add/edit quests, but do you really change content constantly? Or do you mean players accepting quests?
Which still wouldn’t be a problem. Even a cheap database server can run dozens of typical queries per second.

And for quest vars, just make a (SQL) table like
quest_id | player_id | var | value

[editline]6th August 2015[/editline]

Or

quest_instance_id | var | value

if you store quest instances.

It’s a framework that I hope to make public designed to make editing easy as possible. To answer your question, yes, quests will be constantly added/edited. Players accepting a quest has nothing to do with the database. But I just want to know, if at all possible and going through all the stress to make these edits, is it really necessary? Will converting it to a JSON every once in a while actually affect performance?

In practice you’re unlikely to notice a performance hit, as the amount of data involved should be really small.

But if you’re making a framework, any developer with database experience who comes across it will cringe upon the sight of JSON strings in the database. It’s never the right thing to do.

If you’re just gonna be storing lots of IDs and JSON documents, the database adds no value here - you might as well store them as files.

I’m still not entirely sure what you’re trying to do here. Could you post an example table/JSON document? When and how will it be created and updated? That should clear things up a bit.

If you’re going to jam all your encoded data into a single column, why bother with a database at all? Unless you’re using a no-sql / non-relational solution, it is against the design and will lead to the db taking up much more space on disk than it should.

Is it necessary to do it differently? No, and we can’t make you.

Should you do it this way? Absolutely not.

Most of the work done by your quests will be in a script anyways, so a TES-like approach would probably be the easiest:

Your quests:
quest_uid | quest_name

Quest stages:
quest_uid | quest_stage_id | quest_desc | is_failure

Each player’s tracked quests:
player_uid | quest_uid | current_stage | completion

Quest variables:
player_uid | quest_uid | varname | varval

So to get each player’s quest list, you would join on both the quest_stages and player_quests table based on the quest uid and the current stage, which would let you get the name of the quest, the current stage, and the description of the current stage, as well as if they have failed the quest.

You even can make use of a the completion field to filter - 0 = unstarted, 1 = started, 2 = finished

Fetching any number of quest variables is simple as well - you just run a simple select query against quest_variables.




player.Quests table structure:

player.Quests[quest_id] = {
Name = str,
XP = int,
Objectives = {"whatever","whatever"},
Completed = bool
}



That is the structure. Its converted and stored as a JSON string. It gets converted back to a table when chars are loaded. A hook is called whenever the tables modified and for now whenever its called, I send the table over the net library. I may have to change this.

[editline]6th August 2015[/editline]

Wow, I never gave any thought to doing it this way. Thank you, I’ll change the format. I always try fit everything into one database to make it easy when developing, which is why I ended up going down this route. It actually won’t be that complicated doing this, however all the tables will be flooded I assume. Just to clear it up, theres no limit to how much I can insert into the table?

Storing your quests as properties of the player is messy - after all, things like quest name and XP have nothing to do with players, so you’re making lots of redundant copies. While it’s technically a waste of space, the bigger problem for a GMod server is that you’ll have to update all these copies whenever you change the quest. You only want to relate things to players that actually are related to them logically, like progression and state variables.

You still have to keep those consistent with the code, of course. For example, if you rename a quest variable in the code, and players who started the quest before still have the old name stored in the DB, things are going to break.

You’re probably getting the idea that keeping your quest system’s data consistent is something you kinda need to put a lot of thought into :v:

You mean database table, right? There are many situations where trying to put all your data in one is not a good idea. In case you weren’t aware, you can access multiple tables in one query (and that is both the easiest and best way of using the schema above): read up on joins.

Don’t worry about it. The “millions of rows” figure above wasn’t hyperbole - your average PC can handle literally that many without any slowdowns.

Databases on GMod servers never even come close - if there’s a performance issue it’s almost always fixable with a rather simple change to your database schema and/or code.

Thanks for clearing this up man. I’ll fix this formatting issue ASAP. And thanks for the corrections :slight_smile:
I’ll mark it as solved, thanks for all the advice from everyone who replied.