Concerns/questions over the C# security model in S&Box

I assume this isn’t something many people are concerned about, but as someone who’s heavily security conscious, I’m very interested in the number of precautions taken to limit the vulnerability surface of the C# programming layer. I’m fairly confident garry and co. have well-thought this through, but it would be nice to receive some actual confirmation.

A while back I’ve researched the various security hardening methods available for .NET Framework and .NET Core, and my findings have been somewhat… concerning, to say the least. Here they are in no particular order:

  1. AppDomains are deprecated and virtually all Microsoft documentation suggests they shouldn’t be used anymore and don’t offer a strong security guarantee. There have also been mentions of performance penalties associated with using the .NET access control system. In .NET Core the whole subsystem is entirely removed and there is no way to instantiate separate AppDomains.

  2. It is fairly hard to verify compiled .NET assemblies because they might contain unsafe code, which can pretty much do anything. This can be limited to an extent by using PEVerify (the MS .NET tool to verify the safety of .NET assemblies), but this method also seems fairly limited to me because it requires inspecting an assembly post-factum rather than restricting the number of available operations on the language level. The other way in which I can see this being implemented is by using an AST-level verifier that checks that the code contains no unsafe blocks and other operations that might expose a client to a vulnerability, but this also feels fragile to me.

  3. Microsoft has explicitly stated in various sources (if I remember correctly) that .NET does not offer strong sandboxing guarantees and shouldn’t be used in scenarios where untrusted or semi-trusted code is being run. This puts yet another level of concern on top of the points listed above. This also seems to be the reason why they chose to stop supporting and updating the AppDomain and security access control subsystems.

  4. C# and .NET are not without vulnerabilities. Humans are fallible, and I have no doubts that .NET is a fairly secure runtime environment, simply because it has been tested ad-nauseam by millions of people. Still, due to the sheer size of the .NET code-base, which is written mostly in C++, this means that there are likely many more vulnerabilities to be discovered. Most of these are probably fairly benign ones like buffer overruns causing an immediate crash, or some other unwanted behaviour of limited scope, but anything that may read or alter unwanted sections of memory can cause a lot of havoc.

  5. Reflection is a fairly important feature of C# that, unfortunately, has the potential to wreak a lot of havoc as well, in particular if access is gained to methods that are not supposed to be exposed to the user code. I can imagine this is limited by not loading restricted standard assemblies, but it would be nice to have some confirmation.

This is pretty much everything I can think off the top of my head. I can only hope that S&box does not place an excessive vulnerability burden on its player, and I hope that Facepunch has given this whole topic reasonable amount of planning and forethought. It would be really nice to have that confirmation, though.

8 Likes

I am pretty sure Facepunch has thought about that. I wouldn’t worry about it. You should be able to do anything on the serverside (i assume), but not on the clientside

image

1 Like

The reason why I’m concerned is that I’ve been personally researching this exact topic and my findings have been disappointing. Sure, you can insert access checks and perform static code analysis and run existing verification tools on the compiled assemblies, but ultimately CIL is not designed for security-first and it doesn’t even retain information about unsafe/safe code - a lot of the operations can be used to perform unsafe functions just as well as safe ones.

There are even documented cases where seemingly safe code (no external libraries, no ‘unsafe’ blocks, etc.) could cause an arbitrary memory write or perform remote code execution. Just take a look at the list of published CVEs for .NET - https://www.cvedetails.com/vulnerability-list/vendor_id-26/product_id-2002/Microsoft-.net-Framework.html

And these are just the documented ones. A lot of these allow arbitrary code execution. When using a security-second language like C# which does not have the goal of sandboxing code as an explicit and first priority, there are bound to be corner cases and vulnerabilities.

Just briefly skimming through the list of CVEs there is even a vulnerability allowing RCE by loading an incorrect font. Software security is a very complicated field and it isn’t sufficient to just restrict a bunch of operations. JIT miscompilations, accidental vulnerabilities in the runtime, missed access checks etc. pose a very real risk of security issues.

1 Like

The CVEs you listed were for .NET Framework, not for .NET Core. Judging from the commit history, s&box is using .NET 5.0. I think that RCE vulnerabilities are going to be likely non-existent. I am not sure what form of sandboxing is the game using, but it should be definitely safe. IIRC Garry talked about disabling certain methods, like reading/writing files, and I am pretty sure reflection and unsafe code context/other dangerous operations will be disabled in clientside code. Don’t really see any other way how to exploit it.

1 Like

Yeah, my bad about the CVE list.

Still, just because you don’t see an obvious way in which you might be able to exploit something, doesn’t mean there aren’t ways to do that. There was a case in 2015 or thereabouts where you could cause an RCE in Source 1, because there wasn’t a buffer overrun check when loading textures. There were very real instances of this bug being used to actually infect players.

Even security-hardened products that have been audited by security experts over and over are not safe from this - let me remind you of the Heartbleed bug that happened in 2014 in OpenSSL - one of the most heavily audited pieces of software in existence.

Software security is not easy. In fact, it is exceptionally hard.

Please, don’t be lulled into a false sense of security (no pun intended) just because you don’t see a way by which a certain piece of software can be exploited. Someone will try to poke holes through a product. Someone will find those holes if they try hard enough, and if they have a real incentive to do so. Especially since .NET Core is entirely open-source and the runtime is free to inspect for anyone willing to do so.

I don’t claim to be a security expert myself by any means, but I have seen enough cases where seemingly innocuous code or libraries or runtimes have had such bizzare and completely unobvious vulnerabilities that it is honestly scary.

To provide some actual substance to my claims, check out these links and resources:

Some of these are more benign than others. Some might not be applicable to S&Box at all. The point I’m trying to illustrate, however, is that these vulnerabilities are very real and there are likely more to be discovered.

Of particular interest is this issue, and the comment by one of .NET’s security developers:

Here is another comment by a .NET Core developer stating they explicitly don’t support untrusted code sandboxing:

Another reply suggesting that .NET features shouldn’t be relied on for sandboxing:

And more…

When even lead developers behind .NET Core state that all previous attempts at providing security boundaries within a process have been defeated or insufficient, this makes you wonder just how safe this is.

I don’t want to stir controversy or claim “c# bad” or anything of the sorts. I just want some peace of mind that these things aren’t just hand-waved away as “oh it’ll never happen” or “.NET is secure by default”.

EDIT:

For reference, I don’t actually know of any ways to mitigate the potential risks. In fact, I’m tempted to just say “let’s bite the bullet and hope for the best”. My goal here is to raise awareness about security overall, and especially in the context of running untrusted C# code.

In practice it is unlikely that a wide-spread security issue will plague S&Box, assuming that .NET Core is kept updated. Still, this doesn’t mean that precautions shouldn’t be taken, or that you shouldn’t at least give this some consideration.

The only other real option is to use WASM - but this is hardly a usable option for the specific goal that S&Box set out to achieve.

3 Likes

They control the whole runtime. They can blacklist APIs. Doesn’t need AppDomains nor the .NET CAS. They can block any unsafe code, any call to unmanaged code, block pointer usage, etc. It’s very efficient and used widely

It’s not hard at all. I did it too. Space Engineers does it too. Space Engineers even loads C# mods from workshop automatically when you join a server.

Most addon’s will never need reflection. Only server side addons would need it, if at all. See at the end of the post.

Lets analyze every of your links.

https://github.com/dotnet/runtime/issues/39296

This is about a XML vulnerability that is only possible to exploit under special circumstances and is mostly only relevant for ASP.NET Core.

https://github.com/dotnet/runtime/issues/25806

Don’t see what the issue here is. Aside that, like said, Unsafe calls can be restricted.

https://github.com/dotnet/runtime/issues/27807

Once again, unrelated. Stated above why. Same reason as for AppDomains.

https://github.com/dotnet/runtime/issues/28342

Involves only zip files. Attacker has limited control and prob hard to abuse, if at all.

https://stackoverflow.com/questions/58468896/sandboxing-in-net-core

Irrelevant for the same reason like all of your AppDomain related links.

.NET basically works like WASM, its a VM too and has bytecode too. There is no reason why WASM should be any more secure.

Yet you talk like if you are one. Posting a ton of irrelevant links to scare people. Don’t know what your goal is here.

It’s really easy. You only allow namespaces that are whitelisted.

Yeah, let’s talk about Heartbleed from 2014 that is completely irrelevant to the topic.

So you are the only smart guy out there and no one at Facepunch ever thought about how secure this can be? Don’t you think they came up with a solution already?

Also, these restrictions seems to be planned for client side only. Garry said that you could do anything on the server side. Many games allow server side plugin installation and almost none have any restrictions on the server side, so that would be normal.

/close-request

1 Like

Lets analyze every of your posts, to show how bullshit they are.

If you could read, you’d see that I explicitly said

Some of these are more benign than others. Some might not be applicable to S&Box at all. The point I’m trying to illustrate, however, is that these vulnerabilities are very real and there are likely more to be discovered.

People are fallible and they write shitty, insecure code. Just because you refuse to acknowledge that fact doesn’t mean it’s not true.

It’s not hard at all. I did it too. Space Engineers does it too. Space Engineers even loads C# mods from workshop automatically when you join a server.

Are you a security expert or a pen tester? Just because Space Engineers did it doesn’t mean it’s safe.

.NET basically works like WASM, its a VM too and has bytecode too. There is no reason why WASM should be any more secure.

WASM was designed as a security-first runtime with sandboxing in mind. The instruction set is far more restricted and cannot express unsafe operations at all. It is also much simple.

NET Core developers have explicitly stated that sandboxing is a non-goal for .NET Core.

So you are the only smart guy out there and no one at Facepunch ever thought about how secure this can be? Don’t you think they came up with a solution already?

Where did I imply that? This isn’t targeted directly at Facepunch, this is simply thinking aloud.
People like you are exactly the reason why I’m concerned about security. These things shouldn’t be hand-waved away.

I’m fairly confident garry and co. have well-thought this through, but it would be nice to receive some actual confirmation.

Yet you talk like if you are one. Posting a ton of irrelevant links to scare people. Don’t know what your goal is here.

Yeah, let’s talk about Heartbleed from 2014 that is completely irrelevant to the topic.

How is it irrelevant? It perfectly demonstrates how a seemingly innocuous missed check can lead to a vulnerability that raised such panic in 2014. Just because we’re in 2020 now, doesn’t mean that all of our software is suddenly more secure. It really isn’t.

My goal here is to find out what measures were put into place, because:

  1. I am curious,
  2. I want some peace of mind that this was an actual thing that was heavily considered.

This is just some really paranoid post, nothing else.

There is nothing paranoid about not wanting a random server’s untrusted code be able to wreak havoc on my game. All it takes is a single RCE.

I suggest you do actual research about the kinds of vulnerabilities that can exist in software before you hand-wave it away.

1 Like

Yeah, when the available namespaces / instructions are restricted by default it doesn’t matter.

They have been doing this since the game was released years ago. I am not aware of any incident. It is purely based on static code analysis / namespace & instruction restriction.

They can restrict instructions as they have control over the runtime. It can also be done with static code analysis, as you can’t hide MSIL instructions.

I am saying this because it should be obvious that it is not hand-waved away. Sure, multi million dollar company Facepunch totally forgot about this.

Let this be Facepunch’s concern. If you have so many concerns, dont buy / play the game.

2 Likes

Debate is good guys but try not to keep any hostility down

This is an interesting read and you don’t want to be muddied by looking salty

1 Like

Yes, I agree. Was kinda too salty. But my point stands.

3 Likes

Yeah, when the available namespaces / instructions are restricted by default it doesn’t matter.

This doesn’t matter if you can break out of the VM and execute arbitrary code. All it takes for the RCE to occur is a single buffer overrun/JIT miscompilation/missed bounds checks/etc.

It can also be done with static code analysis, as you can’t hide MSIL instructions.

MSIL doesn’t distinguish between safe and unsafe instructions. The verifications happens at the compilation stage. If you ship a DLL you can’t analyze the source code. Analyzing MSIL for safety is a non-trivial task - just as analyzing any low-level bytecode. PeVerify exists for this particular purpose.

I am saying this because it should be obvious that it is not hand-waved away. Sure, multi million dollar company Facepunch totally forgot about this.

There are many documented cases of multi-million (and multi-billion, for that matter) completely disregarding security. Not saying Facepunch is or will be guilty of this, but this is a non-argument.

Let this be Facepunch’s concern. If you have so many concerns, dont buy / play the game.

I enjoy this project and I’m enthusiastic about it. Like I said, I don’t want to raise panic or imply that this means that the project is somehow doomed. Not at all.

I don’t see what’s wrong with raising a legitimate question about a project that you’re passionate about.

EDIT:

Regardless, this discussion has devolved into a general debate about security rather than security in the context of S&box. My concerns still stand and I’m still curious as to what the stance is on the more obscure types of vulnerabilities.

I’m not here to argue about what is and what isn’t secure or insecure code. I’ve made my conclusions about shitty code long ago.

I’ve also concluded that properly sandboxing code is a very hard problem, as illustrated by the catastrophic failure of projects that tried to integrate Flash, Java and C# into the Web. They all died because they were horribly insecure. This is simply a consequence of running untrusted code on user devices. It’s a hard problem.

1 Like

Show me a single example of this in .NET or similar.

This is just wrong. Don’t know where you got this from. You can analyze MSIL and be safe. Like said, Space Engineers is a good example. There isn’t much you can do if all dangerous instructions are blocked such as calli or reading arbitrary memory addressed. Aside that, since they control the compilation process too, they can just block any unsafe call as well.

Yes, I am sure that garry just thought lets make clients load C# addons and hope there wont be a security issue.

Yet you are only trying to cause panic here, posting unrelated links and making it look like .NET is the most unsecure thing ever.

It did not. This is still about .NET and “security”.

I’m not going to go over all of your replies, because:

a) You have a point and I can’t find the resources I’ve based my initial conclusions on a few months ago (some obscure issue on github illustrating bytecode unsafety without the use of calli, but I might be misremembering)
b) You’re misinterpreting my initial concerns.
c) I don’t think we’ll come to an agreement on this. That’s fine.

But:

Yet you are only trying to cause panic here, posting unrelated links and making it look like .NET is the most unsecure thing ever.

I didn’t intend to imply that. I had a question/concern and was curious about it, especially since I’ve been researching this exact topic a few months back, and the only things related to sandboxing .NET Core code was advice from devs to “not do it.” That’s hardly reassuring.

I don’t see what the harm is in asking a bunch of questions and receiving a simple answer in the vein of “yes, these things have been covered.” You don’t have to go on a personal crusade as if I’ve attacked you directly, and didn’t just post a message on a forum.

And, yes, the gist of this is that some people are more concerned about security than others. If you’re not concerned - that’s fine. Doesn’t mean I can’t be.

It really doesn’t matter what you do, vulnerabilities will exist in all contexts. If lua was used instead of C#, you’re still dealing with vulnerabilities which lead to code execution. No application is 100% secure, even packets interfacing from the server/client can be attacked. Googling a bunch of CVEs really don’t mean anything, there’s CVEs for web browser dom elements, does that mean we can’t use those dom elements?

4 Likes

I’m really worried about this. The restrictions should also apply to the server side, because otherwise Workshop addons could easily infect users starting singleplayer or hosting a local server.
This would also mean that backdoors wouldn’t only give adversaries admin rights, but potentially access to the server or the players, if they use a Workshop addon.

Besides that, I really hate LUA (it embodies all things that are said about Javascript negatively) and I at first would have thought that S&Box would run with Javascript/Typescript, a much more modern, better, widely supported and easily sandboxable language. C# is also completely fine, but I have never heard of sandboxing C# code.

The whole praise for Javascript is completely subjective, there’s reasons why Unity dropped it for example in favour of C#. Infections will exist in any platform.

Maybe they could introduce a safe mode which is enabled by default on servers, applying the same restrictions as client

1 Like

Of course C# is more efficient than Javascript, but these are two different worlds. Unity compiles to an executable, that can do practically everything anyway. Here, it should only be a scoped scripting language that shouldn’t harm your computer just because you downloaded an infected addon.

And don’t get me wrong: I think C# is a better choice here and is generally better to code with, but I really don’t want addons to be allowed to do anything on the server-side.

this sounds like a really good idea. like a default safe mode so nothing wrong can run without you knowing. and if you want more advanced plugins and such you can disable it at your own risk.

Sounds like a good idea at first, but this could create compatibility issues and then the addon devs have to make sure that it runs on both systems.
The same restrictions should always apply on the server-side! No mod should be able to modify or read anything out of the game.