Linux, Custom module SSDK stuff do not working

Hi, I’ve been trying to get CBaseServer pointer in linux module using SSDK13 headers (vtable hooks) and linking to tier and vstdlib, but obtained pointers is useless, and I can not understand why.
So, some code:


#include <Lua/Interface.h>
#include "SSDK.h"
#include "GaSignatures.h"

#define ENGINE_LIB "engine.so"

GMOD_MODULE_OPEN() {
	...

	auto engineHandle = dlopen(ENGINE_LIB, RTLD_LAZY);
	auto engineFactory = dlsym(engineHandle, "CreateInterface");
	
	auto cbaseServerPtr = SigScan(engineFactory, GASIG_CBASESERVER);
	if (!cbaseServerPtr) {
		printf("[li] CBaseServer signature is not found![/li]");
		return 0;
	}
	
	IServer* iServer;
	V_memcpy(&iServer, static_cast< char* >(cbaseServerPtr) + 12, sizeof(char*));
	auto baseServer = static_cast<CBaseServer*>(iServer);
	auto gameServer = static_cast<CGameServer*>(iServer);
	
	// Testing, getting empty map name
	printf( "[li] Current server map - '%s'[/li]", baseServer->GetMapName());
	
	// Testing broadcasting (should crash the server on loading process)
	baseServer->BroadcastPrintf("Hello clients!
");
	
	...

	return 0;
}

GMOD_MODULE_CLOSE() {
}

I’m getting empty map name string, and BroadcastPrintf and other functions aren’t working.
But the same code on windows are works perfectly:


void* cbaseServerPtr = UTIL_FindSignatureInRunTime( GASIG_CBASESERVER_G, "engine.dll" );
if ( !cbaseServerPtr ) {
	printf( "[GlobalAdmin] Unable to find CBaseServer pointer! [%p]
", cbaseServerPtr );
	return 0;
}

IServer* iServer;
V_memcpy( &iServer, static_cast< char* >(cbaseServerPtr) + 8, sizeof(char*) );
GAGlobals::g_pBaseServer = static_cast<CBaseServer*>(iServer);
GAGlobals::g_pGameServer = static_cast<CGameServer*>(iServer);

printf( "[li] Current server map - '%s'[/li]", GAGlobals::g_pBaseServer->GetMapName());

Someone know where is the problem can be? Compiling with gcc -m32 -fPIС:


$ ldd <...>/garrysmod/lua/bin/gmsv_testmodule_linux.dll
        linux-gate.so.1 (0xf76f7000)
        libdl.so.2 => /lib/i386-linux-gnu/i686/cmov/libdl.so.2 (0xf76d2000)
        libvstdlib.so => not found
        libtier0.so => not found
        vphysics.so => not found
        libstdc++.so.6 => /usr/lib32/libstdc++.so.6 (0xf75df000)
        libm.so.6 => /lib/i386-linux-gnu/i686/cmov/libm.so.6 (0xf7599000)
        libc.so.6 => /lib/i386-linux-gnu/i686/cmov/libc.so.6 (0xf73ec000)
        /lib/ld-linux.so.2 (0xf76fa000)
        libgcc_s.so.1 => /lib/i386-linux-gnu/libgcc_s.so.1 (0xf73ce000)


VTable hooks and headers code: http://pastebin.com/kCV0i21p

Are you using the same signature for both windows and linux? They will have different signatures sk keep tjat into account.

No, I use different signatures for linux and windows.

Do not use linux.


(User was banned for this post ("Shitposting" - Bradyns))

The offsets for the functions in the VTable will be the same on both windows and linux if they are before the deconstructor method, otherwise Linux offsets are one less than windows offsets. Could this be the issue?

Changing offset gives very weird result, for example setting offset to -2 and -1 gives ‘(null)’ in map string and ‘-499547808’ in GetUDPPort(), setting to 0-1,3 gives me ‘’ (empty) map name and zero udp port value, if I set offset to 2, GetUDPPort() will crash server with segmentation fault, and map name will be ‘▒▒’, but offset 4 will not crash server, but give the same result with map, as we see with offset 2…