That’s a bit ignorant, the integrated definitely matters, not so much the model but the fact that it may default to integrated.
Like elix said.
Not relative to your problem but it bothers me slightly, turn vsync on.
[editline]4th February 2014[/editline]
And no vsync doesn’t cause frame drops, that is a myth.
Think of it as a little man in your GPU that holds the next image to be displayed, his name is back buffer. If this little man is patient and has vsync turned on he waits until your monitor is refreshed before handing the next image over to his good friend frame buffer, frame buffer then draws the image to the screen.
Now, if back buffer is not patient and has vsync turned off he will force all of the images on frame buffer as fast as he possibly can, frame buffer can’t possibly keep up with this so he gets angry at back buffer and starts tearing up images to spite him. Because of this their once beautiful friendship turns into a hatred for one another.
So next time you go to turn off vsync, think to yourself, do I want to destroy a beautiful friendship?
Edit: oh it’s a server.