Getting a game to work well on different platforms, cards, ram, etc., is an art.
For this reason, a game, in addition to many other things, must be as compact as possible, textures (lower resolution, better performance), polygons (lowpoly better than hightpoly).
In this section, the scripts play an important role, since they eat the fps with potatoes.
This forces the author of the game to be very careful when writing the scripts. Declare the variables wisely in the right place and time, limit the scope of the variables to what is strictly necessary, do not constantly invoke a function to get a value that you do not need at this point in the game, or that you will no longer use, avoid as much as possible that one script constantly calls another, etc. try to debug the code to the maximum, and destroy all entities that you no longer need.
A poorly crafted script can slow a game down.
About how to get GG to work with a specific card through a standalone, without it including a menu for this, tbh I don't know. I know that in test mode you can know the card that GG uses by pressing f11, I know that the Nvidia card allows you to configure it to be used with GG, I ignore other cards.
Laptop: Lenovo - Intel(R) Celeron(R) CPU 1005M @ 1.90GHz
OS: Windows 10 (64) - Ram: 4 gb - Hd: 283 gb - Video card: Intel(R) HD Graphics
cpu mark: 10396.6
2d graphics mark: 947.9
3d graphics mark: 8310.9
memory mark 2584.8
Disk mark: 1146.3
Passmark rating: 3662.4