Grumbleshadersgrumble…

As I’ve been tinkering far too much with the visual niceties of late, I decided to make a deliberate and concerted effort to concentrate on gameplay for a change.

I disabled and reverted all the hacks I’d put in around the Scene Context Manager, and allowed it to once again spawn a Hive station and numerous NPC vessels.

Which then reminded me that I had a performance issue to deal with.

Normally, performance ‘can wait’ – only optimise when you need to is a golden development rule. I write code to be efficient in the first place (ie. performance aware) but only worry about squeezing CPU cycles etc when the need arises.

Well, in this instance performance ‘can’t wait’. As soon as the Hive station appears on screen, whumph… drops from smooth to chug… 15fps. The irksome thing is, I’m not yet doing anything amazing rendering wise. So what is this mystery?

Using the (very nice thank you) diagnostic CPU profiler tools now embedded into VisualStudio 2015, I quickly learn that it’s my age old ‘just make it work’ code for bringing Shader support into 666, whereby I’m binding uniforms ‘on the fly’ every frame, whenever the object is rendered and uses a shader. This binding is expensive, as it requires looking up a shader uniform by name (ouch) which means hitting the shader and therefore accessing the GPU (more ouch). So far I’ve gotten away with murder here, purely because I’ve not done anything too demanding.

Time to fix this up!

I’d already planned an object to handle this long ago – GFXShader – and estimated 60 hours of dev effort, so this was not going to be a simple thing to do. The aim of the GFXShader is to avoid having to do these expensive run-time dynamic variable lookups. The issue with shader variables (GLSL uniforms in this instance) is that you don’t know where the actually are until after the GPU has compiled and linked them together. So you always have to ‘look them up’ – and the only way to do that meaningfully is by their variable name, which means a string lookup. So, instead of doing this every single time, GFXShader does this once after the shader has been linked, by parsing the shader programs (vertex and fragment) and creating an array of the variables. The issue is, a variable can be (and is) any type… float, int, vector, matrix… even custom types. So each of these ‘bindings’ has to know what type, and also allow the client (the rendering engine) to pass known type values across to GFXShader, so it can then push them to the GPU when needed. None of that is as easy as it may sound 🙂

Creating GFXShader and relocating all the existing GFX API code for shaders was fairly straightforward, as was creating the shader parsing to get the uniforms out. The major, major pain was then upgrading all of 666’s Material system, and Dominium’s game objects to now talk to the GFXShader instead of directly pouncing on the shaders via the GFX API. Ouch. That took some doing! But, overall, 12 hours dev effort logged, so not as bad as I first thought so long ago 😉

But, it’s all done, and 666 is all the better for it. There are still optimisations and indeed ‘better ways’ to handle certain parts which I’ll get to later, but for now, I’m happy.

Did it improve performance?

Well. A bit. Not as much as it should have… that earlier 15fps is now 22fps. More annoyingly, if I disable the station render it creeps to 35fps. Totally unacceptable, especially considering this only has 15 NPC vessels in the scene.

So, it’s back to the magnifying glass, the profiler, and a lot of patient digging…

PS. Please bear in mind, those 15/22fps performance marks are running in debug mode, with loads of diagnostic code baked into the engine… but nevertheless, it’s inexcusably poor for the workload thrown at the game engine itself.

SHARE THIS POST

  • Facebook
  • Twitter
  • Myspace
  • Google Buzz
  • Reddit
  • Stumnleupon
  • Delicious
  • Digg
  • Technorati
Author: Mak View all posts by

Comments are closed.

Subscribe to The Dominium Observer Newsletter!
Subscribe