Something about the decal implementation I wrote for Gunbritt.
As I began I was quite befuddled. I didn’t have a good idea of how to project an image on to (possibly) several other models. I looked at a few articles around the internet which described how to do it using a camera. I wanted something simpler though! After doing some contemplating I began breaking it down in to smaller pieces. To begin with, I decided to limit my self to using the deferred rendered models, which meant I didn’t have to draw on to multiple models. I also realized, that by using a simple bounding box as the in-world representation of the decal I would, rendering it, always generate the correct pixels (and a few unneccesary ones).
This subsequently meant I could sample the position texture and check if the position of that pixel was inside or outside the boundingbox using distance. I could also cull pixels using the normal to projector forward dot to remove back surface drawing or drawn-out decals.
This is basically just if (pixel inside bounds) multi = 1 else multi = 0. But without branching.
After that i simply normalized the x/y distance (the bounding box is facing towards the projection) of the pixel from the bounding box center, and used that to sample the decal texture, which then is output to the already existing deferred diffuse texture.
And now with some randomized rotation..
A few notes:
- This implementations is obviously only for deferred rendering. Given how simple it is, I’m sure it can be made to work with forward rendering though.
- During the rendering of the bounding boxes (‘projectors’), depth writing must be disabled. Else things will get wacky : ).