Monday 20 April 2015

"That's not right. It looks pretty, but it's not right."

Oh shaders. How I love you so. And I mean that in 'the abused housewife still loves her husband' kind of way. I think this, once again, confirms my suspicions that I am a masochist.

So last week I began my journey into the perilous realms of Terra Unity Morte, specifically in the aspect of custom shaders. My plan was to create a God Ray Post-Processing shader, and it was a fairly straightforward one.

The Plan

Step 1. Create a new render texture, render the scene onto it
Step 2. Iterate through each pixel. If the pixel alpha is larger than 0 (ie. object exists) then draw the pixel as black (Occlusion Pass)

Step 3. Draw lighting shade onto every other pixel, ignoring the occludes. (Lighting Pass)

Step 4. Apply a God Ray effect (Effect Pass)
Step 5. display the material with render texture on a screen sized quad, placed in front of the scene with alpha blending (Scene Render)

Step 6. Pretty!



Unfortunately, at this point old man Unity came along and looked over my plan.
"Nice plan you've got here," it noted, "It'd be a shame if.."

Unity's answer to my plan

Step 1. Disable render textures in Unity 4's free version (what I am using)
Step 2. Set shaders to outright ignore alpha values by default, drawing to RGB channels only
Step 3. Allow multiple passes in a single shader, but texture samples for preceeding passes will refer to the newly rendered texture with previous pass effects

Step 4. Most (if not all) Post-Processing effects, including God Rays, require the use of multiple render textures, not just one

Step 5. updating a quad's position every update to remain in front of the camera will cause issues with physics (for some reason)


Satisfied with its work Unity wandered away, leaving me to weep over what remained of my plan.



So yes, there were issues.

Render textures not being available was the biggest one, since that is the core ingredient of any Post-Processing effect. I eventually settled on a CPU-based alternative, ReadPixels(), which uses the CPU to read every pixel currently on screen and dump the data into a 2D texture. If that sounds like an extremely expensive operation to you, that's because it is. I am essentially performing a screenshot after every frame, manipulating the screenshot, then re-rendering the edited screenshot into the game. Good God.

The second biggest one is the frankly labyrinthine system that is ShaderLab. In the spirit of all things Unity, this system attempts to be 'User Friendly' by automatically performing several functions for you. That's great, except that I now have no idea what is actually happening when the scene is rendered. It was only after hours of researching that I discovered shaders only render RGB by default, completely ignoring the alpha channel. Any transparency can only be done via alpha blending, and alpha-inclusive renders can only be performed under certain settings with the right flags declared. Overall the thing is so unintuitive I feel like Theseus wandering the Minotaur's Maze. Except I don't have Ariadne guiding me with a piece of string.


Still, I have managed to get somewhere. It's far from what I want, but I am currently up to step 4 in my once-foolproof plan. It's very slow going since I have no idea what will actually be rendered 90% of the time, but that's less to do with Unity and more to do with shader coding in general. I'm hoping that I can get to at least a passable rendition of the effect I want, at which point I will declare victory and look the other way.

And to think, all this could have been void if we had used Unity 5.



Some very incorrect (but nonetheless pretty) render results

 





No comments:

Post a Comment