Testing Volumes
Volumes? Not as easy as I thought.
Volumes are a complicated subject but can deliver outstanding artistic results if used correctly. Volumetric shaders are a relatively new practice in games as we have been unable to pull off any of this within real-time simulations until recently. Especially on consumer-grade hardware. Volume Shaders refer to the generation of volumes through the use of 2D textures and a lot of math. We do this through the use of volume textures. Volume textures store the information about 3D space rather than 2D space like the textures we are used to in modern pipelines. While graphics APIs have supported volume textures for a while, not all game engines have native support for them yet which is unfortunate as the programming to facilitate a new custom game engine requires far too much work for some simple tests. For this reason, I in my tests, I decided to create and use 'pseudo volume textures' using regular 2D textures that can be used in any engine, But I decided to use Unreal Engine 4 as I am used to this software. They will not be as cheap as true hardware supported Volume Textures but they are a great start for building knowledge in volumetrics. We will be looking further into the realms of real-time rendering soon but first, let’s look into how these are created and what they might look like.
To gaze into what a volume shader and a volume texture are, let’s first look into how they are created. A volume shader is, in the simplest terms, like a CT scan backwards. What this means is that instead of a machine working to try and take vertical slices of a 3-dimensional object to look at what is inside of it, a volume shader will take the 2-dimensional slices of information to construct a 3-dimensional shape for external viewing. The building of a shape using these practices of projection rather than simulation means that instead of having to render an outstanding amount of particles and perform masses of physical calculations, this is already pre-baked in a sense and the information can simply be projected onto a cube. This can make an incredibly intricate visual only 6 polygons and complex texture. Very smart indeed. But there is still some more to explore here. That being the definition and the creation of volume textures. This can be a bit complicated so I will use an image to demonstrate the practice. There are some similarities to other kinds of textures so hopefully, that will illuminate.
The closest thing to Volume Textures that most artists will be familiar with is flipbook or subUV animation textures. A Flipbook is simply a collection of 2D frames laid out in a series. Flipbooks are actually very similar to volume textures: the only 'difference' is that for a flipbook, we consider the time to be the 3rd dimension and for a volume texture we consider another spatial dimension to be the 3rd dimension, typically the Z axis. This distinction is subjective as they are both just another dimension in the data.
for this part of the project, I wanted to try and implement a basic volume shader into a UE4 project. To do this, I had to first author a custom volume texture sheet. I want to keep this simple so I decided to do a volume slice of a Sphere. this works much the same way as an MRI machine and uses sophisticated projection algorithms to create the texture.
This is the completed texture sheet. As hinted at before, this is probably the most simple version of a volume texture sheet that you could find while still maintaining relative accuracy within the shapes form. This image will dictate the final form of the volume. The first thing to do with this creates a new material within UE4 that will act as the shader.
Here is the finished material and the volume it creates. Now, I realise there are a few stages missing here but I thought it would be far easier to show the final image before going into the complicated stuff so that there would be an outline for the rest of this post to go on. To make this material only really requires two bits of information: the opacity and the emissive colour. To get this is easier said than done. This is where it gets really annoying that there isn't any native volume support in the engine. Here, we have to write our own custom material node.
The code is fairly simple, pretty much we just need something to allow for us to carry out a few formulas. this is not just to allow for the shader to take form but also to react to the light properly. To do this we must set up the parameters for custom ray marching within the volume. The algorithms we need to account for are set out as follows:
These will allow an artist to automatically calculate the values they need from inputting arbitrary data such as Constant 4 vectors for colour and alpha as well as simple range floats. Writing out the code for this took a lot longer than expected but works fine now that it is done. I have thought about streamlining the programming a little but I think that if it isn't broken then I won't fix it. The completed code can be found below --
float numFrames = XYFrames * XYFrames;
float accumdist = 0;
float curdensity = 0;
float transmittance = 1;
float3 localcamvec = normalize( mul(Parameters.CameraVector, Primitive.WorldToLocal) ) * StepSize;
float3 invlightdir = 1 / LightVector;
float shadowstepsize = 1 / ShadowSteps;
LightVector *= shadowstepsize*0.5;
ShadowDensity *= shadowstepsize;
Density *= StepSize;
float3 lightenergy = 0;
float shadowthresh = -log(ShadowThreshold) / ShadowDensity;
int3 randpos = int3(Parameters.SvPosition.xy, View.StateFrameIndexMod8);
float rand =float(Rand3DPCG16(randpos).x) / 0xffff;
CurPos += localcamvec * rand.x * Jitter;
for (int i = 0; i < MaxSteps; i++)
{
float cursample = PseudoVolumeTexture(Tex, TexSampler, CurPos, XYFrames, numFrames).r;
//Sample Light Absorption and Scattering
if( (cursample.r) > 0.001)
{
float3 lpos = CurPos;
float shadowdist = 0;
for (int s = 0; s < ShadowSteps; s++)
{
lpos += LightVector;
float lsample = PseudoVolumeTexture(Tex, TexSampler, saturate(lpos), XYFrames, numFrames).r;
float3 shadowboxtest = floor( 0.5 + ( abs( 0.5 - lpos ) ) );
float exitshadowbox = shadowboxtest .x + shadowboxtest .y + shadowboxtest .z;
if(shadowdist > shadowthresh || exitshadowbox >= 1) break;
shadowdist += lsample;
}
curdensity = 1 - exp(-cursample.r * Density);
//curdensity = saturate(cursample * Density);
//float shadowterm = exp(-shadowdist * ShadowDensity);
//float3 absorbedlight = exp(-shadowdist * ShadowDensity) * curdensity;
lightenergy += exp(-shadowdist * ShadowDensity) * curdensity * transmittance * LightColor;
transmittance *= 1- (curdensity);
#if 1
//Sky Lighting
shadowdist = 0;
lpos = CurPos + float3(0,0,0.025);
float lsample = PseudoVolumeTexture(Tex, TexSampler, saturate(lpos), XYFrames, numFrames).r;
shadowdist += lsample;
lpos = CurPos + float3(0,0,0.05);
lsample = PseudoVolumeTexture(Tex, TexSampler, saturate(lpos), XYFrames, numFrames).r;
shadowdist += lsample;
lpos = CurPos + float3(0,0,0.15);
lsample = PseudoVolumeTexture(Tex, TexSampler, saturate(lpos), XYFrames, numFrames).r;
shadowdist += lsample;
lightenergy += exp(-shadowdist * AmbientDensity) * curdensity * SkyColor * transmittance;
#endif
}
CurPos -= localcamvec;
}
return float4( lightenergy, transmittance);
So, the hard part is over. Let's get artistic with values! Taking the time to input proper values for the light colours is an incredibly important step toward a believable volume. I am sure this will be far more apparent when I am able to input some more realistic looking shapes for the volume. Perhaps I will do this using some math too? Something that works on the use of noise? Maybe in the next post. All that's left to do here is to fill in the node network with some constants and values.
At the top of this post, you can find the finished product of this work and feel free to copy and paste the code into your own custom material node! Stage one, Volumes --COMPLETE!!
Comments
Post a Comment