tag:blogger.com,1999:blog-10234416402345974362024-02-20T13:52:42.766-05:00Graphics RunnerDelusions of graphics programming and other ramblingsKyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.comBlogger38125tag:blogger.com,1999:blog-1023441640234597436.post-13257647843005888172012-02-02T20:45:00.001-05:002012-02-02T21:03:03.892-05:00Optix.NET: a managed wrapper for Nvidia OptixToday I released an open source project I’ve been working on for the last week, Optix.NET. It’s a lightweight wrapper around Nvidia’s Optix GPU ray-tracing library. I figured since there aren’t any .NET wrappers around that I’d go ahead and make one myself. The project started out as a curiosity of managed wrappers and a little of a learning experience working with c++/cli. It’s still in the Alpha stage of development so there may be some bugs, and if you have any suggestions feel free to drop me a line. The math library is pretty sparse at the moment, having only implemented functions that I needed.<br />
<br />
Optix.NET may head in the direction of CUDAfy where you can create your optix programs in-line with your C#. The current downside with Optix.NET is that you cannot share structs/classes with your Optix programs as you can when working with the original c/c++ library.<br />
<br />
The Optix.NET SDK also comes with a (at the moment very) basic demo framework for creating Optix applications. Such as a basic OBJ model loader and simple camera.<br />
<br />
As I talked a little about last post on Instant Radiosity, the general flow of Optix is:<br />
<ul>
<li>Create a context <ul>
<li>This is similar to a D3D Device. </li>
</ul>
</li>
<li>Create material programs <ul>
<li>These will run when there is an intersection and are akin to pixel shaders. </li>
</ul>
</li>
<li>Create intersection programs <ul>
<li>These are responsible for performing ray-geometry intersection. </li>
</ul>
</li>
<li>Create the main entry program / ray-generation program <ul>
<li>These will launch eye rays in a typical pinhole camera ray-tracer </li>
</ul>
</li>
<li>Load geometry data and creating a scene hierarchy </li>
<li>Perform ray-tracing and display results. </li>
</ul>
To get a very good introduction to Optix I recommend following the programming guide and quick start guide that comes with the Optix SDK. Let’s get to it then.<br />
<br />
This small tutorial will walk through the steps of Sample 6 in the Optix.NET SDK and create a simple program that will ray-trace a cow and shade it with its interpolated normals.<br />
<br />
<b style="color: orange;">Creating the Optix Context</b>
<span style="color: orange;"><br /></span><br />
<pre class="mycodeSmall">Context = <span style="color: blue;">new </span><span style="color: #2b91af;">Context</span>();
Context.RayTypeCount = 1;
Context.EntryPointCount = 1;</pre>
<br />
Here we uh create our rendering context :-). We also set the ray type count. This tells optix how many different types of rays will be traversing the scene (e.g. Eye rays, indirect rays, shadow rays, etc ). EntryPointCount sets the number of main entry programs there will be.<br />
<br />
<span style="color: orange;"><b>Creating the material</b></span>
<span style="color: orange;"><br /></span><br />
<pre class="mycodeSmall"><span style="color: #2b91af;">Material </span>material = <span style="color: blue;">new </span><span style="color: #2b91af;">Material</span>( Context );
material.Programs[ 0 ] = <span style="color: blue;">new </span><span style="color: #2b91af;">SurfaceProgram</span>( Context, <span style="color: #2b91af;">RayHitType</span>.Closest, shaderPath, <span style="color: #a31515;">"closest_hit_radiance" </span>);</pre>
<br />
This creates a material that the geometry will use and assigns a SurfaceProgram (similar to a pixel shader), and tells Optix to run this shader on the closest ray-geometry intersection so that there is propery depth sorting.<br />
<br />
<b style="color: orange;">Creating geometry</b><br />
<br />
Next the geometry is loaded. For brevehity’s sake that part is omitted, but I show the important part of how you get your geometry into Optix.<br />
<br />
First we create geometry buffers, similar to vertex and index buffers in D3D, and fill them with the positions, normals, texture coordinates, and triangle indices.<br />
<br />
<pre class="mycodeSmall"><span style="color: green;">//create buffer descriptions</span>
<span style="color: #2b91af;">BufferDesc </span>vDesc = <span style="color: blue;">new </span><span style="color: #2b91af;">BufferDesc</span>() { Width = (<span style="color: blue;">uint</span>)mVertices.Count, Format = <span style="color: #2b91af;">Format</span>.Float3, Type = <span style="color: #2b91af;">BufferType</span>.Input };
<span style="color: #2b91af;">BufferDesc </span>nDesc = <span style="color: blue;">new </span><span style="color: #2b91af;">BufferDesc</span>() { Width = (<span style="color: blue;">uint</span>)mNormals.Count, Format = <span style="color: #2b91af;">Format</span>.Float3, Type = <span style="color: #2b91af;">BufferType</span>.Input };
<span style="color: #2b91af;">BufferDesc </span>tcDesc = <span style="color: blue;">new </span><span style="color: #2b91af;">BufferDesc</span>(){ Width = (<span style="color: blue;">uint</span>)mTexcoords.Count, Format = <span style="color: #2b91af;">Format</span>.Float2, Type = <span style="color: #2b91af;">BufferType</span>.Input };
<span style="color: #2b91af;">BufferDesc </span>iDesc = <span style="color: blue;">new </span><span style="color: #2b91af;">BufferDesc</span>() { Width = (<span style="color: blue;">uint</span>)mIndices.Count, Format = <span style="color: #2b91af;">Format</span>.Int3, Type = <span style="color: #2b91af;">BufferType</span>.Input };
<span style="color: green;">// Create the buffers to hold our geometry data</span>
Optix.<span style="color: #2b91af;">Buffer </span>vBuffer = <span style="color: blue;">new </span>Optix.<span style="color: #2b91af;">Buffer</span>( Context, <span style="color: blue;">ref </span>vDesc );
Optix.<span style="color: #2b91af;">Buffer </span>nBuffer = <span style="color: blue;">new </span>Optix.<span style="color: #2b91af;">Buffer</span>( Context, <span style="color: blue;">ref </span>nDesc );
Optix.<span style="color: #2b91af;">Buffer </span>tcBuffer = <span style="color: blue;">new </span>Optix.<span style="color: #2b91af;">Buffer</span>( Context, <span style="color: blue;">ref </span>tcDesc );
Optix.<span style="color: #2b91af;">Buffer </span>iBuffer = <span style="color: blue;">new </span>Optix.<span style="color: #2b91af;">Buffer</span>( Context, <span style="color: blue;">ref </span>iDesc );
vBuffer.SetData<<span style="color: #2b91af;">Vector3</span>>( mVertices.ToArray() );
nBuffer.SetData<<span style="color: #2b91af;">Vector3</span>>( mNormals.ToArray() );
tcBuffer.SetData<<span style="color: #2b91af;">Vector2</span>>( mTexcoords.ToArray() );
iBuffer.SetData<<span style="color: #2b91af;">Int3</span>>( mIndices.ToArray() );</pre>
<br />
Next we create a Geometry node that will tell Optix what intersection programs to use, how many primitives our geometry has, and creates shader variables to hold the geometry buffers.<br />
<br />
<pre class="mycodeSmall"><span style="color: green;">//create a geometry node and set the buffers</span>
<span style="color: #2b91af;">Geometry </span>geometry = <span style="color: blue;">new </span><span style="color: #2b91af;">Geometry</span>( Context );
geometry.IntersectionProgram = <span style="color: blue;">new </span><span style="color: #2b91af;">Program</span>( Context, IntersecitonProgPath, IntersecitonProgName );
geometry.BoundingBoxProgram = <span style="color: blue;">new </span><span style="color: #2b91af;">Program</span>( Context, BoundingBoxProgPath, BoundingBoxProgName );
geometry.PrimitiveCount = (<span style="color: blue;">uint</span>)mIndices.Count;
geometry[ <span style="color: #a31515;">"vertex_buffer" </span>].Set( vBuffer );
geometry[ <span style="color: #a31515;">"normal_buffer" </span>].Set( nBuffer );
geometry[ <span style="color: #a31515;">"texcoord_buffer" </span>].Set( tcBuffer );
geometry[ <span style="color: #a31515;">"index_buffer" </span>].Set( iBuffer );</pre>
<br />
Now we create a GeometryInstance that pairs a Geometry node with a Material (that we created earlier ).<br />
<br />
<pre class="mycodeSmall"><span style="color: green;">//create a geometry instance</span>
<span style="color: #2b91af;">GeometryInstance </span>instance = <span style="color: blue;">new </span><span style="color: #2b91af;">GeometryInstance</span>( Context );
instance.Geometry = geometry;
instance.AddMaterial( Material );
<span style="color: green;">//create an acceleration structure for the geometry</span>
<span style="color: #2b91af;">Acceleration </span>accel = <span style="color: blue;">new </span><span style="color: #2b91af;">Acceleration</span>( Context, <span style="color: #2b91af;">AccelBuilder</span>.Sbvh, <span style="color: #2b91af;">AccelTraverser</span>.Bvh );
accel.VertexBufferName = <span style="color: #a31515;">"vertex_buffer"</span>;
accel.IndexBufferName = <span style="color: #a31515;">"index_buffer"</span>;</pre>
<br />
We then create an Acceleration structure ( or Bounding Volume Hierarchy ) that will create a spatial data structure that will optimize the ray traversal of the geometry. Here we create the Acceleration node with a Split BVH builder and a BVH traverser. This informs Optix how the BVH should be built and traversed. We also give the Acceleration structure the name of the vertex and index buffers so that it can use that data to optimize the building of the Split BVH (assigning the names of the vertex and index buffers is only required with Sbvh and TriangkeKdTree AccelBuilders ). <br />
<br />
Next we create a top-level node to hold our hierarchy. We give it the acceleration structure and the geometry instance. Optix will use this top-level node to begin its scene traversal.<br />
<br />
<pre class="mycodeSmall"><span style="color: green;">//now attach the instance and accel to the geometry group</span>
<span style="color: #2b91af;">GeometryGroup</span> GeoGroup = <span style="color: blue;">new </span><span style="color: #2b91af;">GeometryGroup</span>( Context );
GeoGroup.Acceleration = accel;
GeoGroup.AddChild( instance );</pre>
<br />
<b style="color: orange;">Create ray generation program</b><br />
<br />
Now comes the creation of our main entry ray generation program and set it on the Context. This will be responsible for creating pinhole camera rays.<br />
<br />
<pre class="mycodeSmall"><span style="color: #2b91af;">Program </span>rayGen = <span style="color: blue;">new </span><span style="color: #2b91af;">Program</span>( Context, rayGenPath, <span style="color: #a31515;">"pinhole_camera" </span>);
Context.SetRayGenerationProgram( 0, rayGen );</pre>
<br />
<b style="color: orange;">Create the output buffer and compile the Optix scene</b><br />
<br />
Finally, we create our output buffer, making sure to define its format and type. The BufferType in Optix defines how the buffer will be used. The BufferTypes are: Input, Output, InputOutput, and Local. The first three are self explanatory. Local sets up the buffer to live entirely on the GPU, which is a huge performance win in multi-gpu setups as it doesn’t require us copying the buffer from GPU memory to main memory after every launch. Local buffers are typically used for intermediate results ( such as accumulation buffers for iterative GI ).<br />
<br />
<pre class="mycodeSmall"><span style="color: #2b91af;">BufferDesc </span>desc = <span style="color: blue;">new </span><span style="color: #2b91af;">BufferDesc</span>() { Width = (<span style="color: blue;">uint</span>)Width, Height = (<span style="color: blue;">uint</span>)Height, Format = <span style="color: #2b91af;">Format</span>.UByte4, Type = <span style="color: #2b91af;">BufferType</span>.Output };
OutputBuffer = <span style="color: blue;">new </span>OptixDotNet.<span style="color: #2b91af;">Buffer</span>( Context, <span style="color: blue;">ref </span>desc );</pre>
<br />
Now we setup shader variables to that will hold our top level GeometryGroup, and OutputBuffer. Then Compile Optix (this will validate our node layout and programs are correct) and build the acceleration tree, which only needs to be done on initialization or when geometry changes.<br />
<br />
<pre class="mycodeSmall">Context[ <span style="color: #a31515;">"top_object" </span>].Set( model.GeoGroup );
Context[ <span style="color: #a31515;">"output_buffer" </span>].Set( OutputBuffer );
Context.Compile();
Context.BuildAccelTree();</pre>
<br />
<b style="color: orange;">Ray-tracing and displaying results</b><br />
<br />
To ray-trace the scene we call Launch and give the size of our 2D launch dimensions and the index of our main entry program (zero).<br />
<br />
<pre class="mycodeSmall">Context.Launch( 0, Width, Height );
</pre>
<br />
And to display the results, we get a pointer to the output buffer, and we use OpenGl’s draw pixels:<br />
<br />
<pre class="mycodeSmall"><span style="color: #2b91af;">BufferStream </span>stream = OutputBuffer.Map();
<span style="color: #2b91af;">Gl</span>.glDrawPixels( Width, Height, <span style="color: #2b91af;">Gl</span>.GL_BGRA, <span style="color: #2b91af;">Gl</span>.GL_UNSIGNED_BYTE, stream.DataPointer );
OutputBuffer.Unmap();</pre>
<br />
Results:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkWD1YKa7QMaxqetERg946YfOvAPDmsoJ5fzu-TMd8Hk2OVtO7z9J8ACkPG4GlbgcxQLDBQG9cqIwQ4GxnbTEXrfHBbU2zcl3NQv4GnuSxHxSWGVoY9OzviUI8ngbEyNzpTVxVSfVf3q2U/s1600-h/cow%25255B8%25255D.png"><img alt="cow" height="354" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRhv8kzJ0pFxeGu8xRItVfijdDjexWaxrqXaH9DShJ4FoGK-eejE6u84CgOuEK4Qd2NrGYv9LtQP9CiQZKxqc7MGmWnNvZCGeg-XnR2IfPtdN6QtsKyWDpWmg1pX6ix_NsLuvnUYOYrOaN/?imgmax=800" style="display: inline;" title="cow" width="449" /></a><br />
<br />
And that’s pretty much it. A pretty simple program for doing GPU ray-tracing :-).<br />
<br />
The current source, samples, and built executables are freely downloadable. Currently, I’ve got 5 samples that mimic the Optix SDK samples and will continue to add more to test functionality and eye candy.<br />
<br />
You can download the current release and source here:<br />
<a href="http://optixdotnet.codeplex.com/">http://optixdotnet.codeplex.com/</a><br />
<br />
Or get the source directly with Mercurial here:<br />
<a href="https://hg01.codeplex.com/optixdotnet">https://hg01.codeplex.com/optixdotnet</a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com0tag:blogger.com,1999:blog-1023441640234597436.post-13413997768889209082011-06-04T20:56:00.002-04:002011-07-28T11:55:34.476-04:00New Prey 2 E3 TrailerBethesda released a new trailer for Prey 2. I'm not gonna lie, it looks awesome.<br />
<br />
<iframe allowfullscreen="" frameborder="0" height="349" src="http://www.youtube.com/embed/5h2TkpFEsn8" width="560"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com0tag:blogger.com,1999:blog-1023441640234597436.post-41333939689873791622011-04-07T15:10:00.001-04:002011-07-28T11:55:52.250-04:00Prey 2 Concept and ScreenshotBethesda posted a screenshot and concept piece from Prey 2 today: <a href="http://bethblog.com/index.php/2011/04/07/sneak-peek-at-prey-2/">http://bethblog.com/index.php/2011/04/07/sneak-peek-at-prey-2/</a><br />
<br />
<div class="separator" style="clear: both; text-align: center;"><a href="http://farm6.static.flickr.com/5069/5597986810_79201b61d8_b.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="209" src="http://farm6.static.flickr.com/5069/5597986810_79201b61d8_b.jpg" width="480" /></a></div><br />
<div class="separator" style="clear: both; text-align: center;"><a href="http://farm6.static.flickr.com/5187/5598199064_67d204a3d7_b.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="294" src="http://farm6.static.flickr.com/5187/5598199064_67d204a3d7_b.jpg" width="480" /></a></div>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com0tag:blogger.com,1999:blog-1023441640234597436.post-90906600621734434642011-03-28T02:58:00.009-04:002011-03-28T03:40:48.106-04:00Instant Radiosity using Optix and Deferred Rendering<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizOT2b3BfhIe-m4mJdIXIaGiOtirpv_PZlEFp3e8dpGxQdGyx3zwPhWEIVn4V_cry8vG8xrMweE3ghyphenhyphengG6e2iSDjiC7ihpFDXnOmwFqwwOC3i_qhyphenhyphenM4rFR50vF5dLU_GCNuihDNIxM3foc/s1600-h/cornell_gi_0%5B5%5D.png"><img alt="cornell_gi_0" border="0" height="364" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtH8RSL0ofGNFJm1ymxvTqQEoe_I5XErs8vg9SluTGgFtOl0eRzrt5X1518a64XeFyZX5U4xr3plY37f189-ecQvJnpS6vvRdMclzpQAGFVMDrcyJSiqQ4VjCfQEhDG5QNm5X6uEeP0rl4/?imgmax=800" style="background-image: none; border-bottom-width: 0px; border-left-width: 0px; border-right-width: 0px; border-top-width: 0px; display: inline; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="cornell_gi_0" width="484" /></a><br />
<br />
This comes a little later than I wanted, I hadn’t factored in Crysis 2 taking up as much of my time as it did last week :-)<br />
<br />
I’ve been using Nvidia’s <a href="http://www.nvidia.com/object/optix.html" target="_blank">Optix raytracing API</a> for quite some time, and decided that a good introduction to Optix and what it can do for you would be using it in an Instant Radiosity demo. The demo is fairly large so I won’t cover all of it here, as that would be entirely too long of a post, but just the main parts.<br />
<br />
<b><span style="color: #f79646;">Instant Radiosity</span></b><br />
<br />
Instant Radiosity is a global illumination algorithm that approximates the diffuse radiance of a scene by placing many virtual point lights that act as indirect light. The algorithm is fairly simple: for each light in the scene you cast N photon rays into the scene. At each intersection the photon either bounces and another ray is cast or, through Russian Roulette, is killed of. At each of the intersections you create a Virtual Point Light (VPL) that has the same radiance value as the photon. Once you have these VPLs you render them as you would any other light source.<br />
<br />
One optimization that the demo makes is to divide the scene into a regular grid. For each grid voxel, we find all the VPLs in the voxel and merge them together to form a new VPL that represents the merged VPLs. Any voxels that don’t contain any VPLs are skipped. This dramatically reduces the number of VPLs that we need to render, and trades off indirect accuracy for speed. The following couple of shots demonstrate this idea. The image on the left shows the VPLs as calculated from our Optix program. The image on the right shows the merged VPLs.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhkdS0natlQUrslbRuso-qTKV_UmX3pCSRRUUSFw7a7zGgILmYfYcTGhzpa1SBMEkWJjF9IJNvSm37n3XdKmbRUDzCQ6gDSzN5igXTCHEsofbJdLKCwYogp9JdUn-rcggrC4bz5OYBsO2HQ/s1600-h/scattered_vpl8.png"><img alt="scattered_vpl" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiuhyphenhyphenDZFCgpIZFX6tfjkD9OhjgHMZtJlIu4tNPXOOcu44fP7Ya6oMuHLcZPmppjpjQU5XLoKkAS54nKq5GdVXMGiRYcfoNOIomNwc9J_Uqch5TpnVnOfRLvNvl0Rz12pkoyJJSPeLb4mBPp/?imgmax=800" style="display: inline;" title="scattered_vpl" width="260" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgeOkivyNNLiPQ1rfsvU-7lfY2AaQMqWzRGzKTY-3Rf415K9Zh6iHyU23CEJ3Q-DI4GxuSGHurkiDbPd3lFir9yymMUv2fiG_AmTa2SsBqzvuUzcewityGaHY9yTULw1P_0U4305BmyUvjz/s1600-h/grid_vpl3.png"><img alt="grid_vpl" border="0" height="180" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiw5uZOpJpbZmNO71agNImvoEsAGErhPTHzRK_ma41N-hJMOJVtYWg_jhFnu8oWhJ8cxeApH3pKCPa41h6Ut6AWJelNMK6kIztpbrzzRNdiI8kk6L45h9iD1DJmll9TmSRo4Vx0uCni_e8r/?imgmax=800" style="background-image: none; border-bottom-width: 0px; border-left-width: 0px; border-right-width: 0px; border-top-width: 0px; display: inline; padding-left: 0px; padding-right: 0px; padding-top: 0px;" title="grid_vpl" width="260" /></a><br />
<br />
<b><span style="color: #f79646;">Optix</span></b><br />
<br />
Optix is Nvidia’s ray tracing API that runs on Nvidia GPUs (on G80 and up). Giving an overview to optix could take many blog posts so I won’t go that in depth here. There are a couple of SIGGRAPH presentations that give a good overview:<br />
<br />
<a href="http://nvidia.fullviewmedia.com/siggraph2010/04-dev-austin-robison.html" target="_blank">http://nvidia.fullviewmedia.com/siggraph2010/04-dev-austin-robison.html</a><br />
<a href="http://graphics.cs.williams.edu/papers/OptiXSIGGRAPH10/" target="_blank">http://graphics.cs.williams.edu/papers/OptiXSIGGRAPH10/</a><br />
<br />
To create an optix program you essentially need two things: a ray generation program and a material program ( essentially a shader ) that gets called when a ray intersects geometry. The ray generation program does exactly as it sounds, it generates rays. The program is called for each pixel of your program’s dimensions. Rays cast by your ray generation program will traverse the scene for intersections, once a ray intersects geometry it will call its material program. The material program is responsible for say shading in a classic ray tracer, or any other computation you want to perform. In our case we’ll use it to create our Virtual Point Lights. So lets get down to business.<br />
Here we have the ray generation program that will cast rays from a light. In the case of our cornell box room, we have an area light at the ceiling and we need to cast photons from this light.<br />
<br />
<pre class="mycode">RT_PROGRAM <span style="color: blue;">void </span>instant_radiosity()
{
<span style="color: green;"> //get our random seed
</span><span style="color: green;"> </span><span style="color: blue;">uint2 </span>seed = seed_buffer[ launch_index ];
<span style="color: green;"> </span><span style="color: green;">//create a random photon direction
</span><span style="color: green;"> </span><span style="color: blue;">float2 </span>raySeed = make_float2( ( (<span style="color: blue;">float</span>)launch_index.x + rnd( seed.x ) ) / (<span style="color: blue;">float</span>)launch_dim.x,
( (<span style="color: blue;">float</span>)launch_index.y + rnd( seed.y ) ) / (<span style="color: blue;">float</span>)launch_dim.y );
<span style="color: green;"> </span><span style="color: blue;">float3 </span>origin = Light.position;
<span style="color: green;"> </span><span style="color: blue;">float3 </span>direction = generateHemisphereLightPhoton( raySeed, Light.direction );
<span style="color: green;"> </span><span style="color: green;">//create our ray
</span><span style="color: green;"> </span>optix::Ray ray(origin, direction, radiance_ray_type, scene_epsilon );
<span style="color: green;"> </span><span style="color: green;">//create our ray data packet and launch a ray
</span><span style="color: green;"> </span>PerRayData_radiance prd;
<span style="color: green;"> </span>prd.radiance = Light.color * Light.intensity * IndirectIntensity;
<span style="color: green;"> </span>prd.bounce = 0;
<span style="color: green;"> </span>prd.seed = seed;
<span style="color: green;"> </span>prd.index = ( launch_index.y * launch_dim.x + launch_index.x ) * MaxBounces;
<span style="color: green;"> </span>rtTrace( top_object, ray, prd );
}</pre><br />
So here we cast a randomly oriented ray from a hemisphere oriented about the direction of the light. Once we have our ray, we setup a ray data packet that will collect data as this ray traverses the scene. To cast the ray we make a call to rtTrace, providing the ray and its data packet.<br />
<br />
Next we have our material program. This program is called when a ray hits the closest piece of geometry from the light. And it is responsible for updating the ray data packet, placing a VPL, and deciding to cast another ray recursively if we’re under the maximum number of bounces.<br />
<br />
<pre class="mycode">RT_PROGRAM <span style="color: blue;">void </span>closest_hit_radiosity()
{
<span style="color: green;"> </span><span style="color: green;">//convert the geometry's normal to world space
</span><span style="color: green;"> </span><span style="color: green;">//RT_OBJECT_TO_WORLD is an Optix provided transformation
</span><span style="color: green;"> </span><span style="color: blue;">float3 </span>world_shading_normal = normalize( rtTransformNormal( RT_OBJECT_TO_WORLD, shading_normal ) );
<span style="color: green;"> </span><span style="color: blue;">float3 </span>world_geometric_normal = normalize( rtTransformNormal( RT_OBJECT_TO_WORLD, geometric_normal ) );
<span style="color: green;"> </span><span style="color: blue;">float3 </span>ffnormal = faceforward( world_shading_normal, -ray.direction, world_geometric_normal );
<span style="color: green;"> </span><span style="color: green;">//calculate the hitpoint of the ray
</span><span style="color: green;"> </span><span style="color: blue;">float3 </span>hit_point = ray.origin + t_hit * ray.direction;
<span style="color: green;"> </span><span style="color: green;">//sample the texture for the geometry
</span><span style="color: green;"> </span><span style="color: blue;">float3 </span>Kd = norm_rgb( <span style="color: blue;">tex2D</span>( diffuseTex, texcoord.x, texcoord.y ) );
<span style="color: green;"> </span>Kd = pow3f( Kd, 2.2f ); <span style="color: green;">//convert to linear space
</span><span style="color: green;"> </span>Kd *= make_float3( diffuseColor ); <span style="color: green;">//multiply the diffuse material color
</span><span style="color: green;"> </span>prd_radiance.radiance = Kd * prd_radiance.radiance; <span style="color: green;">//calculate the ray's new radiance value
</span><span style="color: green;"> </span><span style="color: green;">// We hit a diffuse surface; record hit if it has bounced at least once
</span><span style="color: green;"> </span><span style="color: blue;">if</span>( prd_radiance.bounce >= 0 ) {
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;">//offset the light a bit from the hit point
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">float3 </span>lightPos = ray.origin + ( t_hit - 0.1f ) * ray.direction;
<span style="color: green;"> </span><span style="color: green;"> </span>VirtualPointLight& vpl = output_vpls[ prd_radiance.index + prd_radiance.bounce ];
<span style="color: green;"> </span><span style="color: green;"> </span>vpl.position = lightPos;
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;">//the light's intensity is divided equally among the photons. Each photon starts out with an intensity
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;">//equal to the light. So here we must divide by the number of photons cast from the light.
</span><span style="color: green;"> </span><span style="color: green;"> </span>vpl.radiance = prd_radiance.radiance * 1.0f / ( launch_dim.x * launch_dim.y );
<span style="color: green;"> </span>}
<span style="color: green;"> </span><span style="color: green;">//if we're less than the max number of bounces shoot another ray
</span><span style="color: green;"> </span><span style="color: green;">//we could also implement Russion Roulette here so that we would have a less biased solution
</span><span style="color: green;"> </span>prd_radiance.bounce++;
<span style="color: green;"> </span><span style="color: blue;">if </span>( prd_radiance.bounce >= MaxBounces )
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">return</span>;
<span style="color: green;"> </span><span style="color: green;">//here we "rotate" the seeds in order to have a little more variance
</span><span style="color: green;"> </span>prd_radiance.seed.x = prd_radiance.seed.x ^ prd_radiance.bounce;
<span style="color: green;"> </span>prd_radiance.seed.y = prd_radiance.seed.y ^ prd_radiance.bounce;
<span style="color: green;"> </span><span style="color: blue;">float2 </span>seed_direction = make_float2( ( (<span style="color: blue;">float</span>)launch_index.x + rnd( prd_radiance.seed.x ) ) / (<span style="color: blue;">float</span>)launch_dim.x,
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>( (<span style="color: blue;">float</span>)launch_index.y + rnd( prd_radiance.seed.y ) ) / (<span style="color: blue;">float</span>)launch_dim.y );
<span style="color: green;"> </span><span style="color: green;">//generate a new ray in the hemisphere oriented to the surface
</span><span style="color: green;"> </span><span style="color: blue;">float3 </span>new_ray_dir = generateHemisphereLightPhoton( seed_direction, ffnormal );
<span style="color: green;"> </span><span style="color: green;">//cast a new ray into the scene
</span><span style="color: green;"> </span>optix::Ray new_ray( hit_point, new_ray_dir, radiance_ray_type, scene_epsilon );
<span style="color: green;"> </span>rtTrace(top_object, new_ray, prd_radiance);
}</pre><br />
With both of these programs created we need to launch our optix program in order to generate the VPLs. When we’re done running the optix program, we gather all the VPLs into a grid, merging lights that are in the same voxel. Once the VPLs are merged, we add them to the deferred renderer.<br />
<br />
<pre class="mycode"><span style="color: green;">//run our optix program
</span>mContext->launch( 0, SqrtNumVPLs, SqrtNumVPLs );
<span style="color: green;">//get a pointer to the GPU buffer of virtual point lights.
</span>VirtualPointLight* lights = <span style="color: blue;">static_cast</span>< VirtualPointLight* >( mContext[<span style="color: #a31515;">"output_vpls"</span>]->getBuffer()->map() );
<span style="color: green;">//the following block merges the scattered vpls into a structured grid of vpls
//this helps dramatically reduce the number of vpls we need in the scene
</span><span style="color: blue;">if</span>( mMergeVPLs )
{
<span style="color: green;"> </span><span style="color: green;">//Here we traverse over the VPLs and we merge all the lights that are in a cell
</span><span style="color: green;"> </span><span style="color: blue;">for</span>( <span style="color: blue;">int </span>i = 0; i < TotalVPLs; ++i )
<span style="color: green;"> </span>{
<span style="color: green;"> </span><span style="color: green;"> </span>optix::Aabb node = mBoundingBox;
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;">//start with the root cell and recursively traverse the grid to find the cell this vpl belongs to
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">int </span>index = 0;
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">if</span>( FindCellIndex( mBoundingBox, -1, mVoxelExtent, lights[ i ].position, index ) )
<span style="color: green;"> </span><span style="color: green;"> </span>{
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;">//make sure we found a valid cell
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>assert( index >= mFirstLeafIndex );
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;">//subtract the first leaf index to find the zero based index of the vpl
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>index -= mFirstLeafIndex;
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">float3</span>& light = mVPLs[ index ];
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>light += lights[ i ].radiance;
<span style="color: green;"> </span><span style="color: green;"> </span>}
<span style="color: green;"> </span>}
<span style="color: green;"> </span><span style="color: green;">//once the VPLs have been merged, add them to the renderer as indirect lights
</span><span style="color: green;"> </span><span style="color: blue;">int </span>numLights = 0;
<span style="color: green;"> </span><span style="color: blue;">int </span>lastIndex = -1;
<span style="color: green;"> </span><span style="color: blue;">for</span>( <span style="color: blue;">int </span>i = 0; i < mVPLs.size(); ++i )
<span style="color: green;"> </span>{
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">const float3</span>& vpl = mVPLs[i];
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">if</span>( dot( vpl, vpl ) <= 0.0f )
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">continue</span>;
<span style="color: green;"> </span><span style="color: green;"> </span>numLights++;
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: blue;">float3 </span>radiance = vpl;
<span style="color: green;"> </span><span style="color: green;"> </span>D3DXVECTOR3 pos = *(D3DXVECTOR3*)&mVoxels[i].center();
<span style="color: green;"> </span><span style="color: green;"> </span>Light light = { LIGHT_POINT, <span style="color: green;">//type
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>GetColorValue(radiance.x, radiance.y, radiance.z, 1.0f), <span style="color: green;">//diffuse
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>pos, <span style="color: green;">//pos
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>Vector3Zero, <span style="color: green;">//direction
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>1.0f <span style="color: green;">//intensity
</span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;"> </span>};
<span style="color: green;"> </span><span style="color: green;"> </span>renderer->AddIndirectLight( light );
<span style="color: green;"> </span><span style="color: green;"> </span><span style="color: green;">//also add as a light source so we can visualize the VPLs
</span><span style="color: green;"> </span><span style="color: green;"> </span>LightSource lightSource;
<span style="color: green;"> </span><span style="color: green;"> </span>lightSource.light = light;
<span style="color: green;"> </span><span style="color: green;"> </span>lightSource.Model = mLightModel;
<span style="color: green;"> </span><span style="color: green;"> </span>renderer->AddLightSource( lightSource );
<span style="color: green;"> </span>}
}</pre><br />
Now for some eye candy. The first set are your typical cornell box + dragon. In the Instant Radiosity shot you can see the light bleeding from the green and red walls onto the floor, the dragon and the box.<br />
<br />
Direct lighting:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi7a7NBxSx3nrEm7kIDo4taI4EY8TjSmWzt8ltZTszV1wGxT7AfgnR6DFsi6bfdhbHo7SKIUoE1g9292_sRCZ0OjHsJDXCemrHlqWxiEJxv9IUx8B3FUDdYJNZdQ6ghgGc-mGDfLwE2fpYN/s1600-h/cornell_dl_0%5B4%5D.png"><img alt="cornell_dl_0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEheF1FMhMiqmRCdUKKYv0Yd8WJ9rLQjONdHgBPkNYYhr5h7TG3TG6zRaq7vnhOnzuTtGu-gU9k2vbYte6PfD3xHJfgGJ7HhWGggpmnUB9f1DVEpbQKaQXKu1SV8HjusyIpXFMJo_LeL0f5l/?imgmax=800" style="display: inline;" title="cornell_dl_0" width="480" /></a><br />
<br />
Direct Lighting + Indirect VPLs:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgKaAfPvvmFhE1OUF6feHrXt7_d3CHQaq4mpuIiF5UKXfzlHfwq2SDUd-iOl9bgmPVnJFwQs1W12LFwFJA-OFaXmYDQV2dgRJyjafqUsQ22BCjsTv4YTiDOmA-u3sODSs9448CZ1I7Q1BWg/s1600-h/cornell_gi_0%5B10%5D.png"><img alt="cornell_gi_0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOrp5HQGZ8PG_flBiHmSzchYYBcfDzQHiVzHxtRt9Ok-9x6cniuf36Xv4vGbaWrQ3oVsHYx_ZQ5sEexA_ug5kM1EYrLNljzYKwN96Olf5HzrRpjUlR9u85ng9suM-3v84MMSas6HbhVBqP/?imgmax=800" style="display: inline;" title="cornell_gi_0" width="480" /></a><br />
<br />
Direct Lighting:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiLtbj9J9TlKH4rcHSYbuoXYAgztZrGhOT31SrsX8ukxZ5w_5fzZPwZ9OdE6-8UVPVCZ-5KCASbYU9eum0njEf7acsDdlzoF3lilIz-uAIw4_2puaXNkLGgicZYz68xVX69UQ_JgDSNKMLJ/s1600-h/cornell_dl_1%5B4%5D.png"><img alt="cornell_dl_1" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgabbFvNk2vSga1yKqTMLLGMhmWEVXkegI6Y2LRpKWzg8_wYAetCCEV99gGxBZQVa0FZyjYGQmlcFAUuuIolfpfoNZzn96bA_LWoiKIGr1MvsszHITGkezeQmKaqOijxEd-Byn9P0y_fwrZ/?imgmax=800" style="display: inline;" title="cornell_dl_1" width="480" /></a><br />
<br />
Direct Lighting + Indirect VPLs:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEhDpN5YKIAI1b_7DTo9NjO5dWjyie_MJWSr3tr5B2oYZR7ESHBxQUcsC9-zQarnxlAYI77QfFIvAt8RhmHGRDF4D22c0bLgbjgYV9Z24wMB7C_SxomWJAHvPc_FccWHbV8n0ROKJDRRyK/s1600-h/cornell_gi_1%5B4%5D.png"><img alt="cornell_gi_1" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgUOwnype6kSrKeiH-McpghJ6xaRgcEiC7uXbp95A_CFGSCV7hOd-i5mCq7tKDOxIce1fDjKrsinqtxB3AI025BYb4qPWPwGHw-prNo6Eh8cDcpFGHv4Hvk74X1p50107RvhrQVmxVfOIlk/?imgmax=800" style="display: inline;" title="cornell_gi_1" width="480" /></a><br />
<br />
The next set is from the sponza scene. Here too you can notice the red bounced light from the draperies onto the floor and in the ambient lighting in the shadows.<br />
<br />
Direct Lighting:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhuSEKcIC7VxBXqZqVvhyrbwJ4neJEkZ8mcPEK885RBB0QnS7W_6EU0JhHPORDxo5tTt6YSaz2wNbw0trGNc4hzoju7imWNKUwYxvF7HyRpSFtznzxnyCMByXb1PC2T5lrtYYTT4jdPUYMS/s1600-h/sponza_dl_0%5B4%5D.png"><img alt="sponza_dl_0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhzPHEh2tnEqt9RvYh4mAf7vm3gLVAZGWWAI7mCRQ1vtvqb9Ps_cvS6LHS-A-S82_79cDAUvenURVNSULs5iZQvuGVoaZQr8wPwn4iqxIFXfSScIWO321URe0II8Je9NryjlbJSCCepeIEe/?imgmax=800" style="display: inline;" title="sponza_dl_0" width="480" /></a><br />
<br />
Direct Lighting + Indirect VPLs:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRmKIzy15AHSmR5kXu3a7y_A28tCfxf-yEzLs8SU5q1u4b6POONniLoN526x-Gy3EreNxvz6RPt0wIs-1meUI0abvMHxVJmEMTGPhma-pj5FhKlWkKllf_M3A5PdOdMQIcK_qG7jrHu2Gt/s1600-h/sponza_gi_0%5B4%5D.png"><img alt="sponza_gi_0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjO8uvbDYsK0Q6s78bpbdhgGoQj0MKBdB37fdGESDwznRK0wp0svcsif7-iOBqq4DUKz8sR8L8mldQ3n7xFd66-Sw6aGsYEN_YMSIiCA8da4Cl3NR7hYJp6SiPrj3qfG6RC0f32jtZmGFMd/?imgmax=800" style="display: inline;" title="sponza_gi_0" width="480" /></a><br />
<br />
Direct Lighting:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiywR8nyF4Frs1PCPrpNyRrD6mKzeDOZyLtw3pGDecMpVCMYFpLVWWWCDOpbBkOyGcSCaJp-ZM0yDQM197Me4Gj3w4GK9xgsgnUjbg-NG8qs_mEkZ1UWC69Mf8H_OjRaAyHlEh9Rc347ZjN/s1600-h/sponza_nobounce%5B4%5D.png"><img alt="sponza_nobounce" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgMsb2jlucHQukFk8524-HhkPNLPsY438O6CJZoCg56s30VBKaxUotWGhBRTXx_FnlGOSR2yAvLEJHXJCybtHWDwZUG7TGYqNypWMZIw4uoHnavNwclllZOHus1GRk2AuWCswwzuk0x8Kx-/?imgmax=800" style="display: inline;" title="sponza_nobounce" width="480" /></a><br />
<br />
Direct Lighting + Indirect VPLs:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkUwuO_1WrFEDEQczDqbXngqYiSoqHiC1hSP4K8rlUmRrtAYa7pxOIu7BAmDhguifVH78DQbVanoQnpDHTTUMUrfHJu8FOJaCi6jy6Ehtg4SiuWnG8jvxN-fcq9NHoMu4a9jlpkT8P_ajt/s1600-h/spona_bounce%5B4%5D.png"><img alt="spona_bounce" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMp3EjKqRUJwMjyD0B9YVwDFq5TvA1qK5zaxH8QyEwZKIGI4NV8tMwE_7Sfg_oHEnI0tnDh7BHnzeDyQmU3vRPS3oh_8UmhAMytsogKGW8LC72XCNtQd-dtYDN0m5sYytiih2uZkkqyGaZ/?imgmax=800" style="display: inline;" title="spona_bounce" width="480" /></a><br />
<br />
<br />
Direct Lighting:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4d1md6kf2gZuG6xo1QfpXNpgmMdDtU9EC4KHsJoxhqdvbQZCG8z3DiOfvzRsgBgiYRFBnydFbxQyHfEWuosoC86ULweg4Xn17bnnpXqNHOBYCKYfe2VFfZ_YTpT3433oWbxP7GSI7hsUv/s1600-h/sponza_dl%5B5%5D.png"><img alt="sponza_dl" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgRDqKYkEBzbgDqeNecvLaiatDnt9GEo4gm8G_LyCYDWEIoSPcfJU_yNq_Lc4YFIH9pBOMhgODj_9W06b13C5hpZ6nj2mHUrElU5HtqifhuWCKnHmk6wHjf8vlq60azo6nEas7fGBVQkD-K/?imgmax=800" style="display: inline;" title="sponza_dl" width="480" /></a><br />
<br />
Direct Lighting + Indirect VPLs:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDkIzThIhhAnvG3-9yZJb5m0wlu2dUeP-j2gdSCf2-Oyg_db6v2jPRrfpVZBlmgeugDhEXj0uhuJM-i0bP6x8AaqcKpjOTgEIWLYDudyoUnrgnnk86rz3t3WvOaY7r_LAk-9Uuy_CpoeK9/s1600-h/spona_il%5B4%5D.png"><img alt="spona_il" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgsix4LlABcvXRNlpRDlRHk0ezp4qQqFba5AaLZs3XPvZtKzgiw9kBOGhbwFlqRIbwgJPMsVh80ERDbsMfz7qKFvEkEGpNTRssuB3-nFdEmiM3JU7suXjDBdwoDL2ErSX9uzuHkU8louglZ/?imgmax=800" style="display: inline;" title="spona_il" width="480" /></a><br />
<br />
<br />
<span style="color: #f79646;"><b>Notes</b></span><br />
<br />
To Build the demo you’ll need boost 1.43 or later. To run the demo you’ll need at least an Nvidia 8800 series or later ( anything Computer 1.0 compliant ).<br />
<br />
Files of interest are in the Demo project: OptixEntity.cpp and InstantRadiosity.cu.<br />
<br />
Controls:<br />
Show VPLs : L<br />
Toggle GI : I<br />
Toggle Merge VPLs : M<br />
<br />
<br />
<span style="color: #f79646;"><b>Download:</b></span><br />
<br />
Sorry for requiring two download links but skydrive limits file sizes to 50MB<br />
<a href="http://cid-b80a3031b5bfa52b.office.live.com/self.aspx/Public/OptixInstantRadiosity%5E_Part1.zip">OptixInstantRadiosity Part 1</a> - Code<br />
<a href="http://cid-b80a3031b5bfa52b.office.live.com/self.aspx/Public/OptixInstantRadiosity%5E_Part2.zip">OptixInstantRadiosity Part 2</a> - AssetsKyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com9tag:blogger.com,1999:blog-1023441640234597436.post-15040954583406464672011-03-19T16:29:00.002-04:002011-03-19T16:31:18.800-04:00Instant Radiosity with OptixI've been working on a new sample for the past few days, Instant Radiosity using Optix and DirectX. I should have a writeup and sample in the coming week.<br />
<br />
Here’s a few shots with the obligatory cornell and sponza scenes.<br />
<br />
Direct Lighting <br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj_1Et4xXSInWYvbD-oUzr0s0thP7wqSRODmyEmgmbWOOgZN-2FdKzSwt-Bj8tgiEjXhkFFNI3ZVnRxWawV912P1s-e5G1d2iijfBT5VvKRAbAs27ma4f_EA4wB4KDeBG7da_Ym34S1kzXf/s1600-h/cornell_dl_0%5B8%5D.png"><img alt="cornell_dl_0" height="361" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgphD9mTVlmvQlG7Fo3fSdsidGMbG_ohy4PKY-lFdpkGRw5a6GeYNAOjnswmMI2kQ_3gzXPB_Jb4UXA2op8GEA18EJe5hEk2kapvpGgcuYbfcWxA4zYMZv_RZZpbFGRkh3396qRdzU1jFIz/?imgmax=800" style="display: inline;" title="cornell_dl_0" width="480" /></a><br />
<br />
Direct + Indirect VPLs<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEik5qQbnnzXyg4W7vHf2ld3ysZ1gokN7aHcTWtUPISnKwuqy3-lBdnKu_9uUw12l4MG_68QpHjdQi0wk-fOLNNw_NpP0ruXp6KwXcMMbf-kpQQKV5Ch7yHKBnImGLSNn18UW1B_vKx7xhV8/s1600-h/cornell_gi_0%5B4%5D.png"><img alt="cornell_gi_0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiI3pyGcgHy_L_VCelFVdeVXqiMCrZGxD6iX_AdFK_INTX4clyhHcXzq6OMmQpm1oBcLQ5cB9q7Maq0opvvmpY-TuID1mGzPYYAs9Dkw4eSjZhal8gJ2AESJs1C9pv0i3tmSP-dJw6xSvXp/?imgmax=800" style="display: inline;" title="cornell_gi_0" width="480" /></a><br />
<br />
Direct Lighting <br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj31_deBuLWJb_QsIkk7mnVU9Fd5L1TTVrCr1hrSpJDrrhsuZzkdtx-7x5I755tMKIhyphenhyphenPla3E3Mu7S7uZaE6gbGfJKfhyO_rPRvA_aP6LovmIFGJG3RgHdCJ2bK-9OLwIqxf-OGuz850YHR/s1600-h/sponza_dl_0%5B4%5D.png"><img alt="sponza_dl_0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhj1elZyW2VSwlvRyIlG2Q3C8osu7EVMRXVN3qnJjYqpQ3-Rkd5dyJHx8LJiCSO1bnv5awthY71SnzrR_mZWtuZx5yQijMloNT6u7RBIkHmq4B_Z8sxgjUy5jK-6GxTnISMHe3a23ashpql/?imgmax=800" style="display: inline;" title="sponza_dl_0" width="480" /></a><br />
<br />
Direct + Indirect VPLs<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6iWoZeBW1dM2XaiMONeJrOWPNpxy5kQrh5scELx8x8gICs9M4mwqfuqgTZbWyBXfSEUmHtInGXFMMdlyrCo9UdpGUd53a13S0aMqjYfiJDQKT9i2zl-5TBQEpUAIldkEsI35MrxNyCEvJ/s1600-h/sponza_gi_0%5B4%5D.png"><img alt="sponza_gi_0" height="360" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgxiil1oAnpAuAzg6OOYVjStzOLD0-xAKZb7gGsg-hUvYRhFaUsOAT8hZzmUC3MRCiErEvMThiQWNvp3v5ZBsCRmMaHRNLS_G5JNMYjcHNcCxUSBaxoWOzj6FmyeIYesSa4rGSW4YgoK5Wr/?imgmax=800" style="display: inline;" title="sponza_gi_0" width="480" /></a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com1tag:blogger.com,1999:blog-1023441640234597436.post-43390357191626572332011-03-15T20:11:00.002-04:002011-07-28T11:57:03.589-04:00Prey 2 Teaser TrailerHuman Head has been keeping me busy since I've been working there and I can finally say why: <b style="color: orange;">Prey 2</b>. The game was announced on Monday :) Here's the Prey 2 teaser trailer:<br />
<br />
<br />
<object classid="clsid:D27CDB6E-AE6D-11cf-96B8-444553540000" height="418" id="VideoPlayerLg51782" width="480"><param name="movie" value="http://www.g4tv.com/lv3/51782" /><param name="allowScriptAccess" value="always" /><param name="allowFullScreen" value="true" /><embed src="http://www.g4tv.com/lv3/51782" type="application/x-shockwave-flash" name="VideoPlayer" width="480" height="382" allowScriptAccess="always" allowFullScreen="true" /></object><br />
<div style="color: #ff9b00; font-family: Arial,sans-serif; font-size: 12px; margin: 0pt; text-align: center; width: 480px;"><a href="http://www.g4tv.com/games/xbox-360/index" style="color: #ff9b00;" target="_blank">Xbox 360 Games</a> - <a href="http://www.g4tv.com/e32011" style="color: #ff9b00;" target="_blank">E3 2011</a> - <a href="http://www.g4tv.com/games/xbox-360/41320/prey-2" style="color: #ff9b00;" target="_blank">Prey 2</a></div><br />
<a href="http://www.g4tv.com/videos/51782/Prey-2-Debut-Trailer---Exclusive-Premiere/">http://www.g4tv.com/videos/51782/Prey-2-Debut-Trailer---Exclusive-Premiere/</a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com6tag:blogger.com,1999:blog-1023441640234597436.post-28329621334546601922010-08-03T04:23:00.004-04:002015-09-04T10:25:17.808-04:00Animating Water Using Flow Maps<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjAQIrUUWEBHG6cDF-ro50Kbp7I-SUQ-K-H51j47gUQMWmbpCL-3DgXHQ-q0p9fWWx2n9wE6qFInwlByn9QbEWplpNTVpo7eVH9uoVQNUMaLphp3jFS2jTpbuGwA6JmIlYevKgyQgXz942N/s1600-h/flow10.jpg"><img alt="flow" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjDK5S5BH_kgdmnPLyQmeIb91PcbE6w4wNLvw9VI8ARqDXb-ZK_RWrkcWp80PEqd-iA54OnPz1AO38mUtOUcptHYha3P23x_VNnOv6cpJDR18LP6Siv_vKW1ChuGBwMg5zyaGRnS1sCBhZF/?imgmax=800" height="418" style="display: block; float: none; margin-left: auto; margin-right: auto;" title="flow" width="540" /></a> <br />
<br />
Last week I attended SIGGRAPH 2010, and among the many good presentations, Valve game a talk on the simple water shader they implemented for Left For Dead 2 and Portal 2. So on the plane ride back from LA, I whipped up this little sample from what I could remember of the talk. Edit: You can find the talk here: <a href="http://advances.realtimerendering.com/s2010/index.html">http://advances.realtimerendering.com/s2010/index.html</a><br />
<br />
The standard technique for animated water is scrolling normal maps, as I’ve previously written about. The problem with this is that it looks unnatural as water does not uniformly move in one direction. So Valve came up with the idea of using flow maps ( based on a flow viz paper from the mid 90s ). The basic idea of flow maps is that you create a 2D texture that you will map to your water. And this map will contain the flow directions that you want the water to flow, with each pixel in the flow map representing a flow vector. This allows you to have varying velocity ( based on length of the flow vector ), and varying flow directions ( based on the color of the flow vector ). You then use this flow map to alter the texture coordinates of the normal maps instead of scrolling them. Lets get to work :)<br />
<br />
<span style="color: #ff8040; font-size: small;"><b>The Flow Map</b></span><br />
<br />
First we need to create a flow map. Here’s what I came up with in a couple of minutes in Photoshop. This flow map was designed around the column with dragon scene as with the previous scene. Note, this flow map is greatly exaggerated to demonstrate the effect.<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_2R-78OqEszA5FNb8l-tKTFxIcu06LLbwvu6Q3kSmejBbmy58uBP4C5K2OpIu6WxpvCWiUGKNTm4z-csl12GxmmQJzsBykTGTc-eOaoma-9oSkN48qAVYPGNG3zbnxwsAvsNgqyCXK7sA/s1600-h/flowmap4.png"><img alt="flowmap" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhpUoswArdLyR11SO_WRzm7g3GB-Wu0QGo6F45SwqRW_NeSL2hf7XsvPquoGdAeoxNod9y3LcKsTO2CkArKQHUZdSv-yzgZZNppZNTq6Yv4Iol7dyDcux9rB5B8ecJ2eptMQveyWnCur1JE/?imgmax=800" height="480" style="display: block; float: none; margin-left: auto; margin-right: auto;" title="flowmap" width="480" /></a><br />
<span style="color: #ff8040; font-size: small;"><b>Using The Flow Map</b></span><br />
<br />
Now we need to use the flow map to alter the water normal maps. We do this by taking the texture coordinate of the current water pixel and offset it using the flow vector from the flow map based on a time offset. We then render the water as we did in the previous water sample. But there’s a problem with this, after awhile the texture coordinates will become so distorted that the normal maps will be stretched and will have nasty filtering artifacts. So to solve this we limit the amount of distortion of the texture coordinates by resetting the time offset. This solves the over-distortion, but now the water will reset every X seconds. So we introduce another layer, that is offset from the first by half a time cycle. This will ensure that while one layer is fading out and beginning to reset, the next layer is fading to where the last layer was. Here’s a diagram to visualize this phase-in phase-out of the 2 layers.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhLmbav8mYZhX3oeq4ojD4Kuln3nEbTLfoCSQTWwNipc7CahIKLwZIoVTJh3Gp6jc2xHC9wQBJbc86PZxa_tSBDK5h-4I6hD89QGJHlE1-snW0CZsJFhbVgocQDYFTUt7SQ1LnzEu9rpx7I/s1600-h/graph7.jpg"><img alt="graph" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhKRbhGwoclgcJXkBX8VT9iaFRcbwWKqUAFUjhLBkVodN5xzb9IaeYzmWa4i8aQM2wf-HyD1H98vdy9d-YobaYmUa21XN73gAHJfDNp0S6T95zS-vD4Who4ZCRRGOuwWfPvNsM1iFpNSZUh/?imgmax=800" height="236" style="display: block; float: none; margin-left: auto; margin-right: auto;" title="graph" width="540" /></a> <br />
<br />
The graph illustrates that during a cycle time from 0 to 1, we want the layer to be fully interpolated at the mid-point in the cycle, and fully un-interpolated at 0 and 1. Lets see the code:<br />
<pre class="mycode"><span style="color: green;">//get and uncompress the flow vector for this pixel
</span><span style="color: blue;">float2 </span>flowmap = <span style="color: blue;">tex2D</span>( FlowMapS, tex0 ).rg * 2.0f - 1.0f;
<span style="color: blue;">float </span>phase0 = FlowMapOffset0;
<span style="color: blue;">float </span>phase1 = FlowMapOffset1;
<span style="color: green;">// Sample normal map.
</span><span style="color: blue;">float3 </span>normalT0 = <span style="color: blue;">tex2D</span>(WaveMapS0, ( tex0 * TexScale ) + flowmap * phase0 );
<span style="color: blue;">float3 </span>normalT1 = <span style="color: blue;">tex2D</span>(WaveMapS1, ( tex0 * TexScale ) + flowmap * phase1 );
<span style="color: blue;">float </span>flowLerp = ( <span style="color: blue;">abs</span>( HalfCycle - FlowMapOffset0 ) / HalfCycle );
<span style="color: blue;">float3 </span>offset = lerp( normalT0, normalT1, flowLerp );</pre>
In the code above, HalfCycle would be .5 if our cycle was from 0 to 1. We can see here that we unwrap the flow vector (as it is stored in [0,1] and we need it in [-1,1]), fetch the normals using the flow vector and then lerp between the two normals based on the cycle time. This however will lead to a subtle pulsing affect, which I couldn’t really notice when the water was rendered, but I included the fix for completeness. To fix this pulsing effect, we perturb the flow cycle at each pixel using a noise map.<br />
<pre class="mycode"><span style="color: green;">//get and uncompress the flow vector for this pixel
</span><span style="color: blue;">float2 </span>flowmap = <span style="color: blue;">tex2D</span>( FlowMapS, tex0 ).rg * 2.0f - 1.0f;
<span style="color: blue;">float </span>cycleOffset = <span style="color: blue;">tex2D</span>( NoiseMapS, tex0 ).r;
<span style="color: blue;">float </span>phase0 = cycleOffset * .5f + FlowMapOffset0;
<span style="color: blue;">float </span>phase1 = cycleOffset * .5f + FlowMapOffset1;
<span style="color: green;">// Sample normal map.
</span><span style="color: blue;">float3 </span>normalT0 = <span style="color: blue;">tex2D</span>(WaveMapS0, ( tex0 * TexScale ) + flowmap * phase0 );
<span style="color: blue;">float3 </span>normalT1 = <span style="color: blue;">tex2D</span>(WaveMapS1, ( tex0 * TexScale ) + flowmap * phase1 );
<span style="color: blue;">float </span>flowLerp = ( <span style="color: blue;">abs</span>( HalfCycle - FlowMapOffset0 ) / HalfCycle );
<span style="color: blue;">float3 </span>offset = lerp( normalT0, normalT1, flowLerp );</pre>
And that’s pretty much it. I’ll update the post/source when the slides are posted from SIGGRAPH in case I left anything out. Video time!<br />
<br />
<div align="center">
<object height="340" width="560"><param name="movie" value="http://www.youtube.com/v/VlGYImcuQE4&hl=en_US&fs=1?color1=0x3a3a3a&color2=0x999999"></param>
<param name="allowFullScreen" value="true"></param>
<param name="allowscriptaccess" value="always"></param>
<embed src="http://www.youtube.com/v/VlGYImcuQE4&hl=en_US&fs=1?color1=0x3a3a3a&color2=0x999999" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="560" height="340"></embed></object></div>
<br />
Source/Demo: <a href="https://onedrive.live.com/download?cid=B80A3031B5BFA52B&resid=B80A3031B5BFA52B%21246&authkey=AOZzjISSdDuad44">WaterFlow Demo</a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com48tag:blogger.com,1999:blog-1023441640234597436.post-81742559040045315522010-05-06T13:00:00.008-04:002019-11-13T04:43:11.578-05:00Volume Rendering 202: Shadows and TranslucencyFinally, here is the last sample on volume rendering. It’s only taking me a year to get around to finishing it. Is anyone even visiting this page anymore? I’d better post this for my sanity anyhow.<br />
Last time I left you with some basic optimizations, one being a pseudo-empty space skipping. But as I noted, the volumes needed to be sorted in order for it to work completely. We sort the sub-volumes back to front with respect to distance to the camera. This insures that we have a smooth framerate no matter what angle the camera is at. A speedup we can do here is to only sort the volumes if the camera has moved 45 degrees since we last sorted.<br />
So now our subvolumes are sorted w.r.t. the camera. But we have alpha blending artifacts because depending on the view, the pixels of the subvolumes are not drawn in the correct order. What we can do to fix this is to draw a depth only pass, and ensure that we only draw pixels that will contribute to the final image.<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiLAPgmNOM9JZLN-XeKVk60I6Ie8sOFR3RJcbjHRihgxXGEhRCAzwDETuYIGvzoEynzHeSnAmfbEpql4-Qa6f2fk7ZpjRUi9bky9WNaKUZ0GgXXmVyvkxzbfXqimBR9Tbz7TJJfUYEJZBIY/s1600-h/alpha_errors%5B9%5D.png"><img alt="alpha_errors" height="197" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrrovdmQpVTRdGTk0yeRNL5VCwNT_0XTIExH4zJc5_qYWPLF9CLj2ynF590pdmNZeeY38Rs2_3QOG0fdmlND-KO9SG41d3b70EQ6GAQgYX4CinfDDshVWbh-QrzgVMTJymheKWmvyDeIJz/?imgmax=800" style="display: inline;" title="alpha_errors" width="199" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgZaQ1Zc4OaHsZASnjzggmpywDGhF5U0KSAyJuywOPi_2231lBJxJhHdJQT_5zTuHJ5VonqnXTEglCwe5kpZtxPbtFAOrNN2knKfpwVhy_oOgLuXh6gS8hnnWhlsgM7ug9MCY5fkVWC1ER/s1600-h/no_alpha_errors%5B6%5D.png"><img alt="no_alpha_errors" height="197" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi9CNZGjGoIkfghiw_2CtzclXn8oMe2yBJ_MhItNmDt9mJ_dlDcqavj75dysFTgqDytKd688gIG3jIEZc30c9iVO1IVgtkyJKkebs9OVLoOWP0jd5-e2FRbPK860HINYGSLKdQnTwf9qH7V/?imgmax=800" style="display: inline;" title="no_alpha_errors" width="199" /></a> <br />
Left: no depth prepass. Right: depth prepass<br />
<br />
<b><span style="color: #ff8000;">Translucency</span></b><br />
<br />
The first sample includes an approximated translucency. It is far from realistic, but it gives fairly good results. The idea is very similar to depth mapping, compare the current pixels depth to that of the depth map, and either use this value to look up into a texture or perform an exponential falloff in the shader (the sample does the latter).<br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiy6vwaYNqJyMPe5_AlBz0WiHU0oPRnHCAKYiNOVnyfgYb0GR1ee0Jv8m4pObVbFoBRfafEW7QrH67mOeZLpESDqOhsq3-tJtJRz3e9P1Q8KEme9o5sMwKIO74qwkxfcNXwk_SIRgi-0-Lg/s1600-h/translucency%5B4%5D.png"><img alt="translucency" height="308" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhtmS69TxSrhuPnp7XSJJrH4ezTWLOfBLMcAtLu2qecBzEu5KPKyv8mqRYVG2Q2fZp3ViBDdwHyMljj4OfSle_SpRliPQ4-_8yvNMm0jwvs3piF_V51yu7s9xGZ2GROr9CrRonKn-vVl-vP/?imgmax=800" style="display: inline;" title="translucency" width="404" /></a> <br />
<br />
<b><span style="color: #ff8000;">Shadows</span></b><br />
<br />
There isn’t much to say here. The sample below uses variance shadow mapping.<br />
<br />
<b> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5EC1tV8oHezPQycfq9xgnvk6BSsTSf-AFr6m-5LUQkpte8i3v2WobLkSf3BKRjFa8NmhIO4DtYpFRzFAS2iUtiqyTVNk7lmVrxI76z1w9SBCR1ZvkCXwolWA_XwPn_mV7nRZMxesOkPeF/s1600-h/shadow0%5B5%5D.png"><img alt="shadow0" height="316" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjvFVcgJw5OJZx-ERC8f6oFLsaZH5ZKZBui5MX0mw0eTDK3zExrcHzpYxJMmficZ_jkjMEKtPbsZS-ceIt3mzPegzb8TxgPzzHvUUj38BFG8PXuQQAlkPdmeO2MVDofNYjn3Nko-ShosVNH/?imgmax=800" style="display: inline;" title="shadow0" width="404" /></a> </b><br />
<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgjZ9Unrh1vbuJwH2TTAOID0LX5HEiQLxmYi0vJgz206Hh1YJuXeSPfFM6ZOdfBUikW5SZKWbdi3b0kYN6g9gdxhRBVc_TEYXSaTL7PmcF1lIQtVQiV8qceVMKqQoVooxhaZGeWvhVd5KC2/s1600-h/shadow1%5B4%5D.png"><img alt="shadow1" height="397" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjeqmFDlfOMUEmFOK7YMVvjrR0oBKhi05rAG-Dod3muu8CEUt0aYLb4F7GH1_u7zaT5boBIpc9NNR8or9QoRs_p0Xcfl8VYjSW6VEykBtGxed9Hh65c2SZMTWz_crJiqZxoXmlZBATGmnYy/?imgmax=800" style="display: inline;" title="shadow1" width="404" /></a> <br />
<div class="wlWriterEditableSmartContent" id="scid:5737277B-5D6D-4f48-ABFC-DD9C333F4C5D:14f4c2da-c8ff-4d61-bc9b-f9bcbe527cba" style="display: inline; float: none; margin: 0px; padding: 0px;">
<div id="b93d9547-b511-447a-a9d0-6fecec5c0d2b" style="display: inline; margin: 0px; padding: 0px;">
<br />
<br />
<div>
<iframe allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen="" frameborder="0" height="315" src="https://www.youtube.com/embed/FFLzYKEjcGA" width="560"></iframe>
</div>
</div>
</div>
<br />
Well, there it is. Anticlimactic wasn't it?<br />
<br />
<iframe frameborder="0" height="120" scrolling="no" src="https://onedrive.live.com/embed?cid=B80A3031B5BFA52B&resid=B80A3031B5BFA52B%21241&authkey=ACBIfz0QzPT95zE" width="98"></iframe>
<iframe frameborder="0" height="120" scrolling="no" src="https://onedrive.live.com/embed?cid=B80A3031B5BFA52B&resid=B80A3031B5BFA52B%21240&authkey=AEsA3_NToQEocXM" width="98"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com6tag:blogger.com,1999:blog-1023441640234597436.post-65223357329842799832010-05-05T22:15:00.004-04:002010-06-15T11:51:08.903-04:00Ground control to Major Tom<p>Wow, it’s been over a year since the last post on volume rendering! I must sound like a broken record. Anyhow, I’ve had time to fix a couple of bugs with the last installment in the past couple of weeks and it should be coming online pretty soon.</p><p>So why have I been absent lately? Last spring I was recruited to work on American Sign Language teaching software for Purdue University. The project ranged from database implementation, to layered skeletal animation with additive blending support and facial animation, to creating a language and compiler for ASL scripts (Antlr was amazing for this). Also, our paper was accepted at SIGGRAPH in the Education section.</p><p>On top of that I accepted a job at Human Head Interactive in January as a tech programmer ( these ramblings actually paid off :) ). I’m really excited to be working with some smart and talented people. We have some cool rendering tech – thanks to our <a href="http://graphicrants.blogspot.com/" target="_blank">lead graphics programmer</a> – and pretty slick game play ideas.<br />
</p><p><span style="font-style: italic;">Also, sorry to anyone who has commented on a post and it hasn't been posted, I've been spammed by bots for awhile now.</span><br />
</p>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com0tag:blogger.com,1999:blog-1023441640234597436.post-22934239582634402452010-04-10T01:38:00.012-04:002010-04-10T01:50:43.185-04:00Water for your monies?I got an email a couple of weeks ago from someone ( <a href="http://maximinus.fr/missile.escape.html">Maximinus</a> ) who actually put the water shader from the water game component to good use. Here's a description of the game on the xbox indie marketplace:<br /><br /><span style="font-style: italic;">Missile Escape for Xbox Indies is simple : go flying, evade many</span><br /><span style="font-style: italic;">missiles and unlock new fighters along the way ! Warning : </span><span style="border-bottom: 1px dashed rgb(0, 102, 204); cursor: pointer; font-style: italic;" class="yshortcuts" id="lw_1270878010_0">Fighter<br />pilot</span><span style="font-style: italic;"> spirit required.</span><br /><br /><object height="320" width="440"><param name="movie" value="http://www.youtube.com/v/EeCAmJ_il6I&hl=en_US&fs=1&rel=0"><param name="allowFullScreen" value="true"><param name="allowscriptaccess" value="always"><embed src="http://www.youtube.com/v/EeCAmJ_il6I&hl=en_US&fs=1&rel=0" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" height="320" width="440"></embed></object>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com1tag:blogger.com,1999:blog-1023441640234597436.post-84094452551331741542009-08-27T02:45:00.003-04:002009-08-27T02:49:54.690-04:00Particle Spectrum AnimationReally cool video I found last night. Uses Particular and Adobe After Effects.<br /><br /><object width="450" height="253"><param name="allowfullscreen" value="true"><param name="allowscriptaccess" value="always"><param name="movie" value="http://vimeo.com/moogaloop.swf?clip_id=6045312&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=00ADEF&fullscreen=1"><embed src="http://vimeo.com/moogaloop.swf?clip_id=6045312&server=vimeo.com&show_title=1&show_byline=1&show_portrait=0&color=00ADEF&fullscreen=1" type="application/x-shockwave-flash" allowfullscreen="true" allowscriptaccess="always" width="450" height="253"></embed></object>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com4tag:blogger.com,1999:blog-1023441640234597436.post-60444070683296845812009-07-30T18:04:00.003-04:002009-07-30T21:15:02.687-04:00Water in Your Browser<p>Recently I’ve been playing around with O3D. If you don’t know, O3D is Google’s new browser graphics API. It enables you to develop 3d interactive applications that run inside a browser window (and quite easily mind you). In fact it rivals XNA on getting an app up and running quickly.</p> <p>And on that note, I’ve ported the water sample to O3D (minus the camera animation). Besides a bug I encountered with the sample content converter (and promptly fixed by one of the o3d developers), it was a relatively painless conversion. All that was required was to setup a scene in max, apply materials, export to collada and convert to the o3d format. Setting up the render targets also took minimal effort :). The shaders, for the most part, remained untouched.</p> <p>Click the picture to have a go!</p> <p><a href="http://www.kylehayward.com/o3d/WaterScene.html"><img style="display: inline;" title="waterscene" alt="waterscene" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhcZ9YClu_Pzg3Uf6PLj0zjp7WqBtIW4KDCEVCU9x94pm2dVm9Zdas1qqOqioEXbOk8wqZXEzlI8mnO6RK0aOWM6FoyuHw9vPHJGCZL26P6chFUBXWg9k3WBlWcpXIa0Bjg1V8fd5w_ryuU/?imgmax=800" height="254" width="429" /></a> </p> <p>If you’re interested in the max file or source you can get both here:</p> <iframe style="border: 1px solid rgb(221, 229, 233); margin: 3px; padding: 0px; background-color: rgb(255, 255, 255); width: 240px; height: 26px;" marginheight="0" src="http://cid-b80a3031b5bfa52b.skydrive.live.com/embedrow.aspx/Public/WaterSceneO3D.zip" marginwidth="0" frameborder="0" scrolling="no"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com9tag:blogger.com,1999:blog-1023441640234597436.post-34098923372438922212009-06-19T00:58:00.009-04:002009-06-19T01:27:49.428-04:00Non-photorealistic SSGII recently got back from my vacation to the Mediterranean and finally have had time to post this video.<br /><br />For one of my final projects last semester I implemented a deferred renderer with SSAO and simplified global illumination (SSGI). But I wanted to have a dream-like result, so I over emphasized the color bleeding and used the light accumulation buffer plus emissive color of objects; and lots of blurring. The result looks pretty neat I think, not at all accurate lighting but cool none the less. This one was done in C++/DirectX instead of the usual XNA fare.<br /><br /><a href="http://www.youtube.com/watch?v=YAnG2PB1qiI&fmt=22">Large HD video on youtube</a><br /><br /><object height="295" width="480"><param name="movie" value="http://www.youtube.com/v/YAnG2PB1qiI&hl=en&fs=1&rel=0&color1=0x3a3a3a&color2=0x999999&hd=1"><param name="allowFullScreen" value="true"><param name="allowscriptaccess" value="always"><embed src="http://www.youtube.com/v/YAnG2PB1qiI&hl=en&fs=1&rel=0&color1=0x3a3a3a&color2=0x999999&hd=1" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" height="295" width="480"></embed></object>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com10tag:blogger.com,1999:blog-1023441640234597436.post-30050170709773452642009-04-23T03:05:00.003-04:002009-04-23T03:25:14.152-04:00Volume Rendering Update, DirectX blogWow, I can't believe it's almost May already. I also can't believe my last post was back in February! I've had the next installment in the volume rendering series basically done since January. I just have to put some finishing touches on it and write the post. I've just been very busy finishing my last semester, research, searching for a job, and considering graduate school.<br /><br />I've been working on a deferred renderer and delving into screen space ambient occlusion and global illumination for one of my final projects. I'll probably post some images of the final result when it's done.<br /><br /><span class="largefont">Ysaneya </span>has a <span style="font-weight: bold;">very </span>interesting post on <a href="http://www.gamedev.net/community/forums/mod/journal/journal.asp?jn=263350&reply_id=3432126">deferred rendering and instant radiosity</a>.<br /><br />Also, if you haven't noticed, the DirectX team has a new blog: <a href="http://blogs.msdn.com/DirectX/">http://blogs.msdn.com/DirectX/</a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com2tag:blogger.com,1999:blog-1023441640234597436.post-17652810817073237482009-02-04T22:42:00.007-05:002019-11-13T04:39:03.652-05:00Volume Rendering 201: OptimizationsThe discussion on hand this time is optimizing the performance of the volume renderer. I’ll cover a few of these optimizations and provide a rough implementation of one of them. <br />
<strong><span style="color: #ff6633;">Cache Efficiency and Memory Access</span></strong> <strong> </strong><br />
<strong><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiMZ0hFatGVKdiio6VhUA3WKBqXpWLa_S2DJzH7MwA_wUfo0T2MFR7FoOHxDdsP5DdVL3oug-HYFqQ2t8-VCQCuKGUMh5GBEKOGfqidIqbNDS0_6ZxgD1m_s0DS5f15f9eNiku1iqkyLdb5/s1600-h/memory5.jpg"><img alt="memory" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtRNKdGUipnlF9bthqnRfu4r48-Qy38tA0-cutwMSLj7e5TVGbZogLNPEq4_dHeDVlCF3k_1RuWU4klfn5vFuRY6ABsYT5AHmBOlQsto7UtqcgoOInaz6_CO5QTAVU5iJ7hGdBSM5rDCbk/?imgmax=800" height="240" style="display: inline;" title="memory" width="402"></a></strong><br />
Picture from <em>Real-time Volume Graphics.</em> <br />
Currently we load our volume data into a linear layout in memory. However, the ray that is cast through the volume is not likely to access neighboring voxels as we traverse it through the volume when the data is in a linear layout. But we can improve the cache efficiency by converting the layout to a block-based manor through swizzling. With the data in a block format, the GPU is more likely to cache neighboring voxels as we walk through the volume, which will lead to an increase in memory performance. <br />
<strong><span style="color: #ff6633;"></span></strong><strong><span style="color: #ff6633;">Empty-Space Leaping</span></strong> <br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjIX6eeA96raGn81brAzJuiHoMWg2TYPMBfRou_pVnyR2jcs0vfdKWFVeALltb0Lz4vWV169Ve8Dp1wN5sZqSOeRubidoXtmqaJZMZssMTIneAK8v2fR0SbdL5SVpbpMf6lNYYgikS1xPCN/s1600-h/esl04.png"><img alt="esl0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiZjEfjoUxaPgPeRb40icxD66gGLkV_jl0KbuV13IBquFss3suj93thMmmMBWjxw7gF98uQ7p55bjy6C1q1NBfv6TOTkb6M8tADQJoA5au4cWCh3uLX6YjbYH6ojuipjAh3gKyu4bOY39ro/?imgmax=800" height="296" style="display: inline;" title="esl0" width="394"></a><br />
In the previous samples we ray-casted against the entire bounding volume of the data set, even if we were just sampling samples with zero alpha along the way. But we can skip these samples all together, and only render parts of the volume that have a non-zero alpha in the transfer function. More on this in a bit. <br />
<strong><span style="color: #ff6633;"></span></strong><strong><span style="color: #ff6633;">Occlusion Culling</span></strong> <br />
If we render the volume in a block-based fashion as above, and sort the blocks from front to back, we can use occlusion queries to determine which blocks are completely occluded by blocks in front of them. There are quite a few tutorials on occlusion queries on the net, including this one at <a href="http://www.ziggyware.com/readarticle.php?article_id=234" target="_blank">ziggyware</a>. <br />
<strong><span style="color: #ff6633;">Deferred Shading</span></strong> <br />
<strong><span style="color: #ff6633;"></span></strong>We can also boost performance by deferring the shading calculations. Instead shading every voxel during the ray-casting, we can just output the depth and color information into off-screen buffers. Then we render a full-screen quad and use the depth information to calculate normals in screen space and then continue to calculate the shading. Calculating normals this way also has the advantage of being smoother and have less artifacts that computing the gradients of the volume and storing them in a 3D texture. We also save memory this way since we don’t have to save the normals, only the isovalues in the 3D texture. <br />
<strong><span style="color: #ff6633;">Image Downscaling</span></strong> <br />
If the data we are rendering is low frequency (e.g. volumetric fog), we can render the volume into an off-screen buffer that is half the size of the window. Then we can up-scale this image during a final pass. This method is also included in the sample. <br />
<strong><span style="color: #ff6633;"></span></strong><strong><span style="color: #ff6633;">Implementing Empty-Space Leaping</span></strong> <br />
To implement empty-space leaping we need to subdivide the volume into smaller volumes and we also need to determine if these smaller volumes have an opacity greater than zero. To subdivide the volume we follow an approach very similar to quadtree or octree construction. We start out with an original volume from [0, 0, 0] to [1, 1, 1]. The volume is then recursively subdivided until the volume width is say .1 (so we basically divide the volume along each dimension by 10). Here’s how we do that: <br />
<pre class="mycode"><span style="color: blue;">private void </span>RecursiveVolumeBuild(<span style="color: #2b91af;">Cube </span>C)
{
<span style="color: green;"> //stop when the current cube is 1/10 of the original volume
</span><span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>if </span>(C.Width <= 0.1f)
<span class="Apple-style-span" style="color: green;"> </span>{
<span style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;">//add the min/max vertex to the list</span><span style="color: #2b91af;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: #2b91af;">Vector3 </span>min = <span style="color: blue;">new </span><span style="color: #2b91af;">Vector3</span>(C.X, C.Y, C.Z);<span style="color: #2b91af;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span style="color: #2b91af;"><span style="color: green;"> </span>Vector3 </span>max = <span style="color: blue;">new </span><span style="color: #2b91af;">Vector3</span>(C.X + C.Width, C.Y + C.Height, C.Z + C.Depth);<span style="color: #2b91af;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: #2b91af;">Vector3 </span>scale = <span style="color: blue;">new </span><span style="color: #2b91af;">Vector3</span>(mWidth, mHeight, mDepth);<span style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;">//additively sample the transfer function and check if there are any</span><span style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> //samples that are greater than zero</span><span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">float </span>opacity = SampleVolume3DWithTransfer(min * scale, max * scale);<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">if</span>(opacity > 0.0f)<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>{ <span class="Apple-style-span" style="color: green;"></span><span style="color: #2b91af;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: #2b91af;">BoundingBox </span>box = <span style="color: blue;">new </span><span style="color: #2b91af;">BoundingBox</span>(min, max);<span style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> //add the corners of the bounding box</span><span style="color: #2b91af;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: #2b91af;">Vector3</span>[] corners = box.GetCorners();<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">for </span>(<span style="color: blue;">int </span>i = 0; i < 8; i++)<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>{<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span style="color: #2b91af;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: #2b91af;">VertexPositionColor </span>v;<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>v.Position = corners[i];<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>v.Color = <span style="color: #2b91af;">Color</span>.Blue;<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>mVertices.Add(v);<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>}<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>}<span class="Apple-style-span" style="color: green;"></span><span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span>
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>return</span>;
}
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>float </span>newWidth = C.Width / 2f;
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>float </span>newHeight = C.Height / 2f;
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>float </span>newDepth = C.Depth / 2f;
<span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">SubGrid r c d
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Front:
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Top-Left : 0 0 0
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Top-Right : 0 1 0
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Bottom-Left : 1 0 0
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Bottom-Right: 1 1 0
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Back:
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Top-Left : 0 0 1
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Top-Right : 0 1 1
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Bottom-Left : 1 0 1
</span><span style="color: grey;"><span class="Apple-style-span" style="color: green;"> </span>/// </span><span style="color: green;">Bottom-Right: 1 1 1
</span><span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>for </span>(<span style="color: blue;">float </span>r = 0; r < 2; r++)
<span class="Apple-style-span" style="color: green;"> </span>{
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">for </span>(<span style="color: blue;">float </span>c = 0; c < 2; c++)<span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>{
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">for </span>(<span style="color: blue;">float </span>d = 0; d < 2; d++)<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>{<span class="Apple-style-span" style="color: green;"></span><span style="color: #2b91af;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: #2b91af;">Cube </span>cube = <span style="color: blue;">new </span><span style="color: #2b91af;">Cube</span>(C.Left + c * (newWidth),<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>C.Top + r * (newHeight),<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>C.Front + d * (newDepth),<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>newWidth,<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>newHeight,<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>newDepth);<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="font-family: "times new roman"; white-space: normal;"> <span style="color: green;"> </span> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>RecursiveVolumeBuild(cube);<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>}<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>}
<span class="Apple-style-span" style="color: green;"> </span>}
}</pre>
To determine whether a sub-volume contains any samples that have opacity, we simply loop over the volume and additively sample the transfer function: <br />
<pre class="mycode"><span style="color: blue;">private float </span>SampleVolume3DWithTransfer(<span style="color: #2b91af;">Vector3 </span>min, <span style="color: #2b91af;">Vector3 </span>max)
{
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>float </span>result = 0.0f;
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>for </span>(<span style="color: blue;">int </span>x = (<span style="color: blue;">int</span>)min.X; x <= (<span style="color: blue;">int</span>)max.X; x++)
<span class="Apple-style-span" style="color: green;"> </span>{
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">for </span>(<span style="color: blue;">int </span>y = (<span style="color: blue;">int</span>)min.Y; y <= (<span style="color: blue;">int</span>)max.Y; y++)<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>{<span class="Apple-style-span" style="color: green;"></span><span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">for </span>(<span style="color: blue;">int </span>z = (<span style="color: blue;">int</span>)min.Z; z <= (<span style="color: blue;">int</span>)max.Z; z++)<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>{<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;">//sample the volume to get the iso value</span><span style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;">//it was stored [0, 1] so we need to scale to [0, 255]</span><span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span style="color: blue;">int </span>isovalue = (<span style="color: blue;">int</span>)(sampleVolume(x, y, z) * 255.0f);<span style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;">//accumulate the opacity from the transfer function</span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>result += mTransferFunc[isovalue].W * 255.0f;<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>}<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>}<span class="Apple-style-span" style="color: green;"></span><span class="Apple-style-span" style="color: green;"> </span>
<span class="Apple-style-span" style="color: green;"> </span>}
<span style="color: blue;"><span class="Apple-style-span" style="color: green;"> </span>return </span>result;
}</pre>
Depending on the transfer function (a lot of zero opacity samples), this method can increase our performance by 50%.<br />
<strong><span style="color: #ff6633;"></span></strong><strong><span style="color: #ff6633;">Problems</span></strong><br />
<strong><span style="color: #ff6633;"></span></strong>Now, a problem that this method introduces is overdraw. You can see the effects of this when rotating the camera to view the back of the bear; here the frame rate drops considerably. To remedy this the sub-volumes need to be sorted front to back by their distance to the camera each time the view changes. I’ve left this as an exercise for the reader. The new demo implements empty space leaping and downscaling. And when rendering the teddy bear volume frame rates on my 8800GT are at about <strong>190 FPS</strong>. Compare this to the the last demo from 102 at 30 FPS. All at a resolution of 800x600. Pretty good results! Next time I’ll be introducing soft shadows and translucent materials. <br />
<iframe frameborder="0" height="120" scrolling="no" src="https://onedrive.live.com/embed?cid=B80A3031B5BFA52B&resid=B80A3031B5BFA52B%21196&authkey=AF6XD9cnp12Laq0" width="98"></iframe>
<br />
References: <a href="http://www.amazon.com/Real-time-Graphics-Markus-Hadwiger/dp/1568812663">Real-time Volume Graphics</a><br />
<pre></pre>
Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com15tag:blogger.com,1999:blog-1023441640234597436.post-82438993558354013502009-01-20T18:59:00.003-05:002019-11-13T04:37:39.096-05:00Volume Rendering 102: Transfer FunctionsLast time I introduced the concept of volume ray-casting. And to follow that up, today I'll talk about transfer functions and shading.<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgw5U07jj1ibGiWHZ2eCrAx5JIp2D9bwk-biYC57BsFiaQbxdr3hXvXpZZ5IOZQv2gAmJ-LeP5CEuMhfVhyGPQTafajMns536FpkMciNQFbVfyRtg-UOxBEW6Sp3XTxXnqS-eV3ALtQ7-2c/s1600-h/transfer%5B3%5D.png"><img alt="transfer" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiixcSeJMLCvo9gm5bDlHGRht4SO9vEzpbupSGr9SG5p1R8Y_rOD92WzbebyjMkIXB1QhQfFJWb_J5smDmiDyzu2Hx6YXaOiEPeD3NtOuCztz0v9dgG75hClgC52ZcIVmrzVcOmITPZPvP2/?imgmax=800" height="240" width="183"></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8TPyW_INBtBvgkQ6kf3Hc_UknoJlNJs4u4V1lsGr-l8WgcFx0Xfv6ChIzTvi2odtWH2nu9vHI2z7RtAXRMLjE03TcNtWLHEpc3VyA4i2nDcYrxZSnrqYKnt3NK-vTOOnWbHsYwlxBZZ2a/s1600-h/shade0%5B3%5D.png"><img alt="shade0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi50qchWdGd4fS9-8ROX0Le4YVTuvjmDDLO_-MYb_5D0UqgLmepGKp3mgfc5SrZNos7Bfgg9BOWTnATmeBQtyYkfMvbasBuDk_zzCXEdsv4qHMHVhJMANRFKMY5ck70bCvlfdD0PladM35q/?imgmax=800" height="240" width="206"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjNubqbnu0chjaiqdqlEVYrk25cglHQRW03WMry6x8YXNHxt71lVRmmSA0ai0rEr4U2T2uSpK-MzlcKacFIDhABMKIZrceuftwA4N9iZT4uiSCHF9JfAFVARGa6pb12P4e6d1z4pdmzPqEQ/s1600-h/male_noshadew0%5B13%5D.png"><img alt="male_noshadew0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_bYKg4CpHHhiRfVVVXbS0vfMmx5KdEOs-Qu4uEI9ZDIqLtsNg_5bf2BqhUDaiAbR2bxZCFnVjYFAXVijnAUD6ly5vWGnqsAkZmueKn5tBANH6lD-x0P0vVGISoJANaXCg02ufXyalTi4t/?imgmax=800" height="195" width="197"></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEin3oMWoYZ0JDusrziSpef0-bLku9KnDGk-GLLProaX0CYCDhyphenhypheny0lmFxi9pQV4hiBzWhDz67bd1ucvqeH3Y9GWOBTEfOnEIBBDmvQ4eSpsoKvwx3Zhf6Hqd5_YamUO7vdQ59kSF210WNxhO/s1600-h/male_shadeb0%5B7%5D.png"><img alt="male_shadeb0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMXQ5Pfq6iaT-2w8Svel7_O3h-RDci3sFcq1QmC7Cb4-6Dw2fXBXia2qIPwwCghIi4j-IjPSzx1S0X0vIh5A1XOYA-oKSWURoRvgnHiPipcND0ng89dpDp8JsAKcHIocWWFmmlCUhDH54X/?imgmax=800" height="196" width="205"></a><br />
Mummy (top) and Male (bottom) volumes colored with a transfer function (left) and shaded (right).<br />
A transfer function is used to assign RGB and alpha values for every voxel in the volume. A 1D transfer function maps one RGBA value for every isovalue [0, 255]. Multi-dimensional transfer functions allow multiple RGBA values to be mapped to a single isovalue. These are however out of the scope of this tutorial, so I will just focus on 1D transfer functions.<br />
The transfer function is used to "view" a certain part of the volume. As with the second set of pictures above, there is a skin layer/material and there is a skull layer/material. A transfer function could be designed to just look at the skin, the skull, or both (as is pictured). There are a few different ways of creating transfer functions. One is two manually define the transfer functions by specifying the RGBA values for the isovalues (what we will be doing), and another is through visual controls and widgets. Manually defining the transfer function is a lot like guess work and takes a bit of time, but is the easiest way to get your feet wet. Visually designing transfer functions is the easiest way to get good results quickly as it happens at run-time. But this method is quite complex to implement (at least for a tutorial).<br />
<strong><span style="color: #ff8000;">Creating the transfer function:</span></strong><br />
To create a transfer function, we want to define the RGBA values for certain isovalues (control points or control knots) and then interpolate between these values to produce a smooth transition between layers/materials. Our transfer function will result in a 1D texture with a width of 256.<br />
First we have the <span style="color: #ff8000;">TransferControlPoint </span><span style="color: black;">class. This class takes an RGB color or alpha value for a specific isovalue.</span><br />
<pre class="mycode"><span style="color: blue;">public class </span><span style="color: #2b91af;">TransferControlPoint</span>{
<span style="color: blue;">public </span><span style="color: #2b91af;">Vector4 </span>Color;
<span style="color: blue;">public int </span>IsoValue;
<span style="color: grey;">/// <summary>
/// </span><span style="color: green;">Constructor for color control points.
</span><span style="color: grey;">/// </span><span style="color: green;">Takes rgb color components that specify the color at the supplied isovalue.
</span><span style="color: grey;">/// </summary>
/// <param name="x"></param>
/// <param name="y"></param>
/// <param name="z"></param>
/// <param name="isovalue"></param>
</span><span style="color: blue;">public </span>TransferControlPoint(<span style="color: blue;">float </span>r, <span style="color: blue;">float </span>g, <span style="color: blue;">float </span>b, <span style="color: blue;">int </span>isovalue)
{
Color.X = r;
Color.Y = g;
Color.Z = b;
Color.W = 1.0f;
IsoValue = isovalue;
}
<span style="color: grey;">/// <summary>
/// </span><span style="color: green;">Constructor for alpha control points.
</span><span style="color: grey;">/// </span><span style="color: green;">Takes an alpha that specifies the aplpha at the supplied isovalue.
</span><span style="color: grey;">/// </summary>
/// <param name="alpha"></param>
/// <param name="isovalue"></param>
</span><span style="color: blue;">public </span>TransferControlPoint(<span style="color: blue;">float </span>alpha, <span style="color: blue;">int </span>isovalue)
{
Color.X = 0.0f;
Color.Y = 0.0f;
Color.Z = 0.0f;
Color.W = alpha;
IsoValue = isovalue;
}
}</pre>
This class will represent the control points that we will interpolate. I've added two lists to the <span style="color: #ff8000;">Volume</span> class, mAlphaKnots and mColorKnots. These will be the list of transfer control points that we will setup and interpolate to produce the transfer function. To produce the result for the Male dataset above, here are the transfer control points that we will define:<br />
<pre class="mycode">mesh.ColorKnots = <span style="color: blue;">new </span><span style="color: #2b91af;">List</span><<span style="color: #2b91af;">TransferControlPoint</span>> {
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(.91f, .7f, .61f, 0),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(.91f, .7f, .61f, 80),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(1.0f, 1.0f, .85f, 82),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(1.0f, 1.0f, .85f, 256)
};
mesh.AlphaKnots = <span style="color: blue;">new </span><span style="color: #2b91af;">List</span><<span style="color: #2b91af;">TransferControlPoint</span>> {
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(0.0f, 0),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(0.0f, 40),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(0.2f, 60),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(0.05f, 63),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(0.0f, 80),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(0.9f, 82),
<span style="color: blue;">new </span><span style="color: #2b91af;">TransferControlPoint</span>(1f, 256)
}; </pre>
You need to specify <strong>at least two</strong> control points at isovalues <strong>0</strong> and <strong>256</strong> for both alpha and color. Also the control points need to be ordered (low to high) by the isovalue. So the first entry in the list should always be the RGB/alpha value for the zero isovalue, and the last entry should always be the RGB/alpha value for the 256 isovalue. The above list of control points produce the following transfer function after interpolation:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpbzYo5DyZ-Hlc7rHH5NblszTBrhcm2gk_vsp0oh1aR_plGQLURxjNxtJeXuViHhP1B3mZps3Dky-H6E65HT6Iz4TZCWF3RjEQVpBTMtBAXw-WoeMMlbB-whOQsQf0u8HSRyufUaA845I3/s1600-h/transfer_func%5B5%5D.png"><img alt="transfer_func" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhJha23FvIS4YutFn4X0T6cvdq68qR056x5BghbCnbCRglkdSJGcAWd4UGr5jXuFnczzRngJJilMZSL86uuOihnDvxLu2aKk6DRdYUHdi1_k1TnYxTP0Tm1OBAPySzHzuDyy4LoGZY2d6tL/?imgmax=800" height="26" style="display: inline;" title="transfer_func" width="412"></a><br />
So we have defined a range of color for the skin and a longer range of color for the skull/bone. <br />
But how do we interpolate between the control points? We will fit a cubic spline to the control points to produce a nice smooth interpolation between the knots. For a more in-depth discussion on cubic splines refer to my <a href="http://graphicsrunner.blogspot.com/2008/05/camera-animation-part-ii.html" target="_blank">camera animation tutorial</a>.<br />
Here is a simple graph representation of the spline that is fit to the control points.<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi59BSip8Lgz5o2gkg18TjIjaSzC2xIOk2blvGgsZLwGeS2IavnzB2xJapViS2cet0YLqMkHWnifPo4M35vnwG6B3pcaV3pvWu0-bKWZ93Zz267PNtkVJPMVOArqfr0j7y0yUFxGGu2P3oJ/s1600-h/transfer_graph6.png"><img alt="transfer_graph" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg1qb-SCow4ibdMg4JeE-XhosvcNu-YFevfZcOtNRYym4-eLcCTarYDjNNglj_gLM2iVwn6xzoDw1hEE8pKM_h6Ug-Hjnsh6meV-uhzDLl6B1LMjGvt_qHJXShcG4xNO46jjPm0ix6-lI_o/?imgmax=800" height="385" width="425"></a> <br />
<strong><span style="color: #ff8000;">Using the transfer function:</span></strong><br />
<span style="color: black;">So how do we put this transfer function to use? First we set it to a 1D texture and upload it to the graphics card. Then in the shader, we simply take the isovalue sampled from the 3d volume texture and use that to index the transfer function texture.</span><br />
<pre class="code">value = <span style="color: blue;">tex3Dlod</span>(VolumeS, pos);
src = <span style="color: blue;">tex1Dlod</span>(TransferS, value.a);</pre>
Now we have the color and opacity of the current sample. Next, we have to shade it. While this is simply diffuse shading I will go over how to calculate gradients (aka normals) for the 3D volume.<br />
<strong><span style="color: #ff8000;">Calculating Gradients:</span></strong><br />
The method we will use to calculate the gradients is the central differences scheme. This takes the last and next samples of the current sample to calculate the gradient/normal. This can be performed at run-time in the shader, but as it requires 6 extra texture fetches from the 3D texture, it is quite slow. So we will calculate the gradients and place them in the RGB components of our volume texture and move the isovalue to the alpha channel. This way we only need one volume texture for the data set instead of two: one for the gradients and one for the isovalues.<br />
Calculating the gradients is pretty simple. We just loop through all the samples and find the difference between the next and previous sample to calculate the gradients:<br />
<pre class="mycode"><span style="color: grey;">/// <summary>
/// </span><span style="color: green;">Generates gradients using a central differences scheme.</span><span style="color: grey;">/// </summary>
/// <param name="sampleSize"></span><span style="color: green;">The size/radius of the sample to take.</span><span style="color: grey;"></param></span><span style="color: blue;">private void </span>generateGradients(<span style="color: blue;">int </span>sampleSize)
{
<span style="color: blue;">int </span>n = sampleSize;
<span style="color: #2b91af;">Vector3 </span>normal = <span style="color: #2b91af;">Vector3</span>.Zero;
<span style="color: #2b91af;">Vector3 </span>s1, s2;
<span style="color: blue;">int </span>index = 0;
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>z = 0; z < mDepth; z++)
{
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>y = 0; y < mHeight; y++)
{
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>x = 0; x < mWidth; x++)
{
s1.X = sampleVolume(x - n, y, z);
s2.X = sampleVolume(x + n, y, z);
s1.Y = sampleVolume(x, y - n, z);
s2.Y = sampleVolume(x, y + n, z);
s1.Z = sampleVolume(x, y, z - n);
s2.Z = sampleVolume(x, y, z + n);
mGradients[index++] = <span style="color: #2b91af;">Vector3</span>.Normalize(s2 - s1);
<span style="color: blue;">if </span>(<span style="color: blue;">float</span>.IsNaN(mGradients[index - 1].X))
mGradients[index - 1] = <span style="color: #2b91af;">Vector3</span>.Zero;
}
}
}
}</pre>
Next we will filter the gradients to smooth them out and prevent any high irregularities. We achieve this by a simple NxNxN cube filter. A cube filter simply averages the surrounding N^3 - 1 samples. Here's the code for the gradient filtering:<br />
<pre class="mycode"><span style="color: grey;">/// <summary>
/// </span><span style="color: green;">Applies an NxNxN filter to the gradients.</span><span style="color: grey;">/// </span><span style="color: green;">Should be an odd number of samples. 3 used by default.</span><span style="color: grey;">/// </summary>
/// <param name="n"></param></span><span style="color: blue;">private void </span>filterNxNxN(<span style="color: blue;">int </span>n)
{
<span style="color: blue;">int </span>index = 0;
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>z = 0; z < mDepth; z++)
{
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>y = 0; y < mHeight; y++)
{
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>x = 0; x < mWidth; x++)
{
mGradients[index++] = sampleNxNxN(x, y, z, n);
}
}
}
}
<span style="color: grey;">/// <summary>
/// </span><span style="color: green;">Samples the sub-volume graident volume and returns the average.</span><span style="color: grey;">/// </span><span style="color: green;">Should be an odd number of samples.</span><span style="color: grey;">/// </summary>
/// <param name="x"></param>
/// <param name="y"></param>
/// <param name="z"></param>
/// <param name="n"></param>
/// <returns></returns></span><span style="color: blue;">private </span><span style="color: #2b91af;">Vector3 </span>sampleNxNxN(<span style="color: blue;">int </span>x, <span style="color: blue;">int </span>y, <span style="color: blue;">int </span>z, <span style="color: blue;">int </span>n)
{
n = (n - 1) / 2;
<span style="color: #2b91af;">Vector3 </span>average = <span style="color: #2b91af;">Vector3</span>.Zero;
<span style="color: blue;">int </span>num = 0;
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>k = z - n; k <= z + n; k++)
{
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>j = y - n; j <= y + n; j++)
{
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>i = x - n; i <= x + n; i++)
{
<span style="color: blue;">if </span>(isInBounds(i, j, k))
{
average += sampleGradients(i, j, k);
num++;
}
}
}
}
average /= (<span style="color: blue;">float</span>)num;
<span style="color: blue;">if </span>(average.X != 0.0f && average.Y != 0.0f && average.Z != 0.0f)
average.Normalize();
<span style="color: blue;">return </span>average;
}</pre>
This is a really simple and slow way of filtering the gradients. A better way is to use a seperable 3D Gaussian kernal to filter the gradients.<br />
Now that we have the gradients we just fill the xyz components of the 3D volume texture and put the isovalue in the alpha channel:<br />
<pre class="mycode"><span style="color: green;">//transform the data to HalfVector4</span><span style="color: #2b91af;">HalfVector4</span>[] gradients = <span style="color: blue;">new </span><span style="color: #2b91af;">HalfVector4</span>[mGradients.Length];
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>i = 0; i < mGradients.Length; i++)
{
gradients[i] = <span style="color: blue;">new </span><span style="color: #2b91af;">HalfVector4</span>(mGradients[i].X, mGradients[i].Y, mGradients[i].Z, mScalars[i].ToVector4().W);
}
mVolume.SetData<<span style="color: #2b91af;">HalfVector4</span>>(gradients);
mEffect.Parameters[<span style="color: #a31515;">"Volume"</span>].SetValue(mVolume);</pre>
And that's pretty much it. Creating a good transfer function can take a little time, but this simple 1D transfer function can still produce pretty good results. Here's a few captures of what this demo can produce:<br />
A teapot with just transfer function:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGjLaDWQB87vVx8gbw-t52ljg4JDtmTHdqmm4QG_KW_TRdLjpU-EVlc3qDor3iOjVDY3_zviZ1mHzM9vRLcknYuJeISjbGg7b1FLyHxaIOxwPna7Ar45wES60vDp-swSiuwksygeq73PsR/s1600-h/teapot_noshade15.png"><img alt="teapot_noshade1" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggJoVCUUu8MjdHPrppgBeMZ57m69C5w3ZUB5c1JZ-GmKWl_lwvJL6fy3ur7xO145Zb4d2i4bjDCDzfZ25xg21Z-_9jpn7boPVHG8t-Yxxl_0vDxX2BuNlndtwDwnh6v4QGspt5hxqlDnVo/?imgmax=800" height="317" width="412"></a><br />
The teapot with transfer function and shaded:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNfN3LOwbGSkK1hjZFAfkCCqVsjGdW7qJGQOk-Te6cGk5BUSqaYcTV8oWWN8VXd7WgqwIuQJidm_klvNAc2ZKltLbDNLNzcudPFCA2L3LSMRlFdE2MXUNpp_Mat-jXyDjfQcPfSIirTxyN/s1600-h/teapot_shade26.png"><img alt="teapot_shade2" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh6gPgaxl8HjyrsmkgHsmdytcDa1Benq6xDxLBWyvCn6ms1X3_1TJaINodBJfGPkviZwwCbKBoANYbuycdzj0OQ5Ccdo8xYGl6hy1-uiAgGr4G3rRdrehlP1xXU20DGKI84N3cbU1Fja1Ar/?imgmax=800" height="309" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQnXtgSky_e8gHe0SQlUuW2d47yjtE4m-yQuDEudYNjgK75qU0D6_QGxLvTB4qJzcXHmiKpbz1P9bYVLIlBk82wyKETCh0JNSz0l4lzQWlRHUiRQsiaWecJxZJNB9-SMW65DFYjnFKr7rh/s1600-h/teapot_shade84.png"><img alt="teapot_shade8" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg90lbG0dv78k9PEi5O5FOOcwbI3Bozv3B4UL6MF2vith5Ga9-nFI2nQ7-2c1Xdq05FaYhBaV1dUAh_ZQCQ1amLpG8nVNVWYxdsjz_mishMPmNXwJMpVHijjNmHcddY_hPXfZseuvjONIja/?imgmax=800" height="309" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgLFhpk0JMIPJ0kF0pmcdSAPoLtDiXynNy_MRXRIcbxxR6UOO719mM4G0JNQOsv2cyJZHlB18t_SZCCFywqMjqA6HW1Jwgnm3YGUu0BdUKmBGbdB_tJvmiPV-XfgWjNhxJuQdLgnYaizl3/s1600-h/teapot_shade94.png"><img alt="teapot_shade9" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgtVC_8lBWoiMAg7gLR5OCOt2x6mkjd5UkxDZctVTEZv3E2-mOdNnZCDPR8xYHdbwGKnBcW40jejsFku6BxmxszPrfKEA6A0SEF3KQJRQ1rJEda42vZz6-p63pN6TC4Q25Bx9FR4XyscYQY/?imgmax=800" height="309" width="412"></a><br />
Now a CT scan of a male head. Just transfer function:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh95TzGnTqHk46TjxmIScrWWDLXewhCE3RqOnqJinfjQB0YEHuNpGHptRVNvnyUxgoOKcfkPdbxWzKCQasqnyqo1WGPkJExpnR4MPe-s8IBmgmGoKiLt4v_yuwZeHEoDiELJv35tSRA0tW_/s1600-h/male_noshadeb017.png"><img alt="male_noshadeb0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhda8Y5RiWfBrq3f88WD1-zZsTp9BHfuPlrVc3IueON2bnUSLHOJMCfAYLx10JfxBcb42SaS1SrAyWao4bx9FiMKNvUI-9474ueGIh6OYLMvkf4GFdTDgxYjYsuq2XNjOkYSfp0_sc3GlFs/?imgmax=800" height="390" style="display: inline;" title="male_noshadeb0" width="412"></a><br />
Now with shading:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgeOzLVbd33L4UqYEhTj8hgWo6ieNzZsTi6AZ9LN0XOEGgg5ZIyY2VAO1pr1ggR_s9yuI724gW0cJUpOSop4enr439wXRQtcnrATTOjXJVOvEludS_MehabijAARSm2tBVr9Xy_E-POJm5r/s1600-h/male_shadeb26.png"><img alt="male_shadeb2" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhgmdAL4_YUBZzw2-JFSchKmzrv1DOVmQKrE_Afsj7Q7wvZZ7CAsMdDREON48bNdlfctY7_IdBpPAeSO2f5BAN4Ry_OHT1QutENkBMbPCyc5jv9GVRSxzcg9aHzQbnWXcPcWQ24o_aHdtSD/?imgmax=800" height="405" style="display: inline;" title="male_shadeb2" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiwjSDiggRrvlmad7ib6RzAHGxdJNopYWVvh5itT5TcMSMKie8IhQvw-HB3Js8sRlB9E32RJJNW2aWjiMYjLTFBCqrttC-lWWWxVMAluoFLZXrHY1xtr5re8g-w7mvvT973PzjbZrawfY4g/s1600-h/male_shadeb014.png"><img alt="male_shadeb0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjA9_mQnWAwFCyb7WypsbK55qe3U7U4N8-YlsLEC83DeTMw6ZFunOjZcK5Ac5pZJAORaoYcAVWCI44kFYl_QxcFQd-iwy51zy8J_hDlu1MxW6-N1NaAin7PotcLHoC8FoKkF45fo-d7TR3h/?imgmax=800" height="386" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhHNR4LyHbudNE8aD-_HfDKeCeCJEm0eMF1SS1SaRYdzRczHJ2nN_pncPaeXJM440Ut8vUG6BO_Qd_EPBxw4tmGO8Oh9_aO17YVRAqbn_VwDC3-1Lus4Dw3H8pU-emYSLDDAu5DJD0q4gDz/s1600-h/male_shadeb15.png"><img alt="male_shadeb1" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6asnpGpYBoyJwnm6Norc0P0RW8uBa34tj5Y_L0v3u0aVXczi19O8inpDnndlClf8toP8k1YoaktToOS2TjwA8YuVZ0JKnY2NtmLdFWch5Rf-w_DDO-QyP4Lx3si5iVfOj_WoBGb0rlP3y/?imgmax=800" height="490" style="display: inline;" title="male_shadeb1" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKajfdUJSTzsyH6XSbc3ESrRJo92MOEv0zdTYoe7VOAMHJzpsihwZX-Zgy-fQ_iTO1wo9g0Oax9hDxHumNTtDhiRKfSFy6ukL-ns0mmFJAx9LXfPILFIl-3yNgFaynfxK1zRgWSIB3TB-k/s1600-h/male_shadew05.png"><img alt="male_shadew0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgacMQnfOKElmp0iOHMy2WaJ0C3ChVJtE5ysoB3LDGzeBrugCW69XgWg1SfuccI2Ed7WLWn-k3VICbsMY4kPUkQ0nOQF07RM9Oo9Zn0iR2R4u-9eevjPUHhWPwPlqIdUlNbD1M6hsTfSpXR/?imgmax=800" height="407" style="display: inline;" title="male_shadew0" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgNR2UIruW8xLul9nXgdt3bAwFMuLAeSpmpZkyGBKiPjCNJKAkQiCSkqYBgxbAhcCMwX4gTSdBWi8h-wlPQVWj8K9LpfjNIsFyRNToGnUyz27ObFiblXs_ssF8P46f-YZpcsBM8qlT4m6qN/s1600-h/male_shadew25.png"><img alt="male_shadew2" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3lvPJnqzHEyW9NqkeNDMQC3MaxJJpkk8JbhpriV57gGFjhr8YAuOP45h59W0y9XrNQw06qT9ub3Gr43FHqbBuswyRw2r7zHvqMrqcOq646rHU7Nhj_-IHX6vIgb1YR9GRvQzoFf2rdhRQ/?imgmax=800" height="403" style="display: inline;" title="male_shadew2" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhdNi2Ej6J272uDNBaRfN7RczaOp981vRXXTVJp2WvuNfww0nVxcV2UZA89KOzK_oMRL3V1x5WB2UHTZKvGgHnoOSykTfQPyPb3D9lbBQDFHqMwBRn_HUJIHMZIIa4lJuTJh8VvfWEBOdNj/s1600-h/male_shadew15.png"><img alt="male_shadew1" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhxFibxxMriSz7jazwMjBZrszorcLtu5TxKMWyaFi1DywrzE4f-b2izy8Tl90t52inQNz7fd8i9W53c1sTgZnnb9R03x7iIW9ehWjRP7bPK6fSEHfaqNeonA57nWYQ9raFEY7uXIYLWZXkY/?imgmax=800" height="460" style="display: inline;" title="male_shadew1" width="412"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjADC4ucua-8jsD9s3QTz4JjU5VfTXSw5W9Po37-azv-e_nMxS5lKxnOffklFhBC9Vy-ygEgn0TLgluD0_gceb_-Oa2Pwe5dR0rap7BBmp764-tUVapuf2cm4c6bakfYwu_oDaqe_Cs7JcM/s1600-h/male_shadew35.png"><img alt="male_shadew3" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh28BFOw_a2FmhXYJOLH6GzZisHK2YgORKF1u8hkd_OkF0kxG-zVM5ab9nyFeiwhn312-0sPlpu0MZ1VzhXQuV0zMweAIcg68fVDUJx8lDBxE-oc9JMSR0m41RcIr9whoM8iLkiEu6daElv/?imgmax=800" height="469" style="display: inline;" title="male_shadew3" width="412"></a><br />
And a mummy:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjtIKPa_lH5N607F3QEhpib09p72ehZ_kUCRZtRboonzvyCp6txaVxkwEKum4ONk1JWoXJZzz_l_CLJGWguV4t5_Bo2QsJI-bgGLJDOAkcTlA4In2l7fmhD2lyvJQKZyaCFNyF1GV0zFGKX/s1600-h/shade09.png"><img alt="shade0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgeMHYFmzMcIptvF4THnRCvxx3b_nITbLT93cI91TzA1vnPvTXRfERrVPiBzFD9PnrN3OYMyo9EibQ1hoe-w9i1yxvg95729xTMRRiW-HJ1jDX_dv8pshSIK6eNLjxfUwFIY9oMLJv23Xj5/?imgmax=800" height="480" width="413"></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_DdyFIizUfe6e_OMPnoOq8-UyV9kNotceGCTw6Hx4PAqKjpaXp7oOlE4ezKq27wySgvCC3kOP7jA3Mi7a24VIqdV8_Sc4H3wwlek3WC70OsCU-nPVHNS8H7y2QOoj_fp3LoX-Oef9hXrh/s1600-h/shade14.png"><img alt="shade1" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdnNDHlNq6U0Nb6eqGbm3vrIrD1Dy6kWeI_ChwJXtAwGa_0p1XqxY8_8VJMiFKabDsl-TZlc_o2YRVgTdAhvuC-mvNFWeC_TTy0R17Dcxl6ix86VYWWjgHUifJVnhTdUZZ0PYlZv-cmQvE/?imgmax=800" height="423" width="412"></a><br />
*The CT Male data set was converted from PVM format to RAW format with the V3 library. The original data set can be found here: <a href="http://www9.informatik.uni-erlangen.de/External/vollib/">http://www9.informatik.uni-erlangen.de/External/vollib/</a><br />
<iframe frameborder="0" height="120" scrolling="no" src="https://onedrive.live.com/embed?cid=B80A3031B5BFA52B&resid=B80A3031B5BFA52B%21193&authkey=ALffDPM3e-WIsGg" width="98"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com30tag:blogger.com,1999:blog-1023441640234597436.post-29056497495533193322009-01-15T22:58:00.004-05:002019-11-13T04:36:23.938-05:00Volume Rendering 101<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMjSUAjMVRMEV1T2hb_W0CYDBfrEMM_3Wt-l5OfpRBr4P8oIaUgc14zaR-j6lvs0daUCfFWV9Z3baW5VABRLDKR0q2yLtwjlvJT_4uGz_KAbLsjTfEhohCYd25iVvwYfQkj2fO4wDGbM8Z/s1600-h/fish%5B3%5D.png"><img alt="fish" height="127" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg6VHKQEYuo0Q0VHpHffmJFMdkjyCTShv70r-M9QXyINMC3ZSDUmWdF1a0tTVmnhtbnot5RnDSzIGVhe7LNC0I-FgDAy6GryvQwR-bOQYeAQQWC__o85xjIrsTZQuSKbC9KhkA5Kk4HW8QW/?imgmax=800" width="240" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgXJ0nzfwiTpppgK_sOns5FSsNvvxON7qFbuNOiFKBPX1nszTRgq9HdEeKo9t-D0QR6CrEeVyeGdz1J2-4ppqBG7WU5yJ9s6VfOKrJTyYAUbBawx_rI6yNc7D9Xi3qHY_ZE-gRR5B0_NZ13/s1600-h/bunny%5B7%5D.png"><img alt="bunny" height="160" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifx7GhDt2Q4uOq6d7r9pMsgA3YMTN1AF2DX5GMpQGs2HYXH2bMwe_hgdSW36HpqeqQttam65-ofc_OtcY9f4aMgDZlR7QOyEUioXB1uEzRKfag1adDmLTIZ_AC5bJDzd1b1OqWC3wfMdtQ/?imgmax=800" width="154" /></a> <br />
Pictures above from: <a href="http://www.cs.utah.edu/~jmk/simian/" target="_blank">http://www.cs.utah.edu/~jmk/simian/</a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjsYg00-ZYvTi97uBOIULjpxXUQPWCLjILxn0WrzOIhms0EA0WVgS4kIy0vY0z-yIfkyU6qYTsex0xv0NEo6g1IYwgh_ji4aiCGwFtyOGHkCq0RhSa3YhMfF13UFMbkJy6bUbd7EO8mmI3p/s1600-h/translucent_l%5B5%5D.png"><img alt="translucent_l" height="226" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhEk1DNLhGmeoR0uVoClQLejVBJ3cYWJUUkAqPfSiX0kF0uTy-SB_oTbU_K1dSd2WrSKaG4dEtRDG7bZcwO_kbVm0EBTlPJM85Qd1roMlfhq5WvRSpQ5dq8c3fITOXjvf3ewRw9uBBEBqBe/?imgmax=800" width="199" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiHVdRimJKdRA7mF9TY7Hw5R9I1mZhbpPwVPnVWI3NFRHkPqXsf08gxnAs8xlzOYsGoCz2XoRDaTPoYAt-AjHN7OIwXKKXW0we240MRq-IqXAGsoLF3KvIliXclssYpf6_IIkzqWCzO2okG/s1600-h/translucent_d%5B7%5D.png"><img alt="translucent_d" height="227" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXj-xk11kyDz5yPUgLpDR3g4RQPf0gpvEjImxHiAEhoHjWnAYif3wR7BTP6e9Rok5Pw5LLso572n_xvOIN5yCNuNnxn_l6wM-UP_i10ESVhqJXAlc-QhAv3nUrJIKLlSixr4sNInxd3519/?imgmax=800" width="199" /></a> <br />
There is quite a bit of documentation and papers on volume rendering. But there aren't many good tutorials on the subject (that I have seen). So this tutorial will try to teach the basics of volume rendering, more specifically volume ray-casting (or volume ray marching).<br />
What is volume ray-casting you ask? You didn't? Oh, well I'll tell you anyway. Volume rendering is a method for directly displaying a 3D scalar field without first fitting an intermediate representation to the data, such as triangles. How do we render a volume without geometry? There are two traditional ways of rendering a volume: slice-based rendering and volume ray-casting. This tutorial will be focusing on volume ray-casting. There are many advantages over slice-based rendering that ray-casting provides; such as empty space skipping, projection independence, simple to implement, and single pass.<br />
Volume ray-casting (also called ray marching) is exactly how it sounds. <span style="color: #ff6600;">[edit: volume ray-casting is not the same as ray-casting ala Doom or Nick's tutorials]</span> Rays are cast through the volume and is sample along equally spaced intervals. As the ray is marched through the volume scalar values are mapped to optical properties through the<br />
use of a transfer function which results in an RGBA color value that includes the corresponding emission and absorption coefficients for the current sample point. This color is then composited by using front-to-back or back-to-front alpha blending.<br />
This tutorial will focus specifically on how to intersect a ray with the volume and march it through the volume. In another tutorial I will focus on transfer functions and shading.<br />
First we need to know how to read in the data. The data is simply scalar values (usually integers or floats) stored as slices [x, y, z], where x = width, y = height, and z = depth. Each slice is x units wide and y units high, and the total number of slices is equal to z. A common format for the data is to be stored in 8-bit or 16-bit RAW format. Once we have the data, we need to load it into a volume texture. Here's how we do the whole process:<br />
<pre class="mycode"><span style="color: green;">//create the scalar volume texture</span>mVolume = <span style="color: blue;">new </span>Texture3D(<span style="color: #2b91af;">Game</span>.GraphicsDevice, mWidth, mHeight, mDepth, 0,
<span style="color: #2b91af;">TextureUsage</span>.Linear, <span style="color: #2b91af;">SurfaceFormat</span>.Single);
<span style="color: blue;">private void </span>loadRAWFile8(<span style="color: #2b91af;">FileStream </span>file)
{
<span style="color: #2b91af;">BinaryReader </span>reader = <span style="color: blue;">new </span><span style="color: #2b91af;">BinaryReader</span>(file);
<span style="color: blue;">byte</span>[] buffer = <span style="color: blue;">new byte</span>[mWidth * mHeight * mDepth];
<span style="color: blue;">int </span>size = <span style="color: blue;">sizeof</span>(<span style="color: blue;">byte</span>);
reader.Read(buffer, 0, size * buffer.Length);
reader.Close();
<span style="color: green;">//scale the scalar values to [0, 1]
</span>mScalars = <span style="color: blue;">new float</span>[buffer.Length];
<span style="color: blue;">for </span>(<span style="color: blue;">int </span>i = 0; i < buffer.Length; i++)
{
mScalars[i] = (<span style="color: blue;">float</span>)buffer[i] / <span style="color: blue;">byte</span>.MaxValue;
}
mVolume.SetData(mScalars);
mEffect.Parameters[<span style="color: #a31515;">"Volume"</span>].SetValue(mVolume);
}</pre>
In order to render this texture we fit a bounding box or cube, that is from [0,0,0] to [1,1,1] to the volume. And we render the cube and sample the volume texture to render the volume. But we also need a way to find the ray that starts at the eye/camera and intersects the cube.<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjPwuvMCrKaoWyvKvO0a8VOA_QRJQabkUrSQSgVJgF-DTpKEeecG8paP6aij2GV_-x1U4VYFeUJNuppyJiia6Avr2uJTevkMHqOYVj-s-S3GT6oh4kSGDTLuJFm5apQG58TrvJjvUMkyoT1/s1600-h/ray%5B7%5D.png"><img alt="ray" height="240" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0z1GAHdJZUUN2a5hLPLApBKFmQJ4hsh-W-fs2AFBJAkcYKojW6j2Gz_e29sdat1xTKVlpXrZk3_HomdFz4ld2Mp0EQxWUKuMCZ8885zQzeCZrfYhJ_3vQh2-5IfZnXpadfc8GVEn5BkTR/?imgmax=800" width="216" /></a> <br />
We could always calculate the intersection of the ray from the eye to the current pixel position with the cube by performing a ray-cube intersection in the shader. But a better and faster way to do this is to render the positions of the front and back facing triangles of the cube to textures. This easily gives us the starting and end positions of the ray, and in the shader we simply sample the textures to find the sampling ray.<br />
Here's what the textures look like (Front, Back, Ray Direction):<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjehSdKF9PYsPS7wCk-1wXFPN2Mu8zGr0Jpu85hgHyggNy3_bR-f7iWGl8D87QqKZJjfEuALFaMq4YpeeGQt0ZWkMcOwExPXTEi_ihyphenhyphenrwFS6GZ1VClBi_JoGutc46PgsfTzyDUquRLMCNn0/s1600-h/front%5B4%5D.png"><img alt="Front" height="134" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkxk8vjoV0xPlQJRJF1oTBZEw1Yh5OjRX659RAjwPgN6aY7TH55My-jMYx7I3rmLyBMiPEyOYtmDtfyeu_Bstj5n0o_3cECsCjR0C04Bl6JRfOSm1IYJUlZqxqZQblg_reRoySjlD_mgcL/?imgmax=800" width="120" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg65-x4AThER8XBRcAbqcq6WcuSIP0vMOyhVrrFWrdqDUgezAOOMvV6PClSvxkpepn99Z5rIMSU0g7BoWA2U6WWP_VS0rRo_4UHqP8LemMK6YHXHgONazQpYFkCFDMvHX0tSFttZzoZ8Ahw/s1600-h/back%5B5%5D.png"><img alt="Back" height="134" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjdvI515NNKgi0JggOXxdJUHYxeQRqFqAPmot1NpBxmb13eJW8v6u8-BmcAKTogNErkVwS0dOfTCIy-SQCOmqUyiqbILSnJ2Nqw3itcHrDQPVnsLvhUgSb1zHL8AU2Qv184qCKUOZK03APE/?imgmax=800" width="120" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjLkeEpd3BSiUKktj_qMvOZnEGyOwbcOlC34F2IM1KfHjq6QbB0YdxRFf1MZMIzOJdRCuqp9Fp6qtEVQJadUzsC8K6kPiBvaM0_gbJ-zfSb7mWujyGSKcfbFXeamw0ZDX6UqstHXJGA9N96/s1600-h/direction%5B4%5D.png"><img alt="Ray Direction" height="135" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_uCJg9sCf00_I7JHoZLqVqHhPtOVPfx1Fxpw0u7cwYDofVo5j2sH1UsXvVdXLqBX4YlkPh-qRr69-IZ5Gvo4bdBOwQBD5OkLnJfyKIWfFXlxFzFAmbjy1QQ8fAywwKPMwqdj-dAMHMjIa/?imgmax=800" width="120" /></a> <br />
And here's the code to render the front and back positions:<br />
<pre class="mycode"><span style="color: green;">//draw front faces
//draw the pixel positions to the texture</span>Game.GraphicsDevice.SetRenderTarget(0, mFront);
Game.GraphicsDevice.Clear(<span style="color: #2b91af;">Color</span>.Black);
<span style="color: blue;">base</span>.DrawCustomEffect();
Game.GraphicsDevice.SetRenderTarget(0, <span style="color: blue;">null</span>);
<span style="color: green;">//draw back faces
//draw the pixel positions to the texture</span>Game.GraphicsDevice.SetRenderTarget(0, mBack);
Game.GraphicsDevice.Clear(<span style="color: #2b91af;">Color</span>.Black);
Game.GraphicsDevice.RenderState.CullMode = <span style="color: #2b91af;">CullMode</span>.CullCounterClockwiseFace;
<span style="color: blue;">base</span>.DrawCustomEffect();
Game.GraphicsDevice.SetRenderTarget(0, <span style="color: blue;">null</span>);
Game.GraphicsDevice.RenderState.CullMode = <span style="color: #2b91af;">CullMode</span>.CullClockwiseFace;</pre>
Now, to perform the actual ray-casting of the volume, we render the front faces of the cube. In the shader we sample the front and back position textures to find the direction (back - front) and starting position (front) of the ray that will sample the volume. The volume is then iteratively sampled by advancing the current sampling position along the ray at equidistant steps. And we use front-to-back compositing to accumulate the pixel color.<br />
<pre class="mycode"><span style="color: blue;">float4 </span>RayCastSimplePS(VertexShaderOutput input) : <span style="color: navy;">COLOR0</span>{
<span style="color: green;">//calculate projective texture coordinates
//used to project the front and back position textures onto the cube
</span><span style="color: blue;">float2 </span>texC = input.pos.xy /= input.pos.w;
texC.x = 0.5f*texC.x + 0.5f;
texC.y = -0.5f*texC.y + 0.5f;
<span style="color: blue;">float3 </span>front = <span style="color: blue;">tex2D</span>(FrontS, texC).xyz;
<span style="color: blue;">float3 </span>back = <span style="color: blue;">tex2D</span>(BackS, texC).xyz;
<span style="color: blue;">float3 </span>dir = <span style="color: blue;">normalize</span>(back - front);
<span style="color: blue;">float4 </span>pos = <span style="color: blue;">float4</span>(front, 0);
<span style="color: blue;">float4 </span>dst = <span style="color: blue;">float4</span>(0, 0, 0, 0);
<span style="color: blue;">float4 </span>src = 0;
<span style="color: blue;">float </span>value = 0;
<span style="color: blue;">float3 </span>Step = dir * StepSize;
<span style="color: blue;">for</span>(<span style="color: blue;">int </span>i = 0; i < Iterations; i++)
{
pos.w = 0;
value = <span style="color: blue;">tex3Dlod</span>(VolumeS, pos).r;
src = (<span style="color: blue;">float4</span>)value;
src.a *= .5f; <span style="color: green;">//reduce the alpha to have a more transparent result
//Front to back blending
// dst.rgb = dst.rgb + (1 - dst.a) * src.a * src.rgb
// dst.a = dst.a + (1 - dst.a) * src.a
</span>src.rgb *= src.a;
dst = (1.0f - dst.a)*src + dst;
<span style="color: green;">//break from the loop when alpha gets high enough
</span><span style="color: blue;">if</span>(dst.a >= .95f)
break;
<span style="color: green;">//advance the current position
</span>pos.xyz += Step;
<span style="color: green;">//break if the position is greater than <1, 1, 1>
</span><span style="color: blue;">if</span>(pos.x > 1.0f pos.y > 1.0f pos.z > 1.0f)
break;
}
<span style="color: blue;">return </span>dst;
}</pre>
<br />
And here's the result when sampling a foot, teapot with a lobster inside, engine, bonsai tree, ct scan of an aneurysm, skull, and a teddy bear:<br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgw7PmlYnRxAdKpkTawr_GX8BYHCn8DXop8lZ3B_Dyx6R4lb-16ZdF-A-yeckxsBeEFZhJGjdYp5HTKo1rDZ1a7sJNlGbNCcanBqj1Fp0Nts1BSuTBQLeTyYeAb8qaS5T8yoUQKB3xx46kd/s1600-h/foot%5B15%5D.png"><img alt="foot" height="397" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiXKM_mZm7oXLSRf9hGkSWLtxgRIrH4u56_kN4vXNBpV_5X418j-_GE7DaSOl0-qidvhkNl2jyrYTbNuEKTlWTe_zE8_GyKfbo3u5Gfq2RYCuiNPLCAtGeg46Gqr0RhJxg7u0AeN4trRxxp/?imgmax=800" width="404" /></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjD48uX04uh-QIXa2T-h8Nh9u9tuN4nKuXSTchQvHuVq4SVPVGol_QHgTx4YnEApGquvjVUGhoUPspxA7IhYUieoHK758q7JPO-8XXxs5GeMKy6uQZdkQ2pRXxo0W9GAOIYvi93caApyQ75/s1600-h/teapot%5B10%5D.png"><img alt="teapot" border="0" height="311" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhiJvjLpWZnaky4CxA9jZ8UTy6esmGiB1LMJ9q1K9QVhr-hrKaRuHRWdL_yGjiVRWkMJZpcLvL7t3LMVQGV4z5lEcHwqe3RovIgtekMcvVSzftSCHUGHb0eX8i6exSZAQFrUGihnqkjUngg/?imgmax=800" width="408" /></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhbwNn7rUlKCXUtT_ApXfOnjJa1gC3zo1LwwbaFMB836iS9YsjMfyfKUVpUbpeoKVbzwAXVvuGVQt4tgJnE_XrWuYtSml1X_q4tGjXaPParjL-XiZjPRskAri5hxT9yopy1fsDbLD5UzPD1/s1600-h/engine%5B6%5D.png"><img alt="engine" height="415" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEimvlZIGXNlbXot_NbSMUEocjO1Paxr8NVnKlstkyfSCfQBqBK6vDUgCtZ3FXbJbkKznz5YiBtmh7MGKYQT1NAZs3mbuUoYI49RWkoMRGG-IArGAzcWjhrMIoCfJjp9fqm_unJhNdStz5c3/?imgmax=800" width="410" /></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg97VKy4Tcd4ecG5Eq6rkz7PG5zK0FMlBTmFBObioq7N7JDRnuNl9Oj38jt0y9PPRQ2v7eJNGArz2pvIk3K8UpRlA7-PiORH9lh5ENghI4k8v2rwOB4RpoFpHoo98bEOFIzf4AHQrBLhz7S/s1600-h/bonsai%5B8%5D.png"><img alt="bonsai" border="0" height="415" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg8CTfFHvC64CZyKnxmp20VzgXSsCt0CJXPup1bWChnSMHcTUh87xZ_jYNrOv2NfR708CmI1NEAGtk-KbPHkZ0at7u_ez6gmJ0XoRPQm95Jp0-PWZlv_CG_PsoCY6X5LWuYtDwWs8bSQINc/?imgmax=800" width="412" /></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjzBzhDaS6l69gCLbxt44jFNytHP6aZuSHyilKgAQDaxue2BM-f0FXKr82Bfcm5mztGav9_-ivXrvn5uiRxAYm5RDyQSWp600euKmdnkXe3dYnBfZxmzVQ07MmuJA-EjQVy4_auCp92Ljmp/s1600-h/aneurism%5B4%5D.png"><img alt="aneurism" height="352" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh899Ztk3ALOpLoktp_xBgu3HzwmBprLh-fEAeRgmC8Bqvlxm-p5TqtejEEDCwoDqzI8tLksQQYJRH3qD2a0HCV-sUKZeJV_GsxQ34sCklN4UrvPPOdsYcT2VXQ8_d6RP5NdVkcD9QMK9sG/?imgmax=800" width="409" /></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjVLOKbe9dFHEftUGr_zfbExK_PCNYoXTJ9l_zVuwsMLqB0wmMJnmFtxrt8jhYga1PHe5avFCuQFRJU2vdzfNBoxy4VBnXkwExULUKQsBfpFBmgK41vDPWdl2qqUQOjA2wFxoRS9aGc1i_U/s1600-h/skull%5B5%5D.png"><img alt="skull" border="0" height="370" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifPbAAZ7q3u_IyPt802VUlFEi4tRzB36EzkCdWO15yy6Avgk2tWk-ZmzCj_aDwrwUb94aE68_SqmUOaVDufb0DTnT2DaPL8YSsPr8sKPfjWg0Dia7iQiEd4GshWzJ54ozVp35qoSmo5EEl/?imgmax=800" width="409" /></a><br />
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMRHAa7z7htIvYvnpDsN9Mz3g23Ea1NoBnSPD-2mihKgx6OQ9lSoRa5CkzPEHfXVm32c_pXQdurkLU5NfOodXdFK_2FgCoPnLZr8qNBtPOp7HFj11R79O6ahgoZS4pZjlFOAGj0ToNcwu1/s1600-h/teddy%5B5%5D.png"><img alt="teddy" height="395" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj_ZzM2FdDAtfnVK-ObnIlcEFHHIZ5SQzjctFJHFOAe6BXvJAhPhDb_Lfg9LfihXPLy4R5xPv7bVyRUsZxjiv_JayXS4n1J-O3e_c4WkngDV5tbTEPhvemIFsfw6iR0Uv2GfS5qDGiGK955/?imgmax=800" width="412" /></a><br />
So, not very colorful but pretty cool. When we get into transfer functions we will start shading the volumes. The volumes used here can be found at <a href="http://www.gris.uni-tuebingen.de/edu/areas/scivis/volren/datasets/datasets.html" target="_blank">volvis.org</a><br />
<strong><span style="color: #ff6600;">Notes:</span></strong><br />
Refer to the scene setup region in VolumeRayCasting.cs, Volume.cs, and RayCasting.fx for relevant implementation details.<br />
Also, a Shader Model 3.0 card (Nvidia 6600GT or higher) is needed to run the sample.<br />
<br />
<iframe frameborder="0" height="120" scrolling="no" src="https://onedrive.live.com/embed?cid=B80A3031B5BFA52B&resid=B80A3031B5BFA52B%21192&authkey=AJM7E5k_xeKVfss" width="98"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com57tag:blogger.com,1999:blog-1023441640234597436.post-55802310811997043682008-11-13T17:32:00.019-05:002008-12-01T22:53:26.047-05:00Water Game Component<a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjbMhQu6JJKSJKCgSddaMmQFLCbg9ycZJbh24AMGds4o7Vw4KJGcCjcMjdRt0emZotGUveuXL-lk_UvMFHsRY1-ZlO6UE8ASaBE2OsQLqFmSYPMkGg-fMfD-dO3zs8DMMaU82A7FlPLdQBl/s1600-h/water.png"><img id="BLOGGER_PHOTO_ID_5268533297856805890" style="WIDTH: 400px; CURSOR: pointer; HEIGHT: 300px" alt="" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjbMhQu6JJKSJKCgSddaMmQFLCbg9ycZJbh24AMGds4o7Vw4KJGcCjcMjdRt0emZotGUveuXL-lk_UvMFHsRY1-ZlO6UE8ASaBE2OsQLqFmSYPMkGg-fMfD-dO3zs8DMMaU82A7FlPLdQBl/s400/water.png" border="0" /></a><br /><br /><p>I had some requests awhile back for a more independent water effect that I had provided in my camera animation tutorials. And just the other day I remembered that I had totally forgot about this (doh!). So here is a DrawableGameComponent for the water effect. <a href="http://graphicsrunner.blogspot.com/2008/05/camera-animation-part-ii.html">Have a look here to see it in action.</a><br /></p><p>To setup the component we need to fill out a WaterOptions object that will be passed to the water component.</p><pre class="code"><span style="COLOR: rgb(43,145,175)">WaterOptions </span>options = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">WaterOptions</span>();<br />options.Width = 257;<br />options.Height = 257;<br />options.CellSpacing = 0.5f;<br />options.WaveMapAsset0 = <span style="COLOR: rgb(163,21,21)">"Textures/wave0"</span>;<br />options.WaveMapAsset1 = <span style="COLOR: rgb(163,21,21)">"Textures/wave1"</span>;<br />options.WaveMapVelocity0 = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector2</span>(0.01f, 0.03f);<br />options.WaveMapVelocity1 = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector2</span>(-0.01f, 0.03f);<br />options.WaveMapScale = 2.5f;<br />options.WaterColor = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector4</span>(0.5f, 0.79f, 0.75f, 1.0f);<br />options.SunColor = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector4</span>(1.0f, 0.8f, 0.4f, 1.0f);<br />options.SunDirection = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector3</span>(2.6f, -1.0f, -1.5f);<br />options.SunFactor = 1.5f;<br />options.SunPower = 250.0f;<br /><br />mWaterMesh = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Water</span>(<span style="color:blue;">this</span>);<br />mWaterMesh.Options = options;<br />mWaterMesh.EffectAsset = <span style="COLOR: rgb(163,21,21)">"Shaders/Water"</span>;<br />mWaterMesh.World = <span style="COLOR: rgb(43,145,175)">Matrix</span>.CreateTranslation(<span style="COLOR: rgb(43,145,175)">Vector3</span>.UnitY * 2.0f);<br />mWaterMesh.RenderObjects = DrawObjects;</pre><br />So here will fill out various options such as width and height, cell spacing, the normal map asset names, etc. We then create the water component, and assign it the options object. We then provide the filename of the Water.fx shader, the water's position and we then assign its RenderObjects delegate a function that will be used to draw the objects in your scene.<br /><br />The component tries to be relatively independent of how your represent your game objects. All that it asks for is that you provide a function that takes a reflection matrix. This function should go through the objects that you want to be reflected/refracted and combine the reflection matrix with the object's world matrix.<br /><br />Here's an example of what your DrawObjects() function might look like.<br /><pre class="mycode"><span style="color:blue;">private void </span>DrawObjects(<span style="COLOR: rgb(43,145,175)">Matrix </span>reflMatrix)<br />{<br /><span style="color:blue;"> foreach </span>(<span style="COLOR: rgb(43,145,175)">DrawableGameComponent </span>mesh <span style="color:blue;">in </span>Components)<br />{<br /> <span style="COLOR: rgb(43,145,175)">Matrix </span>oldWorld = mesh.World;<br /> mesh.World = oldWorld * reflMatrix;<br /><br /> mesh.Draw(mGameTime);<br /><br /> mesh.World = oldWorld;<br />}<br />}</pre><a href="http://11011.net/software/vspaste"></a>mWaterMesh.RenderObjects is the delegate that has the signature of:<span style="color:blue;"> public delegate void </span><span style="COLOR: rgb(43,145,175)">RenderObjects</span>(<span style="COLOR: rgb(43,145,175)">Matrix </span>reflectionMatrix);<br /><br />Basically this function should just go through your game objects and render them.<br /><br />Lastly, before you draw your objects in the scene, you need to send to the water component the ViewProjection matrix and the camera's position by using WaterMesh.SetCamera(). And you need to call WaterMesh.UpdateWaterMaps() to update the reflection and refraction maps. After this, you can clear your framebuffer and draw your objects. For how this effect looks you can take a look at my camera animation tutorials.<br /><span style="FONT-WEIGHT: bold"><br />Water GameComponent:</span><br /><pre class="mycode"><span style="color:blue;">using </span>System;<br /><span style="color:blue;">using </span>System.Collections.Generic;<br /><span style="color:blue;">using </span>System.Text;<br /><br /><span style="color:blue;">using </span>Microsoft.Xna.Framework;<br /><span style="color:blue;">using </span>Microsoft.Xna.Framework.Graphics;<br /><span style="color:blue;">using </span>Microsoft.Xna.Framework.Content;<br /><br /><span style="color:blue;">namespace WaterSample</span><br />{<br /><span style="color:green;">//delegate that the water component to call to render the objects in the scene<br /></span><span style="color:blue;">public delegate void </span><span style="COLOR: rgb(43,145,175)">RenderObjects</span>(<span style="COLOR: rgb(43,145,175)">Matrix </span>reflectionMatrix);<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Options that must be passed to the water component before Initialization<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">public class </span><span style="COLOR: rgb(43,145,175)">WaterOptions<br /></span>{<br /><span style="font-size:0;"><span style="color:green;">//width and height must be of the form 2^n + 1</span></span><br /><span style="color:blue;">public int </span>Width = 257;<br /><span style="color:blue;">public int </span>Height = 257;<br /><span style="color:blue;">public float </span>CellSpacing = .5f;<br /><br /><span style="color:blue;">public float </span>WaveMapScale = 1.0f;<br /><br /><span style="color:blue;">public int </span>RenderTargetSize = 512;<br /><br /><span style="color:green;">//offsets for the texcoords of the wave maps updated every frame<br /></span><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Vector2 </span>WaveMapOffset0;<br /><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Vector2 </span>WaveMapOffset1;<br /><br /><span style="color:green;">//the direction to offset the texcoords of the wave maps<br /></span><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Vector2 </span>WaveMapVelocity0;<br /><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Vector2 </span>WaveMapVelocity1;<br /><br /><span style="color:green;">//asset names for the normal/wave maps<br /></span><span style="color:blue;">public string </span>WaveMapAsset0;<br /><span style="color:blue;">public string </span>WaveMapAsset1;<br /><br /><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Vector4 </span>WaterColor;<br /><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Vector4 </span>SunColor;<br /><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Vector3 </span>SunDirection;<br /><span style="color:blue;">public float </span>SunFactor;<br /><span style="color:blue;">public float </span>SunPower;<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Drawable game component for water rendering. Renders the scene to reflection and refraction<br /></span><span style="color:gray;">/// </span><span style="color:green;">maps that are projected onto the water plane and are distorted based on two scrolling normal<br /></span><span style="color:gray;">/// </span><span style="color:green;">maps.<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">public class </span><span style="COLOR: rgb(43,145,175)">Water </span>: <span style="COLOR: rgb(43,145,175)">DrawableGameComponent<br /></span>{<br /><span style="color:blue;">#region </span>Fields<br /><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">RenderObjects </span>mDrawFunc;<br /><br /><span style="color:green;">//vertex and index buffers for the water plane<br /></span><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">VertexBuffer </span>mVertexBuffer;<br /><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">IndexBuffer </span>mIndexBuffer;<br /><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">VertexDeclaration </span>mDecl;<br /><br /><span style="color:green;">//water shader<br /></span><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">Effect </span>mEffect;<br /><span style="color:blue;">private string </span>mEffectAsset;<br /><br /><span style="color:green;">//camera properties<br /></span><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">Vector3 </span>mViewPos;<br /><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">Matrix </span>mViewProj;<br /><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">Matrix </span>mWorld;<br /><br /><span style="color:green;">//maps to render the refraction/reflection to<br /></span><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">RenderTarget2D </span>mRefractionMap;<br /><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">RenderTarget2D </span>mReflectionMap;<br /><br /><span style="color:green;">//scrolling normal maps that we will use as a<br />//a normal for the water plane in the shader<br /></span><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">Texture </span>mWaveMap0;<br /><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">Texture </span>mWaveMap1;<br /><br /><span style="color:green;">//user specified options to configure the water object<br /></span><span style="color:blue;">private </span><span style="COLOR: rgb(43,145,175)">WaterOptions </span>mOptions;<br /><br /><span style="color:green;">//tells the water object if it needs to update the refraction<br />//map itself or not. Since refraction just needs the scene drawn<br />//regularly, we can:<br />// --Draw the objects we want refracted<br />// --Resolve the back buffer and send it to the water<br />// --Skip computing the refraction map in the water object<br /></span><span style="color:blue;">private bool </span>mGrabRefractionFromFB = <span style="color:blue;">false</span>;<br /><br /><span style="color:blue;">private int </span>mNumVertices;<br /><span style="color:blue;">private int </span>mNumTris;<br /><span style="color:blue;">#endregion<br /><br />#region </span>Properties<br /><br /><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">RenderObjects </span>RenderObjects<br />{<br /><span style="color:blue;">set </span>{ mDrawFunc = <span style="color:blue;">value</span>; }<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Name of the asset for the Effect.<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">public string </span>EffectAsset<br />{<br /><span style="color:blue;">get </span>{ <span style="color:blue;">return </span>mEffectAsset; }<br /><span style="color:blue;">set </span>{ mEffectAsset = <span style="color:blue;">value</span>; }<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">The render target that the refraction is rendered to.<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">RenderTarget2D </span>RefractionMap<br />{<br /><span style="color:blue;">get </span>{ <span style="color:blue;">return </span>mRefractionMap; }<br /><span style="color:blue;">set </span>{ mRefractionMap = <span style="color:blue;">value</span>; }<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">The render target that the reflection is rendered to.<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">RenderTarget2D </span>ReflectionMap<br />{<br /><span style="color:blue;">get </span>{ <span style="color:blue;">return </span>mReflectionMap; }<br /><span style="color:blue;">set </span>{ mReflectionMap = <span style="color:blue;">value</span>; }<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Options to configure the water. Must be set before<br /></span><span style="color:gray;">/// </span><span style="color:green;">the water is initialized. Should be set immediately<br /></span><span style="color:gray;">/// </span><span style="color:green;">following the instantiation of the object.<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">WaterOptions </span>Options<br />{<br /><span style="color:blue;">get </span>{ <span style="color:blue;">return </span>mOptions; }<br /><span style="color:blue;">set </span>{ mOptions = <span style="color:blue;">value</span>; }<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">The world matrix of the water.<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">public </span><span style="COLOR: rgb(43,145,175)">Matrix </span>World<br />{<br /><span style="color:blue;">get </span>{ <span style="color:blue;">return </span>mWorld; }<br /><span style="color:blue;">set </span>{ mWorld = <span style="color:blue;">value</span>; }<br />}<br /><br /><span style="color:blue;">#endregion<br /><br />public </span>Water(<span style="COLOR: rgb(43,145,175)">Game </span>game) : <span style="color:blue;">base</span>(game)<br />{<br /><br />}<br /><br /><span style="color:blue;">public override void </span>Initialize()<br />{<br /><span style="color:blue;">base</span>.Initialize();<br /><br /><span style="color:green;">//build the water mesh<br /></span>mNumVertices = mOptions.Width * mOptions.Height;<br />mNumTris = (mOptions.Width - 1) * (mOptions.Height - 1) * 2;<br /><span style="COLOR: rgb(43,145,175)">VertexPositionTexture</span>[] vertices = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">VertexPositionTexture</span>[mNumVertices];<br /><br /><span style="COLOR: rgb(43,145,175)">Vector3</span>[] verts;<br /><span style="color:blue;">int</span>[] indices;<br /><br />GenTriGrid(mOptions.Height, mOptions.Width, mOptions.CellSpacing, mOptions.CellSpacing,<br /> <span style="COLOR: rgb(43,145,175)">Vector3</span>.Zero, <span style="color:blue;">out </span>verts, <span style="color:blue;">out </span>indices);<br /><br /><span style="color:green;">//copy the verts into our PositionTextured array<br /></span><span style="color:blue;">for </span>(<span style="color:blue;">int </span>i = 0; i < mOptions.Width; ++i)<br />{<br /> <span style="color:blue;">for </span>(<span style="color:blue;">int </span>j = 0; j < mOptions.Height; ++j)<br /> {<br /> <span style="color:blue;">int </span>index = i * mOptions.Width + j;<br /> vertices[index].Position = verts[index];<br /> vertices[index].TextureCoordinate = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector2</span>((<span style="color:blue;">float</span>)j / mOptions.Width, (<span style="color:blue;">float</span>)i / mOptions.Height);<br /> }<br />}<br /><br />mVertexBuffer = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">VertexBuffer</span>(Game.GraphicsDevice,<br /> <span style="COLOR: rgb(43,145,175)">VertexPositionTexture</span>.SizeInBytes * mOptions.Width * mOptions.Height,<br /> <span style="COLOR: rgb(43,145,175)">BufferUsage</span>.WriteOnly);<br />mVertexBuffer.SetData(vertices);<br /><br />mIndexBuffer = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">IndexBuffer</span>(Game.GraphicsDevice, <span style="color:blue;">typeof</span>(<span style="color:blue;">int</span>), indices.Length, <span style="COLOR: rgb(43,145,175)">BufferUsage</span>.WriteOnly);<br />mIndexBuffer.SetData(indices);<br /><br />mDecl = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">VertexDeclaration</span>(Game.GraphicsDevice, <span style="COLOR: rgb(43,145,175)">VertexPositionTexture</span>.VertexElements);<br />}<br /><br /><span style="color:blue;">protected override void </span>LoadContent()<br />{<br /><span style="color:blue;">base</span>.LoadContent();<br /><br />mWaveMap0 = Game.Content.Load<<span style="COLOR: rgb(43,145,175)">Texture2D</span>>(mOptions.WaveMapAsset0);<br />mWaveMap1 = Game.Content.Load<<span style="COLOR: rgb(43,145,175)">Texture2D</span>>(mOptions.WaveMapAsset1);<br /><br /><span style="COLOR: rgb(43,145,175)">PresentationParameters </span>pp = Game.GraphicsDevice.PresentationParameters;<br /><span style="COLOR: rgb(43,145,175)">SurfaceFormat </span>format = pp.BackBufferFormat;<br /><span style="COLOR: rgb(43,145,175)">MultiSampleType </span>msType = pp.MultiSampleType;<br /><span style="color:blue;">int </span>msQuality = pp.MultiSampleQuality;<br /><br />mRefractionMap = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">RenderTarget2D</span>(Game.GraphicsDevice, mOptions.RenderTargetSize, mOptions.RenderTargetSize,<br /> 1, format, msType, msQuality);<br />mReflectionMap = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">RenderTarget2D</span>(Game.GraphicsDevice, mOptions.RenderTargetSize, mOptions.RenderTargetSize,<br /> 1, format, msType, msQuality);<br /><br />mEffect = Game.Content.Load<<span style="COLOR: rgb(43,145,175)">Effect</span>>(mEffectAsset);<br /><br /><span style="color:green;">//set the parameters that shouldn't change.<br />//Some of these might need to change every once in awhile,<br />//move them to updateEffectParams if you need that functionality.<br /></span><span style="color:blue;">if </span>(mEffect != <span style="color:blue;">null</span>)<br />{<br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"WaveMap0"</span>].SetValue(mWaveMap0);<br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"WaveMap1"</span>].SetValue(mWaveMap1);<br /><br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"TexScale"</span>].SetValue(mOptions.WaveMapScale);<br /><br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"WaterColor"</span>].SetValue(mOptions.WaterColor);<br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"SunColor"</span>].SetValue(mOptions.SunColor);<br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"SunDirection"</span>].SetValue(mOptions.SunDirection);<br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"SunFactor"</span>].SetValue(mOptions.SunFactor);<br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"SunPower"</span>].SetValue(mOptions.SunPower);<br /><br /> mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"World"</span>].SetValue(mWorld);<br />}<br />}<br /><br /><span style="color:blue;">public override void </span>Update(<span style="COLOR: rgb(43,145,175)">GameTime </span>gameTime)<br />{<br /><span style="color:blue;">float </span>timeDelta = (<span style="color:blue;">float</span>)gameTime.ElapsedGameTime.TotalSeconds;<br /><br />mOptions.WaveMapOffset0 += mOptions.WaveMapVelocity0 * timeDelta;<br />mOptions.WaveMapOffset1 += mOptions.WaveMapVelocity1 * timeDelta;<br /><br /><span style="color:blue;">if </span>(mOptions.WaveMapOffset0.X >= 1.0f mOptions.WaveMapOffset0.X <= -1.0f)<br /> mOptions.WaveMapOffset0.X = 0.0f;<br /><span style="color:blue;">if </span>(mOptions.WaveMapOffset1.X >= 1.0f mOptions.WaveMapOffset1.X <= -1.0f)<br /> mOptions.WaveMapOffset1.X = 0.0f;<br /><span style="color:blue;">if </span>(mOptions.WaveMapOffset0.Y >= 1.0f mOptions.WaveMapOffset0.Y <= -1.0f)<br /> mOptions.WaveMapOffset0.Y = 0.0f;<br /><span style="color:blue;">if </span>(mOptions.WaveMapOffset1.Y >= 1.0f mOptions.WaveMapOffset1.Y <= -1.0f)<br /> mOptions.WaveMapOffset1.Y = 0.0f;<br />}<br /><br /><span style="color:blue;">public override void </span>Draw(<span style="COLOR: rgb(43,145,175)">GameTime </span>gameTime)<br />{<br />UpdateEffectParams();<br /><br />Game.GraphicsDevice.Indices = mIndexBuffer;<br />Game.GraphicsDevice.Vertices[0].SetSource(mVertexBuffer, 0, <span style="COLOR: rgb(43,145,175)">VertexPositionTexture</span>.SizeInBytes);<br />Game.GraphicsDevice.VertexDeclaration = mDecl;<br /><br />mEffect.Begin(<span style="COLOR: rgb(43,145,175)">SaveStateMode</span>.None);<br /><br /><span style="color:blue;">foreach </span>(<span style="COLOR: rgb(43,145,175)">EffectPass </span>pass <span style="color:blue;">in </span>mEffect.CurrentTechnique.Passes)<br />{<br /> pass.Begin();<br /> Game.GraphicsDevice.DrawIndexedPrimitives(<span style="COLOR: rgb(43,145,175)">PrimitiveType</span>.TriangleList, 0, 0, mNumVertices, 0, mNumTris);<br /> pass.End();<br />}<br /><br />mEffect.End();<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Set the ViewProjection matrix and position of the Camera.<br /></span><span style="color:gray;">/// </summary><br />/// <param name="viewProj"></param><br />/// <param name="pos"></param><br /></span><span style="color:blue;">public void </span>SetCamera(<span style="COLOR: rgb(43,145,175)">Matrix </span>viewProj, <span style="COLOR: rgb(43,145,175)">Vector3 </span>pos)<br />{<br />mViewProj = viewProj;<br />mViewPos = pos;<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Updates the reflection and refraction maps. Called<br /></span><span style="color:gray;">/// </span><span style="color:green;">on update.<br /></span><span style="color:gray;">/// </summary><br />/// <param name="gameTime"></param><br /></span><span style="color:blue;">public void </span>UpdateWaterMaps(<span style="COLOR: rgb(43,145,175)">GameTime </span>gameTime)<br />{<br /><span style="color:green;">/*------------------------------------------------------------------------------------------<br />* Render to the Reflection Map<br />*/<br />//clip objects below the water line, and render the scene upside down<br /></span>GraphicsDevice.RenderState.CullMode = <span style="COLOR: rgb(43,145,175)">CullMode</span>.CullClockwiseFace;<br /><br />GraphicsDevice.SetRenderTarget(0, mReflectionMap);<br />GraphicsDevice.Clear(<span style="COLOR: rgb(43,145,175)">ClearOptions</span>.Target <span style="COLOR: rgb(43,145,175)">ClearOptions</span>.DepthBuffer, mOptions.WaterColor, 1.0f, 0);<br /><br /><span style="color:green;">//reflection plane in local space<br /></span><span style="COLOR: rgb(43,145,175)">Vector4 </span>waterPlaneL = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector4</span>(0.0f, -1.0f, 0.0f, 0.0f);<br /><br /><span style="COLOR: rgb(43,145,175)">Matrix </span>wInvTrans = <span style="COLOR: rgb(43,145,175)">Matrix</span>.Invert(mWorld);<br />wInvTrans = <span style="COLOR: rgb(43,145,175)">Matrix</span>.Transpose(wInvTrans);<br /><br /><span style="color:green;">//reflection plane in world space<br /></span><span style="COLOR: rgb(43,145,175)">Vector4 </span>waterPlaneW = <span style="COLOR: rgb(43,145,175)">Vector4</span>.Transform(waterPlaneL, wInvTrans);<br /><br /><span style="COLOR: rgb(43,145,175)">Matrix </span>wvpInvTrans = <span style="COLOR: rgb(43,145,175)">Matrix</span>.Invert(mWorld * mViewProj);<br />wvpInvTrans = <span style="COLOR: rgb(43,145,175)">Matrix</span>.Transpose(wvpInvTrans);<br /><br /><span style="color:green;">//reflection plane in homogeneous space<br /></span><span style="COLOR: rgb(43,145,175)">Vector4 </span>waterPlaneH = <span style="COLOR: rgb(43,145,175)">Vector4</span>.Transform(waterPlaneL, wvpInvTrans);<br /><br />GraphicsDevice.ClipPlanes[0].IsEnabled = <span style="color:blue;">true</span>;<br />GraphicsDevice.ClipPlanes[0].Plane = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Plane</span>(waterPlaneH);<br /><br /><span style="COLOR: rgb(43,145,175)">Matrix </span>reflectionMatrix = <span style="COLOR: rgb(43,145,175)">Matrix</span>.CreateReflection(<span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Plane</span>(waterPlaneW));<br /><br /><span style="color:blue;">if </span>(mDrawFunc != <span style="color:blue;">null</span>)<br /> mDrawFunc(reflectionMatrix);<br /><br />GraphicsDevice.RenderState.CullMode = <span style="COLOR: rgb(43,145,175)">CullMode</span>.CullCounterClockwiseFace;<br /><br />GraphicsDevice.SetRenderTarget(0, <span style="color:blue;">null</span>);<br /><br /><br /><span style="color:green;">/*------------------------------------------------------------------------------------------<br />* Render to the Refraction Map<br />*/<br /><br />//if the application is going to send us the refraction map<br />//exit early. The refraction map must be given to the water component<br />//before it renders<br /></span><span style="color:blue;">if </span>(mGrabRefractionFromFB)<br />{<br /> GraphicsDevice.ClipPlanes[0].IsEnabled = <span style="color:blue;">false</span>;<br /> <span style="color:blue;">return</span>;<br />}<br /><br /><span style="color:green;">//update the refraction map, clip objects above the water line<br />//so we don't get artifacts<br /></span>GraphicsDevice.SetRenderTarget(0, mRefractionMap);<br />GraphicsDevice.Clear(<span style="COLOR: rgb(43,145,175)">ClearOptions</span>.Target <span style="COLOR: rgb(43,145,175)">ClearOptions</span>.DepthBuffer, mOptions.WaterColor, 1.0f, 1);<br /><br /><span style="color:green;">//reflection plane in local space<br /></span>waterPlaneL.W = 2.5f;<br /><br /><span style="color:green;">//if we're below the water line, don't perform clipping.<br />//this allows us to see the distorted objects from under the water<br /></span><span style="color:blue;">if </span>(mViewPos.Y < mWorld.Translation.Y)<br />{<br /> GraphicsDevice.ClipPlanes[0].IsEnabled = <span style="color:blue;">false</span>;<br />}<br /><br /><span style="color:blue;">if </span>(mDrawFunc != <span style="color:blue;">null</span>)<br /> mDrawFunc(<span style="COLOR: rgb(43,145,175)">Matrix</span>.Identity);<br /><br />GraphicsDevice.ClipPlanes[0].IsEnabled = <span style="color:blue;">false</span>;<br /><br />GraphicsDevice.SetRenderTarget(0, <span style="color:blue;">null</span>);<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Updates effect parameters related to the water shader<br /></span><span style="color:gray;">/// </summary><br /></span><span style="color:blue;">private void </span>UpdateEffectParams()<br />{<br /><span style="color:green;">//update the reflection and refraction textures<br /></span>mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"ReflectMap"</span>].SetValue(mReflectionMap.GetTexture());<br />mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"RefractMap"</span>].SetValue(mRefractionMap.GetTexture());<br /><br /><span style="color:green;">//normal map offsets<br /></span>mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"WaveMapOffset0"</span>].SetValue(mOptions.WaveMapOffset0);<br />mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"WaveMapOffset1"</span>].SetValue(mOptions.WaveMapOffset1);<br /><br />mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"WorldViewProj"</span>].SetValue(mWorld * mViewProj);<br /><br />mEffect.Parameters[<span style="COLOR: rgb(163,21,21)">"EyePos"</span>].SetValue(mViewPos);<br />}<br /><br /><span style="color:gray;">/// <summary><br />/// </span><span style="color:green;">Generates a grid of vertices to use for the water plane.<br /></span><span style="color:gray;">/// </summary><br />/// <param name="numVertRows"></span><span style="color:green;">Number of rows. Must be 2^n + 1. Ex. 129, 257, 513.</span><span style="color:gray;"></param><br />/// <param name="numVertCols"></span><span style="color:green;">Number of columns. Must be 2^n + 1. Ex. 129, 257, 513.</span><span style="color:gray;"></param><br />/// <param name="dx"></span><span style="color:green;">Cell spacing in the x dimension.</span><span style="color:gray;"></param><br />/// <param name="dz"></span><span style="color:green;">Cell spacing in the y dimension.</span><span style="color:gray;"></param><br />/// <param name="center"></span><span style="color:green;">Center of the plane.</span><span style="color:gray;"></param><br />/// <param name="verts"></span><span style="color:green;">Outputs the constructed vertices for the plane.</span><span style="color:gray;"></param><br />/// <param name="indices"></span><span style="color:green;">Outpus the constructed triangle indices for the plane.</span><span style="color:gray;"></param><br /></span><span style="color:blue;">private void </span>GenTriGrid(<span style="color:blue;">int </span>numVertRows, <span style="color:blue;">int </span>numVertCols, <span style="color:blue;">float </span>dx, <span style="color:blue;">float </span>dz,<br /> <span style="COLOR: rgb(43,145,175)">Vector3 </span>center, <span style="color:blue;">out </span><span style="COLOR: rgb(43,145,175)">Vector3</span>[] verts, <span style="color:blue;">out int</span>[] indices)<br />{<br /><span style="color:blue;">int </span>numVertices = numVertRows * numVertCols;<br /><span style="color:blue;">int </span>numCellRows = numVertRows - 1;<br /><span style="color:blue;">int </span>numCellCols = numVertCols - 1;<br /><br /><span style="color:blue;">int </span>mNumTris = numCellRows * numCellCols * 2;<br /><br /><span style="color:blue;">float </span>width = (<span style="color:blue;">float</span>)numCellCols * dx;<br /><span style="color:blue;">float </span>depth = (<span style="color:blue;">float</span>)numCellRows * dz;<br /><br /><span style="color:green;">//===========================================<br />// Build vertices.<br /><br />// We first build the grid geometry centered about the origin and on<br />// the xz-plane, row-by-row and in a top-down fashion. We then translate<br />// the grid vertices so that they are centered about the specified<br />// parameter 'center'.<br /><br />//verts.resize(numVertices);<br /></span>verts = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector3</span>[numVertices];<br /><br /><span style="color:green;">// Offsets to translate grid from quadrant 4 to center of<br />// coordinate system.<br /></span><span style="color:blue;">float </span>xOffset = -width * 0.5f;<br /><span style="color:blue;">float </span>zOffset = depth * 0.5f;<br /><br /><span style="color:blue;">int </span>k = 0;<br /><span style="color:blue;">for </span>(<span style="color:blue;">float </span>i = 0; i < numVertRows; ++i)<br />{<br /> <span style="color:blue;">for </span>(<span style="color:blue;">float </span>j = 0; j < numVertCols; ++j)<br /> {<br /> <span style="color:green;">// Negate the depth coordinate to put in quadrant four.<br /> // Then offset to center about coordinate system.<br /> </span>verts[k] = <span style="color:blue;">new </span><span style="COLOR: rgb(43,145,175)">Vector3</span>(0, 0, 0);<br /> verts[k].X = j * dx + xOffset;<br /> verts[k].Z = -i * dz + zOffset;<br /> verts[k].Y = 0.0f;<br /><br /> <span style="COLOR: rgb(43,145,175)">Matrix </span>translation = <span style="COLOR: rgb(43,145,175)">Matrix</span>.CreateTranslation(center);<br /> verts[k] = <span style="COLOR: rgb(43,145,175)">Vector3</span>.Transform(verts[k], translation);<br /><br /> ++k; <span style="color:green;">// Next vertex<br /> </span>}<br />}<br /><br /><span style="color:green;">//===========================================<br />// Build indices.<br /><br />//indices.resize(mNumTris * 3);<br /></span>indices = <span style="color:blue;">new int</span>[mNumTris * 3];<br /><br /><span style="color:green;">// Generate indices for each quad.<br /></span>k = 0;<br /><span style="color:blue;">for </span>(<span style="color:blue;">int </span>i = 0; i < numCellRows; ++i)<br />{<br /> <span style="color:blue;">for </span>(<span style="color:blue;">int </span>j = 0; j < numCellCols; ++j)<br /> {<br /> indices[k] = i * numVertCols + j;<br /> indices[k + 1] = i * numVertCols + j + 1;<br /> indices[k + 2] = (i + 1) * numVertCols + j;<br /><br /> indices[k + 3] = (i + 1) * numVertCols + j;<br /> indices[k + 4] = i * numVertCols + j + 1;<br /> indices[k + 5] = (i + 1) * numVertCols + j + 1;<br /><br /> <span style="color:green;">// next quad<br /> </span>k += 6;<br /> }<br />}<br />}<br />}<br />}</pre><span style="FONT-WEIGHT: bold"><br />Water.fx shader:</span><br /><pre class="mycode"><span style="color:green;">//Water effect shader that uses reflection and refraction maps projected onto the water.<br />//These maps are distorted based on the two scrolling normal maps.<br /><br /></span><span style="color:blue;">float4x4 </span>World;<br /><span style="color:blue;">float4x4 </span>WorldViewProj;<br /><br /><span style="color:blue;">float4 </span>WaterColor;<br /><span style="color:blue;">float3 </span>SunDirection;<br /><span style="color:blue;">float4 </span>SunColor;<br /><span style="color:blue;">float </span>SunFactor; <span style="color:green;">//the intensity of the sun specular term.<br /></span><span style="color:blue;">float </span>SunPower; <span style="color:green;">//how shiny we want the sun specular term on the water to be.<br /></span><span style="color:blue;">float3 </span>EyePos;<br /><br /><span style="color:green;">// Texture coordinate offset vectors for scrolling<br />// normal maps.<br /></span><span style="color:blue;">float2 </span>WaveMapOffset0;<br /><span style="color:blue;">float2 </span>WaveMapOffset1;<br /><br /><span style="color:green;">// Two normal maps and the reflection/refraction maps<br /></span><span style="color:blue;">texture </span>WaveMap0;<br /><span style="color:blue;">texture </span>WaveMap1;<br /><span style="color:blue;">texture </span>ReflectMap;<br /><span style="color:blue;">texture </span>RefractMap;<br /><br /><span style="color:green;">//scale used on the wave maps<br /></span><span style="color:blue;">float </span>TexScale;<br /><br /><span style="color:blue;">static const float </span>R0 = 0.02037f;<br /><br /><span style="color:blue;">sampler </span>WaveMapS0 = <span style="color:blue;">sampler_state<br /></span>{<br />Texture = <WaveMap0>;<br /><span style="color:blue;">MinFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MagFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MipFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">AddressU </span>= <span style="color:navy;">WRAP</span>;<br /><span style="color:blue;">AddressV </span>= <span style="color:navy;">WRAP</span>;<br />};<br /><br /><span style="color:blue;">sampler </span>WaveMapS1 = <span style="color:blue;">sampler_state<br /></span>{<br />Texture = <WaveMap1>;<br /><span style="color:blue;">MinFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MagFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MipFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">AddressU </span>= <span style="color:navy;">WRAP</span>;<br /><span style="color:blue;">AddressV </span>= <span style="color:navy;">WRAP</span>;<br />};<br /><br /><span style="color:blue;">sampler </span>ReflectMapS = <span style="color:blue;">sampler_state<br /></span>{<br />Texture = <ReflectMap>;<br /><span style="color:blue;">MinFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MagFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MipFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">AddressU </span>= <span style="color:navy;">CLAMP</span>;<br /><span style="color:blue;">AddressV </span>= <span style="color:navy;">CLAMP</span>;<br />};<br /><br /><span style="color:blue;">sampler </span>RefractMapS = <span style="color:blue;">sampler_state<br /></span>{<br />Texture = <RefractMap>;<br /><span style="color:blue;">MinFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MagFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">MipFilter </span>= <span style="color:navy;">LINEAR</span>;<br /><span style="color:blue;">AddressU </span>= <span style="color:navy;">CLAMP</span>;<br /><span style="color:blue;">AddressV </span>= <span style="color:navy;">CLAMP</span>;<br />};<br /><br /><span style="color:blue;">struct </span>OutputVS<br />{<br /><span style="color:blue;">float4 </span>posH : <span style="color:navy;">POSITION0</span>;<br /><span style="color:blue;">float3 </span>toEyeW : <span style="color:navy;">TEXCOORD0</span>;<br /><span style="color:blue;">float2 </span>tex0 : <span style="color:navy;">TEXCOORD1</span>;<br /><span style="color:blue;">float2 </span>tex1 : <span style="color:navy;">TEXCOORD2</span>;<br /><span style="color:blue;">float4 </span>projTexC : <span style="color:navy;">TEXCOORD3</span>;<br /><span style="color:blue;">float4 </span>pos : <span style="color:navy;">TEXCOORD4</span>;<br />};<br /><br />OutputVS WaterVS( <span style="color:blue;">float3 </span>posL : <span style="color:navy;">POSITION0</span>,<br /> <span style="color:blue;">float2 </span>texC : <span style="color:navy;">TEXCOORD0</span>)<br />{<br /><span style="color:green;">// Zero out our output.<br /></span>OutputVS outVS = (OutputVS)0;<br /><br /><span style="color:green;">// Transform vertex position to world space.<br /></span><span style="color:blue;">float3 </span>posW = <span style="color:blue;">mul</span>(<span style="color:blue;">float4</span>(posL, 1.0f), World).xyz;<br />outVS.pos.xyz = posW;<br />outVS.pos.w = 1.0f;<br /><br /><span style="color:green;">// Compute the unit vector from the vertex to the eye.<br /></span>outVS.toEyeW = posW - EyePos;<br /><br /><span style="color:green;">// Transform to homogeneous clip space.<br /></span>outVS.posH = <span style="color:blue;">mul</span>(<span style="color:blue;">float4</span>(posL, 1.0f), WorldViewProj);<br /><br /><span style="color:green;">// Scroll texture coordinates.<br /></span>outVS.tex0 = (texC * TexScale) + WaveMapOffset0;<br />outVS.tex1 = (texC * TexScale) + WaveMapOffset1;<br /><br /><span style="color:green;">// Generate projective texture coordinates from camera's perspective.<br /></span>outVS.projTexC = outVS.posH;<br /><br /><span style="color:green;">// Done--return the output.<br /></span><span style="color:blue;">return </span>outVS;<br />}<br /><br /><span style="color:blue;">float4 </span>WaterPS( <span style="color:blue;">float3 </span>toEyeW : <span style="color:navy;">TEXCOORD0</span>,<br /> <span style="color:blue;">float2 </span>tex0 : <span style="color:navy;">TEXCOORD1</span>,<br /> <span style="color:blue;">float2 </span>tex1 : <span style="color:navy;">TEXCOORD2</span>,<br /> <span style="color:blue;">float4 </span>projTexC : <span style="color:navy;">TEXCOORD3</span>,<br /> <span style="color:blue;">float4 </span>pos : <span style="color:navy;">TEXCOORD4</span>) : <span style="color:navy;">COLOR<br /></span>{<br />projTexC.xyz /= projTexC.w;<br />projTexC.x = 0.5f*projTexC.x + 0.5f;<br />projTexC.y = -0.5f*projTexC.y + 0.5f;<br />projTexC.z = .1f / projTexC.z;<br /><br />toEyeW = <span style="color:blue;">normalize</span>(toEyeW);<br />SunDirection = <span style="color:blue;">normalize</span>(SunDirection);<br /><br /><span style="color:green;">// Light vector is opposite the direction of the light.<br /></span><span style="color:blue;">float3 </span>lightVecW = -SunDirection;<br /><br /><span style="color:green;">// Sample normal map.<br /></span><span style="color:blue;">float3 </span>normalT0 = <span style="color:blue;">tex2D</span>(WaveMapS0, tex0);<br /><span style="color:blue;">float3 </span>normalT1 = <span style="color:blue;">tex2D</span>(WaveMapS1, tex1);<br /><br /><span style="color:green;">//unroll the normals retrieved from the normalmaps<br /></span>normalT0.yz = normalT0.zy;<br />normalT1.yz = normalT1.zy;<br /><br />normalT0 = 2.0f*normalT0 - 1.0f;<br />normalT1 = 2.0f*normalT1 - 1.0f;<br /><br /><span style="color:blue;">float3 </span>normalT = <span style="color:blue;">normalize</span>(0.5f*(normalT0 + normalT1));<br /><span style="color:blue;">float3 </span>n1 = <span style="color:blue;">float3</span>(0,1,0); <span style="color:green;">//we'll just use the y unit vector for spec reflection.<br /><br />//get the reflection vector from the eye<br /></span><span style="color:blue;">float3 </span>R = <span style="color:blue;">normalize</span>(<span style="color:blue;">reflect</span>(toEyeW,normalT));<br /><br /><span style="color:blue;">float4 </span>finalColor;<br />finalColor.a = 1;<br /><br /><span style="color:green;">//compute the fresnel term to blend reflection and refraction maps<br /></span><span style="color:blue;">float </span>ang = <span style="color:blue;">saturate</span>(<span style="color:blue;">dot</span>(-toEyeW,n1));<br /><span style="color:blue;">float </span>f = R0 + (1.0f-R0) * <span style="color:blue;">pow</span>(1.0f-ang,5.0);<br /><br /><span style="color:green;">//also blend based on distance<br /></span>f = <span style="color:blue;">min</span>(1.0f, f + 0.007f * EyePos.y);<br /><br /><span style="color:green;">//compute the reflection from sunlight, hacked in color, should be a variable<br /></span><span style="color:blue;">float </span>sunFactor = SunFactor;<br /><span style="color:blue;">float </span>sunPower = SunPower;<br /><br /><span style="color:blue;">if</span>(EyePos.y < pos.y)<br />{<br />sunFactor = 7.0f; <span style="color:green;">//these could also be sent to the shader<br /></span>sunPower = 55.0f;<br />}<br /><span style="color:blue;">float3 </span>sunlight = sunFactor * <span style="color:blue;">pow</span>(<span style="color:blue;">saturate</span>(<span style="color:blue;">dot</span>(R, lightVecW)), sunPower) * SunColor;<br /><br /><span style="color:blue;">float4 </span>refl = <span style="color:blue;">tex2D</span>(ReflectMapS, projTexC.xy + projTexC.z * normalT.xz);<br /><span style="color:blue;">float4 </span>refr = <span style="color:blue;">tex2D</span>(RefractMapS, projTexC.xy - projTexC.z * normalT.xz);<br /><br /><span style="color:green;">//only use the refraction map if we're under water<br /></span><span style="color:blue;">if</span>(EyePos.y < pos.y)<br />f = 0.0f;<br /><br /><span style="color:green;">//interpolate the reflection and refraction maps based on the fresnel term and add the sunlight<br /></span>finalColor.rgb = WaterColor * <span style="color:blue;">lerp</span>( refr, refl, f) + sunlight;<br /><br /><span style="color:blue;">return </span>finalColor;<br />}<br /><br /><span style="color:blue;">technique </span>WaterTech<br />{<br /><span style="color:blue;">pass </span>Pass1<br />{<br /><span style="color:green;">// Specify the vertex and pixel shader associated with this pass.<br /></span>vertexShader = <span style="color:blue;">compile </span><span style="color:purple;">vs_2_0 </span>WaterVS();<br />pixelShader = <span style="color:blue;">compile </span><span style="color:purple;">ps_2_0 </span>WaterPS();<br /><br /><span style="color:gray;">CullMode </span>= None;<br />}<br />}</pre><br /><span style="color:#ff6600;"><strong>Edit:</strong> A demo of the water component is now available.</span><br /><iframe style="BORDER-RIGHT: #dde5e9 1px solid; PADDING-RIGHT: 0px; BORDER-TOP: #dde5e9 1px solid; PADDING-LEFT: 0px; PADDING-BOTTOM: 0px; MARGIN: 3px; BORDER-LEFT: #dde5e9 1px solid; WIDTH: 240px; PADDING-TOP: 0px; BORDER-BOTTOM: #dde5e9 1px solid; HEIGHT: 66px; BACKGROUND-COLOR: #ffffff" marginwidth="0" marginheight="0" src="http://cid-b80a3031b5bfa52b.skydrive.live.com/embedrowdetail.aspx/Public/WaterComponentDemo.zip" frameborder="0" scrolling="no"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com31tag:blogger.com,1999:blog-1023441640234597436.post-23406446521166517872008-11-07T01:20:00.003-05:002008-11-07T01:22:48.676-05:00Scientific Visualization<p>This semester I've been taking a class in scientific visualization. It's pretty interesting and it covers a lot of techniques such as color visualization, human vision and color perception, contours, isosurfacing, volume rendering, flow visualization such as stream lines and stream surfaces and texture based methods.</p> <p>There are good and bad parts to this course. The bad part is that it is a fairly new graduate course and has no prerequisites. The professor is just trying to build up interest in the course. This equates to us not actually coding the different algorithms, but using a visualization framework, VTK, and c++/java/python/tcl to implement the techniques. The good part is that this semester is one of the busiest I've had, so minimizing my work is a good thing :)</p> <p>Now on to some pictures.</p> <p><strong>Contours, Heightmaps:</strong></p> <p> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSpr1mWGAlgfzUnOQlBgSY2PF9WoFZIRJgEBWv6uL43jlg6EXAdji-yS3lwjzp-FK5r567DJ3g8MU1A8r70u6GEO1l5DtOsYZ5fOU9I6hp241RadXoMAejMlH_YLGiQ3o9Jg5ZSeRzGZkB/s1600-h/brain1%5B5%5D.png"><img style="border: 0px none ;" alt="brain1" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj07OSRfvbMX5LSXEuAkdEMb2ES8YcBZovXlySC0jMw6wstW94G9q7xS0BcBWdVMNiPsyQZHRMwiGcIj9RlZZUNVDBu9tqNmesNyhyAkUByulYslCkvmLTCZm5BBEYzbFDqMcea8LZv4PjE/?imgmax=800" border="0" height="224" width="200" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg5p0XzKNXNRdaBqSc3VU0iKM3z6ZC_V8NZbKzhJdzo5FchfputZEVKIWddBgeDi_J5DencA6qKvS7vtqMuuxtsYLmIw4Etd9UzL3gvPWKNPJC7rQLqswZSLETL9HaPw5DLJoEj9l3M-IID/s1600-h/brainHM2%5B6%5D.jpg"><img style="border: 0px none ;" alt="brainHM2" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgqsRLvp3cqOd3EUK6t0t-fqlHi1tOH3ubW8krYsn9b7yiGxwcT-JlbVUFBAO50JFBAiSqOHU3K0YL-1H69mvbjeJ-KNP3A3yb-pldqiITT3xbhbtBzVDq73TfyziSMAw37Ij5_wahdt7fj/?imgmax=800" border="0" height="193" width="200" /></a></p> <p><strong>Isosurfacing:</strong></p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi2ArYBzdxLuYPdnJEa496i_ls3yxy61DXd_TBEF4j_TCiZFJmxNU3zOkiquzOktWt-zj9ASx1p4m6cIIGmRpFXL02O2fC0EsWcTGuTzk4_mu8yjWVZ5skzQhyphenhyphenf_N4vxL1zeDBudq6PW3qC/s1600-h/2%5B9%5D.jpg"><img style="border: 0px none ;" alt="2" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXLbNzbg6TIzTBfr7dUtEKRr96Zf7GCfFDXj8yI8l2oxGfWGbDxWCjxQIKF0KhcQdTD9KxRC56O70E9YKTjk2w4G7GIlpaT2dvIPR_gB13kdl_vJ7w1e7niDVVwssxnp9wEB7y2meyFokd/?imgmax=800" border="0" height="195" width="200" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjo0rIODIf35cf9kFjjoqHCXLRXZwdt2lSz_3RsQoENicavBz-3UrnBL74zojzDah7mOiA_iQW-RTHVZPYhB09QLewWRQm9keWdvKeohiq_v_PITKczekunW3NU1OzR6Te_Ul_0LdEct7yB/s1600-h/5%5B7%5D.jpg"><img style="border: 0px none ;" alt="5" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj7lL_86DwqR_IdAidtSeTKjvYfqS4EJMScHNFHW-E6R4shJW0LocT7AuvpvdCr3TeLMXc1huMOEJHLy-ZX6ocMRj5xC_lGhFTfnrNXCe_aEQvaEJt2Du4F6pbrHbDsq26E7gw3sUnqjtiw/?imgmax=800" border="0" height="207" width="170" /></a> </p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-2g4gaN0ohPZOMa2M8J9fu-Va0FPqiI1iEhPiBBvh99GP3e8ehYpxCy-Iyag56M-wPunwRAClzvikK-0M6mnFWSSglUTp3KlUBmsqiD4RJX59XZ_4OlYX3JuCtjFKtg0xVRf9uyKnzvqV/s1600-h/2_0%5B5%5D.jpg"><img style="border: 0px none ;" alt="2_0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEUpBCaXUOphQZS6wC5OOGQ5jTzS6o7CEzEW3lbGPILthGXA7TnUywLzRdJvMyahf4Bry45rf6rYmBgUXKe_EuJYpYf-NlumcKCL-CQ5Ve7zO_1Erab7Fa1F2ssWtw5GZkUB0oAU9lq-OY/?imgmax=800" border="0" height="193" width="200" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgd3IdAxdPZoRJPFV8YtL5-u7qJWyN7ayt50YlRBD0Gix_uY8yZTuYmigwzSDOTNYxE01ete6i9jb_MKjvrv-jXpdS_PZhIrtkek3TmYx7aeXQRKQXT2Ge04TEhFJ1sRl-i93219souYB_o/s1600-h/6%5B5%5D.jpg"><img style="border: 0px none ;" alt="6" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQSC2GAZK9CwAMm_8upw3U7YJc2lKuzJxfGZM_fdgstizuu4Vg0xbmA7Qh16DIjV8D-5Cicbbo4d4XS_fcKo7u9NqawuAC1gc7wSKxFhhp_G-NrHoZdDWXduonL95FGPC7qaY2O7lPseTc/?imgmax=800" border="0" height="175" width="200" /></a> </p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjxZHIBT-tSXJpItWx6Weqwe-wAam1q9U7eg2prJBulYB9c76gd80VghpKPPk5PIX040P6U_vpYqe_NqpEiSCiJpwl9muStzflXladrYUCyIA7f3ATdGuvLN_Kh7lf2Ncl5MzXJucJOIymY/s1600-h/7%5B5%5D.jpg"><img style="border: 0px none ;" alt="7" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiBNIoobzk-9NkTSF2Alt1eSoj5cpT6Lw0_OwkhAPi4Q4DO7bTzd7cO_C1P2uYPvSwgBtZWqQpE0eyBdfJNdjom_798xEE0b3eztUgfWf5B2UOVF04Ljlh8MAUP5ArwVMPQNhFdCHpUaEdc/?imgmax=800" border="0" height="171" width="200" /></a></p> <p><strong>Volume Rendering:</strong></p> <p> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgYXhHxoI3-ekYxdtyqcgi8IWk6qZ0yvRJZa3_eD9AG-vY1KbHR5u-0u_UYB09QxgIJQdprkLepw6lxdrRnUd5Rae7AFdEHAyrojYciGAlci3sWUcuJnrHWZA9ZMUaKLI81Io5aQwaxZwom/s1600-h/dist_0_25%5B5%5D.jpg"><img style="border: 0px none ;" alt="dist_0_25" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSxqsYEYNNpGsYl4vzx3Pb_SeYKTPgz7GXsuXA-7B01w6hr2ENwfs7-ZeQZFN4mv-zc6Y0oBa_aQA0mY6THKMHBG4Nrii4S1ooTCppFqE-zbt7BzgusEydwVtFr0OyHDquXjwtIN6UmOE1/?imgmax=800" border="0" height="226" width="200" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgs6SAe304MsOBXB4ig5Eu9QHTVHkZ1nrZFG2-a6gipJBZGME0ZynMFfOT4wq6lqNv-SyIZ8iN2yWnbcFFABj1FyDjpIwbp9eg5NapigzQWXMcfB0p7zbeJBzsF1AprfAb0fgv_IMqlo80t/s1600-h/vol1_inv%5B5%5D.jpg"><img style="border: 0px none ;" alt="vol1_inv" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQJSJnIdb15xOlZU1cKc_YewLhUUr3PE9nFzA4aXzESQNnVGm0sYK_3LcYiQoGnDDLaGfxMOnV7ARVA_5CDfa3GoQ98wesDsLCqTL9WoyUqMATlLJ4xr85CKjMI5sFCeKtD8sjZWWW-zgs/?imgmax=800" border="0" height="227" width="200" /></a></p> <p> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhf-uzsjwgM_bZYECy8NwMZr5uUP-SeJwFeSJwO5mFV1nms0CMz1h1xW1fWPJaG50mRVBnvk0As3o1TzEgvjLWCB3yfryK0k2HV79RFaTGK_fLPvJ-cliDk_vhVWRA4L5iTu947pL54wUQx/s1600-h/mip1%5B4%5D.png"><img style="border: 0px none ;" alt="mip1" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiDqLYAuW9Oo7A-3JwmGeKKMIj5V0f9Oh4SpEAQ3aN2pmQVmQkyvEVQFuNE-XRPydpw3ul5fj5-VT3Hy-Fn-gQJKCjMRopQTvpG8LHRNTxbuY1WJHdpwQFihE7Y2IKmirzB_Lxvvd9SeilC/?imgmax=800" border="0" height="240" width="188" /></a> </p> <p><strong>Glyphs, Stream lines, stream surfaces:</strong></p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEji64nmuCVelNVzclEpV7tOBQsxtwPKYGmqAca3svlAzf4EHj7FQiZH1VfL25AU2oUqJgxRHkmUrp_0HwP9Mwtm50E6l1qxwDuRd5j79PJrIfVZIjc8QbtBc6Q65jkdhx2vO_n7AUqqyGPG/s1600-h/glyph2%5B5%5D.png"><img style="border: 0px none ;" alt="glyph2" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgOl-c70-kyU2XtpwMzu9gggb7Z7BtZ-drPpvFAVHSj3ujHGEMbZaDD8FW02GJZ603ksqBPgSVkNx_5IRHXkqT-bPjZYX9lp1XE2kBagB1-NQCHh0-A0eI9avCb3vT_cmllvm4HOrmMbQg5/?imgmax=800" border="0" height="154" width="200" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhg71a_V6ooAwYu5oVb-gZqrny2aX7ZJAFrP1QPOc5rdhyD9tEEoo5WEgrCHNhITrJsWRkT70EsYRk7huUmK0RCkTy2VvCo5vGnO_KolNnApkuKClKqkD_dIv9RcmIONRG4EiMEbtyMgTL4/s1600-h/glyph0%5B5%5D.png"><img style="border: 0px none ;" alt="glyph0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiyI79M7M9u_Te5d7eeyYwuiIprhf4bSUDLzuUbqT5gKjSS3x9cfqKHjHVwnh_ANwbaofyj5YXBSQeLoHYTykr54E_5oNH4itR9ciN7gJqI4_jkmijLgZS5eAg1CXawOYx3lUKzpycpt6oV/?imgmax=800" border="0" height="154" width="200" /></a> </p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgR-dOqf2gW97AHmroP3dwix1QLfbHUE9erZCQGWGKEZvQaMjqmQBvl2yhx-y5OTmJYvUxk_FaPR11-rFzJrv32rP6ZuHyg5QfU-ksmGXXY7fknmvCzKj4BorzNrHaNkdNaJOeWBl45Z4TB/s1600-h/streamlines0%5B5%5D.png"><img style="border: 0px none ;" alt="streamlines0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhb113MMjRSNKvdiI_rjARVrDfXzIxI_ToBPnOw4lnpfq6Gia-saOTErD_cur2GVdpP4lBhfF6AI-WIjyq_HAUE_1cNvhuCS081BevhKauh9Oho8GY9FpdmlIP-3Xa5Y6sAMnPK7EBqkmVw/?imgmax=800" border="0" height="154" width="200" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh_Vh4A1X3qBoO71CKHYT_fl8Gtyt5oCT0cCwmS_fKRcIvhsrXHIkGyt1uzRFR4JT5zZWYDG0Bshjwv2uHwnz0NjVD8XGy9djmRQOq7gQIVSswrI7rt82PIV1KNi4oYsIJb8nUr6IxeBFYB/s1600-h/streamlines3%5B6%5D.png"><img style="border: 0px none ;" alt="streamlines3" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjRX0EBuo5wagUU0C8mETITMpb_x2BwjrIRlyeMH8NqKsCRmDt9cO-jONS0klBZpYzwhTcS7LoAt-3mH0lFYtmLMIe6QHUaMjvWX6qUM_7gfilr36XBxZpEykHR-9VDp3JXiIdz0boJRmfi/?imgmax=800" border="0" height="154" width="200" /></a> </p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjCzuNO7fewnsJUKtEai_0nHKkrB01uczYLxQgQmwCu3QYnG3uBwZHlHMtjRmKX5JJ2kS6kFy71Yd3OJv7Ix4ui1GifMyJwWdTxBHa7-SMwWerUFO5lbMgKzYAs1EnWiH60-2d2VrEy4Io1/s1600-h/streamtubes0%5B5%5D.png"><img style="border: 0px none ;" alt="streamtubes0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiB6mh2vLjfJHElf7L86dAPPYS-iHHN8iyDF0xXzgSWiq2_CxznkdoktH7QAuZkEaQT587PqI13V_A2C69fes0sNR7MwamEGv9ZC_u7OMHKHMrEqqz0B2qsMjZSVuhSJV8eK_mjOZEdLGPy/?imgmax=800" border="0" height="154" width="200" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9afM-NThkPwfnbM2wBPWp2ITNXWIE-zhoQDSk0bX8mDW6vnGtpnBPHenr3O4Zt62WLzw4GczFEccdk6gD7P4CvHOw_nhAV5ZrFA7BhW2V8kRJ8eZLQlJ5qs5WlQ22RKy126SLPIPlc-YX/s1600-h/streamtubes5%5B5%5D.png"><img style="border: 0px none ;" alt="streamtubes5" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjbjOK27L_nP_frJ0WTWAImhyQ0rgJc2aS5j4dhgp4cvKA1iSctcVQplFOH02U3cxVFqdLfWLYa6330OBTlkMDHlkTdO7HfHhQDBwv34GeafpW1AwqpmhleN3dgWFLhQ-AAYtvTBtnZGEIf/?imgmax=800" border="0" height="154" width="200" /></a> </p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi_FE3n0MxstaN0GnZrLvrDgLkI5e9-zfkMyByWju8qTeEFpSqZgkzSsCi1PiPq4ifIoaj2Fzl8U6y3MP1n_PG2qDw1ZmsaEwQnu4Ufj7rJ8Tj5OL6ISkeqL2gBIEFueGqykx0QJZ4GZVYA/s1600-h/streamsurface0%5B5%5D.png"><img style="border: 0px none ;" alt="streamsurface0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhPXzlOQnO7ByAIYMT9cZBdHeYCcTZYC86TUvOQ-qEXb14ZOBb1qbymKEA-a0mHMT2nzsYled-M3vXzpUD0tqia_IgY4QhEcI9PJQn712WK9A_NlNXrP9QOBJCeSoHAbz1qxQFt4xJ-CGA9/?imgmax=800" border="0" height="154" width="200" /></a> <a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjp2p_OPg0UiO-mdSt191BbTdxKJY6CD2Q9KQwecGKIKKu228Gf8K2aC9kcThP73jK1DQxmnGcY3HlnLwxB1l3-cUqK-jU6bYjdmBzJNzbIZmnKzv4xtUEs5T3dLhZOHQDobxtVC9Gt0sDf/s1600-h/streamsurface5%5B5%5D.png"><img style="border: 0px none ;" alt="streamsurface5" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXPTyuWyOUsTVZWOFJeRg0K9urARMXKVWXXHzQdSccGVbW3ykAa6-HtsK1UAUhGdUjprAoeWvghamDh3CacfbA5BJhCinrfNtPhZWGUc4UZvGz_tk30q48XCivhbaKKf-m_-ozf-B-MznN/?imgmax=800" border="0" height="154" width="200" /></a> </p> <p><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCmskbAof8OQXoRtiMJ1cV95QcNr1yscUk16FQskSK5xyhykCdVM3cuZPy3cz2ZjNwBnwRu8G_rvVo5vNp21MwL8taYMdqXLNjs2RsT6pQqkbjmcYYBiXx2jM6F9FneRo4_GjuN4jHMvps/s1600-h/streamsurface3%5B5%5D.png"><img style="border: 0px none ;" alt="streamsurface3" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhQlhhzXn5a1EctXjS60cBR6WLnHGCF2c-_E1FnwuXrMWla47OAlegFYKFJ93gxyVkMPBLYbvXSbswta37Ql2NEznfZcfEDYudpSmpOxTdmQOPz0KU6xgHUcfs1mjtbUJTJEhf9_7S3dS11/?imgmax=800" border="0" height="154" width="200" /></a></p>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com3tag:blogger.com,1999:blog-1023441640234597436.post-24506257633490971432008-10-07T15:38:00.007-04:002008-10-10T18:45:46.271-04:00Interviews, research, and future postsI don't have a new sample/tutorial today. I have been very busy with interviews, research, and school as of late.<br /><br />As this is my last year of school, I've been interviewing with various companies and trying to get an on-line portfolio together. I recently had a great time in Portland meeting with Intel.<br /><br />Last spring I was involved in the research of using non-pinhole impostors for reflections and refractions. We submitted to Eurographics (specifically EGSR), but unfortunately didn't get accepted. So this semester we are looking to work on the short comings that some of the reviewers noted and resubmit to I3D. So until I get that out of the way there probably won't be any new samples/tutorials for a couple of weeks.<br /><br />As far as for future posts, I've been working on rain as a particle system and as a post process (see Tatarchuck's AMD/ATi paper on Rain). Besides this I've been wanting to have a series of samples/tutorials on different lighting methods. We all know Gouraud and [Blinn-]Phong shading, but I wanted to cover other methods such as Cook-Torrance, Oren-Nayar, Ward lighting and others. I also might do a tutorial on Depth Impostors that would build off of the Billboard Impostor tutorial I wrote earlier in the year.<br /><br />For now, here's a couple of teaser images that we submitted to EGSR. You're looking through a glass bunny's ear. You can see that with regular depth impostors, you are missing a significant amount of data for the teapot lid.<br /><br />Planar Pinhole Camera Depth Impostor<br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh21uf7eSCf1dI-gTJGvuahFx0PsMkZfrmo3N32J2s2and5kFCm4CyeZnw6pP87SwjRCl-JhFcr-smXtE2M7GmdjaOj_jYEFGZN4jiQEutcKqQ5wd2PumodNzNy6ALaxfypeRZMqrPrrTje/s1600-h/Picture1.png"><img style="cursor: pointer;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh21uf7eSCf1dI-gTJGvuahFx0PsMkZfrmo3N32J2s2and5kFCm4CyeZnw6pP87SwjRCl-JhFcr-smXtE2M7GmdjaOj_jYEFGZN4jiQEutcKqQ5wd2PumodNzNy6ALaxfypeRZMqrPrrTje/s400/Picture1.png" alt="" id="BLOGGER_PHOTO_ID_5254503493379822898" border="0" /></a><br /><br />Non-Pinhole Camera Depth Impostor<br /><a onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}" href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-mbHEMv22-YNHiIQv3_An-0rDoEg6NqF69YJlN05XJ8rq_W6PyGivDPHD4Nijvw8diKn8hr3GlEEA1WZvKBAhjALPUKFLCVacn9mcDbPEXr21zWgGBgnGWdZBNXdBcCugUVr6pjje4qQr/s1600-h/Picture2.png"><img style="cursor: pointer;" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-mbHEMv22-YNHiIQv3_An-0rDoEg6NqF69YJlN05XJ8rq_W6PyGivDPHD4Nijvw8diKn8hr3GlEEA1WZvKBAhjALPUKFLCVacn9mcDbPEXr21zWgGBgnGWdZBNXdBcCugUVr6pjje4qQr/s400/Picture2.png" alt="" id="BLOGGER_PHOTO_ID_5254503777242726402" border="0" /></a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com5tag:blogger.com,1999:blog-1023441640234597436.post-73574928895302292992008-07-30T12:09:00.003-04:002008-07-30T12:39:17.119-04:00Other recent blog postsThere are a few posts that I have seen other blogs/sites that I think are interesting, and would like to share with anyone who reads my blog.<br /><br />Andy Patrick has a series of useful "efficient development" posts regarding speeding up and making game development easier. Check it out:<br /><br /><a href="http://bittermanandy.wordpress.com/2008/07/26/efficient-development-part-one/">Efficient Development, Part I</a><br /><a href="http://bittermanandy.wordpress.com/2008/07/28/efficient-development-part-two/">Efficient Development, Part II</a><br /><a href="http://bittermanandy.wordpress.com/2008/07/28/efficient-development-part-three/">Efficient Development, Part III</a><br /><a href="http://bittermanandy.wordpress.com/2008/07/30/efficient-development-part-four/">Efficient Development, Part IV</a><br /><br /><br />Next up there were a couple of articles on Gamasutra related to 2D fluid dynamics that are also pretty interesting.<br /><br /><a href="http://www.gamasutra.com/view/feature/1549/practical_fluid_dynamics_part_1.php">Fluid Dynamics, Part I</a><br /><a href="http://www.gamasutra.com/view/feature/1615/practical_fluid_dynamics_part_2.php">Fluid Dynamics, Part II</a><br /><br /><br />Christer Ericson has an interesting post on using cellular automata for path finding. And one of the guys over at XNAInfo already has a working XNA demo implementing the idea.<br /><br /><a href="http://realtimecollisiondetection.net/blog/?p=57">Path finding with cellular automata</a><br /><a href="http://www.xnainfo.com/content.php?content=21">Game of Life on the GPU</a><br /><br />Got any interesting links? Post 'em in the comments!<br /><span style="text-decoration: underline;"><span style="font-weight: bold;"></span></span><a style="font-weight: bold;" href="http://en.wikipedia.org/wiki/Cellular_automata"></a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com0tag:blogger.com,1999:blog-1023441640234597436.post-65308937769252348702008-07-18T10:31:00.004-04:002008-11-18T13:34:02.541-05:00Dual-Paraboloid Variance Shadow Mapping<p><a href="http://lh4.ggpht.com/GraphicsRunner/SICpRA9vBNI/AAAAAAAAATw/66qZcihDngQ/dp_vsm%5B4%5D.jpg"><img height="327" alt="dp_vsm" src="http://lh5.ggpht.com/GraphicsRunner/SICpR7bdUxI/AAAAAAAAAT0/gKxbjPbGlxk/dp_vsm_thumb%5B2%5D.jpg" width="416" /></a> </p><p><span style="color:#ff6600;">Edit: Added the video that I recently made</span></p><object height="344" width="425"><param name="movie" value="http://www.youtube.com/v/RuPntqfZf44&hl=en&fs=1"><param name="allowFullScreen" value="true"><param name="allowscriptaccess" value="always"><embed src="http://www.youtube.com/v/RuPntqfZf44&hl=en&fs=1" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="425" height="344"></embed></object><p></p><p>I have to say, I really like variance shadow mapping. It's such a simple(ingenious) technique to implement, but it provides such nice looking results. I haven't had the need to implement the technique before, but I'm glad I did. Last post we implemented dual-paraboloid shadow mapping. And those of you with a PS 3.0 graphics card were able to have semi-soft shadows with percentage closer filtering. But now when we get rid of the PCF filter, and replace it with variance shadow mapping, we can fit all the code inside the PS 2.0 standard. Anyway, on to the code.</p><p><a href="http://www.punkuser.net/vsm/">Variance Shadow Mapping Paper + Demo</a></p><p><strong><span style="color:#ff6600;">Building the shadow maps:</span></strong></p><p>Variance shadow mapping is really simple to implement. First thing we need to change is to create either a RG32F or RG16F surface format for our front and rear shadow maps (instead of R32F/R16F). This allows us to store the depth of the pixel in the red channel and the squared depth of the pixel in the green channel. So our new pixel shader for building the depth/shadow maps is this:</p><p><span style="font-family:tre;color:#969696;">return float4(z, z * z, 0, 0, 1);</span></p><p><strong><span style="color:#ff6600;">Blurring the shadow maps:</span></strong></p><p>Variance shadow mapping improves upon standard shadow mapping by storing a distribution of depths at each pixel (z * z, and z) instead of the single depth (as with standard shadow mapping). And because it stores a distribution of depth, we can blur the shadow maps. This would produce some funky/incorrect results if we were just doing standard shadow mapping with a PCF filter.</p><p>So, after we have created our depth maps, we will blur them with a separable Gaussian blur. This will perform two passes on each shadow map; the first will perform a horizontal blur and the second will perform a vertical blur. There is a wealth of information on the internet on how to do this so I won't explicitly cover this. Here's what our front shadow map looks like after being blurred:</p><p><a href="http://lh6.ggpht.com/GraphicsRunner/SICpSYa0vRI/AAAAAAAAAT4/OjVwM6wxjkw/depth_front%5B4%5D.png"><img height="297" alt="depth_front" src="http://lh5.ggpht.com/GraphicsRunner/SICpSiZaUYI/AAAAAAAAAT8/d_sR-JtJJyc/depth_front_thumb%5B2%5D.png" width="297" /></a> </p><p><strong><span style="color:#ff6600;">Variance shadow mapping:</span></strong></p><p>We build our texture coordinates exactly the same as the previous method of shadow mapping. But the depth comparison is a little different. You can refer to the VSM paper for an in-depth discussion, but here is the gist of it. Since we filtered our shadow maps with a Gaussian blur, we need to recover the moments over that filter region. The moments are simple the depth and squared depth we stored in the texture. From these we can build the mean depth and the variance at the pixel. And as such the variance can be interpreted as a quantitative measure of the width of a distribution (Donelly/Lauritzen). This measure places a bound on the distribution and can be represented by Chebychev's inequality.</p><pre class="mycode"><span style="color:blue;">float </span>depth;<br /><span style="color:blue;">float </span>mydepth;<br /><span style="color:blue;">float2 </span>moments;<br /><span style="color:blue;">if</span>(alpha >= 0.5f)<br />{<br /> moments = <span style="color:blue;">tex2D</span>(ShadowFrontS, P0.xy).xy;<br /> depth = moments.x;<br /> mydepth = P0.z;<br />}<br /><span style="color:blue;">else<br /></span>{<br /> moments = <span style="color:blue;">tex2D</span>(ShadowBackS, P1.xy).xy;<br /> depth = moments.x;<br /> mydepth = P1.z;<br />}<br /><br /><span style="color:blue;">float </span>lit_factor = (mydepth <= moments[0]);<br /> <br /><span style="color:blue;">float </span>E_x2 = moments.y;<br /><span style="color:blue;">float </span>Ex_2 = moments.x * moments.x;<br /><span style="color:blue;">float </span>variance = <span style="color:blue;">min</span>(<span style="color:blue;">max</span>(E_x2 - Ex_2, 0.0) + SHADOW_EPSILON, 1.0);<br /><span style="color:blue;">float </span>m_d = (moments.x - mydepth);<br /><span style="color:blue;">float </span>p = variance / (variance + m_d * m_d); <span style="color:green;">//Chebychev's inequality<br /><br /></span>texColor.xyz *= <span style="color:blue;">max</span>(lit_factor, p + .2f); <span style="color:green;">//lighten the shadow just a bit (with the + .2f)<br /><br /></span><span style="color:blue;">return </span>texColor;</pre><p><span style="color:#ff6600;">5x5 Guassian Blur</span></p><p><a href="http://lh6.ggpht.com/GraphicsRunner/SICpS0jeMfI/AAAAAAAAAUA/ZivDUeZeghY/dp_vsm2%5B4%5D.jpg"><img height="342" alt="dp_vsm2" src="http://lh3.ggpht.com/GraphicsRunner/SICpTYZln1I/AAAAAAAAAUE/VcsOkoBfA5w/dp_vsm2_thumb%5B2%5D.jpg" width="435" /></a> </p><br /><p><span style="color:#ff6600;">9x9 Guassian Blur</span></p><p><a href="http://lh6.ggpht.com/GraphicsRunner/SICpTiJvL2I/AAAAAAAAAUI/UJZvqg3i3PI/dp_vsm3%5B5%5D.jpg"><img height="346" alt="dp_vsm3" src="http://lh5.ggpht.com/GraphicsRunner/SICpUFEsPXI/AAAAAAAAAUM/1UWpPlNavwk/dp_vsm3_thumb%5B3%5D.jpg" width="437" /></a> </p><p>And there you go. Nice looking dual-paraboloid soft shadows thanks to variance shadow mapping.</p><p>As before, your card needs to support either RG16F or RG32F formats (sorry again Charles :) ). You can refer to the VSM paper and demo on how to map 2 floats to a single ARGB32 pixel if your card doesn't support the floating point surface formats.</p><p></p><br /><iframe style="BORDER-RIGHT: #dde5e9 1px solid; PADDING-RIGHT: 0px; BORDER-TOP: #dde5e9 1px solid; PADDING-LEFT: 0px; PADDING-BOTTOM: 0px; MARGIN: 3px; BORDER-LEFT: #dde5e9 1px solid; WIDTH: 240px; PADDING-TOP: 0px; BORDER-BOTTOM: #dde5e9 1px solid; HEIGHT: 66px; BACKGROUND-COLOR: #ffffff" marginwidth="0" marginheight="0" src="http://cid-b80a3031b5bfa52b.skydrive.live.com/embedrowdetail.aspx/Public/Dual-Paraboloid%20VSM.zip" frameborder="0" scrolling="no"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com17tag:blogger.com,1999:blog-1023441640234597436.post-70708719785695748862008-07-17T12:34:00.005-04:002008-07-17T13:51:43.463-04:00Dual-Paraboloid Shadow Maps<p><a href="http://lh4.ggpht.com/GraphicsRunner/SH90g3clirI/AAAAAAAAATY/KwezlEeTnS0/DPShadow2%5B8%5D.jpg"><img height="326" alt="DPShadow2" src="http://lh5.ggpht.com/GraphicsRunner/SH90hb1QWnI/AAAAAAAAATc/ROvaa6Boy_8/DPShadow2_thumb%5B6%5D.jpg" width="415" /></a> </p><p>Last time I introduced using dual-paraboloid environment mapping for reflections. Well now we're going to apply the same process to shadows. So if you haven't looked at my previous post, read it over before going on.</p><p>Creating the depth/shadow maps is exactly the same as when we created the reflection maps with one exception. Instead of outputting color in the pixel shader, we output the depth of the 3d pixel, like so:</p><p><span style="font-family:Tahoma;color:#7c7c7c;">return depth.x / depth.y;</span></p><p><span style="color:#404040;">Where depth.x is the depth of the pixel and depth.y is the w component. And here is the resulting depth/shadow map for the front hemisphere.</span></p><p><a href="http://lh6.ggpht.com/GraphicsRunner/SH90hpgk20I/AAAAAAAAATg/RiTmqfhE71Y/depth_f%5B5%5D.png"><img height="269" alt="depth_f" src="http://lh6.ggpht.com/GraphicsRunner/SH90h2cvZcI/AAAAAAAAATk/h5-bwOO2jBg/depth_f_thumb%5B3%5D.png" width="269" /></a> </p><p><span style="color:#000000;"><span style="color:#404040;">Now, to map the shadows the process is also very similar to how we generated the reflections. We follow a similar process in the pixel shader:</span></span></p><ul><li><span style="color:#404040;">Generate the texture coordinates for the front and rear paraboloids</span> </li><li>Generate the depth of the pixel </li><li>Test to see if the pixel is in shadow </li></ul><p>We generate the texture coordinates exactly as when we generated the reflection texture coordinates. To generate the depth of the pixel we take the length of the vector from the vertex to the origin of the paraboloid (0, 0, 0) and divide by the light attenuation. Also to check which hemisphere we are in, we calculate an alpha that is the Z value of the transformed vertex and offset by .5f;</p><pre class="mycode"><span style="color:blue;">float </span>L = <span style="color:blue;">length</span>(pos);<br /><span style="color:blue;">float3 </span>P0 = pos / L;<br /><br /><span style="color:blue;">float </span>alpha = .5f + pos.z / LightAttenuation;<br /><span style="color:green;">//generate texture coords for the front hemisphere</span><br />P0.z = P0.z + 1;<br />P0.x = P0.x / P0.z;<br />P0.y = P0.y / P0.z;<br />P0.z = L / LightAttenuation;<br /><br />P0.x = .5f * P0.x + .5f;<br />P0.y = -.5f * P0.y + .5f;<br /><br /><span style="color:blue;">float3 </span>P1 = pos / L;<br /><span style="color:green;">//generate texture coords for the rear hemisphere</span><br />P1.z = 1 - P1.z;<br />P1.x = P1.x / P1.z;<br />P1.y = P1.y / P1.z;<br />P1.z = L / LightAttenuation;<br /><br />P1.x = .5f * P1.x + .5f;<br />P1.y = -.5f * P1.y + .5f;</pre><p>Now that we have generated our texture coordinates we need to test the depth of the pixel against the depth in the shadow map. To do this we index either the front or rear shadow map with the texture coordinates we generated to get the depth and compare this to our depth. If the depth is less than our depth, then the pixel is in shadow.</p><pre class="mycode"><span style="color:blue;">float </span>depth;<br /><span style="color:blue;">float </span>mydepth;<br /><span style="color:blue;">if</span>(alpha >= 0.5f)<br />{<br /> depth = <span style="color:blue;">tex2D</span>(ShadowFrontS, P0.xy).x;<br /> mydepth = P0.z;<br />}<br /><span style="color:blue;">else<br /></span>{<br /> depth = <span style="color:blue;">tex2D</span>(ShadowBackS, P1.xy).x;<br /> mydepth = P1.z;<br />}<br /><br /><span style="color:green;">//lighten the shadow just a bit so it isn't completely black<br /></span><span style="color:blue;">if</span>((depth + SHADOW_EPSILON) < mydepth)<br /> texColor.xyz *= 0.3f;<br /><br /><span style="color:blue;">return </span>texColor;<br /><br /></pre><p><a href="http://lh6.ggpht.com/GraphicsRunner/SH90jYqByQI/AAAAAAAAATo/ub-U1PGGhKs/DPShadow%5B4%5D.jpg"><img height="324" alt="DPShadow" src="http://lh5.ggpht.com/GraphicsRunner/SH90kCAdZ5I/AAAAAAAAATs/RwEl6sSzhFw/DPShadow_thumb%5B2%5D.jpg" width="412" /></a> </p><p>And that's it. Now we have dual-paraboloid shadow mapping. If you have a pixel shader 3.0 graphics card, then the shadow also has a percentage closer filter applied to it. You also may notice seams in the shadows. This is because the splitting plane of the paraboloids is the x-axis (since the paraboloids look down the +/- z-axis). This is one of the problems of using paraboloid mapping for shadows. One has to be careful where they place the split plane to avoid this situation. Pixels that are in the center of either hemisphere suffer little distortion. But this is just a tutorial so I didn't worry too much about it.</p><p>Also you're graphics card must be able to support R32F or R16F surface formats to run the demo out of the box (sorry Charles ;) ). Otherwise, you must use the ARGB32 format and pack the depth values in all 4 channels. Here is some code to pack/unpack to/from an ARGB32 surface format. You pass the depth value to the pack method when you render to the shadow maps, and you pass the float4 color to the unpack method when you fetch from the shadow maps. I decided not to implement this so the code wouldn't become complicated by something that doesn't add to the tutorial.</p><pre class="mycode"><span style="color:green;">//pack the depth in a 32-bit rgba color<br /></span><span style="color:blue;">float4 </span>mapDepthToARGB32(<span style="color:blue;">const float </span>value)<br />{<br /> <span style="color:blue;">const float4 </span>bitSh = <span style="color:blue;">float4</span>(256.0 * 256.0 * 256.0, 256.0 * 256.0, 256.0, 1.0);<br /> <span style="color:blue;">const float4 </span>mask = <span style="color:blue;">float4</span>(0.0, 1.0 / 256.0, 1.0 / 256.0, 1.0 / 256.0);<br /> <span style="color:blue;">float4 </span>res = <span style="color:blue;">frac</span>(value * bitSh);<br /> res -= res.xxyz * mask;<br /> <span style="color:blue;">return </span>res;<br />}<br /><br /><span style="color:green;">//unpack the depth from a 32-bit rgba color<br /></span><span style="color:blue;">float </span>getDepthFromARGB32(<span style="color:blue;">const float4 </span>value)<br />{<br /> <span style="color:blue;">const float4 </span>bitSh = <span style="color:blue;">float4</span>(1.0 / (256.0 * 256.0 * 256.0), 1.0 / (256.0 * 256.0), 1.0 / 256.0, 1.0);<br /> <span style="color:blue;">return</span>(<span style="color:blue;">dot</span>(value, bitSh));<br />}</pre><p>Next time I'll introduce using variance shadow mapping with our dual-paraboloid shadow mapping to give nice soft shadows that we can still use with pixel shader 2.0 cards.</p><br /><iframe style="BORDER-RIGHT: #dde5e9 1px solid; PADDING-RIGHT: 0px; BORDER-TOP: #dde5e9 1px solid; PADDING-LEFT: 0px; PADDING-BOTTOM: 0px; MARGIN: 3px; BORDER-LEFT: #dde5e9 1px solid; WIDTH: 240px; PADDING-TOP: 0px; BORDER-BOTTOM: #dde5e9 1px solid; HEIGHT: 66px; BACKGROUND-COLOR: #ffffff" marginwidth="0" marginheight="0" src="http://cid-b80a3031b5bfa52b.skydrive.live.com/embedrowdetail.aspx/Public/Dual-Paraboloid%20Shadow%20Maps.zip" frameborder="0" scrolling="no"></iframe>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com15tag:blogger.com,1999:blog-1023441640234597436.post-49130382283169311452008-07-16T11:29:00.010-04:002008-08-28T19:05:11.956-04:00Dual-Paraboloid Reflections<p><a href="http://lh3.ggpht.com/GraphicsRunner/SH4Tn0OBd-I/AAAAAAAAASo/85vkjSUidPw/dragon5.png"><img height="318" alt="dragon" src="http://lh4.ggpht.com/GraphicsRunner/SH4TooD_JiI/AAAAAAAAASs/OdFKgTKhoDI/dragon_thumb3.png" width="402" /></a> </p><p>I recently had to investigate dual-paraboloid reflections at work for an unnamed console. What are these you ask? Great question! :) Lets start with some background.</p><p>The standard way of calculating reflections is to use an environment map, more specifically a cube map. In my last tutorial on reflections, this basic type of reflection mapping was used to compare against billboard impostor reflections. Now, cubemaps are great for static scenes, and are relatively low cost to perform a texture fetch on in a shader. However if you have a dynamic scene you have to update all 6 sides of the cubemap (this is not technically true, aggressive culling and other optimizations can guarantee at most 5 sides). Holy crap, now we have to render our scene 6 times!</p><p>This is where dual-paraboloid reflections come in. They are a view-independent method of rendering reflections just like cubemaps. Except you only have to update 2 textures, not 6! The downside is that you are going to lose quality for speed, but unless you have to have high-quality reflections, paraboloid reflections will probably provide sufficient results.</p><p>Reference articles:</p><p><a href="http://www.cs.ubc.ca/~heidrich/Papers/GH.98.pdf">View-Independent Environment Maps</a></p><p><a href="http://www.mpi-inf.mpg.de/~tannen/papers/cgi_02.pdf">Shadow Mapping for Hemispherical and Omnidirectional Light Sources</a></p><p><a href="http://www.gamedev.net/columns/hardcore/dualparaboloid/">Dual Paraboloid Mapping in the Vertex Shader</a></p><p>In the interest of keeping this post from getting too long, I won't go into great detail on the mathematical process. I suggest you refer to the first and third papers for an in-depth discussion on the details.</p><p>Now lets move on to what exactly paraboloid mapping is. Lets look at what a paraboloid is.</p><p><a href="http://lh3.ggpht.com/GraphicsRunner/SH4TpImkomI/AAAAAAAAASw/Tsc2v951RAw/paraboloid5.jpg"><img height="198" alt="paraboloid" src="http://lh4.ggpht.com/GraphicsRunner/SH4TpsGL2DI/AAAAAAAAAS0/tPahxfnQF24/paraboloid_thumb3.jpg" width="252" /></a> </p><p>The basic idea is that for any origin O, we can divide the scene into two hemispheres, front and rear. For each of these hemispheres there exists a paraboloid surface that will focus any ray traveling in the direction of O into the direction of the the hemisphere. Here is a 2d picture demonstrating the idea:</p><p><a href="http://lh6.ggpht.com/GraphicsRunner/SH4Tpy9gE_I/AAAAAAAAAS4/kFkghUJJDiI/paraboloid2d9.jpg"><img height="271" alt="paraboloid2d" src="http://lh4.ggpht.com/GraphicsRunner/SH4Tqj3j8fI/AAAAAAAAAS8/oENn9zX8D6Y/paraboloid2d_thumb7.jpg" width="263" /></a> </p><p><strong><span style="color:#ff6600;">A brief math overview:</span></strong></p><p>What we need to find is the intersection point where the incident ray intersects the paraboloid surface. To do this we need to know the incident ray and the reflected ray. Now because the paraboloid reflects rays in the same direction, it is easy to compute the reflection vector: it's the forward direction of the hemisphere! So the front hemisphere's reflection vector will always be <0, 0, 1> and the rear hemisphere's reflection vector will always be <0, 0, -1>. Easy! And the incident ray is calculated the same as with environment mapping by reflecting the ray from the pixel position to the eye across the normal of the 3D pixel.</p><p>Now all we have to do is find the normal of the intersection which we will use to map our vertices into paraboloid space. To find the normal, we add the incident and reflected vectors and divide the x and y components by the z value.</p><p><strong><span style="color:#ff6600;">Generating the Paraboloid maps:</span></strong></p><p>What we are basically going to do is, in the vertex shader, place each vertex ourselves that has been distorted by the paraboloid. First we need to transform the vertex by the view matrix of the paraboloid 'camera'. We don't apply the projection matrix since we're going to place the point ourselves</p><span style="font-family:trebuchet ms;color:#666666;">output.Position = <span style="color:#3333ff;">mul</span>(input.Position, WorldViewProj);</span> <p>Next we need to find the vector from the the vertex to the origin of the paraboloid, which is simply:</p><span style="font-family:trebuchet ms;color:#666666;">float L = <span style="color:#3333ff;">length</span>( output.Position.xyz );<br />output.Position = output.Position / L;</span> <p>Now we need to find the x and y coordinates of the point where the incident ray intersects the paraboloid surface.</p><span style="font-family:trebuchet ms;color:#666666;">output.Position.z = output.Position.z + 1;<br />output.Position.x = output.Position.x / output.Position.z;<br />output.Position.y = output.Position.y / output.Position.z;</span> <p>Finally we set the z value as the distance from the vertex to the origin of the paraboloid, scaled and biased by the near and far planes of the paraboloid 'camera'.</p><p><span style="font-family:trebuchet ms;color:#666666;">output.Position.z = (L - NearPlane) / (FarPlane - NearPlane);<br />output.Position.w = 1;</span> </p><p></p><p>And the only thing we need to add in the pixel shader is to make sure and clip vertices that are behind the viewpoint using the intrinsic clip() function of HLSL.</p><p><br /><a href="http://lh5.ggpht.com/GraphicsRunner/SH4TrNbSLxI/AAAAAAAAATA/DY7HQSe1NR4/front4.jpg"><img height="258" alt="front" src="http://lh4.ggpht.com/GraphicsRunner/SH4TrbnQCfI/AAAAAAAAATE/NAmPzc8TJ_A/front_thumb2.jpg" width="338" /></a> </p><p><a href="http://lh3.ggpht.com/GraphicsRunner/SH4TskGKIQI/AAAAAAAAATI/ijNI1MaeCN4/frontWF5.jpg"><img height="258" alt="frontWF" src="http://lh6.ggpht.com/GraphicsRunner/SH4Ts_uQD2I/AAAAAAAAATM/Xr1NfeDgVtc/frontWF_thumb3.jpg" width="338" /></a> </p><p><strong><span style="color:#ff6600;">Reflections with paraboloid maps:</span></strong></p><p>In the reflection pixel shader we will: generate the reflection vector the same way as cube mapping, generate texture coordinates for both the front and rear paraboloids' textures, and blend the samples taken from the textures.</p><p>The texture coordinates are generated exactly as how we generated them before in the generation step. We also scale and bias them to correctly index a D3D texture. And then we take a sample from each map and pick the sample with the greater color value:</p><p><pre class="mycode"><span style="color:green;">// calculate the front paraboloid map texture coordinates</span><br /><span style="color:blue;">float2 </span>front;<br />front.x = R.x / (R.z + 1);<br />front.y = R.y / (R.z + 1);<br />front.x = .5f * front.x + .5f; <span style="color:green;">//bias and scale to correctly sample a d3d texture</span><br />front.y = -.5f * front.y + .5f;<br /><br /><span style="color:green;">// calculate the back paraboloid map texture coordinates</span><br /><span style="color:blue;">float2 </span>back;<br />back.x = R.x / (1 - R.z);<br />back.y = R.y / (1 - R.z);<br />back.x = .5f * back.x + .5f; <span style="color:green;">//bias and scale to correctly sample a d3d texture</span><br />back.y = -.5f * back.y + .5f;<br /><br /><span style="color:blue;">float4 </span>forward = <span style="color:blue;">tex2D</span>( FrontTex, front ); <span style="color:green;">// sample the front paraboloid map</span><br /><span style="color:blue;">float4 </span>backward = <span style="color:blue;">tex2D</span>( BackTex, back ); <span style="color:green;">// sample the back paraboloid map</span><br /><br /><span style="color:blue;">float4 </span>finalColor = <span style="color:blue;">max</span>(forward, backward);</pre><a href="http://www.blogger.com/%3Ca" 20href=""></a><p><p><a href="http://lh5.ggpht.com/GraphicsRunner/SH4TuTnFusI/AAAAAAAAATQ/zWHuWWsBhJY/ss14.png"><img height="312" alt="ss1" src="http://lh3.ggpht.com/GraphicsRunner/SH4TvaUrKBI/AAAAAAAAATU/kRuYcwleFKA/ss1_thumb2.png" width="399" /></a> </p><p><strong><span style="color:#ff6600;">Optimizations:</span></strong></p><p>If you align the paraboloid 'camera' such that it is always facing down the +/- z axis, you don't need to transform the vertices by the view matrix of the camera. You only need to do a simple translation of the vertex by the camera position.</p><p><strong><span style="color:#ff6600;">Conclusion:</span></strong></p><p>As you can see, paraboloid maps give pretty good results. The won't give you the quality of cubemaps, but they are faster to update and require less memory. And in the console world, requiring less is almost reason enough to pick this method over cubemaps.</p><p>One drawback of paraboloid maps is that the environment geometry has to be sufficiently tessellated or will we will have noticeable artifacts on our reflector. Another drawback is that on spherical objects we will see seems. However with objects that are reasonably complex (such as the Stanford bunny or dragon) and are not simple shapes, the seams will not be as noticeable. </p><p>Next time I will present dual-paraboloid mapping for use with real-time omnidirectional shadow mapping of point lights.<br /><br /><iframe style="BORDER-RIGHT: #dde5e9 1px solid; PADDING-RIGHT: 0px; BORDER-TOP: #dde5e9 1px solid; PADDING-LEFT: 0px; PADDING-BOTTOM: 0px; MARGIN: 3px; BORDER-LEFT: #dde5e9 1px solid; WIDTH: 240px; PADDING-TOP: 0px; BORDER-BOTTOM: #dde5e9 1px solid; HEIGHT: 66px; BACKGROUND-COLOR: #ffffff" marginwidth="0" marginheight="0" src="http://cid-b80a3031b5bfa52b.skydrive.live.com/embedrowdetail.aspx/Public/Dual-Paraboloid%20Reflections.zip" frameborder="0" scrolling="no"></iframe></p>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com38tag:blogger.com,1999:blog-1023441640234597436.post-35577528504511207782008-06-20T13:24:00.004-04:002013-02-26T22:31:46.031-05:00Terrain and Atmospheric Scattering Source<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZ-qzVJpy-30TYUwON18VMRRCC8ORD8Oyms3Eus-npqQOVwGLnlIAhNJY-apax_enXHndxKpPwHPX8aW3It-fR1oW0MKlK-qa8MeFTaylNgGPeg6c_kzgnsjJT2z6eBaEGwfCfM18s1wYt/s1600-h/SunsetWater04.png" onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}"><img alt="" border="0" id="BLOGGER_PHOTO_ID_5179878141186776322" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhZ-qzVJpy-30TYUwON18VMRRCC8ORD8Oyms3Eus-npqQOVwGLnlIAhNJY-apax_enXHndxKpPwHPX8aW3It-fR1oW0MKlK-qa8MeFTaylNgGPeg6c_kzgnsjJT2z6eBaEGwfCfM18s1wYt/s200/SunsetWater04.png" style="cursor: pointer;" /></a><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5tqCXgVhA1eQMB12ml9q8pbhXQfNKdAe91xiRypGx6nfRjz2DgPQH16xvotkR1K3gGo7A2A4sVlXiYoTKQXbKwIzKsOBG_LKI7A_lmJcBAKBKHaBfct_k-qXOwuQorMujtHyxwbKLBWVf/s1600-h/Terrain+2007-11-27+23-49-29-48.bmp" onblur="try {parent.deselectBloggerImageGracefully();} catch(e) {}"><img alt="" border="0" height="158" id="BLOGGER_PHOTO_ID_5179879476921605506" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5tqCXgVhA1eQMB12ml9q8pbhXQfNKdAe91xiRypGx6nfRjz2DgPQH16xvotkR1K3gGo7A2A4sVlXiYoTKQXbKwIzKsOBG_LKI7A_lmJcBAKBKHaBfct_k-qXOwuQorMujtHyxwbKLBWVf/s200/Terrain+2007-11-27+23-49-29-48.bmp" style="cursor: pointer;" width="216" /></a><br />
<br />
I figured I would release my source for the terrain rendering demo. There aren't that many terrain and atmospheric scattering examples on the net, so I figured I might as well post mine for anyone that would like to see it. <br />
It's still a work in progress (that I paused about a year ago), so it's kind of immature. But it is the same code that produced the images I posted awhile back. You'll probably need at least a 6 series Nvidia graphics card (or ATi equivalent). For the terrain, the demo uses a simple quad tree with frustum culling, so it's not the greatest performer for anything above 512x512 heigtmaps. Also, it is in Managed DirectX, but it shouldn't be too hard to port to XNA.<br />
<br />
<iframe frameborder="0" height="120" scrolling="no" src="https://skydrive.live.com/embed?cid=B80A3031B5BFA52B&resid=B80A3031B5BFA52B%21158&authkey=AJMWkuVt9V4Z_XA" width="98"></iframe><br />
<br />
Anyway, enough of that. If you didn't know, the <a href="http://www.spore.com/">Creature Creator for Spore</a> has been released, and it's an absolute blast to mess around with. The possibilities on what you can create are pretty much endless. Here's a couple that I made: <br />
<a href="http://lh3.ggpht.com/GraphicsRunner/SFvnqml9lFI/AAAAAAAAASY/M8Qk7ctreTw/CRE_spider-0685c592_sml%5B6%5D.jpg"><img alt="CRE_spider-0685c592_sml" height="290" src="http://lh4.ggpht.com/GraphicsRunner/SFvnsRIn2xI/AAAAAAAAASc/V1AGe7LXPzA/CRE_spider-0685c592_sml_thumb%5B4%5D.jpg" width="381" /></a><br />
<a href="http://lh5.ggpht.com/GraphicsRunner/SFvntDmRZjI/AAAAAAAAASg/fOVcacYVPi8/CRE_Graptilosaurus-0685c593_sml%5B9%5D.jpg"><img alt="CRE_Graptilosaurus-0685c593_sml" height="301" src="http://lh3.ggpht.com/GraphicsRunner/SFvntU5mBBI/AAAAAAAAASk/TpfuEGyglB8/CRE_Graptilosaurus-0685c593_sml_thumb%5B7%5D.jpg" width="381" /></a>Kyle Haywardhttp://www.blogger.com/profile/00654406875137720609noreply@blogger.com27