Program
The program depends on a set of very simple shaders. The
shader_uv.v
and
shader_uv.f
shaders do trivial vertex transformation and texturing. These programs
are completely standard fare and only implement the bare
minimum necessary to draw textured polygons.
The main displacement work takes place in
shader_displace.f:
The program takes the current texture coordinates (interpolated by
the current vertex shader), a texture, a displacement map, a
scaling value (maximum), and
the current time (in frames, but the time unit is not important).
First, the current texture coordinates vertex_uv
are translated by the current scaled time value
time_e. Then, a pixel is
read from the displacement map at the resulting texture coordinates.
The displacement map is assumed to be greyscale. Pixels are represented
as four element RGBA vectors with floating point components. The shader
reads the green channel of the pixel (but would of course get identical
results reading from either the red or blue channels with a greyscale image),
and then scales this value by maximum to
obtain a final offset value displace_k.
Note that this value is in texture-space units, not pixels - in a
256x256 pixel
image, a value of 0.25 would represent
64 pixels. The program then
adds displace_k to the original
interpolated texture coordinates and then retrieves a pixel from the
current texture using the coordinates.
The OpenGL program that drives the shaders is similarly simple. First,
the program allocates a framebuffer and adds a blank texture as color
buffer storage. It loads the requested image and displacement map
image, and also compiles and loads the relevant shading programs. These
uninteresting but essential functions are implemented in the
Utilities
class.
Rendering involves two steps. First, the program needs to generate
a texture based on the loaded image and displacement map. It does this
by binding the allocated framebuffer and then rendering a fullscreen
textured quad using the previously mentioned
displacement shader.
After the above function has executed,
framebuffer_texture
contains the "displaced" texture. The program then draws
a textured quad to the screen:
It is, of course, possible to render directly to the screen using
the displacement shader. The program described here avoids doing that
in order to demonstrate that the resulting procedural texture is an
ordinary texture that can be used in the same manner as any other.