Part I A Simple Game of Air Hockey 2. Defining Vertices and Shaders
6.10 Exercises 114 7. Adding Detail with Textures
Try adjusting the field of vision, and observe the effect that has on the air hockey table. You can also try moving the table around in different ways.
Once you’ve completed these exercises, we’ll start making our table look nicer.
In the next chapter we’re going to start working with textures.
Chapter 6. Entering the Third Dimension • 114
CHAPTER 7
Adding Detail with Textures
We’ve managed to get a lot done with just simple shapes and colors. There’s something missing though: what if we could paint onto our shapes and add refined detail? Like artists, we can start out with basic shapes and color and add extra detail onto our surfaces by using textures. A texture is simply an image or a picture that has been uploaded into OpenGL.
We can add an incredible amount of detail with textures. Think of a beautiful 3D game you might have played recently. At the heart of things, the game is just using points, lines, and triangles, like any other 3D program. However, with the detail of textures and the touch of a skilled artist, these triangles can be textured to build a beautiful 3D scene.
Once we start using textures, we’ll also start using more than one shader program. To make this easier to manage, we’ll learn how to adapt our code so that we can use multiple shader programs and sources of vertex data and switch between them.
Here’s our game plan for this chapter:
• We’ll start out with an introduction to textures, and then we’ll write code to load a texture into OpenGL.
• We’ll learn how to display that texture, adapting our code to support multiple shader programs.
• We’ll also cover the different texture filtering modes and what they do.
When we’re done, our air hockey table should look like the next figure. Let’s start off by copying the project from the last chapter over into a new project called ‘AirHockeyTextured’.
Figure 35—Air hockey table with a filtered texture
7.1 Understanding Textures
Textures in OpenGL can be used to represent images, pictures, and even fractal data that are generated by a mathematical algorithm. Each two- dimensional texture is composed of many small texels, which are small blocks of data analogous to the fragments and pixels that we’ve talked about previ- ously. The most common way to use a texture is to load in the data directly from an image file.
We’ll use the image in Figure 36, The Surface Image, on page 117 as our new air hockey table surface and load it in as a texture:
All of the images used in the code can be downloaded from this book’s home page. I recommend storing the texture in your project’s /res/drawable-nodpi/ folder.1
1. http://pragprog.com/book/kbogla
Chapter 7. Adding Detail with Textures • 116
Figure 36—The Surface Image
Each two-dimensional texture has its own coordinate space, ranging from (0, 0) at one corner to (1, 1) at the other corner. By convention, one dimension is called S and the other is called T. When we want to apply a texture to a triangle or set of triangles, we’ll specify a set of ST texture coordinates for each vertex so that OpenGL knows which parts of the texture it needs to draw across each triangle. These texture coordinates are also sometimes referred to as UV texture coordinates, as seen in Figure 37, OpenGL 2D texture coordi- nates, on page 118.
There is no inherent orientation for an OpenGL texture, since we can use different coordinates to orient it any which way we like. However, there is a default orientation for most computer image files: they are usually specified with the y-axis pointing downward (as seen in Figure 38, Computer images:
the y-axis points downward, on page 118): the y value increases as we move toward the bottom of the image. This doesn’t cause any trouble for us so long as we remember that if we want to view our image with the right orientation, then our texture coordinates need to take this into account.
In standard OpenGL ES 2.0, textures don’t have to be square, but each dimension should be a power of two (POT). This means that each dimension should be a number like 128, 256, 512, and so on. The reason for this is that non-POT textures are very restricted in where they can be used, while POT textures are fine for all uses.
There is also a maximum texture size that varies from implementation to implementation but is usually something large, like 2048 x 2048.
Understanding Textures • 117
Figure 37—OpenGL 2D texture coordinates
Figure 38—Computer images: the y-axis points downward
Chapter 7. Adding Detail with Textures • 118
7.2 Loading Textures into OpenGL
Our first task will be to load data from an image file into an OpenGL texture.
To start out, let’s create a new class in the com.airhockey.android.util package called TextureHelper. We’ll begin with the following method signature:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java
public static int loadTexture(Context context, int resourceId) {
This method will take in an Android context and a resource ID and will return the ID of the loaded OpenGL texture. To start off, we’ll generate a new texture ID using the same type of pattern as when we’ve created other OpenGL objects:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java final int[] textureObjectIds = new int[1];
glGenTextures(1, textureObjectIds, 0);
if (textureObjectIds[0] == 0) { if (LoggerConfig.ON) {
Log.w(TAG, "Could not generate a new OpenGL texture object.");
}
return 0;
}
We generate one texture object by calling glGenTextures(1, textureObjectId, 0), passing in 1 as the first parameter. OpenGL will store the generated IDs in textureObjec- tIds. We also check that the call to glGenTextures() succeeded by continuing only if it’s not equal to zero; otherwise we log the error and return 0. Since TAG is not yet defined, let’s add a definition for it to the top of the class:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java private static final String TAG = "TextureHelper";
Loading in Bitmap Data and Binding to the Texture
The next step is to use Android’s APIs to read in the data from our image files.
OpenGL can’t read data from a PNG or JPEG file directly because these files are encoded into specific compressed formats. OpenGL needs the raw data in an uncompressed form, so we’ll need to use Android’s built-in bitmap decoder to decompress our image files into a form that OpenGL understands.
Let’s continue implementing loadTexture() and decompress the image into an Android bitmap:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false;
final Bitmap bitmap = BitmapFactory.decodeResource(
Loading Textures into OpenGL • 119
context.getResources(), resourceId, options);
if (bitmap == null) { if (LoggerConfig.ON) {
Log.w(TAG, "Resource ID " + resourceId + " could not be decoded.");
}
glDeleteTextures(1, textureObjectIds, 0);
return 0;
}
We first create a new instance of BitmapFactory.Options called options, and we set inScaled to false. This tells Android that we want the original image data instead of a scaled version of the data.
We then call BitmapFactory.decodeResource() to do the actual decode, passing in the Android context, resource ID, and the decoding options that we’ve just defined. This call will decode the image into bitmap or will return null if it failed.
We check against that failure and delete the OpenGL texture object if the bitmap is null. If the decode succeeded, we continue processing the texture.
Before we can do anything else with our newly generated texture object, we need to tell OpenGL that future texture calls should be applied to this texture object. We do that with a call to glBindTexture():
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java glBindTexture(GL_TEXTURE_2D, textureObjectIds[0]);
The first parameter, GL_TEXTURE_2D, tells OpenGL that this should be treated as a two-dimensional texture, and the second parameter tells OpenGL which texture object ID to bind to.
Understanding Texture Filtering
We’ll also need to specify what should happen when the texture is expanded or reduced in size, using texture filtering. When we draw a texture onto the rendering surface, the texture’s texels may not map exactly onto the fragments generated by OpenGL. There are two cases: “minification” and magnification.
Minification happens when we try to cram several texels onto the same frag- ment, and magnification happens when we spread one texel across many fragments. We can configure OpenGL to use a texture filter for each case.
To start out, we’ll cover two basic filtering modes: nearest-neighbor filtering and bilinear interpolation. There are additional filtering modes that we’ll soon cover in more detail. We’ll use the following image to illustrate each filtering mode:
Chapter 7. Adding Detail with Textures • 120
Nearest-Neighbor Filtering
This selects the nearest texel for each fragment. When we magnify the texture, it will look rather blocky, as follows:
Each texel is clearly visible as a small square.
When we minify the texture, many of the details will be lost, as we don’t have enough fragments for all of the texels:
Bilinear Filtering
Bilinear filtering uses bilinear interpolation to smooth the transitions between pixels. Instead of using the nearest texel for each fragment, OpenGL will use the four neighboring texels and interpolate them together using the same type of linear interpolation that we discussed back in How Does a Varying Get Blended at Each Fragment?, on page 66. We call it bilinear because it is done along two dimensions. The following is the same texture as before, magnified using bilinear interpolation:
Loading Textures into OpenGL • 121
The texture now looks much smoother than before. There’s still some blocki- ness present because we’ve expanded the texture so much, but it’s not as apparent as it was with nearest-neighbor filtering.
Mipmapping
While bilinear filtering works well for magnification, it doesn’t work as well for minification beyond a certain size. The more we reduce the size of a texture on the rendering surface, the more texels will get crammed onto each fragment.
Since OpenGL’s bilinear filtering will only use four texels for each fragment, we still lose a lot of detail. This can cause noise and shimmering artifacts with moving objects as different texels get selected with each frame.
To combat these artifacts, we can use mipmapping, a technique that generates an optimized set of textures at different sizes. When generating the set of textures, OpenGL can use all of the texels to generate each level, ensuring that all of the texels will also be used when filtering the texture. At render time, OpenGL will select the most appropriate level for each fragment based on the number of texels per fragment.
Figure 39, Mipmapped Textures, on page 123 is a mipmapped set of textures combined onto a single image for clarity:
With mipmaps, more memory will be used, but the rendering can also be faster because the smaller levels take less space in the GPU’s texture cache.
To better understand how mipmapping improves the quality of minification, let’s compare and contrast our cute Android, minified to 12.5 percent of the original texel size using bilinear filtering, as shown in Figure 40, Minified with Bilinear Filtering, on page 123.
Chapter 7. Adding Detail with Textures • 122
Figure 39—Mipmapped Textures
Figure 40—Minified with Bilinear Filtering
With this kind of quality, we may as well have stayed with nearest-neighbor filtering. Let’s take a look at what we get when we add mipmaps Figure 41, Minified with Mipmapping, on page 124.
With mipmaps enabled, OpenGL will select the closest appropriate texture level and then do bilinear interpolation using that optimized texture. Each level was built with information from all of the texels, so the resulting image looks much better, with much more of the detail preserved.
Loading Textures into OpenGL • 123
Figure 41—Minified with Mipmapping
Trilinear Filtering
When we use mipmaps with bilinear filtering, we can sometimes see a noticeable jump or line in the rendered scene where OpenGL switches between different mipmap levels. We can switch to trilinear filtering to tell OpenGL to also interpolate between the two closest mipmap levels, using a total of eight texels per fragment. This helps to eliminate the transition between each mipmap level and results in a smoother image.
Setting Default Texture Filtering Parameters
Now that we know about texture filtering, let’s continue loadTexture() and add the following code:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
We set each filter with a call to glTexParameteri(): GL_TEXTURE_MIN_FILTER refers to minification, while GL_TEXTURE_MAG_FILTER refers to magnification. For minifica- tion, we select GL_LINEAR_MIPMAP_LINEAR, which tells OpenGL to use trilinear filtering. We set the magnification filter to GL_LINEAR, which tells OpenGL to use bilinear filtering.
Table 4, OpenGL texture filtering modes, on page 125 and Table 5, Allowable texture filtering modes for each case, on page 125 explain the possible options as well as the valid options for minification and magnification.
Chapter 7. Adding Detail with Textures • 124
Nearest-neighbor filtering GL_NEAREST
Nearest-neighbor filtering with mipmaps GL_NEAREST_MIPMAP_NEAREST
Nearest-neighbor filtering with interpolation between mipmap levels
GL_NEAREST_MIPMAP_LINEAR
Bilinear filtering GL_LINEAR
Bilinear filtering with mipmaps GL_LINEAR_MIPMAP_NEAREST
Trilinear filtering (bilinear filtering with inter- polation between mipmap levels)
GL_LINEAR_MIPMAP_LINEAR
Table 4—OpenGL texture filtering modes
GL_NEAREST
GL_NEAREST_MIPMAP_NEAREST GL_NEAREST_MIPMAP_LINEAR GL_LINEAR
GL_LINEAR_MIPMAP_NEAREST GL_LINEAR_MIPMAP_LINEAR Minification
GL_NEAREST GL_LINEAR Magnification
Table 5—Allowable texture filtering modes for each case
Loading the Texture into OpenGL and Returning the ID
We can now load the bitmap data into OpenGL with an easy call to GLUtils. texImage2D():
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java texImage2D(GL_TEXTURE_2D, 0, bitmap, 0);
This call tells OpenGL to read in the bitmap data defined by bitmap and copy it over into the texture object that is currently bound.
Now that the data’s been loaded into OpenGL, we no longer need to keep the Android bitmap around. Under normal circumstances, it might take a few garbage collection cycles for Dalvik to release this bitmap data, so we should call recycle() on the bitmap object to release the data immediately:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java bitmap.recycle();
Generating mipmaps is also a cinch. We can tell OpenGL to generate all of the necessary levels with a quick call to glGenerateMipmap():
Loading Textures into OpenGL • 125
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java glGenerateMipmap(GL_TEXTURE_2D);
Now that we’ve finished loading the texture, a good practice is to then unbind from the texture so that we don’t accidentally make further changes to this texture with other texture calls:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java glBindTexture(GL_TEXTURE_2D, 0);
Passing 0 to glBindTexture() unbinds from the current texture. The last step is to return the texture object ID:
AirHockeyTextured/src/com/airhockey/android/util/TextureHelper.java return textureObjectIds[0];
We now have a method that will be able to read in an image file from our resources folder and load the image data into OpenGL. We’ll get back a texture ID that we can use as a reference to this texture or get 0 if the load failed.
7.3 Creating a New Set of Shaders
Before we can draw the texture to the screen, we’ll have to create a new set of shaders that will accept a texture and apply it to the fragments being drawn.
These new shaders will be similar to the ones we’ve been working with until now, with just a couple of slight changes to add support for texturing.
Creating the New Vertex Shader
Create a new file under your project’s /res/raw/ directory, and call it texture_ver- tex_shader.glsl. Add the following contents:
AirHockeyTextured/res/raw/texture_vertex_shader.glsl uniform mat4 u_Matrix;
attribute vec4 a_Position;
attribute vec2 a_TextureCoordinates;
varying vec2 v_TextureCoordinates;
void main() {
v_TextureCoordinates = a_TextureCoordinates;
gl_Position = u_Matrix * a_Position;
}
Most of this shader code should look familiar: we’ve defined a uniform for our matrix, and we also have an attribute for our position. We use these to set the final gl_Position. Now for the new stuff: we’ve also added a new attribute for Chapter 7. Adding Detail with Textures • 126
our texture coordinates, called a_TextureCoordinates. It’s defined as a vec2 because there are two components: the S coordinate and the T coordinate. We send these coordinates on to the fragment shader as an interpolated varying called v_TextureCoordinates.
Creating the New Fragment Shader
In the same directory, create a new file called texture_fragment_shader.glsl, and add the following contents:
AirHockeyTextured/res/raw/texture_fragment_shader.glsl precision mediump float;
uniform sampler2D u_TextureUnit;
varying vec2 v_TextureCoordinates;
void main() {
gl_FragColor = texture2D(u_TextureUnit, v_TextureCoordinates);
}
To draw the texture on an object, OpenGL will call the fragment shader for each fragment, and each call will receive the texture coordinates in v_Texture- Coordinates. The fragment shader will also receive the actual texture data via the uniform u_TextureUnit, which is defined as a sampler2D. This variable type refers to an array of two-dimensional texture data.
The interpolated texture coordinates and the texture data are passed in to the shader function texture2D(), which will read in the color value for the texture at that particular coordinate. We then set the fragment to that color by assigning the result to gl_FragColor.
The next couple of sections will be somewhat more involved: we’re going to create a new set of classes and place our existing code for our table data and shader programs into these classes. We’ll then switch between them at runtime.
7.4 Creating a New Class Structure for Our Vertex Data
We’ll start off by separating our vertex data into separate classes, with one class to represent each type of physical object. We’ll create one class for our table and another for our mallet. We won’t need one for the line, since there’s already a line on our texture.
We’ll also create a separate class to encapsulate the actual vertex array and to reduce code duplication. Our class structure will look as follows:
Creating a New Class Structure for Our Vertex Data • 127
We’ll create Mallet to manage the mallet data and Table to manage the table data; and each class will have an instance of VertexArray, which will encapsulate the FloatBuffer storing the vertex array.
We’ll start off with VertexArray. Create a new package in your project called com.airhockey.android.data, and in that package, create a new class called VertexArray. Add the following code inside the class:
AirHockeyTextured/src/com/airhockey/android/data/VertexArray.java private final FloatBuffer floatBuffer;
public VertexArray(float[] vertexData) { floatBuffer = ByteBuffer
.allocateDirect(vertexData.length * BYTES_PER_FLOAT) .order(ByteOrder.nativeOrder())
.asFloatBuffer() .put(vertexData);
}
public void setVertexAttribPointer(int dataOffset, int attributeLocation, int componentCount, int stride) {
floatBuffer.position(dataOffset);
glVertexAttribPointer(attributeLocation, componentCount, GL_FLOAT, false, stride, floatBuffer);
glEnableVertexAttribArray(attributeLocation);
floatBuffer.position(0);
}
This code contains a FloatBuffer that will be used to store our vertex array data in native code, as explained in Section 2.4, Making the Data Accessible to OpenGL, on page 26. The constructor takes in an array of Java floating-point data and writes it to the buffer.
We’ve also created a generic method to associate an attribute in our shader with the data. This follows the same pattern as we explained in Associating an Array of Vertex Data with an Attribute, on page 49.
Chapter 7. Adding Detail with Textures • 128