Cardboard VR Projects For Android - Sample Chapter
Cardboard VR Projects For Android - Sample Chapter
$ 44.99 US
28.99 UK
P U B L I S H I N G
Jonathan Linowes
Matt Schoen
Cardboard VR Projects
for Android
ee
Sa
m
pl
C o m m u n i t y
E x p e r i e n c e
D i s t i l l e d
Cardboard VR Projects
for Android
Develop mobile virtual reality apps using the native Google
Cardboard SDK for Android
Jonathan Linowes
Matt Schoen
Matt Schoen is the cofounder of Defective Studios and has been making VR apps
since the early DK1 days. Still in the early stages of his career, he spent most of his
time working on Unity apps and games, some for hire and some of his own design.
He studied computer engineering at Boston University and graduated with a BS in
2010, at which point he founded Defective with Jono Forbes, a high-school friend.
He has been making games and apps ever since. Matt was the technical lead on
Defective's debut game, CosmoKnots, and remains involved in Jono's pet project,
Archean. This is his first foray into authorship, but he brings with him his experience
as an instructor and curriculum designer for Digital Media Academy. Jono and Matt
have recently joined Unity's VR Labs division, where they will be helping to create
experimental new features which will shape the VR landscape for years to come.
Preface
Preface
Google Cardboard is a low-cost, entry-level medium used for experiencing virtual
3D environments. Its applications are as broad and varied as mobile smartphone
applications themselves. This book gives you the opportunity to implement a variety
of interesting projects for Google Cardboard using the native Java SDK. The idea is to
educate you with best practices and methodologies to make Cardboard-compatible
mobile VR apps and guide you through making quality content appropriate for the
device and its intended users.
Preface
Chapter 6, Solar System, builds a Solar System simulation science project by adding
a sunlight source, spherical planets with texture mapped materials and shaders,
animating in their solar orbits, and a Milky Way star field.
Chapter 7, 360-Degree Gallery, helps you build a media viewer for regular and
360-degree photos, and helps you load the phone's camera folder pictures into a grid
of thumbnail images and use gaze-based selections to choose the ones to view. It also
discusses how to add process threading for improved user experience and support
Android intents to view images from other apps.
Chapter 8, 3D Model Viewer, helps you build a viewer for 3D models in the OBJ file
format, rendered using our RenderBox library. It also shows you how to interactively
control the view of the model as you move your head.
Chapter 9, Music Visualizer, builds a VR music visualizer that animates based on
waveform and FFT data from the phone's current audio player. We implement a
general architecture used to add new visualizations, including geometric animations
and dynamic texture shaders. Then, we add a trippy trails mode and multiple
concurrent visualizations that transition in and out randomly.
Solar System
When I was 8 years old, for a science project at school, I made a Solar System from
wires, styrofoam balls, and paint. Today, 8-year olds all around the world will be
able to make virtual Solar Systems in VR, especially if they read this chapter! This
project creates a Cardboard VR app that simulates our Solar System. Well, maybe
not with total scientific accuracy, but good enough for a kid's project and better than
styrofoam balls.
In this chapter, you will create a new Solar System project with the RenderBox
library by performing the following steps:
As we put these together, we will create planets and moons from a sphere. Much
of the code, however, will be in the various materials and shaders for rendering
these bodies.
The source code for this project can be found on the Packt Publishing
website, and on GitHub at https://github.com/cardbookvr/
solarsystem (with each topic as a separate commit).
[ 187 ]
Solar System
Chapter 6
}
@Override
public void postDraw() {
}
}
While we build this project, we will be creating new classes that could be good
extensions to RenderBox lib. We'll make them regular classes in this project at first.
Then, at the end of the chapter, we'll help you move them into the RenderBox lib
project and rebuild the library:
1. Right-click on the solarsystem folder (com.cardbookvr.solarsystem),
select New | Package, and name it RenderBoxExt.
2. Within RenderBoxExt, create package subfolders named components and
materials.
There's no real technical need to make it a separate package, but this helps organize
our files, as the ones in RenderBoxExt will be moved into our reusable library at the
end of this chapter.
You can add a cube to the scene, temporarily, to help ensure that everything is set up
properly. Add it to the setup method as follows:
public void setup() {
new Transform()
.setLocalPosition(0,0,-7)
.setLocalRotation(45,60,0)
.addComponent(new Cube(true));
}
[ 189 ]
Solar System
The constructor calls a helper method, allocateBuffers, which allocates buffers for
vertices, normals, textures, and indexes. Let's declare variables for these at the top of
the class:
public
public
public
public
public
static
static
static
static
static
FloatBuffer vertexBuffer;
FloatBuffer normalBuffer;
FloatBuffer texCoordBuffer;
ShortBuffer indexBuffer;
int numIndices;
Note that we've decided to declare the buffers public to afford future flexibility in
creating arbitrary texture materials for objects.
We'll define a sphere with a radius of 1. Its vertices are arranged by 24 longitude
sections (as hours of the day) and 16 latitude sections, providing sufficient resolution for
our purposes. The top and bottom caps are handled separately. This is a long method,
so we'll break it down for you. Here's the first part of the code where we declare and
initialize variables, including the vertices array. Similar to our Material setup methods,
we only need to allocate the Sphere buffers once, and in this case, we use the vertex
buffer variable to keep track of this state. If it is not null, the buffers have already been
allocated. Otherwise, we should continue with the function, which will set this value:
public static void allocateBuffers(){
//Already allocated?
if (vertexBuffer != null) return;
//Generate a sphere model
float radius = 1f;
// Longitude |||
int nbLong = 24;
// Latitude --int nbLat = 16;
Vector3[] vertices = new Vector3[(nbLong+1) * nbLat +
nbLong * 2];
float _pi = MathUtils.PI;
float _2pi = MathUtils.PI2;
[ 190 ]
Chapter 6
Calculate the vertex positions; first, the top and bottom ones and then along the
latitude/longitude spherical grid:
//Top and bottom vertices are duplicated
for(int i = 0; i < nbLong; i++){
vertices[i] = new Vector3(Vector3.up).
multiply(radius);
vertices[vertices.length - i - 1] = new
Vector3(Vector3.up).multiply(-radius);
}
for( int lat = 0; lat < nbLat; lat++ )
{
float a1 = _pi * (float)(lat+1) / (nbLat+1);
float sin1 = (float)Math.sin(a1);
float cos1 = (float)Math.cos(a1);
for( int lon =
{
float a2 =
/ nbLong;
float sin2
float cos2
Next, we calculate the vertex normals and then the UVs for texture mapping:
Vector3[] normals = new Vector3[vertices.length];
for( int n = 0; n < vertices.length; n++ )
normals[n] = new Vector3(vertices[n]).normalize();
Vector2[] uvs = new Vector2[vertices.length];
float uvStart = 1.0f / (nbLong * 2);
float uvStride = 1.0f / nbLong;
for(int i = 0; i < nbLong; i++) {
uvs[i] = new Vector2(uvStart + i * uvStride, 1f);
uvs[uvs.length - i - 1] = new Vector2(1 - (uvStart + i
* uvStride), 0f);
}
for( int lat = 0; lat < nbLat; lat++ )
for( int lon = 0; lon <= nbLong; lon++ )
[ 191 ]
Solar System
uvs[lon + lat * (nbLong + 1) + nbLong] = new
Vector2( (float)lon / nbLong, 1f (float)(lat+1) / (nbLat+1) );
This next part of the same allocateBuffers method generates the triangular
indices, which connect the vertices:
int nbFaces = (nbLong+1) * nbLat + 2;
int nbTriangles = nbFaces * 2;
int nbIndexes = nbTriangles * 3;
numIndices = nbIndexes;
short[] triangles = new short[ nbIndexes ];
//Top Cap
int i = 0;
for( short lon = 0; lon < nbLong; lon++ )
{
triangles[i++] = lon;
triangles[i++] = (short)(nbLong + lon+1);
triangles[i++] = (short)(nbLong + lon);
}
//Middle
for( short lat = 0; lat < nbLat - 1; lat++ )
{
for( short lon = 0; lon < nbLong; lon++ )
{
short current = (short)(lon + lat * (nbLong + 1) +
nbLong);
short next = (short)(current + nbLong + 1);
triangles[i++] = current;
triangles[i++] = (short)(current + 1);
triangles[i++] = (short)(next + 1);
triangles[i++] = current;
triangles[i++] = (short)(next + 1);
triangles[i++] = next;
}
}
//Bottom Cap
for( short lon = 0; lon < nbLong; lon++ )
{
triangles[i++] = (short)(vertices.length - lon - 1);
[ 192 ]
Chapter 6
triangles[i++] = (short)(vertices.length - nbLong (lon+1) - 1);
triangles[i++] = (short)(vertices.length - nbLong (lon) - 1);
}
This is a lot of code, and might be hard to read on the pages of a book; you can find a
copy in the project GitHub repository if you prefer.
[ 193 ]
Solar System
Conveniently, since the sphere is centered at the origin (0,0,0), the normal vectors at
each vertex correspond to the vertex position itself (radiating from the origin to the
vertex). Strictly speaking, since we used a radius of 1, we can avoid the normalize()
step to generate the array of normals as an optimization. The following image shows
the 24 x 16 vertex sphere with its normal vectors:
Note that our algorithm includes an interesting fix that avoids a single vertex at
the poles (where all the UVs converge at a single point and cause some swirling
texture artifacts).
We create nLon-1 co-located vertices spread across the UV X, offset by 1/(nLon*2),
drawing teeth at the top and bottom. The following image shows the flattened UV
sheet for the sphere illustrating the polar teeth:
[ 194 ]
Chapter 6
File: res/raw/solid_color_lighting_vertex.shader
uniform mat4 u_MVP;
uniform mat4 u_MV;
attribute vec4 a_Position;
attribute vec3 a_Normal;
varying vec3 v_Position;
varying vec3 v_Normal;
void main() {
// vertex in eye space
v_Position = vec3(u_MV * a_Position);
// normal's orientation in eye space
v_Normal = vec3(u_MV * vec4(a_Normal, 0.0));
// point in normalized screen coordinates
gl_Position = u_MVP * a_Position;
}
[ 195 ]
Solar System
Note that we have separate uniform variables for u_MV and u_MVP. Also, if you
remember that in the previous chapter, we separated the lighting model from the
actual model because we did not want scale to affect lighting calculations. Similarly,
the projection matrix is only useful to apply the camera FOV to vertex positions and
will interfere with lighting calculations.
File: res/raw/solid_color_lighting_fragment.shader
precision mediump float;
// default medium precision in the fragment shader
uniform vec3 u_LightPos;
// light position in eye space
uniform vec4 u_LightCol;
uniform vec4 u_Color;
varying vec3 v_Position;
varying vec3 v_Normal;
varying vec2 v_TexCoordinate;
void main() {
// distance for attenuation.
float distance = length(u_LightPos - v_Position);
// lighting direction vector from the light to the vertex
vec3 lightVector = normalize(u_LightPos - v_Position);
// dot product of the light vector and vertex normal.
// If the normal and light vector are
// pointing in the same direction then it will get max
// illumination.
float diffuse = max(dot(v_Normal, lightVector), 0.01);
// Add a tiny bit of ambient lighting (this is outerspace)
diffuse = diffuse + 0.025;
// Multiply color by the diffuse illumination level and
// texture value to get final output color
gl_FragColor = u_Color * u_LightCol * diffuse;
}
[ 196 ]
Chapter 6
Add the variables for color, program references, and buffers, as shown in the
following code:
float[] color = new float[4];
static int program = -1;
static int positionParam;
static int colorParam;
static int normalParam;
static int modelParam;
static int MVParam;
static int MVPParam;
static int lightPosParam;
static int lightColParam;
FloatBuffer vertexBuffer;
FloatBuffer normalBuffer;
ShortBuffer indexBuffer;
int numIndices;
Now, we can add a constructor, which receives a color (RGBA) value and sets up the
shader program, as follows:
public SolidColorLightingMaterial(float[] c){
super();
setColor(c);
setupProgram();
}
public void setColor(float[] c){
color = c;
}
[ 197 ]
Solar System
As we've seen earlier, the setupProgram method creates the shader program and
obtains references to its parameters:
public static void setupProgram(){
//Already setup?
if (program != -1) return;
//Create shader program
program = createProgram(R.raw.solid_color_lighting_vertex,
R.raw.solid_color_lighting_fragment);
//Get vertex attribute parameters
positionParam = GLES20.glGetAttribLocation(program,
"a_Position");
normalParam = GLES20.glGetAttribLocation(program,
"a_Normal");
//Enable them (turns out this is kind of a big deal ;)
GLES20.glEnableVertexAttribArray(positionParam);
GLES20.glEnableVertexAttribArray(normalParam);
//Shader-specific parameters
colorParam = GLES20.glGetUniformLocation(program,
"u_Color");
MVParam = GLES20.glGetUniformLocation(program, "u_MV");
MVPParam = GLES20.glGetUniformLocation(program, "u_MVP");
lightPosParam = GLES20.glGetUniformLocation(program,
"u_LightPos");
lightColParam = GLES20.glGetUniformLocation(program,
"u_LightCol");
RenderBox.checkGLError("Solid Color Lighting params");
}
[ 198 ]
Chapter 6
Lastly, add the draw code, which will be called from the Camera component, to
render the geometry prepared in the buffers (via setBuffers). The draw method
looks like this:
@Override
public void draw(float[] view, float[] perspective) {
GLES20.glUseProgram(program);
GLES20.glUniform3fv(lightPosParam, 1,
RenderBox.instance.mainLight.lightPosInEyeSpace, 0);
GLES20.glUniform4fv(lightColParam, 1,
RenderBox.instance.mainLight.color, 0);
Matrix.multiplyMM(modelView, 0, view, 0,
RenderObject.lightingModel, 0);
// Set the ModelView in the shader,
// used to calculate lighting
GLES20.glUniformMatrix4fv(MVParam, 1, false,
modelView, 0);
Matrix.multiplyMM(modelView, 0, view, 0,
RenderObject.model, 0);
Matrix.multiplyMM(modelViewProjection, 0, perspective, 0,
modelView, 0);
// Set the ModelViewProjection matrix for eye position.
GLES20.glUniformMatrix4fv(MVPParam, 1, false,
modelViewProjection, 0);
GLES20.glUniform4fv(colorParam, 1, color, 0);
//Set vertex attributes
GLES20.glVertexAttribPointer(positionParam, 3,
GLES20.GL_FLOAT, false, 0, vertexBuffer);
GLES20.glVertexAttribPointer(normalParam, 3,
GLES20.GL_FLOAT, false, 0, normalBuffer);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, numIndices,
GLES20.GL_UNSIGNED_SHORT, indexBuffer);
}
Now that we have a solid color lighting material and shaders, we can add them to
the Sphere class to be used in our project.
[ 199 ]
Solar System
[ 200 ]
Chapter 6
[ 201 ]
Solar System
Loading a texture le
We now need a function to load the texture into our app. We can add it to
MainActivity. Or, you can add it directly to the RenderObject class of your
RenderBox lib. (It's fine in MainActivity for now, and we'll move it along with our
other extensions to the library at the end of this chapter.) Add the code, as follows:
public static int loadTexture(final int resourceId){
final int[] textureHandle = new int[1];
GLES20.glGenTextures(1, textureHandle, 0);
if (textureHandle[0] != 0)
{
final BitmapFactory.Options options = new
BitmapFactory.Options();
options.inScaled = false;
// No pre-scaling
// Read in the resource
final Bitmap bitmap = BitmapFactory.decodeResource
(RenderBox.instance.mainActivity.getResources(),
resourceId, options);
[ 202 ]
Chapter 6
// Bind to the texture in OpenGL
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,
textureHandle[0]);
// Set filtering
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D,
GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
// Load the bitmap into the bound texture.
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0,
bitmap, 0);
// Recycle the bitmap, since its data has been loaded
// into OpenGL.
bitmap.recycle();
}
if (textureHandle[0] == 0)
{
throw new RuntimeException("Error loading texture.");
}
return textureHandle[0];
}
The loadTexture method returns an integer handle that can be used to reference the
loaded texture data.
[ 203 ]
Solar System
varying vec3 v_Position;
varying vec3 v_Normal;
varying vec2 v_TexCoordinate;
void main() {
// vertex in eye space
v_Position = vec3(u_MV * a_Position);
// pass through the texture coordinate.
v_TexCoordinate = a_TexCoordinate;
// normal's orientation in eye space
v_Normal = vec3(u_MV * vec4(a_Normal, 0.0));
// final point in normalized screen coordinates
gl_Position = u_MVP * a_Position;
}
File: res/raw/diffuse_lighting_fragment.shader
precision highp float;
// default high precision for floating point ranges of the planets
uniform vec3 u_LightPos;
uniform vec4 u_LightCol;
uniform sampler2D u_Texture;
[ 204 ]
Chapter 6
// Add a tiny bit of ambient lighting (this is outerspace)
diffuse = diffuse + 0.025;
// Multiply the color by the diffuse illumination level and
// texture value to get final output color
gl_FragColor = texture2D(u_Texture, v_TexCoordinate) *
u_LightCol * diffuse;
}
These shaders add attributes to a light source and utilize geometry normal vectors
on the vertices to calculate the shading. You might have noticed that the difference
between this and the solid color shader is the use of texture2D, which is a sampler
function. Also, note that we declared u_Texture as sampler2D. This variable type
and function make use of the texture units, which are built into the GPU hardware,
and can be used with UV coordinates to return the color values from a texture image.
There are a fixed number of texture units, depending on graphics hardware. You can
query the number of texture units using OpenGL. A good rule of thumb for mobile
GPUs is to expect eight texture units. This means that any shader may use up to eight
textures simultaneously.
Add the variables for the texture ID, program references, and buffers, as shown in
the following code:
int textureId;
static int program = -1;
//Initialize to a totally invalid value for setup state
static int positionParam;
static int texCoordParam;
static int textureParam;
static int normalParam;
static int MVParam;
static int MVPParam;
static int lightPosParam;
static int lightColParam;
[ 205 ]
Solar System
FloatBuffer vertexBuffer;
FloatBuffer texCoordBuffer;
FloatBuffer normalBuffer;
ShortBuffer indexBuffer;
int numIndices;
Now we can add a constructor, which sets up the shader program and loads the
texture for the given resource ID, as follows:
public DiffuseLightingMaterial(int resourceId){
super();
setupProgram();
this.textureId = MainActivity.loadTexture(resourceId);
}
As we've seen earlier, the setupProgram method creates the shader program and
obtains references to its parameters:
public static void setupProgram(){
//Already setup?
if (program != -1) return;
//Create shader program
program = createProgram(R.raw.diffuse_lighting_vertex,
R.raw.diffuse_lighting_fragment);
RenderBox.checkGLError("Diffuse Texture Color Lighting
shader compile");
//Get vertex attribute parameters
positionParam = GLES20.glGetAttribLocation(program,
"a_Position");
normalParam = GLES20.glGetAttribLocation(program,
"a_Normal");
texCoordParam = GLES20.glGetAttribLocation(program,
"a_TexCoordinate");
//Enable them (turns out this is kind of a big deal ;)
GLES20.glEnableVertexAttribArray(positionParam);
GLES20.glEnableVertexAttribArray(normalParam);
GLES20.glEnableVertexAttribArray(texCoordParam);
//Shader-specific parameters
textureParam = GLES20.glGetUniformLocation(program,
"u_Texture");
MVParam = GLES20.glGetUniformLocation(program, "u_MV");
MVPParam = GLES20.glGetUniformLocation(program, "u_MVP");
[ 206 ]
Chapter 6
lightPosParam = GLES20.glGetUniformLocation(program,
"u_LightPos");
lightColParam = GLES20.glGetUniformLocation(program,
"u_LightCol");
RenderBox.checkGLError("Diffuse Texture Color Lighting
params");
}
Lastly, add the draw code, which will be called from the Camera component, to
render the geometry prepared in the buffers (via setBuffers). The draw method
looks like this:
@Override
public void draw(float[] view, float[] perspective) {
GLES20.glUseProgram(program);
// Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
// Tell the texture uniform sampler to use this texture in
// the shader by binding to texture unit 0.
GLES20.glUniform1i(textureParam, 0);
//Technically, we don't need to do this with every draw
//call, but the light could move.
//We could also add a step for shader-global parameters
//which don't vary per-object
[ 207 ]
Solar System
GLES20.glUniform3fv(lightPosParam, 1,
RenderBox.instance.mainLight.lightPosInEyeSpace, 0);
GLES20.glUniform4fv(lightColParam, 1,
RenderBox.instance.mainLight.color, 0);
Matrix.multiplyMM(modelView, 0, view, 0,
RenderObject.lightingModel, 0);
// Set the ModelView in the shader, used to calculate
// lighting
GLES20.glUniformMatrix4fv(MVParam, 1, false,
modelView, 0);
Matrix.multiplyMM(modelView, 0, view, 0,
RenderObject.model, 0);
Matrix.multiplyMM(modelViewProjection, 0, perspective, 0,
modelView, 0);
// Set the ModelViewProjection matrix for eye position.
GLES20.glUniformMatrix4fv(MVPParam, 1, false,
modelViewProjection, 0);
//Set vertex attributes
GLES20.glVertexAttribPointer(positionParam, 3,
GLES20.GL_FLOAT, false, 0, vertexBuffer);
GLES20.glVertexAttribPointer(normalParam, 3,
GLES20.GL_FLOAT, false, 0, normalBuffer);
GLES20.glVertexAttribPointer(texCoordParam, 2,
GLES20.GL_FLOAT, false, 0, texCoordBuffer);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, numIndices,
GLES20.GL_UNSIGNED_SHORT, indexBuffer);
RenderBox.checkGLError("Diffuse Texture Color Lighting
draw");
}
}
[ 208 ]
Chapter 6
[ 209 ]
Solar System
That looks really cool. Oops, it's upside down! Although there's not really a specific
up versus down in outer space, our Earth looks upside down from what we're used
to seeing. Let's flip it in the setup method so that it starts at the correct orientation,
and while we're at it, let's take advantage of the fact that the Transform methods
return themselves, so we can chain the calls, as follows:
public void setup() {
sphere = new Transform()
.setLocalPosition(2.0f, -2.f, -2.0f)
.rotate(0, 0, 180f)
.addComponent(new Sphere(R.drawable.earth_tex));
}
Naturally, the Earth is supposed to spin. Let's animate it to rotate it like we'd expect
the Earth to do. Add this to the preDraw method, which gets called before each new
frame. It uses the Time class's getDeltaTime method, which returns the current
fraction of a second change since the previous frame. If we want it to rotate,
say, -10 degrees per second, we use -10 * deltaTime:
public void preDraw() {
float dt = Time.getDeltaTime();
sphere.rotate( 0, -10f * dt, 0);
}
[ 210 ]
Chapter 6
[ 211 ]
Solar System
Day/night shader
To support this, we will create a new DayNightMaterial class that takes both
versions of the Earth texture. The material will also incorporate the corresponding
fragment shader that takes into consideration the normal vector of the surface
relative to the light source direction (using dot products, if you're familiar with
vector math) to decide whether to render using the day or night texture image.
In your res/raw/ folder, create files for day_night_vertex.shader and
day_night_fragment.shader, and then define them, as follows.
[ 212 ]
Chapter 6
File: day_night_vertex.shader
uniform mat4 u_MVP;
uniform mat4 u_MV;
attribute vec4 a_Position;
attribute vec3 a_Normal;
attribute vec2 a_TexCoordinate;
varying vec3 v_Position;
varying vec3 v_Normal;
varying vec2 v_TexCoordinate;
void main() {
// vertex to eye space
v_Position = vec3(u_MV * a_Position);
// pass through the texture coordinate
v_TexCoordinate = a_TexCoordinate;
// normal's orientation in eye space
v_Normal = vec3(u_MV * vec4(a_Normal, 0.0));
// final point in normalized screen coordinates
gl_Position = u_MVP * a_Position;
}
Except for the addition of v_Texcoordinate, this is exactly the same as our
SolidColorLighting shader.
File: day_night_fragment.shader
precision highp float;
// default high precision for floating point ranges of the
// planets
uniform vec3 u_LightPos;
// light position in eye space
uniform vec4 u_LightCol;
uniform sampler2D u_Texture; // the day texture.
uniform sampler2D u_NightTexture;
// the night texture.
varying vec3 v_Position;
varying vec3 v_Normal;
varying vec2 v_TexCoordinate;
void main() {
[ 213 ]
Solar System
// lighting direction vector from the light to the vertex
vec3 lightVector = normalize(u_LightPos - v_Position);
// dot product of the light vector and vertex normal. If the
// normal and light vector are
// pointing in the same direction then it will get max
// illumination.
float ambient = 0.3;
float dotProd = dot(v_Normal, lightVector);
float blend = min(1.0, dotProd * 2.0);
if(dotProd < 0.0){
//flat ambient level of 0.3
gl_FragColor = texture2D(u_NightTexture, v_TexCoordinate)
* ambient;
} else {
gl_FragColor = (
texture2D(u_Texture, v_TexCoordinate) * blend
+ texture2D(u_NightTexture, v_TexCoordinate) * (1.0 blend)
) * u_LightCol * min(max(dotProd * 2.0, ambient), 1.0);
}
}
As always, for lighting, we calculate the dot product (dotProd) of the vertex normal
vector and the light direction vector. When that value is negative, the vertex is
facing away from the light source (the Sun), so we'll render using the night texture.
Otherwise, we'll render using the regular daytime earth texture.
The lighting calculations also include a blend value. This is basically a way of
squeezing the transitional zone closer around the terminator when calculating the
gl_FragColor variable. We are multiplying the dot product by 2.0 so that it follows
a steeper slope, but still clamping the blend value between 0 and 1. It's a little
complicated, but once you think about the math, it should make some sense.
We are using two textures to draw the same surface. While this might seem
unique to this day/night situation, it is actually a very common method known as
multitexturing. You may not believe it, but 3D graphics actually got quite far before
introducing the ability to use more than one texture at a time. These days, you see
multitexturing almost everywhere, enabling techniques such as normal mapping,
decal textures, and displacement/parallax shaders, which create greater detail with
simpler meshes.
[ 214 ]
Chapter 6
DiffuseLightingMaterial:
As with our other materials, declare the variables we'll need, including the texture ID
for both the day and night:
int textureId;
int nightTextureId;
static int program = -1;
//Initialize to a totally invalid value for setup state
static int positionParam;
static int texCoordParam;
static int textureParam;
static int nightTextureParam;
static int normalParam;
static int MVParam;
static int MVPParam;
static int lightPosParam;
static int lightColParam;
FloatBuffer vertexBuffer;
FloatBuffer texCoordBuffer;
FloatBuffer normalBuffer;
ShortBuffer indexBuffer;
int numIndices;
Define the constructor that takes both the resource IDs and the setupProgram
helper method:
public DayNightMaterial(int resourceId, int nightResourceId){
super();
setupProgram();
this.textureId = MainActivity.loadTexture(resourceId);
this.nightTextureId = MainActivity.
[ 215 ]
Solar System
loadTexture(nightResourceId);
}
public static void setupProgram(){
if(program != -1) return;
//Create shader program
program = createProgram(R.raw.day_night_vertex,
R.raw.day_night_fragment);
//Get vertex attribute parameters
positionParam = GLES20.glGetAttribLocation(program,
"a_Position");
normalParam = GLES20.glGetAttribLocation(program,
"a_Normal");
texCoordParam = GLES20.glGetAttribLocation(program,
"a_TexCoordinate");
//Enable them (turns out this is kind of a big deal ;)
GLES20.glEnableVertexAttribArray(positionParam);
GLES20.glEnableVertexAttribArray(normalParam);
GLES20.glEnableVertexAttribArray(texCoordParam);
//Shader-specific parameters
textureParam = GLES20.glGetUniformLocation(program,
"u_Texture");
nightTextureParam = GLES20.glGetUniformLocation(program,
"u_NightTexture");
MVParam = GLES20.glGetUniformLocation(program, "u_MV");
MVPParam = GLES20.glGetUniformLocation(program, "u_MVP");
lightPosParam = GLES20.glGetUniformLocation(program,
"u_LightPos");
lightColParam = GLES20.glGetUniformLocation(program,
"u_LightCol");
RenderBox.checkGLError("Day/Night params");
}
public void setBuffers(FloatBuffer vertexBuffer, FloatBuffer
normalBuffer, FloatBuffer texCoordBuffer, ShortBuffer
indexBuffer, int numIndices){
//Associate VBO data with this instance of the material
this.vertexBuffer = vertexBuffer;
this.normalBuffer = normalBuffer;
this.texCoordBuffer = texCoordBuffer;
this.indexBuffer = indexBuffer;
this.numIndices = numIndices;
}
[ 216 ]
Chapter 6
Lastly, the draw method that cranks it all out to the screen:
@Override
public void draw(float[] view, float[] perspective) {
GLES20.glUseProgram(program);
// Set the active texture unit to texture unit 0.
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
// Bind the texture to this unit.
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D,
nightTextureId);
// Tell the texture uniform sampler to use this texture in
// the shader by binding to texture unit 0.
GLES20.glUniform1i(textureParam, 0);
GLES20.glUniform1i(nightTextureParam, 1);
//Technically, we don't need to do this with every draw
//call, but the light could move.
//We could also add a step for shader-global parameters
//which don't vary per-object
GLES20.glUniform3fv(lightPosParam, 1,
RenderBox.instance.mainLight.lightPosInEyeSpace, 0);
GLES20.glUniform4fv(lightColParam, 1,
RenderBox.instance.mainLight.color, 0);
Matrix.multiplyMM(modelView, 0, view, 0,
RenderObject.lightingModel, 0);
// Set the ModelView in the shader, used to calculate
// lighting
GLES20.glUniformMatrix4fv(MVParam, 1, false,
modelView, 0);
Matrix.multiplyMM(modelView, 0, view, 0,
RenderObject.model, 0);
Matrix.multiplyMM(modelViewProjection, 0, perspective, 0,
modelView, 0);
// Set the ModelViewProjection matrix for eye position.
GLES20.glUniformMatrix4fv(MVPParam, 1, false,
modelViewProjection, 0);
//Set vertex attributes
GLES20.glVertexAttribPointer(positionParam, 3,
GLES20.GL_FLOAT, false, 0, vertexBuffer);
GLES20.glVertexAttribPointer(normalParam, 3,
GLES20.GL_FLOAT, false, 0, normalBuffer);
[ 217 ]
Solar System
GLES20.glVertexAttribPointer(texCoordParam, 2,
GLES20.GL_FLOAT, false, 0, texCoordBuffer);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, numIndices,
GLES20.GL_UNSIGNED_SHORT, indexBuffer);
RenderBox.checkGLError("DayNight Texture Color Lighting
draw");
}
}
Let's call it from the setup method of MainActivity, and replace the call with the
new Sphere instance passing both the textures' resource IDs:
.addComponent(new Sphere(R.drawable.earth_tex,
R.drawable.earth_night_tex));
Run it now. That looks really cool! Classy! Unfortunately, it doesn't make a lot of
sense to paste a screenshot here because the city night lights won't show very well.
You'll just have to see it for yourself in your own Cardboard viewer. Believe me
when I tell you, it's worth it!
Next, here comes the Sun, and I say, it's alright...
[ 218 ]
Chapter 6
File: unlit_tex_fragment.shader
precision mediump float;
uniform sampler2D u_Texture;
Solar System
// Send the color from the texture straight out
gl_FragColor = texture2D(u_Texture, v_TexCoordinate);
}
[ 220 ]
Chapter 6
texCoordParam = GLES20.glGetAttribLocation(program,
"a_TexCoordinate");
//Enable them (turns out this is kind of a big deal ;)
GLES20.glEnableVertexAttribArray(positionParam);
GLES20.glEnableVertexAttribArray(texCoordParam);
//Shader-specific parameters
textureParam = GLES20.glGetUniformLocation(program,
"u_Texture");
MVPParam = GLES20.glGetUniformLocation(program, "u_MVP");
RenderBox.checkGLError("Unlit Texture params");
}
public void setBuffers(FloatBuffer vertexBuffer, FloatBuffer
texCoordBuffer, ShortBuffer indexBuffer, int numIndices){
//Associate VBO data with this instance of the material
this.vertexBuffer = vertexBuffer;
this.texCoordBuffer = texCoordBuffer;
this.indexBuffer = indexBuffer;
this.numIndices = numIndices;
}
It will be handy to have getter and setter methods for the texture ID (in later projects,
not used here):
public void setTexture(int textureHandle){
textureId = textureHandle;
}
public int getTexture(){
return textureId;
}
[ 221 ]
Solar System
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
// Tell the texture uniform sampler to use this texture in
// the shader by binding to texture unit 0.
GLES20.glUniform1i(textureParam, 0);
Matrix.multiplyMM(modelView, 0, view, 0,
RenderObject.model, 0);
Matrix.multiplyMM(modelViewProjection, 0, perspective, 0,
modelView, 0);
// Set the ModelViewProjection matrix in the shader.
GLES20.glUniformMatrix4fv(MVPParam, 1, false,
modelViewProjection, 0);
// Set the vertex attributes
GLES20.glVertexAttribPointer(positionParam, 3,
GLES20.GL_FLOAT, false, 0, vertexBuffer);
GLES20.glVertexAttribPointer(texCoordParam, 2,
GLES20.GL_FLOAT, false, 0, texCoordBuffer);
GLES20.glDrawElements(GLES20.GL_TRIANGLES, numIndices,
GLES20.GL_UNSIGNED_SHORT, indexBuffer);
RenderBox.checkGLError("Unlit Texture draw");
}
}
Chapter 6
UnlitTexMaterial mat = new UnlitTexMaterial(textureId);
mat.setBuffers(vertexBuffer, texCoordBuffer, indexBuffer,
numIndices);
material = mat;
return this;
}
Notice that the way in which we've defined constructors, you can call either new
Sphere(texId) or Sphere(texId, true) to get lighted renders. But for unlit, you
must use the second one as Sphere(texId, false). Also note that setting up the
whole component in the constructor is not the only way to go. We only do it this way
because it keeps our MainActivity code concise. In fact, as we start expanding our
use of RenderBox and its shader library, it will become necessary to put most of this
code into our MainActivity class. It would be impossible to create a constructor for
every type of material. Ultimately, a materials system is necessary to allow you to
create and set materials without having to create a new class for each one.
[ 223 ]
Solar System
We start by defining an origin transform that will be the center of the Solar System.
Then, we create the Sun, parented to the origin, with the given scale. Then, add a
new sphere component with the Sun texture. We've also given our light a slightly
yellowish color, which will blend with the Earth's texture colors.
Here's what the rendered Sun looks like, which seems to illuminate the Earth:
The distance will be its distance from the Sun measured in millions
of kilometers.
Rotation is the rate at which the planet rotates about its own axis (one of
its days).
Orbit is the rate at which the planet rotates about the Sun (one of its years).
We will assume a perfectly circular orbit.
[ 224 ]
Chapter 6
origin is the center of its orbit. For planets, this will be the Sun's transform.
For a moon, this will be the moon's planet.
The Solar System is a really big thing. The distances and radii are measured in
millions of kilometers. The planets are really far apart and relatively small compared
to the size of their orbits. The rotation and orbit values are relative rates. You'll note
that we'll normalize them to 10 seconds per Earth day.
From these attributes, a planet maintains two transforms: one transform for the
planet itself and another transform that describes its location in orbit. In this way, we
can rotate each planet's separate parent transform which, when the planet is at a local
position whose magnitude is equal to the orbital radius, causes the planet to move in
a circular pattern. Then we can rotate the planet itself using its transform.
For the Moon, we'll also use the Planet class (yeah, I know, maybe we should have
named it HeavenlyBody?) but set its origin as the Earth. The moon does not rotate.
In your app (for example, app/java/com/cardbookvr/solarsystem/), create a
Java class and name it Planet. Add variables for its attributes (distance, radius,
rotation, orbit, orbitTransform, and transform), as follows:
public class Planet {
protected float rotation, orbit;
protected Transform orbitTransform, transform;
public float distance, radius;
Define a constructor that takes the planet's attribute values, initializes the variables,
and calculates the initial transforms:
public Planet(float distance, float radius, float rotation,
float orbit, int texId, Transform origin){
setupPlanet(distance, radius, rotation, orbit, origin);
transform.addComponent(new Sphere(texId));
}
public void setupPlanet(float distance, float radius, float
rotation, float orbit, Transform origin){
this.distance = distance;
this.radius = radius;
this.rotation = rotation;
this.orbit = orbit;
this.orbitTransform = new Transform();
this.orbitTransform.setParent(origin, false);
transform = new Transform()
[ 225 ]
Solar System
.setParent(orbitTransform, false)
.setLocalPosition(distance, 0, 0)
.setLocalRotation(180, 0, 0)
.setLocalScale(radius, radius, radius);
}
The constructor generates an initial transform for the planet and adds a Sphere
component with the given texture.
On each new frame, we will update the orbitTransform rotation around the Sun
(year) and the planet's rotation about its own axis (day):
public void preDraw(float dt){
orbitTransform.rotate(0, dt * orbit, 0);
transform.rotate(0, dt * -rotation, 0);
}
We can also provide a couple of accessor methods for the Planet class's transforms:
public Transform getTransform() { return transform; }
public Transform getOrbitransform() { return orbitTransform; }
Radius size
(km)
Day length
(Earth hours)
Year length
(Earth years)
Mercury
57.9
2440
1408.8
0.24
Venus
108.2
6052
5832
0.615
Earth
147.1
6371
24
1.0
Earth's Moon
1737
Mars
227.9
3390
24.6
2.379
Jupiter
778.3
69911
9.84
11.862
Saturn
1427.0
58232
10.2
29.456
Uranus
2871.0
25362
17.9
84.07
Neptune
4497
24622
19.1
164.81
5913
1186
6.39
247.7
[ 226 ]
Chapter 6
We also have texture images for each of the planets. These files are included with the
downloads for this book. They should be added to the res/drawable folder, named
mercury_tex.png, venus_tex.png, and so on. The following table identifies the
sources we have used and where you can find them as well:
Planet
Mercury
Venus
Earth
Texture
http://laps.noaa.gov/albers/sos/mercury/mercury/
mercury_rgb_cyl_www.jpg
http://csdrive.srru.ac.th/55122420119/texture/venus.jpg
http://www.solarsystemscope.com/nexus/content/tc-earth_
texture/tc-earth_daymap.jpg
Earth's Moon
Mars
Jupiter
Saturn
Uranus
Neptune
Pluto
Night: http://www.solarsystemscope.com/nexus/content/tcearth_texture/tc-earth_nightmap.jpg
https://farm1.staticflickr.com/120/263411684_
ea405ffa8f_o_d.jpg
http://lh5.ggpht.com/-2aLH6cYiaKs/TdOsBtnpRqI/
AAAAAAAAAP4/bnMOdD9OMjk/s9000/mars%2Btexture.jpg
http://laps.noaa.gov/albers/sos/jupiter/jupiter/
jupiter_rgb_cyl_www.jpg
http://www.solarsystemscope.com/nexus/content/planet_
textures/texture_saturn.jpg
http://www.astrosurf.com/nunes/render/maps/full/uranus.
jpg
http://www.solarsystemscope.com/nexus/content/planet_
textures/texture_neptune.jpg
http://www.shatters.net/celestia/files/pluto.jpg
Sun
http://www.solarsystemscope.com/nexus/textures/texture_
pack/assets/preview_sun.jpg
Milky Way
[ 227 ]
Solar System
The setupPlanets method uses our celestial data and builds new planets
accordingly. First, let's define the physical data, as follows:
public void setupPlanets(Transform origin) {
float[]
227.9f,
float[]
149.6f,
The distances array has the distance of each planet from the Sun in millions of
km. This is really huge, especially for the outer planets that are really far away and
are not very visible relative to other planets. To make things more interesting, we'll
fudge the distance of those planets (Jupiter through Pluto), so the values that we'll
use are in the fudged_distances array.
The radii array has the actual size of each planet in kms.
The rotations array has the day length, in Earth hours. Since Mercury and Venus
spin really fast compared to the Earth, we'll artificially slow them down by arbitrary
scale factors.
[ 228 ]
Chapter 6
The orbits array has the length of each planet's year in Earth years and the time it
takes for one complete rotation around the Sun.
Now, let's set up the texture IDs for each planet's materials:
int[] texIds = new int[]{
R.drawable.mercury_tex,
R.drawable.venus_tex,
R.drawable.earth_tex,
R.drawable.mars_tex,
R.drawable.jupiter_tex,
R.drawable.saturn_tex,
R.drawable.uranus_tex,
R.drawable.neptune_tex,
R.drawable.pluto_tex
};
Now initialize the planets array, creating a new Planet object for each:
planets = new Planet[distances.length + 1];
for(int i = 0; i < distances.length; i++){
planets[i] = new Planet(
fudged_distances[i] * DISTANCE_FACTOR,
radii[i] * SCALE_FACTOR,
rotations[i] * DEG_PER_EHOUR,
orbits[i] * DEG_PER_EYEAR *
fudged_distances[i]/distances[i],
texIds[i],
origin);
}
While we fudged some of the planets' actual distances so that they'd be closer to the
inner Solar System, we also multiply all the distances by a DISTANCE_FACTOR scalar,
mostly to not blow up our float precision calculations. We scale all the planet sizes by
a different SCALE_FACTOR variable to make them relatively larger than life (a factor of
0.0001 is actually a factor of 100 because radii are calculated in km while the distance
is calculated in millions of km).
The rotation animation rate is the actual length of the day of the planet scaled by
how fast we want to animate a day in VR. We default to 10 seconds per Earth day.
Lastly, the planetary orbit animation has its own scale factor. We've sped it up about
2 X. You can also adjust the orbit rate of the distance fudge factors (for example,
Pluto orbits the Sun once every 247 Earth years, but we've moved it a lot closer so it
needs to slow down).
[ 229 ]
Solar System
Then, we add the Earth's moon. We've used some artistic license here as well,
adjusting the distance and radius and speeding up its orbit rate to make it
compelling to watch in VR:
// Create the moon
planets[distances.length] = new Planet(7.5f, 0.5f, 0, 0.516f, R.drawable.moon_tex, planets[2].getTransform());}
Let's take a look at one more method: goToPlanet. It'll be convenient to position the
Camera near a specific planet. Since the planets are located at data-driven positions
and will be moving in orbit, it's best to make the camera a child of the planet's
transform. This is one of the reasons why we separated out the orbiting transform
from the planet's transform. We don't want the camera to spin around with the
planetyou might get sick! Here's the implementation:
void goToPlanet(int index){
RenderBox.mainCamera.getTransform().
setParent( planets[index].getOrbitransform(), false);
RenderBox.mainCamera.getTransform().
setLocalPosition( planets[index].distance,
planets[index].radius * 1.5f, planets[index].radius * 2f);
}
[ 230 ]
Chapter 6
setupPlanets(origin);
// Start looking at Earth
goToPlanet(2);
}
If you build and run the project now, you will see the Earth, the Sun, and maybe
some of the planets. But not until they're moving in their orbits will they come to life.
Run! Oh, wow! I feel like a god. Well, not exactly, because it's dark outside.
We need stars!
Solar System
.setLocalScale(Camera.Z_FAR * 0.99f, Camera.Z_FAR
* 0.99f, Camera.Z_FAR * 0.99f)
.addComponent(new Sphere(R.drawable.milky_way_tex,
false));
You might be wondering what that 0.99 factor is all about. Different GPUs deal with
floating point numbers differently. While some might render a vertex at the draw
distance one way, others might exhibit render glitches when the geometry is "on the
edge" due to a floating point precision. In this case, we just pull the skybox toward
the camera by an arbitrarily small factor. It is especially important in VR that the
skybox be as far away as possible, so that it is not drawn with parallax. The fact that
the skybox is in the same exact place for the left and right eye is what tricks your
brain into thinking that it's infinitely far away. You may find that you need to tweak
this factor to avoid holes in the skybox.
[ 232 ]
Chapter 6
This requires that we add a new constructor to the Planet class, which omits texId,
since the Earth constructor creates the new Sphere component, this time with two
textures, textId and nightTexId.
In Planet.java, add the following code:
public Planet(float distance, float radius, float rotation,
float orbit, Transform origin){
setupPlanet(distance, radius, rotation, orbit, origin);
}
Now, in MainActivity, let's create an Earth separately from the other planets.
In setupPlanets, modify the loop to handle this case:
for(int i = 0; i < distances.length; i++){
if (i == 2) {
planets[i] = new Earth(
fudged_distances[i] * DISTANCE_FACTOR,
radii[i] * SCALE_FACTOR,
rotations[i] * DEG_PER_EHOUR,
orbits[i] * DEG_PER_EYEAR *
fudged_distances[i] / distances[i],
texIds[i],
R.drawable.earth_night_tex,
origin);
} else {
planets[i] = new Planet(
[ 233 ]
Solar System
Now, the Earth's rotation on each frame is against this wobble transform, so give
Earth its own preDraw method, as follows:
public void preDraw(float dt){
orbitTransform.rotate(0, dt * orbit, 0);
wobble.rotate(0, dt * 5, 0);
transform.rotate(0, dt * -rotation, 0);
}
[ 234 ]
Chapter 6
Fortunately, we already have a goToPlanet method that we used to set our initial
view from the Earth. Because MainActivity extends CardboardActivity, we
can use the Cardboard SDK's onCardboardTrigger method (refer to https://
developers.google.com/cardboard/android/latest/reference/com/google/
vrtoolkit/cardboard/CardboardActivity.html#onCardboardTrigger()).
The app will start with the camera near the Earth (index 2). When the user presses
the cardboard trigger (or touches the screen), it'll go to Mars (3). Then, Jupiter, and so
on, and then cycle back to Mercury (0).
Possible enhancements
Can you think of other enhancements to this project? Here are a few you could
consider and try to implement:
Add a top-down view option, for a "traditional" picture of the Solar System.
(Be aware of float precision issues at scale.)
Add moons to each of the other planets. (This can be implemented just like
we did for the Earth's moon, with its mother planet as its origin.)
Add tilt and wobble to the other planets. Did you know that Uranus spins
on its side?
Add text labels to each planet that use the planet's transform but always face
the camera. In lieu of 3D text objects, the labels could be prepared images.
Solar System
[ 236 ]
Chapter 6
Lastly, the Solar System project can now use the new .aar library. Copy the
renderbox[-debug].aar file from the RenderBoxLib project's renderbox/build/
output folder into the SolarSystem renderbox/ folder, replacing the older version
of the same file with the newly built one. Build and run the Solar System project with
this version of the library.
Summary
Congratulations! You received an "A" on your Solar System science project!
In this chapter, we built a Solar System simulation that can be viewed in virtual
reality using a Cardboard VR viewer and an Android phone. This project uses and
expands the RenderBox library, as discussed in Chapter 5, RenderBox Engine.
To begin, we added a Sphere component to our repertoire. Initially, it was rendered
using a solid color lighting material. Then, we defined a diffuse lighting material
and rendered the sphere with an Earth image texture, resulting in a rendered globe.
Next, we enhanced the material to accept two textures, adding an additional one to
the back/"night" side of the sphere. And lastly, we created an unlit texture material,
which is used for the Sun. Armed with actual sizes of the planets and distances from
the Sun, we configured a Solar System scene with nine planets, the Earth's moon, and
the Sun. We added a star field as a sky dome, and we animated the heavenly bodies
for their appropriate rotation (day) and orbit (year). We also implemented some
interaction, responding to Cardboard trigger events by moving the camera view
from planet to planet.
In the next chapter, we'll get to use our sphere again, this time, to view your library
of 360-degree photos.
[ 237 ]
www.PacktPub.com
Stay Connected: