Open GL
Open GL
Programming
Adapted from SIGGRAPH 2012 slides by
Ed Angel
University of New Mexico
and
Dave Shreiner
ARM, Inc
Outline
What Is OpenGL?
OpenGL is a computer graphics rendering API
With it, you can generate high-quality color images
by rendering with geometric and image primitives
It forms the basis of many interactive applications
that include 3D graphics
By using OpenGL, the graphics part of your
application can be
operating system independent
window system independent
In the Beginning
OpenGL 1.0 was released on July 1st, 1994
Its pipeline was entirely fixed-function
the only operations available were fixed by the
implementation
Vertex
Data
Vertex
Transform and
Lighting
Primitive
Setup and
Rasterization
Pixel
Data
Fragment
Coloring and
Texturing
Blending
Texture
Store
Vertex
Transform and
Lighting
Primitive
Setup and
Rasterization
Pixel
Data
Texture
Store
Fragment
Coloring and
Texturing
Blending
An Evolutionary Change
OpenGL 3.0 introduced the deprecation model
the method used to remove features from OpenGL
Description
Full
Forward Compatible
Vertex
Shader
Primitive
Setup and
Rasterization
Pixel
Data
Fragment
Shader
Blending
Texture
Store
More Programmability
OpenGL 3.2 (released August 3rd, 2009) added an
additional shading stage geometry shaders
Vertex
Data
Vertex
Shader
Geometry
Shader
Pixel
Data
Texture
Store
Primitive
Setup and
Rasterization
Fragment
Shader
Blending
Profile
Description
core
compatible
core
compatible
Not supported
Full
Forward Compatible
Vertex
Shader
Primitive
Setup and
Rasterization
Tessellation
Control
Shader
Pixel
Data
Texture
Store
Tessellation
Evaluation
Shader
Geometry
Shader
Fragment
Shader
Blending
WebGL
JavaScript implementation of ES 2.0
Runs on most recent browsers
OpenGL Application
Development
Application
Vertices
Vertices
Vertex
Processing
Vertex
Shader
Framebuffer
Pixels
Fragments
Rasterizer
Fragment
Processing
Fragment
Shader
x
y
z
w
GL_POINTS GL_LINES
GL_LINE_STRIP
GL_TRIANGLES
GL_LINE_LOOP
GL_TRIANGLE_FAN
GL_TRIANGLE_STRIP
A First Program
Rendering a Cube
Well render a cube with colors at each vertex
Our example demonstrates:
initializing vertex data
organizing data for rendering
simple object modeling
building up 3D objects from geometric primitives
building geometric primitives from vertices
Cube Data
//
Vertices
of
a
unit
cube
centered
at
origin,
sides
aligned
with
axes
point4
vertex_positions[8]
=
{
point4(
-0.5,
-0.5,
0.5,
1.0
),
point4(
-0.5,
0.5,
0.5,
1.0
),
point4(
0.5,
0.5,
0.5,
1.0
),
point4(
0.5,
-0.5,
0.5,
1.0
),
point4(
-0.5,
-0.5,
-0.5,
1.0
),
point4(
-0.5,
0.5,
-0.5,
1.0
),
point4(
0.5,
0.5,
-0.5,
1.0
),
point4(
0.5,
-0.5,
-0.5,
1.0
)
};
Cube Data
//
RGBA
colors
color4
vertex_colors[8]
=
{
color4(
0.0,
0.0,
0.0,
1.0
),
//
black
color4(
1.0,
0.0,
0.0,
1.0
),
//
red
color4(
1.0,
1.0,
0.0,
1.0
),
//
yellow
color4(
0.0,
1.0,
0.0,
1.0
),
//
green
color4(
0.0,
0.0,
1.0,
1.0
),
//
blue
color4(
1.0,
0.0,
1.0,
1.0
),
//
magenta
color4(
1.0,
1.0,
1.0,
1.0
),
//
white
color4(
0.0,
1.0,
1.0,
1.0
)
//
cyan
};
{
points[Index] =
points[Index] =
points[Index] =
points[Index] =
points[Index] =
points[Index] =
VAOs in Code
//
Create
a
vertex
array
object
GLuint
vao;
glGenVertexArrays(1,
&vao);
glBindVertexArray(vao);
VBOs in Code
//
Create
and
initialize
a
buffer
object
GLuint
buffer;
glGenBuffers(1,
&buffer);
glBindBuffer(GL_ARRAY_BUFFER,
buffer);
glBufferData(GL_ARRAY_BUFFER,
sizeof(points)
+
sizeof(colors),
NULL,
GL_STATIC_DRAW);
glBufferSubData(GL_ARRAY_BUFFER,
0,
sizeof(points),
points);
glBufferSubData(GL_ARRAY_BUFFER,
sizeof(points),
sizeof(colors),
colors);
Operators
Standard C/C++ arithmetic and logic operators
Operators overloaded for matrix and vector operations
mat4
m;
vec4
a,
b,
c;
b
=
a*m;
c
=
m*a;
Qualifiers
in, out
Copy vertex attributes and other variables to/from
shaders
in
vec2
tex_coord;
out
vec4
color;
Flow Control
if
if else
expression ? true-expression : falseexpression
while, do while
for
Functions
Built in
Arithmetic: sqrt, power, abs
Trigonometric: sin, asin
Graphical: length, reflect
User defined
Built-in Variables
gl_Position: output position from vertex
shader
gl_FragColor: output color from fragment
shader
Only for ES, WebGL and older versions of GLSL
Present version use an out variable
Create
Program
glCreateProgram()
Create
Shader
glCreateShader()
Load Shader
Source
glShaderSource()
Compile
Shader
glCompileShader()
Attach Shader
to Program
glAttachShader()
Link
Program
glLinkProgram()
Use Program
glUseProgram()
These
steps need
to be
repeated
for each
type of
shader in
the shader
program
A Simpler Way
Weve created a routine for this course to make it
easier to load your shaders
available at course website
GLuint
InitShaders(
const
char*
vFile,
const
char*
fFile);
Transformations
Camera Analogy
3D is just like taking a photograph (lots of
photographs!)
viewing
volume
camera
tripod
model
Transformations
" Transformations take us from one space to
another
" All of our transforms are 44 matrices
Modeling
Transform"
Modeling
Transform"
Object Coords.
Vertex
Data
Model-View
Transform"
World Coords.
Projection
Transform"
Eye Coords.
Perspective
Division"
(w)"
Viewport
Transform"
Normalized
Clip Coords. Device
Coords.
2D Window
Coordinates
Viewing transformations
define position and orientation of the viewing
volume in the world
Projection transformations
adjust the lens of the camera
Viewport transformations
enlarge or reduce the physical photograph
3D Homogeneous Transformations
A vertex is
transformed by 44
matrices
all affine operations
are matrix
multiplications
all matrices are stored
column-major in
OpenGL
this is opposite of
what C
programmers expect
Mv
m0
m
1
M=
m2
m3
m4
m5
m8
m9
m6 m10
m7 m11
m12
m13
m14
m15
View Specification
Set up a viewing frustum to specify how much
of the world we can see
Done in two steps
specify the size of the frustum (projection transform)
specify its location in space (model-view transform)
Viewing Transformations
Position the camera/eye in the scene
To fly through a scene
change viewing transformation and
redraw scene
Translation
Move object or change
frame origin
&1
$
$0
T (t x , t y , t z ) = $
$0
$
$0
%
tx #
!
ty !
!
tz !
!
1 !"
Scale
Stretch, mirror or decimate a
coordinate direction
& sx
$
$0
S (sx , s y , sz ) = $
$0
$
$0
%
sy
sz
0#
!
0!
!
0!
!
1 !"
Rotation
Rotate coordinate system about an axis in space
Vertex Lighting
Lighting Principles
Lighting simulates how objects reflect light
material composition of object
lights color and position
global lighting parameters
Diffuse reflections
Specular reflections
Ambient light
Emission
Vertex shades are interpolated across polygons by the
rasterizer
OpenGL Lighting
Modified Phong lighting model
Computed at vertices
Lighting contributors
Surface material properties
Light properties
Lighting model properties
Surface Normals
Normals define how a surface reflects light
Application usually provides normals as a vertex atttribute
Current normal is used to compute vertexs color
Use unit normals for proper lighting
Material Properties
Define the surface properties of a primitive
Property
Description
Diffuse
Specular
Highlight color
Ambient
Low-light color
Emission
Glow color
Shininess
Surface
smoothness
Shader Examples
Fragment Shaders
A shader thats executed for each potential pixel
fragments still need to pass several tests before making it to
the framebuffer
Shader Examples
Vertex Shaders
Moving vertices: height fields
Per vertex lighting: height fields
Per vertex lighting: cartoon shading
Fragment Shaders
Per vertex vs. per fragment lighting: cartoon shader
Samplers: reflection Map
Bump mapping
Height Fields
A height field is a function y = f(x, z) where the
y value represents a quantity such as the height
above a point in the x-z plane.
Heights fields are usually rendered by sampling
the function to form a rectangular mesh of
triangles or rectangles from the samples yij =
f(xi, zj)
Mesh Display
Adding Lighting
Solid Mesh: create two triangles for each
quad
Display with
glDrawArrays(GL_TRIANGLES,
0,
NumVertices);
Mesh Shader
uniform
float
time,
shininess;
uniform
vec4
vPosition,
light_position
diffuse_light,
specular_light;
uniform
mat4
ModelViewMatrix,
ModelViewProjectionMatrix,
NormalMatrix;
void
main()
{
vec4
v
=
vPosition;
vec4
t
=
sin(0.001*time
+
5.0*v);
v.y
=
0.1*t.x*t.z;
gl_Position
=
ModelViewProjectionMatrix
*
v;
vec4
diffuse,
specular;
vec4
eyePosition
=
ModelViewMatrix
*
vPosition;
vec4
eyeLightPos
=
light_position;
Shaded Mesh
Texture Mapping
Texture Mapping
y
z
geometry
image
s
screen
Vertices
Geometry
Pipeline
Rasterizer
Pixels
Pixel
Pipeline
Fragment
Shader
Applying Textures
Three basic steps to applying a texture
1. specify the texture
read or generate image
assign to texture
enable texturing
Applying Textures
1.
2.
3.
4.
5.
6.
7.
8.
Texture Objects
Have OpenGL store your images
one image per texture object
may be shared by several graphics contexts
Mapping a Texture
Based on parametric texture coordinates
Coordinates need to be specified at each vertex
t
0, 1
Texture Space
1, 1
b
0, 0
Object Space
(0.4, 0.2)
B
1, 0
C
(0.8, 0.4)
Texture Object
GLuint
textures[1];
glGenTextures(1,
textures);
glBindTexture(GL_TEXTURE_2D,
textures[0]);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_RGB,
TextureSize,
TextureSize,
GL_RGB,
GL_UNSIGNED_BYTE,
image);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_WRAP_S,
GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_WRAP_T,
GL_REPEAT);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MAG_FILTER,
GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D,
GL_TEXTURE_MIN_FILTER,
GL_NEAREST);
glActiveTexture(GL_TEXTURE0);
Vertex Shader
in vec4 vPosition;
in vec4 vColor;
in vec2 vTexCoord;
out vec4 color;
out vec2 texCoord;
void main() {
color
= vColor;
texCoord
= vTexCoord;
gl_Position = vPosition;
}
Fragment Shader
in vec4 color;
in vec2 texCoord;
out vec4 FragColor;
uniform sampler texture;
void main() {
FragColor = color * texture(texture, texCoord);
}
106