Awdgpt I3dppso 04
Awdgpt I3dppso 04
Bart Adams1 Martin Wicke2 Philip Dutré1 Markus Gross2 Mark Pauly3 Matthias Teschner2
Abstract
We present a novel painting system for 3D objects. In order to overcome parameterization problems of existing
applications, we propose a unified sample-based approach to represent geometry and appearance of the 3D object
as well as the brush surface. The generalization of 2D pixel-based paint models to point samples allows us to
elegantly simulate paint transfer for 3D objects. In contrast to mesh-based painting systems, an efficient dynamic
resampling scheme permits arbitrary levels of painted detail.
Our system provides intuitive user interaction with a six degree-of-freedom (DOF) input device. As opposed to
other 3D painting systems, real brushes are simulated including their dynamics and collision handling.
Categories and Subject Descriptors (according to ACM CCS): I.3.4 [Computer Graphics]: Graphics Utilities, I.3.5
[Computer Graphics]: Computational Geometry and Object Modeling
2. Related Work
Point-Sampled Geometry. Points have shown to be very
effective as a display primitive for high quality rendering
of complex objects. Rusinkiewicz and Levoy [RL00] use
a bounding sphere hierarchy for progressive rendering of
large objects. Alexa et al. [ABCO∗ 01] use a dynamic re-
sampling strategy to obtain high display quality. Zwicker et Figure 1: The user interface. The brush is controlled via a
al. [ZPvG01] further increase the rendering quality by intro- PHANToM Desktop haptic device. The object can be rotated
ducing the Elliptical Weighted Average (EWA) filter in the using a SpaceMouse.
point-based rendering pipeline.
Point-sampled surfaces are also used for modeling and
Hanrahan and Haeberli [HH90] first suggested a 3D paint-
editing. Pauly et al. [PKKG03] are able to perform large
ing system, using a mouse to position the brush. Agrawala
free-form deformations on point-sampled geometry. Both
et al. [ABL95] color the vertices of a scanned object using a
Pauly et al. [PKKG03] and Adams and Dutré [AD03]
spherical brush volume. There is no remeshing and therefore
present algorithms to perform boolean operations on point-
the painted detail is limited by the original resolution. More
sampled objects. Central in both approaches is a dynamic
recent painting systems [JTK∗ 99, GEL00, KSD03] provide
resampling strategy.
a haptic interface, but still use a sphere-shaped brush to ap-
In the context of appearance modeling, Pointshop 3D ply paint to the object. In all these systems, color and mate-
[ZPKG02] extends 2D photo editing functions to 3D point rial information is stored in fixed-sized textures, limiting the
clouds. Zwicker et al. propose a set of tools to paint, fil- level of detail that the artist can apply.
ter and sculpt point-sampled objects. Painting is performed
To overcome these limitations, Berman et al. [BBS94]
by projecting a brush footprint bitmap. Recently, Reuter et
propose the use of images with different resolutions in dif-
al. [RSPS04] developed a Pointshop 3D plugin to interac-
ferent places, called multiresolution images, to represent 2D
tively texture an object using a point-sampled multiresolu-
paintings with regions of varying levels of detail. In 3D, the
tion surface representation. Photometric attributes are as-
Chameleon painting system [IC01] overcomes the limitia-
signed to the point samples which result from sampling the
tions of fixed-sized textures and predefined uv-mappings by
3D object in a preprocess. There is no resampling during in-
automatically building a texture map and corresponding pa-
teractive texturing and therefore texture detail is limited by
rameterization during interactive painting. By using different
the sampling resolution at the finest level. There is no brush
patches for different regions they allow for adaptively vary-
or paint metaphor.
ing the painted level of detail. However, even these elabo-
Virtual Painting. There are several 2D painting systems rate techniques cannot fully solve the parameterization prob-
of which the work of Baxter et al. [BSLM01] is most re- lems inherent in texture mapping. DeBry et al. [DGPR02] as
lated to this paper. They present a haptic painting system well as Benson and Davis [BD02] solve the parameterization
using deformable 3D brushes to paint on a 2D canvas. problems by storing paint properties in adaptive octrees, thus
Thanks to the virtual brushes and a bidirectional paint trans- only creating texture detail when necessary. Painting is lim-
fer model the artist can achieve an expressive power sim- ited to a 2D plane which is then projected onto the surface.
ilar to painting on real canvases. They also introduce var- The resulting color attributes are stored in the octree.
ious more advanced paint models [BLL03, BWL04]. An
alternative brush and paint model is presented by Xu et
3. System Overview
al. [XLTP03]. They simulate clusters of hairs and use a diffu-
sion process to transfer paint from brush to canvas. Several User Interface. We developed a user interface which en-
researchers [WI00, XTLP02, YLO02, CT02] propose other ables the artist to manipulate the brush, mix paint, move
virtual brush models in the context of Chinese calligraphy. the object and apply paint in an intuitive manner (see Fig-
However, none of these brush models is employed in a 3D ure 1). In our painting system, the virtual brush is posi-
painting system. tioned using a six DOF input device, such as the PHANToM
Desktop (Sensable Technologies) which also provides hap- A. Brush Dynamics Loop (1 kHz)
tic feedback to the user. The object can be translated and
Collision Brush Skeleton Haptic
rotated using a mouse or a six DOF input device such as Detection Deformation Feedback
the SpaceMouse (3DConnexion). The artist can choose dif-
ferent paint types, such as aquarelle and oil paint. A virtual F
palette is used to mix paint. We vary the reflectivity of the
paint and use environment mapping to enhance realism. The
brush casts shadows on the palette and the object, giving vi-
sual depth cues. An advanced point renderer [ZPvG01] is
used to obtain high quality images of the painted object. Brush Surface Surface Samples Paint Transfer,
Deformation Collecting Resampling
Object Representation and Dynamic Resampling. We
avoid problems, such as texture distortion and patch discon-
tinuities, which are often apparent in polygon-based paint-
ing systems, by using a point-sampled surface representation
and a dynamic sampling strategy. The fundamental idea is to
upsample the object locally if necessary or downsample the B. Paint Transfer Loop (30 Hz)
object locally if possible. For example, if the artist paints a
thin line, our system locally upsamples the surface. If later Figure 2: Top row: brush dynamics loop. When a collision
the artist overpaints this small stroke with a large brush, the occurs, the brush tip is projected onto the object surface
system locally downsamples the affected areas without los- and the brush skeleton is deformed accordingly. The result-
ing any geometric information. ing force acting on the handle is sent to the haptic device.
Bottom row: paint transfer loop. The brush point samples
Our system handles regularly or irregularly sampled ob-
are deformed according to the skeleton deformation. After
jects, given that the samples adequately capture the object
collecting the relevant surface samples paint is transfered
geometry. Each surface sample carries geometric attributes
between the brush and the surface samples.
such as position x, normal n and radius r, as well as a set
of appearance attributes which represent the paint pigments:
dry paint attributes Ad , wet paint attributes Aw and wet paint and procedural small-scale geometry manipulations. The lat-
volume per unit area Vw . The point samples are stored in ter are used to simulate the geometric detail often found in
a kd-tree which is used for efficient collision detection and oil or acrylic paint, or planar patches typically found in mo-
neighbor collection during painting. saics. We chose the paint model from [BSLM01] as a basis
for ours as it allows for bidirectional paint transfer while be-
Virtual Brushes. We model virtual brushes using a point-
ing computationally cheap. Any other paint transfer model
sampled surface, wrapped around a mass-spring skeleton.
that is defined on pixels can be used as well.
The skeleton is used to model the dynamics of the brush, the
surface samples store paint information. This flexible brush Decoupled Haptics. To guarantee the required 1 kHz up-
model enables us to define different brush types of various date frequency of the haptic device we decouple the force
sizes and resolutions. Collision detection between the brush computation from the rest of the application. Only opera-
and complex 3D objects is possible at high rates. Although tions that are necessary to simulate the dynamic behavior of
more accurate, simulating individual brush hairs or clusters the brush, such as collision detection and skeleton deforma-
is too expensive to compute for haptic feedback. Our brush tion, are performed in the brush dynamics loop (Figure 2, A).
model gives us all the flexibility we need for a plausible All other (more costly) operations, such as brush surface de-
painting simulation. formation, paint transfer and dynamic resampling, are per-
formed in the paint transfer loop (Figure 2, B) which runs at
When zooming in on the object surface, the brush is scaled the 30 Hz display frequency.
down. As a result, the brush sampling density increases rela-
tive to the object sampling density. This enables the artist to
apply fine detail to the object. Since both the object and the 4. Brush Model and Haptic Display
brush are represented with point samples, an elegant imple-
mentation of bidirectional paint transfer can be realized. To model a virtual brush, we have to devise a geometric rep-
resentation for the brush surface as well as a physics-based
Paint Model. Based on [BSLM01], our system handles dif- model to simulate the dynamic behavior. We use point sam-
ferent paint types such as aquarelle, oil paint, metallic or oth- ples storing paint information to represent the surface of the
erwise reflective paint as well as other surface types such as brush. These samples are defined relative to a mass-spring
mosaic or beaten gold. In order to model this broad variety of skeleton which is used to simulate the dynamic behavior.
paint types, the paint model stores color as well as other at- The force resulting from the dynamics is directly used for
tributes, such as diffusion coefficients, reflectivity, shininess, haptic feedback.
Figure 3: The brush is represented as a point-sampled sur- Figure 4: Left: undeformed brush. Middle and right: when
face wrapped around a mass-spring skeleton. Left: round the brush is deformed, the new positions x0 of the brush sam-
brush consisting of one basic skeleton. Right: flat brush ples are computed from the original positions and the rota-
modeled using two tips. tion of the springs.
When a collision between the brush and the object surface D. Brush Sample Projection. Similar to the object samples,
is detected, paint is transferred from the brush to the surface the back-facing brush samples are projected into the paint
and vice versa. Inspired by the orthogonal projection map- buffer (see Figure 6, D). Only fragments with a depth greater
ping presented in [ZPKG02] we construct a local planar ap- than the depth already stored in the paint buffer are writ-
proximation of the object surface, the paint buffer. We splat ten. These fragments represent parts of the brush that pene-
both the object samples and the brush samples into the paint trate the surface of the object. The following brush sample
buffer, which serves as a common projection plane. Repro- attributes are written to the paint buffer:
jecting the paint buffer results in new object samples storing
• penetration depth d p ,
the painted detail. The different steps performed during a
paint event are explained below (see Figure 6). • paint attributes Ab ,
• paint volume per unit area Vb .
A. Collecting Surface Samples. In a first step, we collect
all object samples that might be affected by the brush. As Here the penetration depth d p denotes the signed distance
the points are stored in a kd-tree, this can be efficiently im- between the surface of the brush and the surface of the object
plemented by performing a range query corresponding to the at the relevant pixel.
A. Collecting Surface Samples B. Paint Buffer Construction C. Surface Sample Projection D. Brush Sample Projection E. Reprojection
Figure 6: Different steps performed during a paint event. A. Collecting the surface samples. B. Constructing the paint buffer
orthogonal to the average surface normal. C. EWA splatting of the collected surface samples. D. EWA splatting of the back-
facing brush samples. E. Reprojection of pixels in the paint buffer to surface samples.
A. Before Painting B. Upsampling C. Downsampling We use the software renderer during painting because the
hardware implementation suffers from quantization artifacts
ocurring when locally updating the rendered image.
The brush casts a shadow on the object and the palette.
Shadow mapping can be integrated into the hardware ren-
derer without requiring an additional rendering pass. To ren-
der shadows using the software renderer, we save the ren-
dered image to a texture and add the shadow using an addi-
tional hardware rendering pass performing the shadow test
in a fragment program. Environment mapping enhances re-
alism for reflective paint types.
Figure 8: The sampling density is locally adapted to ac- Paint Effects. In order to give the artists a variety of paint
curately represent the texture detail. Left: sampling density types, we modeled various paint effects. We do not limit
of the Stanford Bunny. Middle: upsampling to represent the the paint attributes to color information. Reflectivity makes
painted detail. Right: downsampling of child samples. chrome or gold paint possible and shininess can be used to
model matt paint or glossy polished surfaces.
brush footprint will be reprojected The paint transfer model is allowed to modify the small-
scale geometry of painted surfaces. A mosaic-like effect is
achieved by setting the normal of newly created child sam-
ples to their parent’s normal instead of blending it. When
using gold paint combined with the mosaic effect, we obtain
the appearance of beaten gold.
dead sample non−dead sample will not be reprojected When painting with highly viscous paint, such as oil or
acryl, the brush hairs leave an imprint in the paint. Although
Figure 9: Analyzing the paint buffer. The footprint of the
we are not able to model the complete geometric effect of
brush is shaded blue. Left: samples projected onto a pixel
adding layers of paint, we can model the surface structure.
that is touched by the brush are marked as dead. Right: pix-
If the brush skeleton is aligned with the brush velocity, we
els affected by the brush (blue pixels) will result in new sam-
slightly manipulate the surface normals of the new samples
ples. Additionally, in order to avoid holes, we reproject pix-
as to create the illusion of a hair imprint.
els touched by a dead sample and not touched by any non-
dead sample (red pixels). Diffusion is the most important feature of aquarelle.
Our paint model stores diffusion coefficients and supports
threshold, we remove the child samples, resurrect the parent isotropic surface diffusion. Each surface sample xi ex-
if necessary and set the parent’s attributes to the average of changes wet paint volume ∆Vw with other surface samples
all its children’s attributes. Reasonable values of the devia- x j:
tion threshold are between 0.95 and 0.99, depending on the j
2
− v̄2d ·T
amount of smoothing the downsampling operator is allowed ∆Vw = (Vwi −Vw ) · e , (8)
to perform. To ensure that no geometric detail is lost, we re- where d = kxi − x j k is an approximation of the geodesic dis-
move only child samples. Thus, we maintain an adequately
tance between the two sample points, v̄2 denotes the average
sampled surface at all times. An example of downsampling
squared particle speed and T is the elapsed time period. The
is illustrated in Figure 8, C.
wet paint attributes Aw are adjusted using the paint transfer
rules. Because of the exponential decay of ∆Vw , we can re-
7. Implementation strict the number of samples x j by only considering neighbor
samples within a threshold distance to xi .
Rendering. High quality renderings of the painted objects
are generated with a software implementation of the EWA
splatting algorithm [ZPvG01]. Each paint event only affects 8. Results
a local part of the surface. Thus, we can achieve high frame
Inspired by the Art on Cows project, we set up our own vir-
rates by only locally updating the rendered image, unsplat-
tual Art on Bunnies project and asked a number of artists
ting samples that have been killed and splatting newly added
to paint the Stanford Bunny using our painting system. The
or resurrected samples.
artists all used the same irregularly sampled bunny model
When rotating or translating the object, the system consisting of 97k point samples. We provided them with a
switches to a hardware implementation of the EWA splat- set of 12 round and flat brushes. The painting system runs
ting algorithm similar to [BK03] for performance reasons. on a 3 GHz PC with a GForce FX 5900 graphics board.
A selection of the resulting bunnies is displayed in Fig- [ABL95] AGRAWALA M., B EERS A. C., L EVOY M.:
ures 10, 11, 12 and 13 and in the accompanying video. Fig- 3d painting on scanned surfaces. In 1995
ure 10 shows the sampling density of the painted Day-and- Symposium on Interactive 3D Graphics (Apr.
Night Bunny. The sampling density is increased locally to 1995), pp. 145–150. 2
preserve sharp painted edges. The dynamic resampling strat-
[AD03] A DAMS B., D UTRÉ P.: Interactive boolean
egy also allows for fine painted detail such as the flowers
operations on surfel-bounded solids. ACM
and the bee in Figure 12. The entire bee covers an area about
Transactions on Graphics 22, 3 (July 2003),
the size of a single point sample of the original model. Re-
651–656. 2
flective paint was used for the Caesar Bunny (Figure 11).
You can see diffusion effects on the Savannah Bunny (Fig- [BBS94] B ERMAN D. F., BARTELL J. T., S ALESIN
ure 11). Note the imprints left by the virtual brush hairs in D. H.: Multiresolution painting and composit-
the painted water on the Beach Bunny (see Figure 13 and ing. In Proceedings of SIGGRAPH 94 (July
the video). Depending on the amount of detail, the resulting 1994), Computer Graphics Proceedings, An-
bunnies consist of 300k to 800k point samples. nual Conference Series, pp. 85–90. 2
One of the artists painted the Stanford Dragon (Figure 14). [BD02] B ENSON D., DAVIS J.: Octree textures. In
Environment mapping is used for the reflecting dragon ball. Proceedings of ACM Siggraph 2002 (2002),
The eyes of the dragon are laid in mosaic. pp. 785–790. 2
[BK03] B OTSCH M., KOBBELT L.: High-quality
9. Conclusion and Future Work point-based rendering on modern gpus. In
Proceedings of Pacific Graphics 2003 (2003),
We presented a novel painting system for 3D objects. Our
pp. 335–343. 7
system provides virtual brushes, various paint types, and an
intuitive user interface. In order to overcome parameteriza- [BLL03] BAXTER W., L IU Y., L IN M. C.: A Vis-
tion problems of existing painting applications we employ cous Paint Model for Interactive Applications.
a unified sample-based approach to represent geometry and Tech. rep., University of North Carolina at
appearance of the 3D object surface as well as the brush sur- Chapel Hill, 2003. 2, 8
face. Our paint transfer model locally approximates the ob-
[BSLM01] BAXTER W., S CHEIB V., L IN M. C.,
ject surface with one or more planes, also handling highly
M ANOCHA D.: Dab: Interactive haptic paint-
curved surfaces without distortions. Dynamic resampling of
ing with 3d virtual brushes. In Proceedings of
the point-sampled object surface allows the artist to apply
ACM Siggraph 2001 (2001), pp. 461–468. 2,
arbitrarily fine painted detail.
3, 4, 6
In our current implementation, collision handling of the
[BWL04] BAXTER W., W ENDT J., L IN M. C.: IM-
brush is performed with respect to the original object geom-
PaSTo, a realistic, interactive model for paint.
etry. However, user feedback suggests that the actual thick-
In Proceedings of the Third International Sym-
ness of applied paint should be considered in order to be felt
posium on Non-Photorealistic Animation and
by the user. Therefore, we intend to integrate a height field to
Rendering (NPAR) for Art and Entertainment
represent paint thickness. This height field would also sup-
(2004). to appear. 2
port the incorporation of more advanced paint transfer mod-
els [BLL03]. User feedback has also shown that more intu- [CT02] C HU N. S.-H., TAI C.-L.: An efficient brush
itive depth cues should be provided. Although our system model for physically-based 3d painting. In
gives depth information such as the brush shadow, it might Proceedings of Pacific Graphics 2002 (2002),
be useful to add stereo vision. pp. 413–422. 2
Acknowledgments. We thank Michael Waschbüsch for his work [DGPR02] D E B RY D., G IBBS J., P ETTY D. D., ROBINS
on the renderer and the artists Silke Lang and Christian Ratti for N.: Painting and rendering textures on unpa-
their enthusiasm and feedback. The first author is funded as a Re- rameterized models. In Proceedings of ACM
search Assistant by the Fund for Scientific Research - Flanders, Bel- Siggraph 2002 (2002), pp. 763–768. 2
gium (Aspirant F.W.O.-Vlaanderen).
[Gar99] G ARTNER B.: Fast and robust smallest enclos-
ing balls. In Proceedings of the European Sym-
References posium on Algorithms 1999 (1999), pp. 325–
338. 5
[ABCO∗ 01] A LEXA M., B EHR J., C OHEN -O R D.,
F LEISHMAN S., L EVIN D., S ILVA C. T.: [GEL00] G REGORY A. D., E HMANN S. A., L IN
Point set surfaces. In Proceedings of IEEE Vi- M. C.: inTouch: Interactive multiresolution
sualization 2001 (2001), pp. 21–28. 2 modeling and 3d painting with a haptic inter-
Figure 10: Left: Day-and-Night Bunny. Middle: sampling density on the back of the painted bunny. The sampling density is
higher where necessary to represent fine detail. You can see higher sampling rates around the sun’s boundary. Right: original
sampling density in the same region.
face. In Proceedings of IEEE Conference on The haptic display of complex graphical envi-
Virtual Reality 2002 (2000), pp. 45–52. 2 ronments. In Proceedings of ACM Siggraph
1997 (1997), pp. 345–352. 4
[HH90] H ANRAHAN P., H AEBERLI P.: Direct wysi-
wyg painting and texturing on 3d shapes. In [RL00] RUSINKIEWICZ S., L EVOY M.: Qsplat:
Proceedings of ACM Siggraph 1990 (1990), A multiresolution point rendering system for
pp. 215–223. 2 large meshes. In Proceedings of ACM Sig-
graph 2000 (2000), pp. 343–352. 2
[Hoc70] H OCKNEY R.: The potential calculation and
some applications. In Methods in Computa- [RSPS04] R EUTER P., S CHMITT B., PASKO A.,
tional Physics (1970), Alder B., Fernbach S.„ S CHLICK C.: Interactive solid texturing
Rotenberg M., (Eds.), vol. 9, Academic Press, using point-based multiresolution representa-
New York, pp. 136–211. 4 tions. In Journal of WSCG 2004 (2004),
vol. 12, pp. 363–370. 2
[IC01] I GARASHI T., C OSGROVE D.: Adaptive
unwrapping for interactive texture painting. [WI00] W ONG H. T., I P H. H.: Virtual brush: a
In 2001 ACM Symposium on Interactive 3D model-based synthesis of chinese calligraphy.
Graphics (Mar. 2001), pp. 209–216. 2 Computers & Graphics 24, 1 (2000), 99–113.
2
[JTK∗ 99] J OHNSON D., T HOMPSON II T. V., K APLAN
[XLTP03] X U S., L AU F. C., TANG F., PAN Y.: Ad-
M., N ELSON D. D., C OHEN E.: Painting tex-
vanced design for a realistic virtual brush.
tures with a haptic interface. In Proceedings of
In Proceedings of Eurographics 2003 (2003),
the IEEE Conference on Virtual Reality 1999
pp. 533–542. 2
(1999), pp. 282–285. 2
[XTLP02] X U S., TANG M., L AU F., PAN Y.: A solid
[KSD03] K IM L., S UKHATME G. S., D ESBRUN M.: model based virtual hairy brush. Computer
Haptic editing of decoration and material prop- Graphics Forum 21, 3 (2002), 299–308. 2
erties. In Proceedings of the Symposium on
Haptic Interfaces for Virtual Environment and [YLO02] Y EH J., L IEN T., O UHYOUNG M.: On the ef-
Teleoperator Systems 2003 (2003), pp. 213– fects of haptic display in brush and ink sim-
220. 2 ulation for chinese painting and calligraphy.
In Proceedings of IEEE Pacific Graphics 2002
[LCF00] L EWIS J. P., C ORDNER M., F ONG N.: Pose (2002), pp. 439–441. 2
space deformation: a unified approach to shape
interpolation and skeleton-driven deformation. [ZPKG02] Z WICKER M., PAULY M., K NOLL O.,
In Proceedings of ACM Siggraph 2000 (2000), G ROSS M.: Pointshop 3d: an interactive
pp. 165–172. 4 system for point-based surface editing. In
Proceedings of ACM Siggraph 2002 (2002),
[PKKG03] PAULY M., K EISER R., KOBBELT L. P., pp. 322–329. 2, 5
G ROSS M.: Shape modeling with point-
[ZPvG01] Z WICKER M., P FISTER H., VAN BAAR J.,
sampled geometry. In Proceedings of ACM
G ROSS M.: Surface splatting. In Proceedings
Siggraph 2003 (2003), pp. 641–650. 2, 4
of ACM Siggraph 2001 (2001), pp. 371–378.
[RKK97] RUSPINI D. C., KOLAROV K., K HATIB O.: 2, 3, 7
Figure 11: Several bunnies painted using our painting system. From left to right and top to bottom: Cloud Bunny, Nemo Bunny,
Caesar Bunny, Mondriaan Bunny, Savannah Bunny and Flower-Power Bunny.
Figure 12: Close-ups of the Day-and-Night Bunny. Note the very fine detail. Right: sampling density around the bee.
Figure 13: The Beach Bunny. Right: geometric detail on Figure 14: Left: the Fire Dragon. Top right: close-up of
the water. the reflecting dragon ball. Bottom right: close-up of one
of the eyes painted with the mosaic effect.