0% found this document useful (0 votes)
33 views11 pages

Ecaade2022 384

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views11 pages

Ecaade2022 384

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/363587096

Environment-Aware 3D Concrete Printing through Robot-Vision

Conference Paper · September 2022


DOI: 10.52842/conf.ecaade.2022.2.409

CITATIONS READS

0 61

3 authors, including:

Roberto Naboni Luca Breseghello


University of Southern Denmark University of Southern Denmark
54 PUBLICATIONS 307 CITATIONS 14 PUBLICATIONS 73 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Morphogenesis in Modern Architectural Contexts View project

Advanced Customization in Architectural Design and Construction View project

All content following this page was uploaded by Luca Breseghello on 16 September 2022.

The user has requested enhancement of the downloaded file.


Environment-Aware 3D Concrete Printing through
Robot-Vision
Roberto Naboni1, Luca Breseghello2, Sandro Sanin3
1,2,3CREATE - University of Southern Denmark, Section for Civil and Architectural

Engineering, Odense, Denmark


1,2,3{ron|lucab|sasan}@iti.sdu.dk

In the 2020s, large scale 3D concrete printing (3DCP) is one of the most important areas
of development for research and industry in construction automation. However, the
available technology fails to adapt to the complexity of a real construction site and
building process, oversimplifying design, production, and products to fit the current state
of technology. We hypothesise that by equipping printing machinery with sensing devices
and adaptive design algorithms we can radically expand the range of applications and
effectiveness of 3DCP. In this paper we prove this concept through a full-scale design-to-
fabrication experiment, SENS-ENV, consisting of three main phases: (i) we equip and
calibrate an existing robotic setup for 3DCP with a camera which collects geometric
data; (ii) building upon the collected information, we use environment-aware generative
design algorithms to conceive a toolpath design tailored for the specific environment with
a quasi-real-time workflow; (iii) we successfully prove this approach with a number of
fabrication test-elements printed on unknown environment configurations and by
monitoring the fabrication process to apply printing corrections. The paper describes the
implementation and the successful experiments in terms of technology setup, process
development, and documenting the outcomes. SENS-ENV opens a new agenda for
context-aware autonomous additive construction robots.

Keywords: 3D Concrete Printing, Robot Vision, Environment Mapping, Adaptive


Design.

INTRODUCTION construction operations, receptive and reactive to


Conventional construction technologies are limited the environment.
by a low level of automation and lack of interaction 3D Concrete Printing (3DCP) in particular, is
with the surrounding environment. Despite rigidly intended as a linear course of design,
extensive research in the field, existing processes are generation of the toolpath, and robotic fabrication.
still ineffective for the complexity of construction However, the inherent dynamic behaviour of
tasks and the unstructured and dynamic nature of concrete and its extrusion, as well as the large
construction sites. The advent of digital design and number of variables that influence its outcome
manufacturing techniques, the increasing adoption create a level of unpredictability that is impossible to
of sensing technologies, as well as the development entirely predict in the planning phase, unless this is
of data-driven predictive algorithms provide oversimplified to fulfil the limits of currently
nowadays the means for devising autonomous available technology (Buswell et al. 2020). The

Volume 2 – Co-creating the Future – eCAADe 40 | 409


development of feedback-loops logics that integrate reinforcement meshes (Ayres et al. 2018), expanded
means of detecting changes in the physical Polystyrene (Borg Costanzi et al. 2018), sand-based
environment and assessing the precision of the high-precision 3D printed forms (Anton et al. 2020),
printed element itself during the printing process and robotically shaping a terrain of gravel before
would provide large benefits to the quality, precision printing (Battaglia et al. 2020).
and flexibility of 3DCP (Naboni 2022). Few applications of vision and sensing in
The presented research aims at developing a extrusion-based 3D Concrete Printing can be found
design and fabrication method for 3DCP able to in the literature to date. Wolfs et al. (2017) proposed
sense and react to unknown environments (uneven and tested the integration of a 1D laser
and unstable grounds) and to unintended measurement system for the nozzle height,
fabrication artefacts (defects or imprecisions due to demonstrating the possibility of integration of live-
structural instabilities) by employing camera vision feedback loops in the 3DCP in a simple technical
and feedback-loops for the generation, correction experimental setup. Yuan et al. (2022) propose a
and optimization of the design and printing. workflow for real-time toolpath planning and
We hypothesise that by equipping printing extrusion control, where real-time flow-rate data are
machinery with sensing devices and adaptive design collected, and the motion of the robot adjusted
algorithms we can radically expand their range of accordingly to maintain a constant extrusion.
applications and effectiveness. This paper uses However, the use of a vision system, i.e. 3D scanning,
camera vision to dynamically survey and assess is only employed at the end of each printing session
unknown and unstructured grounds to for accuracy assessment. In the broader context of
autonomously generate tailored designs and additive manufacturing (AM), Sutjipto et al. (2019)
automatically output robotic toolpath that is then developed a method for building live-updating
dynamically updated during the fabrication process occupancy maps using a vision-based sensing
responding to the detection of printing defects or system, i.e. RGB camera, and providing a
changes to the environment (Figure 1). representation of the spatial configuration of the
print and its environment. This was tested at a lab Figure 1
scale on robotic Fused Deposition Modelling (FDM) Close-up view of
AM, and it is limited to 2D pattern recognition. the environment-
aware robotic 3DCP
METHODOLOGY process
To prove the concept of environment-aware 3DCP
we set a full-scale design-to-fabrication experiment,
namely SENS-ENV, consisting of three main phases:
(i) we equip an existing robotic setup for 3DCP with
a camera that collects geometric data; (ii) building
upon the collected information, we use
environment-aware generative, design algorithms
to conceive toolpath design tailored for the specific
State of the art in vision sensing for environment with a quasi-real-time workflow; (iii) we
additive manufacturing successfully prove this approach with a number of
The existing literature presents attempts at fabrication test-elements based on unknown
overcoming the need for an even printing surface environment configurations and by monitoring the
and extending the range of working domains for fabrication process to apply printing corrections.
3DCP, utilising various supportive solutions:

410 | eCAADe 40 – Volume 2 – Co-creating the Future


Setup for Camera-Enhanced Printing Experiment Design and Setup
In this research, Intel RealSense D400 Series Depth An experiment was designed to demonstrate the
Cameras are used to obtain coloured 3D point cloud efficient use of robot-vision in robotic 3DCP to
and RGB images. The 3D data is aligned with the inform the generation of design solutions adaptive
image pixels through the Intel RealSense SDK. This to unknown printing environments as well as to
data is used to screen unknown printing induce design reactions to adjust emerging
environments as well as to monitor the printed deviations from the planned path during the
object during the fabrication process and verify printing process.
deviations afterwards. Conducted in a lab environment, the physical
For this experiment series, an Intel RealSense setup (Figure 3) for the experiment consists of: (i) an
D455 camera was implemented in our printing ABB IRB6650S industrial robot, with a maximum
setup. Its working range is between 0.6 m and 6.0 m. reach of 3.3 metres and able to move below its base
The camera is mounted on the flange of the sixth axis plane; (ii) a Vergumat mixing pump, which provides
of an industrial robot, about 0.3 m from the extrusion a continuous flow of material and is able to handle
Tool Center Point (TCP). The translation from the TCP cementitious pastes with aggregates up to 4mm; (iii)
in the XY-Plane was calculated by checking the a mortar material based on a commercial shotcrete
relative position of the camera Field of View (FOV) in mortar adapted to 3DCP use, with 25-50% Portland
relation to the nozzle (Figure 2). cement content and aggregates of 2 mm particle
Figure 2 size, to which is added a small dosage of 36-48 mm
Custom-developed polypropylene 700 μm monofilament microfibers
end-effector with a tested in previous research (Gislason et al. 2022); (iv)
3DCP extruder and an uneven printing bed built using 5-20 mm gravel
Intel RealSense to create reconfigurable uneven printing sites to
D455 camera which a generative design logic reacts to. The
printing areas are built in a space of 2.4 x 1.2 metres,
and the height randomly varies from 0.1 to 0.5
metres.

Figure 3
3DCP fabrication
setup for the To define the TCP of the Camera itself for the
design experiment alignment of the camera frames and point clouds, a
Hand-Eye Calibration algorithm was used, i.e.
OpenCV Daniildis dual quaternion solver (Daniildis
1998). The transformation matrix from the extruder
TCP to the Camera TCP is calculated by correlating
robot poses with pictures of the camera facing an
AprilTag marker board. The camera data is streamed
to the Rhino/Grasshopper environment through a
Autonomous Design Logic and Workflow
set of custom components, which implement the
To respond to environmental and fabrication
Intel RealSense SDK in C++. The library is then called
uncertainties, it was developed a digital design
from C# in the Grasshopper environment using the
workflow which entails: (1) a field-based printing
Platform Invoke feature of the .NET Framework.
environment survey; (2) environment-aware design;

Volume 2 – Co-creating the Future – eCAADe 40 | 411


(3) adaptive toolpath design and fabrication; (4) Figure 4.
autonomous printing correction (Figure 4). Design and
fabrication
Field-based printing environment survey workflow diagram
Using the calibrated depth camera, a series of scans
of the printing environment are taken. These are
used to precisely reference the printing area
location, obstacles to be accounted for, and to map
the uneven printing bed.
To ensure a sufficiently accurate digital
representation of the experiment setup with enough
overlap between frames, 8 scans of the 2.4x1.2 m
area of the 2.4x1.2 m area were taken with a
resolution of 1200 x 720 pixels/points and aligned
using the transformation of the camera relative to Figure 5
the robot TCP. All the scanning operations were Height-based scalar
performed at an average height of 60 cm from the field to translate
ground which leads to an average noise in the axis the scanned point
perpendicular to the camera of ± 1.2 mm. cloud into a mesh
The resulting point cloud, constituted of over 7
million points, was subsampled to 20% of its original
size while ensuring a constant density of points over
the whole area. To reduce complexity within the
workflow, a height map was generated. This Figure 6
translation was done with the help of an underlying Environment
scalar field with a cell size of 10 mm, which was fed analysis workflow:
with the data of the point cloud. The value stored in from (A) point
each cell is an average of all the points’ Z coordinates cloud, (B) height-
that are inside the cell when projected on the same based field, (C)
plane (Figure 5). mesh
From the heightmap, a 3D mesh representative
of the values was produced (Figure 6). The scalar
field was used as an input for the design process and
as a strategy to monitor the printing process, using
relatively simple mathematical operations for data
comparison and blending between different fields
(Bhooshan et al. 2019).

Environment-aware Generative Design


The design workflow was developed in Rhinoceros
and Grasshopper, building a custom iterative
physics-based particle system in C#.

412 | eCAADe 40 – Volume 2 – Co-creating the Future


Figure 7 The base design consists of a continuous self-
The generative avoiding curve that adaptively fills a given design
design process of space following a vectorial field. The field, obtained
the rule-based from the 3D scan of the printing environment, holds
space-filling self- data regarding the local depth as well as the slope
avoiding curve intensity and direction at each given point of the
terrain (Figure 7).
The particles-based generative curve is growing
from a given location following a set of geometric
rules: (1) a distance-based collision detection is
implemented, avoiding self-intersections and
defining a minimum distance D which corresponds
to the printing layer width; (2) Limited Growth: it layers. In the experiment, the material is extruded at
expands to a defined length L, expressed as a a constant flow rate.
multiplier of the starting curve length L0; (3) Bending
Resistance: it holds a bending resistance B, avoiding
sharp angles and imposing curvature continuity; (4)
Slope Alignment: the curve segments try to align
against local maximum slope direction vectors; (5)
Slope-based Density: its growth rate is proportional to
the local slope intensity, creating denser areas where
higher slope is detected (Figure 8).
The generative design solution developed is
responsive to the input environment field, where the
space-filling curve adapts following the defined
rules and according to the defined parameters
influencing the final outlook of the curve (Figure 9).
The curves are then automatically translated into a
toolpath, given the desired fabrication parameters,
i.e. layer height and width and normal direction of
extrusion (Breseghello and Naboni 2021a).

Fabrication Process
The design workflow was implemented for the
fabrication of two specimens bound in a rectangle of
240 cm x 120 cm, offsetting 20 cm from each edge of
the built gravel box to reduce the likelihood of
collisions with the robot during the fabrication
Figure 8 process. The printed probes are 14.4 cm high from
The generative the surface, with 12 layers of 12 mm. Taking from
design process of previous works of the authors in the calibration of
the rule-based the layer dimensions through speed manipulation
space-filling self- (Breseghello and Naboni 2021b), the robot motion
avoiding curve speed is set to 300 mm/s to produce 30mm wide

Volume 2 – Co-creating the Future – eCAADe 40 | 413


Figure 9
Adaptivity of the
space-filling curve
to varying
environmental field
inputs

The toolpath is partially oriented perpendicular to The developed hardware setup allows to
the printing bed, hence not only extruding in the autonomously survey unknown environments,
vertical direction but creating overhangs up to 15 providing sufficient accuracy for the task, while
degrees. To reduce the overhangs, the average performing this operation in a computationally
vector between the surface normal and the vertical efficient manner.
was employed. Accordingly, the robot tool TCP The open-end process developed in SENS-ENV
follows the same orientation as the extrusion. The allowed the generation of environment-specific
printing operation took around 24 minutes toolpath design artefacts (Figure 11), precisely
including three scanning passes to assess the deploying material along an unknown ground
accuracy of the print (Figure 10). The two produced material and orienting the print head to control the
proof-of-concept probes weigh respectively about layer dimensional features while avoiding collisions
550 Kg and 320 Kg. with the environment and the already deposited
material.
RESULTS The proposed setup provided an in-process
The experiment proved the successful execution of quality monitoring which allowed to track uneven
quasi-real-time design to fabrication workflow for material settlements caused by the complex
3DCP. interaction between the ground slope, the concrete
self-weight, and the sliding of gravel. Deviations in

414 | eCAADe 40 – Volume 2 – Co-creating the Future


the direction normal to the printing ground assessed ± 1.5 mm, an acceptable discrepancy for the purpose
from the generated digital twin were in the range of of the experiment (Figure 12).

Figure 10
Environment-aware
robotic 3DCP
process over an
unstructured gravel
terrain

Volume 2 – Co-creating the Future – eCAADe 40 | 415


CONCLUSIONS
The complexity of construction requires that robot Figure 11
systems gain new capabilities to be able to operate Print results by the
in unstructured and complex environments. This application of the
paper shows that robot vision can successfully environment-aware
enhance the ability of 3DCP to adapt to complex design and
environments and provide tailored design solutions fabrication
with an almost real-time design process, which can workflow tested on
be adapted across a fabrication session to correct different
unpredictable materialisation effects (Figure 13). configuration of a
SENS-ENV is an initial proof-of-concept which opens gravel bed
a new research territory for autonomous 3DCP,
where robots will be able to gain increasing agency
over multiple decision layers during the fabrication
process and acquire a new design role. These
capabilities could, in a short term, be employed for
printing for and in harsh environments, and
optimizing for structural (Breseghello and Naboni
2022) and environmental performance without a
pre-planned optimised design. Our future Figure 12
undertakings will work towards the implementation In-process scanning
of more articulated design and monitoring logic, and analysis of the
integrating material features, e.g. temperature and print: (A)
material flow. From a technical standpoint, we will generation of the
work towards the implementation of a framework point cloud; (B) use
that, in communication with the modelling of a height-based
environment of Rhino/Grasshopper, allows for an field for data
online monitoring and correction printing process. filtering; (C) analysis
of the point cloud
deviation from the
simulated mesh

Figure 13
Depth camera
point cloud for live
geometric analysis
of the printing
toolpath.

416 | eCAADe 40 – Volume 2 – Co-creating the Future


ACKNOWLEDGMENTS Hutter, M., Byrne, K., Schork, T. (eds) Robotic
The experimental work on 3DCP was carried out at Fabrication in Architecture, Art and Design
the CREATE Lab at the University of Southern 2018. ROBARCH 2018. Springer, Cham.
Denmark - Section for Civil and Architectural https://doi.org/10.1007/978-3-319-92294-2_8
Engineering. The authors wish to thank Anders Prier Borg Costanzi, C., Ahmed, Z., Schipper, R.& Bos,
Lindvig from SDU Robotics for his help in the camera F.,Knaack, U., Wolfs, R. (2018) 3D Printing
calibration. The authors wish to thank the industrial Concrete on temporary surfaces: The design
partner Hyperion Robotics (robotic setup), and the and fabrication of a concrete shell structure.
project partners Saint Gobain Weber Denmark Automation in Construction. 94. 395-404.
(mortar material) and Danish Fibres (polypropylene 10.1016/j.autcon.2018.06.013.
fibres). Breseghello, L. Sanin, S. Naboni, R, (2021a) Toolpath
Simulation, Design and Manipulation in Robotic
REFERENCES 3D Concrete Printing. In A. Globa, J. van
Anton A., Jipa A., Reiter L., Dillenburger B. (2020) Ameijde, A. Fingrut, N. Kim, & T. T. S. Lo (Eds.),
Fast Complexity: Additive Manufacturing for Projections - Proceedings of the 26th
Prefabricated Concrete Slabs. In: Bos F., Lucas S., International Conference of the Association for
Wolfs R., Salet T. (eds) Second RILEM Computer-Aided Architectural Design Research
International Conference on Concrete and in Asia, CAADRIA 2021, The Association for
Digital Fabrication. DC 2020. RILEM Bookseries, Computer-Aided Architectural Design Research
vol 28. Springer, Cham. in Asia (CAADRIA), 623-632.
https://doi.org/10.1007/978-3-030-49916-7_102 Breseghello, L., Naboni, R. (2021b). Adaptive
Ayres, P., da Silva, W., Nicholas, P., Andersen, T., & Toolpath: Enhanced Design and Process Control
Greisen, J. R. (2018). SCRIM – Sparse Concrete for Robotic 3DCP. In: Gerber, D., Pantazis, E.,
Reinforcement in Meshworks. In J. Willmann, P. Bogosian, B., Nahmad, A., Miltiadis, C. (eds)
Block, M. Hutter, K. Byrne, & T. Schork (Eds.), Computer-Aided Architectural Design. Design
Robotic Fabrication in Architecture, Art and Imperatives: The Future is Now. CAAD Futures
Design 2018: Foreword by Sigrid Brell-Çokcan 2021. Communications in Computer and
and Johannes Braumann, Association for Information Science, vol 1465. Springer,
Robots in Architecture (pp. 207-220). Springer. Singapore. https://doi.org/10.1007/978-981-19-
https://doi.org/10.1007/978-3-319-92294-2 1280-1_19
Battaglia, Christopher A.; Verian, Kho; Miller, Martin Breseghello, L., & Naboni, R. (2022). Toolpath-based
F. (2020) DE:Stress Pavilion, Print-Cast Concrete design for 3D concrete printing of carbon-
for the Fabrication of Thin Shell Architecture. In: efficient architectural structures. Additive
ACADIA 2020: Distributed Proximities / Volume Manufacturing, 56, 102872.
II: Projects [Proceedings of the 40th Annual https://doi.org/10.1016/j.addma.2022.10287
Conference of the Association of Computer Buswell, R., Kinnell, P., Xu, J., Hack, N., Kloft, H.,
Aided Design in Architecture (ACADIA) ISBN Maboudi, M., Gerke, M., Massin, P., Grasser, G.,
978-0-578-95253-6]. Online and Global. 24-30 Wolfs, R., & Bos, F. (2020). Inspection Methods
October 2020. edited by M. Yablonina, A. for 3D Concrete Printing. RILEM Bookseries, 28,
Marcus, S. Doyle, M. del Campo, V. Ago, B. 790–803. https://doi.org/10.1007/978-3-030-
Slocum. 202-207 49916-7_78
Bhooshan, S., Ladinig, J., Van Mele, T., Block, P. Daniilidis, K. (1998) Hand-eye calibration using dual
(2019). Function Representation for Robotic 3D quaternions. International Journal of Robotics
Printed Concrete. In: Willmann, J., Block, P., Research, 18:286–298,

Volume 2 – Co-creating the Future – eCAADe 40 | 417


Gislason, S. Bruhn, S., Breseghello, L., Sen, B., Liu, G.,
Naboni, R. (2022). Lightweight 3D Printed
Concrete Beams Show An Environmental
Promise: A Cradle-to-Grave Comparative Life
Cycle Assessment. Clean Technol Envir. In press.
Naboni, R. (2022) Cyber-Physical Construction and
Computational Manufacturing. In: Bolpagni, M.,
Gavina, R., Ribeiro, D. (eds) Industry 4.0 for the
Built Environment. Structural Integrity, vol 20.
Springer, Cham.
Sutjipto S., Tish D., Paul G., Vidal-Calleja T., Schork T.
(2019) Towards Visual Feedback Loops for
Robot-Controlled Additive Manufacturing. In:
Willmann J., Block P., Hutter M., Byrne K., Schork
T. (eds) Robotic Fabrication in Architecture, Art
and Design 2018. ROBARCH 2018. Springer,
Cham. https://doi.org/10.1007/978-3-319-
92294-2_7
Wolfs, R., Bos, F. P., van Strien, E., & Salet, T. A. M.
(2017). A Real-Time Height Measurement and
Feedback System for 3D Concrete Printing. High
Tech Concrete: Where Technology and
Engineering Meet - Proceedings of the 2017 Fib
Symposium, 1. https://doi.org/10.1007/978-3-
319-59471-2
Yuan, P. F., Zhan, Q., Wu, H., Beh, H.S., Zhang, L.
(2022) Real-time toolpath planning and
extrusion control (RTPEC) method for variable-
width 3D concrete printing. Journal of Building
Engineering, 46.
https://doi.org/10.1016/j.jobe.2021.103716.

418 | eCAADe 40 – Volume 2 – Co-creating the Future

View publication stats

You might also like