100% found this document useful (1 vote)
241 views26 pages

Measuring BIM Performance Five Metrics

Uploaded by

Chad Galloway
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
241 views26 pages

Measuring BIM Performance Five Metrics

Uploaded by

Chad Galloway
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 26

ARTICLE

Measuring BIM performance: Five metrics


Bilal Succar*, Willy Sher and Anthony Williams
School of Architecture and Built Environment, University of Newcastle, Callaghan Campus, NSW 2308, Australia

Abstract
The term Building Information Modelling (BIM) refers to an expansive knowledge domain within the design,
construction and operation (DCO) industry. The voluminous possibilities attributed to BIM represent an array
of challenges that can be met through a systematic research and delivery framework spawning a set of
performance assessment and improvement metrics. This article identifies five complementary components
specifically developed to enable such assessment: (i) BIM capability stages representing transformational
milestones along the implementation continuum; (ii) BIM maturity levels representing the quality,
predictability and variability within BIM stages; (iii) BIM competencies representing incremental
progressions towards and improvements within BIM stages; (iv) Organizational Scales representing the
diversity of markets, disciplines and company sizes; and (v) Granularity Levels enabling highly targeted yet
flexible performance analyses ranging from informal self-assessment to high-detail, formal organizational
audits. This article explores these complementary components and positions them as a systematic method
to understand BIM performance and to enable its assessment and improvement. A flowchart of the contents
of this article is provided.

B Keywords – Building Information Modelling; capability and maturity models; performance assessment and improvement

A BRIEF INTRODUCTION TO BUILDING SOME INDICATORS OF THE


INFORMATION MODELLING (BIM) PROLIFERATION OF BIM
BIM is a term that is used by different authors in There are many signs that the use of BIM tools and
many different ways (Figure 1). The nuances processes is reaching a tipping point in some
between their definitions highlight the rapid growth markets (Keller, Gerjets, Scheiter, & Garsoffky, 2006;
the area has experienced, as well as the potential McGraw-Hill, 2009). For example, in the USA an
for confusion to arise when ill-defined terminology increasing number of large institutional clients now
is used to communicate specific meanings. In require object-based three-dimensional (3D) models
the context of this article, BIM refers to a set to be provided as a part of tender submissions
of interacting policies, processes and technologies (Ollerenshaw, Aidman, & Kidd, 1997). Furthermore,
(illustrated in Figure 2) that generate a the UK Cabinet Office has recently published a
‘methodology to manage the essential building construction strategy article that requires the
design and project data in digital format throughout submission of a ‘fully collaborative 3D BIM (with all
the building’s life-cycle’ (Penttila¨, 2006). It is project and asset information, documentation and
important to identify the knowledge structures, data being electronic) as a minimum by 2016’ (BIS,
internal dynamics and implementation requirements 2011; UKCO, 2011, p. 14). Other signs include the
of BIM if confusion and duplication of effort are to abundance of BIM-specific software tools, books,
be avoided. new media tools and reports (Eppler & Platts, 2009).

B *Corresponding author: E-mail: [email protected]


ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT B 2012 B VOLUME 8 B 120–142
http://dx.doi.org/10.1080/17452007.2012.659506 ª 2012 Taylor & Francis ISSN: 1745-2007 (print), 1752-7589 (online) www.tandfonline.com/taem
Measuring BIM performance 121

FIGURE 1 Flowchart of the contents of this article

ISSUES ARISING FROM THE PROLIFERATION to realize significant benefits and productivity gains
OF BIM while they are still inexperienced users. Successful
Notwithstanding the much-touted benefits of BIM as a implementation of these systems requires an
means of increasing productivity, there are appreciation of how BIM resources (including
currently few metrics that measure such hardware, software as well as the technical and
improvements. Furthermore, little guidance is management skills of staff) need to evolve in
available for organizations wishing to generate new harmony with each other. The multiple and varied
or enhance their existing BIM deliverables. Those understandings that practitioners have of BIM further
wishing to adopt BIM or identify and/or prioritize compound the difficulties they experience. When the
their requirements are thus left to their own unforeseen happens, the risks, costs and difficulties
devices. The implementation of any new associated with implementing BIM increase. In such
technology is fraught with challenges and BIM is no circumstances compromises are likely to be made
exception. In addition, those implementing BIM leading, in turn, to users’ expectations not being met.
frequently expect to be able

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


122 B. SUCCAR et al.

FIGURE 2 The interlocking fields of BIM activity

THE NEED FOR BIM PERFORMANCE METRICS DEVELOPING BIM METRICS AND
BIM use needs to be assessable if the productivity BENCHMARKS
improvements that result from its implementation Although it is important to develop metrics and
are to be made apparent. Without such metrics, benchmarks for BIM performance assessment, it is
teams and organizations are unable to consistently equally important that these metrics are accurate
measure their own successes and/or failures. and able to be adapted to different industry sectors
Performance metrics enable teams and and organizations. Considerable insight can be
organizations to assess their own competencies in gained from the performance measurement tools
using BIM and, potentially, to benchmark their developed for other industries but it would be
progress against that of other practitioners. foolhardy to rely on any tool which is not designed
Furthermore, robust sets of BIM metrics lay the for the specific requirements of the task in question.
foundations for formal certification systems, which Those required to measure key BIM deliverables/
could be used by those procuring construction requirements across the construction supply chain
projects to pre-select BIM service providers. are no exception.

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


Measuring BIM performance 123

This article describes a set of metrics purposefully and experiences which combine to form the view of
developed to measure the specifics of BIM the BIM domain reported here.
performance. To increase their reliability, adoptability
and usability for different stakeholders, the first- CONCEPTUAL BACKGROUND
named author identified the following performance According to Maxwell (2005), the conceptual
criteria. The metrics should be: background underpinning a study such as this is
typically based on several sources including previous
l Accurate: Well-defined and able to measure research and existing theories, the researcher’s own
performance at high levels of precision. experiential knowledge and thought experiments.
l Applicable: Able to be utilized by all stakeholders Various theories (including systems theory (Ackoff,
across all phases of a project’s lifecycle. 1971; Chun, Sohn, Arling, & Granados, 2008),
l Attainable: Achievable if defined actions are systems thinking (Chun et al., 2008), diffusion of
undertaken. innovation theory (Fox & Hietanen, 2007; Mutai,
l Consistent: Yield the same results when conducted 2009; Rogers, 1995), technology acceptance models
by different assessors. (Davis, 1989; Venkatesh & Davis, 2000) and
l Cumulative: Set as logical progressions; complexity theory (Froese, 2010; Homer-Dixon, 2001)
deliverables from one act as prerequisites for assisted in analysing the BIM domain and enriched
another. the study’s conceptual background. Constraints
l Flexible: Able to be performed across markets, identified in these theories led to the development of
Organizational Scales and their subdivisions. a new theoretical framework based on an inductive
l Informative: Provide ‘feedback for improvement’ approach ‘[more suitable for researchers who are
and ‘guidance for next steps’ (Nightingale & Mize, more concerned about] the correspondence of
2002, p. 19). their findings to the real world than their coherence
l Neutral: Not prejudice proprietary, non-proprietary, with existing theories or laws’ (Meredith, Raturi,
closed, open, free or commercial solutions or Amoako-Gyampah, & Kaplan, 1989, p. 307).
schemata.
l Specific: Serve the specific requirements of the METHODOLOGY AND VALIDATION
construction industry. The five components of BIM performance
l Universal: Apply equally across markets and measurement are some of the deliverables of the
geographies. BIM framework developed after assessing numerous
l Usable: Intuitive and able to be easily employed to publicly available international guidelines (Succar,
assess BIM performance. 2009). The framework itself is composed of a
number of high-level concepts that interact to
generate a set of guides and tools necessary to
This article describes the development of a set of (i) facilitate BIM implementations; (ii) conduct BIM
BIM performance metrics based on these guiding performance assessments; and (iii) generate
principles. It introduces a set of complementary multi-tiered educational curricula.
knowledge components that enable BIM performance The theoretical underpinnings of the BIM
assessment and facilitate its improvement. framework have been generated through a process
of inductive inference (Michalski, 1987), conceptual
RESEARCH DESIGN clustering (Michalski & Stepp, 1987) and reflective
The investigations described in this article are part learning (Van der Heijden & Eden, 1998; Walker,
of a larger PhD study which addresses the question Bourne, & Shelley, 2008). Framework components
of how to represent BIM knowledge structures and were then represented visually through a series of
provide models that facilitate the implementation ‘knowledge models’ to reduce topic complexity
of BIM in academic and industrial settings. It is (Tergan, 2003) and facilitate knowledge transfer to
grounded in a set of paradigms, theories, concepts others (Eppler & Burkhard, 2005).

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


124 B. SUCCAR et al.

Many of the BIM framework’s components – BIM stages are defined by their minimum
fields, stages, lenses, steps, competencies and requirements. For example, to be considered as
several visual knowledge models – have been having achieved BIM capability stage 1, an
subjected to a process of validation through a series organization needs to have deployed an object-based
of international focus groups employing a mixed- modelling software tool similar to ArchiCAD, Revit,
model approach (Tashakkori & Teddlie, 1998). The Tekla or Vico. Similarly, for BIM capability stage 2,
results from these focus groups and their impact on an organization needs to be engaged in a
the development of the five components of BIM multidisciplinary ‘model-based’ collaborative project.
performance measurement will be published To be considered at BIM capability stage 3, an
separately. organization needs to be using a network-based
solution which links to external databases and shares
THE FIVE COMPONENTS OF BIM object-based models with at least two other
PERFORMANCE MEASUREMENT disciplines – a solution similar to a model server or
The first named author identified five BIM framework BIMSaaS solution (BIMserver, 2011; Onuma, 2011;
components as those required to enable accurate Wilkinson, 2008).
and consistent BIM performance measurement Each of these three capability stages may be
(Succar, 2010b). These include BIM capability further subdivided into competency steps. What
stages, BIM maturity levels, BIM competency sets, differentiates stages from steps is that stages are
Organizational Scales and Granularity Levels. transformational or radical changes, while steps are
The following sections provide brief introductions incremental ones (Henderson & Clark, 1990; Taylor &
to each component. They are followed by a Levitt, 2005). The collection of steps involved in
step-by-step workflow which allows BIM capability working towards or within a BIM stage (i.e. across
and maturity assessments to be conducted. the continuum from pre-BIM to post-BIM) is driven
by different perquisites for, challenges within and
BIM CAPABILITY STAGES deliverables of each BIM stage. In addition to their
BIM capability is defined here as the basic ability to type (the competency set they belong to – refer to
perform a task or deliver a BIM service/product. BIM Section BIM competency sets), the following BIM
capability stages (or BIM stages) define the steps can be also identified according to their
minimum BIM requirements – the major milestones location on the continuum shown in Figure 3:
that need to be reached by teams or organizations
as they implement BIM technologies and concepts. l A steps: from pre-BIM status leading to BIM stage 1;
Three BIM stages separate ‘pre-BIM’, a fixed starting l B steps: from BIM stage 1 leading towards BIM
point representing industry status before BIM stage 2;
implementation, from ‘post-BIM’, a variable l C steps from BIM stage 2 leading towards BIM
end-point representing the continually evolving goal stage 3;
of employing virtually integrated design, construction l D steps from BIM stage 3 leading towards
and operation (viDCO) tools and concepts. (The term post-BIM.
viDCO is used in preference to integrated project
delivery (IPD) as representing the ultimate goal of
implementing BIM (AIA, 2007) to prevent any BIM MATURITY LEVELS
confusion with the term’s evolving contractual The term ‘BIM maturity’ refers to the quality,
connotations within the United States.) The stages repeatability and degree of excellence within a BIM
are: capability. Although ‘capability’ denotes a minimum
ability (refer to Section BIM capability stages),
‘maturity’ denotes the extent of that ability in
l BIM stage 1: object-based modelling; performing a task or delivering a BIM service/product.
l BIM stage 2: model-based collaboration; BIM maturity’s benchmarks are performance
l BIM stage 3: network-based integration.
improvement milestones (or levels) that teams and

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


Measuring BIM performance 125

organizations aspire to or

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


126 B. SUCCAR et al.

FIGURE 3 Step sets leading to or separating BIM stages – v1.1

work towards. In general, the progression from lower ‘higher’ maturity. Although CMMs are not without
to higher levels of maturity indicates (i) improved
control resulting from fewer variations between
performance targets and actual results; (ii)
enhanced predictability and forecasting of reaching
cost, time and performance objectives; and (iii)
greater effectiveness in reaching defined goals and
setting new more ambitious ones (Lockamy III &
McCormack, 2004) (McCormack, Ladeira, & Oliveira,
2008).
The concept of BIM maturity has been adopted
from Software Engineering Institute’s (SEI) capability
maturity model (CMM) (SEI, 2008a), a process
improvement framework initially intended as a tool to
evaluate the ability of government contractors to
deliver software projects. CMM originated in the field
of quality management (Crosby, 1979) and was later
developed for the benefit of the US Department
of Defence (Hutchinson & Finnemore, 1999). Its
successor, the more comprehensive capability
maturity model integration (CMMI) (SEI, 2006a,
2006b, 2008c), continues to be developed and
extended by the SEI, Carnegie Mellon University.
Several CMM variants exist for other industries
(Succar, 2010a) but they are all, in essence,
specialized frameworks that assist stakeholders to
improve their capabilities (Jaco, 2004) and benefit
from process improvements. Example benefits
include increased productivity and return on
investment as well as reduced costs and post-
delivery defects (Hutchinson & Finnemore, 1999).
Maturity models are typically composed of
multiple maturity levels, or process improvement
‘building blocks’ or ‘components’ (Paulk, Weber,
Garcia, Chrissis, & Bush, 1993). When the
requirements of each level are satisfied, implementers
can then build on established components to attempt

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


Measuring BIM performance 127

their detractors (e.g. Bach, 1994; Jones, 1994;


Weinberg, 1993), research conducted in other
industries has already identified a correlation
between improved process maturity and business
performance (Lockamy III & McCormack, 2004).
The ‘original’ software industry CMM, however,
is not applicable to the construction industry. It does
not address supply chain issues, and its maturity
levels do not account for the different phases of the
lifecycle of a construction project (Sarshar et al.,
2000). Although other efforts, derived from CMM,
focus on the construction industry (refer to Table
1), there is no comprehensive maturity
model/index that can be applied to BIM, its
implementation stages, players, deliverables or its
effect on project lifecycle phases.
The CMMs listed in Table 1 are similar in
structure and objectives but differ in conceptual
depth, industrial focus, terminology and target
audience. A common theme is how CMMs employ
simple experience-based classifications and
benchmarks to facilitate continuous improvement
within organizations. In analysing their suitability for
developing a BIM-specific maturity index, most are
broad in approach and can collectively form a basis
for a range of BIM processes, technologies and
policies. However, none easily accommodates the
size of organizations being monitored. Also, from a
terminology standpoint, there is insufficient
differentiation between the notion of capability (an
ability to perform a task) and that of maturity (the
degrees of excellence in performing a task). This
differentiation is critical when catering for staged BIM
implementation as it responds to the disruptive and
expansive nature of BIM.
To address the aforementioned shortcomings,
the BIM maturity index (BIMMI) has been
developed by

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

126
al.
B. SUCCAR et
TABLE 1 Maturity models influencing the BIM maturity index
SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION
DESCRIPTION AND NUMBER OF MATURITY LEVELS

BIM proficiency matrix – The Indiana University Architect’s Office


The BIM proficiency matrix is ‘used to assess the proficiency of a respondent’s skill at working in a BIM
environment’. The matrix is ‘adaptable to project needs’ and intends to communicate ‘owner intent regarding
BIM objectives’ (IU, 2009a, pp. 15 and 16)
The BIM proficiency matrix is a static, multi-worksheet, MS Excel workbook (IU, 2009b) which includes eight
categories to be assessed. Upon assessment, a score ranging from one to four points is assigned against each
category. Points for each category are then tallied and the total BIM maturity score is calculated. The matrix
identifies five ‘BIM standards’ which a project can achieve, should achieve or has already achieved depending on
‘Simplified matrix’ – an Excel Worksheet from the when the matrix is deployed
BIM proficiency matrix (IU, 2009b) The five proficiency levels (or BIM standards) are: ‘working towards BIM’ – the lowest standard, ‘certified BIM’,
‘silver’, ‘gold’ and ‘ideal’ – the highest BIM maturity standard

BIM QuickScan – TNO Built Environment and Geosciences


The BIM QuickScan tool aims to ‘serve as a standard BIM benchmarking instrument in the Netherlands’.
The scan is intended to be performed ‘in a limited time of maximum one day’ (Sebastian & Van Berlo, 2010,
pp. 255 and 258)
The BIM QuickScan Tool is organized around four chapters: organization and management, mentality and culture,
information structure and information flow, and tools and applications. ‘Each chapter contains a number of KPIs
in the form of a multiple-choice questionnaire.. . With each KPI, there are a number of possible answers. For each
answer, a score is assigned. Each KPI also carries a certain weighting factor. The sum of all the partial scores
after considering the weighting factors represents the total score of BIM performance of an organization’
Score representation (by category) from the sample BIM
(Sebastian & Van Berlo, 2010, pp. 258 and 259)
QuickScan report (TNO, 2010)
KPIs are assessed against a percentile score while ‘Chapters’, representing a collation of KPIs, are assessed
against a five-level system (0 to 4).
COBIT, Control objects for information and related technology – Information Systems Audit and Control
Association (ISACA) and the IT Governance Institute (ITGI)
The main objective of COBIT is to ‘enable the development of clear policy and good practice for IT control
throughout organizations’ (Lainhart, 2000, p. 22)
The COBIT Maturity Model is ‘an IT governance tool used to measure how well developed the management
processes are with respect to internal controls. The maturity model allows an organization to grade itself from
non-existent (0) to optimized (5)’ (Pederiva, 2003, p. 1). COBIT includes six maturity levels (non-existent, initial/
ad hoc, repeatable but intuitive, defined process, managed and measurable and optimized), four domains and 34
(Lainhart, 2000) control objectives
Note: There is some alignment between ITIL (OGC, 2009) and COBIT with respect to IT governance within
organizations (Sahibudin, Sharifi, & Ayat, 2008) of value to BIM implementation efforts

CMMI, Capability maturity model integration – Software Engineering Institute/Carnegie Melon


Capability maturity modelw integration (CMMI) is a process improvement approach that helps integrate
traditionally separate organizational functions, set process improvement goals and priorities, provide guidance
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

for quality processes, and provide a point of reference for appraising current processes (SEI, 2006b, 2006c,
2008a, 2008b, 2008c)
CMMI has five maturity levels (for staged representation, six capability levels for continuous representation),
16 core process areas (22 for CMMI-DEV and 24 for CMMI-SVC) and one to four goals for each process area
The five maturity levels are: initial, managed, defined, quantitatively managed and optimizing

Measuring BIM performance


Source: NASA, Software Engineering Process Group. http://bit.ly/
CMMI-NASA

Continued

127
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

128
TABLE 1 Continued

al.
B. SUCCAR et
SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION
DESCRIPTION AND NUMBER OF MATURITY LEVELS

CSCMM, construction supply chain maturity model


‘Construction supply chain management (CSCM) refers to the management of information, flow, and money in
the development of a construction project’ as mentioned in (Vaidyanathan & Howell, 2007, p. 170)
CSCMM has four maturity stages: ad hoc, defined, managed and controlled

(Vaidyanathan & Howell, 2007)

iBIM – integrated Building Information Modelling


The iBIM maturity model – introduced in Bew, Underwood, Wix, and Storer (2008) – has been devised ‘to
ensure clear articulation of the standards and guidance notes, their relationship to each other and how they
can be applied to projects and contracts in industry’ (BIS, 2011, p. 40)

The iBIM model identifies specific capability targets (not performance milestones) for the UK Construction
Industry covering technology, standards, guides, classifications and delivery (total number of topics not
defined). Targets for each topic are organized under one or more loosely defined maturity levels (0 – 3)

(BIS, 2011)
I-CMM, Interactive capability maturity model – National Institute for Building Sciences (NIBS) Facility
Information Council (FIC)
This I-CMM is closely coupled with the NBIMS effort (version1, part 1) and establishes ‘a tool to determine the
level of maturity of an individual BIM as measured against a set of weighted criteria agreed to be desirable in a
Building Information Model’ (Suermann, et al., 2008, p. 2; NIST, 2007; NIBS, 2007)
The ICMM has 11 ‘areas of interest’ measured against 10 maturity levels

(Suermann, Issa, & McCuen, 2008)

Knowledge retention maturity levels


ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

Arif et al. (2009) introduced four levels of knowledge retention maturity


Knowledge management is an integral part of BIM capability and subsequent maturity. The matrix thus
incorporates these levels: (i) knowledge is shared between employees, (ii) shared knowledge is documented
(transferred from tacit to explicit), (iii) documented knowledge is stored and (iv) stored knowledge is accessible
and easily retrievable (Arif, et al., 2009)

Measuring BIM performance


(Arif, Egbu, Alom, & Khalfan, 2009)

Continued

129
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

130
al.
B. SUCCAR et
TABLE 1 Continued

SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION


DESCRIPTION AND NUMBER OF MATURITY LEVELS

LESAT, Lean Enterprise Self-Assessment Tool – Lean Aerospace Initiative (LAI) at the Massachusetts Institute
of Technology (MIT)
LESAT is focused on ‘assessing the degree of maturity of an enterprise in its use of ‘lean’ principles and
practices to achieve the best value for the enterprise and its stakeholders’ (Nightingale & Mize, 2002, p. 17).
LESAT has 54 lean practices organized within three assessment sections: lean transformation/leadership, life
cycle processes and enabling infrastructure and five maturity levels: some awareness/sporadic, general
awareness/informal, systemic approach, ongoing refinement and exceptional/innovative

(Nightingale & Mize, 2002)

P3M3, Portfolio, programme and project management maturity model – Office of Government Commerce
The P3M3 provides ‘a framework with which organizations can assess their current performance and put in place
improvement plans with measurable outcomes based on industry best practice’ (OGC, 2008, p. 8)
The P3M3 has five maturity levels: awareness, repeatable, defined, managed and optimized

(OGC, 2008)
P-CMMw, People capability maturity model v2 – Software Engineering Institute/Carnegie Melon
P-CMM is an ‘organizational change model’ and a ‘roadmap for implementing workforce practices that
continuously improve the capability of an organization’s workforce’ (SEI, 2008d, pp. 3 and 15)
P-CMM has five maturity levels: initial, managed, defined, predictable and optimizing

(SEI, 2008d)
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

(PM)2, Project management process maturity model

The project management process maturity (PM)2 model ‘determines and positions an organization’s relative
project management level with other organizations’. It also aims to integrate PM ‘practices, processes, and
maturity models to improve PM effectiveness in the organization’ (Kwak & Ibbs, 2002, p. 150)
(PM)2 has five maturity levels: initial, planned, managed at project level, managed at corporate level and
continuous learning

Measuring BIM performance


(Kwak & Ibbs, 2002)

Continued

131
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

132
TABLE 1 Continued

al.
B. SUCCAR et
SAMPLE REPRESENTATION ABBREVIATION, NAME – ORGANIZATION
DESCRIPTION AND NUMBER OF MATURITY LEVELS

SPICE, Standardized process improvement for construction enterprises – Research Centre for the Built and
Human Environment, The University of Salford
SPICE is a project which developed a framework for continuous process improvement for the construction
industry. SPICE is an ‘evolutionary step-wise model utilizing experience from other sectors, such as
manufacturing and IT’ (Hutchinson & Finnemore, 1999, p. 576; Sarshar et al., 2000)
SPICE has five stages: initial/chaotic, planned & tracked, well defined, quantitatively controlled, and continuously
improving

(Hutchinson & Finnemore, 1999)

Supply chain management process maturity model and business process orientation (BPO) Maturity Model
The model conceptualizes the relation between process maturity and supply chain operations as based on the
supply-chain operations reference model (Stephens, 2001). The model’s maturity describes the ‘progression
of activities toward effective SCM and process maturity. Each level contains characteristics associated with
process maturity such as predictability, capability, control, effectiveness and efficiency’ (Lockamy III &
McCormack, 2004, p. 275; McCormack, 2001).
The five maturity levels are: ad hoc, defined, linked, integrated and extended

(Lockamy III & McCormack, 2004)


Other maturity models – or variation on listed maturity models – include those on software process improvement (Hardgrave & Armstrong, 2005), IS/ICT management capability (Jaco,
2004), interoperability (Widergren, Levinson, Mater, & Drummond, 2010), project management (Crawford, 2006), competency (Gillies & Howard, 2003) and financial management
(Doss, Chen, & Holland, 2008)
Measuring BIM performance 133

FIGURE 4 Building Information Modelling maturity levels at BIM stage 1

analysing and then integrating these and other collaboration (BIM stage 2).
maturity models used across different industries.
The BIMMI has been customized to reflect the
specifics of BIM capability, implementation
requirements, performance targets and quality
management. It has five distinct levels: (a) initial/
ad hoc, (b) defined, (c) managed, (d) integrated
and
(e) optimized (Figure 4). Level names were chosen to
reflect the terminology used in many maturity
models, to be easily understandable by DCO
stakeholders and to reflect increasing BIM maturity
from ad hoc to continuous improvement (Table 2).

BIM COMPETENCY SETS


A BIM competency set is a hierarchical collection of
individual competencies identified for the purposes
of implementing and assessing BIM. In this context,
the term competency reflects a generic set of
abilities suitable for implementing as well as
assessing BIM capability and/or maturity. Figure 5
illustrates how the BIM framework generates BIM
competency sets out of multiple fields, stages and
lenses (Succar, 2009).
BIM competencies are a direct reflection of BIM
requirements and deliverables and can be grouped
into three sets, namely technology, process and
policy: Technology sets in software, hardware and
data/ networks. For example, the availability of a BIM
tool allows the migration
from drafting-based to object-based
workflow (a requirement of BIM stage 1) Process sets
in resources, activities/workflows, products/services,
and leadership/management. For example,
collaboration processes and database- sharing
skills are necessary to allow model-based

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


Policy sets in benchmarks/controls, contracts/
agreements and guidance/supervision. For example,
alliance-based or risk-sharing contractual
agreements are pre-requisites for network-based
integration (BIM stage 3).
Figure 6 provides a partial mind-map of BIM
competency sets shown at Granularity Level 2
(for an explanation of Granularity Levels, please refer
to Section BIM granularity levels).

BIM ORGANIZATIONAL SCALES


To allow BIM performance assessments to respect
the diversity of markets, disciplines and company
sizes, an Organizational Scale (OScale) has been
developed. The scale can be used to customize
assessment efforts and is depicted in Table 3.

BIM GRANULARITY LEVELS


Competency sets include a large number of
individual competencies grouped under numerous
headings (shown in Figure 6). To enhance BIM
capability and maturity assessments and to increase
their flexibility, a granularity ‘filter’ with four
Granularity Levels (GLevels) has been developed.
Progression from lower to higher levels of granularity
indicates an increase in (i) assessment breadth, (ii)
scoring detail,
(iv) formality and (iv) assessor specialization.
Using higher Granularity Levels (GLevel 3 or 4)
exposes more detailed competency areas than lower
Granularity Levels (GLevel 1 or 2). This variability
enables the preparation of several BIM performance
measurement tools ranging from low-detail, informal
and self-administered assessments to high-detail,
formal and specialist-led appraisals. Table 4 provides
more information about the four Granularity Levels.
ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT
Measuring BIM performance 135

134
al.
B. SUCCAR et
TABLE 2 A non-exhaustive list of terminology used by CMMs to denote maturity levels including those used by the BIM maturity index
MATURITY MODELS MATURITY LEVELS

0 1 or a 2 or b 3 or c 4 or d 5 or e
BIM maturity index Initial/ad hoc Defined Managed Integrated Optimized
COBIT, Control objects for information and Non-existent Initial/ad hoc Repeatable but Defined process Managed & Optimized
related technology intuitive measurable
CMMI, Capability maturity model integration (staged Initial Managed Defined Quantitatively Optimizing
representation) managed
CMMI (continuous representation) Incomplete Performed Managed Defined Quantitatively Optimizing
managed
CSCMM, Construction supply chain maturity model Ad-hoc Defined Managed Controlled N/A
LESAT, Lean enterprise self-assessment tool Awareness/ General awareness/ Systemic approach Ongoing refinement Exceptional/
Sporadic informal innovative
P-CMMw, People capability maturity model Initial Managed Defined Predictable Optimizing
P3M3, Portfolio, programme and project management Awareness Repeatable Defined Managed Optimized
maturity model
(PM)2, Project management process maturity model Ad-hoc Planned Managed at project Managed at Continuous
level corporate level learning
SPICE, Standardized process improvement for Initial/chaotic Planned & tracked Well defined Quantitatively Continuously
construction enterprises controlled improving
Supply chain management process maturity model Ad hoc Defined Linked Integrated Extended
Measuring BIM performance 135

FIGURE 5 Structure of BIM competency sets v1.0

Granularity Levels increase or decrease the levels, competency sets, Organizational Scales and
number of competency areas used for performance Granularity Levels) allow performance assessments
assessment. For example, the mind map provided in to be conducted involving combinations of these
Figure 6 reveals 10 competency areas at GLevel 1 components. The guiding principles discussed in
and 41 competency areas at GLevel 2. Also, at Section Developing BIM metrics and benchmarks all
GLevels 3 and 4, the number of competency areas apply. To manage all possible configurations, a
available for performance assessment increases simple assessment and reporting workflow has been
dramatically as shown in Figure 7. developed (Figure 8).
The partial mind-map shown in Figure 7 reveals The workflow shown in Figure 8 identifies the five
many additional competency areas under GLevel 3, steps needed to conduct a BIM performance
such as data types and data structures. At GLevel assessment. Starting with an extensive pool of
4, the map reveals even more detailed competency generic BIM competencies – applicable across DCO
areas including structured and unstructured data, disciplines and organizational sizes – assessors can
which in turn branch into computable and first filter-out non-applicable competency sets,
non-computable components (Fallon & Palmer, 2007; conduct a series of assessments based on the
Kong et al., 2005; Mathes, 2004). competencies remaining and then generate
appropriate assessment reports.
APPLYING THE FIVE ASSESSMENT
COMPONENTS A FINAL NOTE
The aforementioned five complementary BIM The five BIM framework components, briefly
framework components (capability stages, maturity discussed in this article, provide a range of

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


136 B. SUCCAR et al.

FIGURE 6 BIM Competency sets v3.0 – shown at Granularity Level 2

opportunities for DCO stakeholders to measure and These range from informal self-assessments to
improve their BIM performance. The components highly detailed and formal organizational audits.
complement each other and enable highly targeted Such a system of assessment can be used to
yet flexible performance analyses to be conducted. standardize BIM implementation and assessment

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


Measuring BIM performance 137

TABLE 3 Organizational scales


LOW DETAIL HIGH DETAIL

NAME SYM GRANULARITY NAME SYM GRANULARITY SHORT DEFINITION


MACRO markets M (Macro M) M Markets are the ‘world of commercial activity
Markets Market
and industries where goods and services are bought and sold’.
http://bit.ly/pjB3c
(Meso M) Md Defined Defined markets can be geographical, geopolitical
or resultant from multi-party agreements similar to
market
NAFTA or ASIAN
(Micro M) Ms Sub-market Sub-markets can be local or regional.

I (Macro I) I Industries are the organized action of making of


Industries Industry
goods and services for sale. Industries can
traverse markets and may be service, product or
project-based. The AEC industry is mostly
Project-Based. http://bit.ly/ielY3
(Meso I) Is Sector A sector is a ‘distinct subset of a market, society,
industry, or economy whose components share
similar characteristics’ http://bit.ly/15UkZD
(Micro I) Id Discipline Disciplines are industry sectors, ‘branches of
knowledge, systems of rules of conduct or
methods of practice’. http://bit.ly/7jT82
Isp Specialty Specialty is a focus area of knowledge, expertise,
production or service within a sub-discipline
MESO projects and P Project teams n/a P Project team Project teams are temporary groupings of
their teams organizations with the aim of fulfilling predefined
objectives of a project – a planned endeavour,
usually with a specific goal and accomplished in
several steps or stages. http://bit.ly/dqMYg
MICRO O Organizations (Macro O) O Organization An organization is a ‘social arrangement which
organizations units, pursues collective goals, which controls its own
their groups and performance, and which has a boundary
members separating it from its environment’. http://bit.ly/
v7p9N
(Meso O) Ou Organizational Departments and units are specialized divisions
of
an organization. These can be co-located or
unit
distributed geographically
Og Organizational Organizational Groups consist of individual human
group (or resources assigned to perform an activity or
deliver a set of assigned objectives. Groups (also
team)
referred to as organizational teams) can be
physically co-located or formed across
geographical or departmental lines
(Micro O) Om Organizational Organizational members can be part of multiple
member organizational groups.

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


138 B. SUCCAR et al.

TABLE 4 BIM competency Granularity Levels v2.1


GLEVEL NUMBER, GLEVEL NAME, DESCRIPTION AND SCORING SYSTEM (NUMERICAL OSCALE ASSESSMENT BY, REPORT
AND/OR NAMED) APPLICABILITY TYPE AND GUIDE NAME
1 Discovery A low detail assessment used for basic and semi-formal discovery All scales Self Discovery
of BIM capability and maturity. Discovery assessments yield a notes
basic numerical score BIMC&M
discovery
guide
2 Evaluation A more detailed assessment of BIM capability and maturity. All scales Self and peer Evaluation
Evaluation assessments yield a detailed numerical score sheets
BIMC&M
evaluation
guide
3 Certification A highly detailed appraisal of those competency areas applicable 8 and 9 External Certificate
across disciplines, markets and sectors. Certification appraisal is consultant BIMC&M
used for structured (staged) capability and maturity and yields a certification
formal, named maturity level guide
4 Auditing Auditing is the most comprehensive appraisal type. In addition to 8, 9, 10 and 11 Self, peer and Audit report
competencies covered under certification, auditing appraises external BIMC&M
detailed competency areas including those specific to a market, consultant auditing
discipline or a sector. Audits are highly customizable, suitable for guide
non-structured (continuous) capability and maturity and yield a
named maturity level plus a numerical maturity score for each
competency area audited

efforts, enable a structured approach to BIM After scrutiny of a significant part of the BIM
education and training as well as establish a solid framework through peer-reviewed publications and a
base for a formal BIM certification process. series of international focus groups, the five

FIGURE 7 Technology competency areas at Granularity Level 4 – partial mind map v3.0

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


Measuring BIM performance 139

FIGURE 8 BIM capability and maturity assessment and reporting workflow diagram – v2.0

components and other related assessment metrics UAE. Engineering, Construction and Architectural Management, 16(1),
are currently being extended and field tested. 92 – 108.
Sample online tools (focusing on selected Bach, J. (1994). The immaturity of the CMM. American Programmer, 7(9),
disciplines, at different granularities) are currently 13 – 18.
being formulated. All these form part of an ongoing Bew, M., Underwood, J., Wix, J., & Storer, G. (2008). Going BIM in a
effort to promote the establishment of an commercial world. Paper presented at the EWork and EBusiness in
independent BIM certification body responsible for Architecture, Engineering and Construction: European Conferences on
assessing and accrediting individuals, organizations Product and Process Modeling (ECCPM 2008), Sophia Antipolis, France.
and collaborative project teams. Subject to BIMserver. (2011, 20 October). Open source building information
additional field testing and tool calibration, the five Modelserver. Retrieved from http://bimserver.org/
components may be well placed to consistently BIS. (2011). A report for the Government Construction Client Group, Building
assess, and by extension improve, BIM performance. Information Modelling (BIM) working party strategy. Department for
Business Innovation & Skills (BIS). Retrieved from http://www.cita.ie/
ACKNOWLEDGEMENTS images/assets/uk%20bim%20strategy%20(summary).pdf
This article draws on the Bilal Succar’s PhD research Chun, M., Sohn, K., Arling, P., & Granados, N.F. (2008). Systems theory
at the University of Newcastle, School of Architecture and knowledge management systems: The case of Pratt-Whitney
and Built Environment (Australia). Bilal Succar wishes Rocketdyne. Paper presented at the Proceedings of the 41st Hawaii
to acknowledge his supervisors Willy Sher, Guillermo International Conference on System Sciences, Hawaii.
Aranda-Mena and Anthony Williams for their Crawford, J.K. (2006). The project management maturity model.
continuous support. Information Systems Management, 23(4), 50 – 58.
Crosby, P.B. (1979). Quality is free: The art of making quality certain.
REFERENCES New York: New American Library.
Ackoff, R.L. (1971). Towards a system of systems concepts. Management Davis, F.D. (1989). Perceived usefulness, perceived ease of use, and
Science, 17(11), 661 – 671. user acceptance of information technology [Article]. MIS Quarterly,
AIA. (2007). Integrated project delivery: A guide. AIA California Council. 13(3), 319 – 340.
Arif, M., Egbu, C., Alom, O., & Khalfan, M.M.A. (2009). Measuring Doss, D.A., Chen, I.C.L., & Holland, L.D. (2008). A proposed variation of
knowledge retention: A case study of a construction consultancy in the the capability maturity model framework among financial management

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


140 B. SUCCAR et al.

settings. Paper presented at the Allied Academies International Kong, S.C.W., Li, H., Liang, Y., Hung, T., Anumba, C., & Chen, Z. (2005).
Conference, Tunica. Web services enhanced interoperable construction products catalogue.
Eppler, M., & Burkhard, R.A. (2005). Knowledge visualization. In D.G. Automation in Construction, 14(3), 343 – 352.
Schwartz (Ed.), Encyclopedia of knowledge management (pp. 551 – 560). Kwak, Y.H., & Ibbs, W.C. (2002). Project management process maturity
Covent Garden, London: Idea Group Reference. (PM)2 model. ASCE, Journal of Management in Engineering, 18(3),
Eppler, M.J., & Platts, K.W. (2009). Visual strategizing: The systematic use 150–155.
of visualization in the strategic-planning process. Long Range Planning, Lainhart IV, J.W. (2000). COBITTM: A methodology for managing and
42, 42 – 74. controlling information and information technology risks and
Fallon, K.K., & Palmer, M.E. (2007). General buildings information handover vulnerabilities. Journal of Information Systems, 14(s-1), 21 – 25.
guide: Principles, methodology and case studies. Washington, DC: US Lockamy III, A., & McCormack, K. (2004). The development of a supply
Department of Commerce. chain management process maturity model using the concepts of
Fox, S., & Hietanen, J. (2007). Interorganizational use of building information business process orientation. Supply Chain Management: An
models: Potential for automational, informational and transformational International Journal, 9(4), 272 – 278.
effects. Construction Management and Economics, 25(3), 289 – 296. Mathes, A. (2004). Folksonomies – Cooperative classification and
Froese, T.M. (2010). The impact of emerging information technology on communication through shared metadata. Paper presented at the
project management for construction. Automation in Construction, Computer Mediated Communication, LIS590CMC (Doctoral seminar),
19(5), 531 – 538. Graduate School of Library and Information Science. Retrieved from
Gillies, A., & Howard, J. (2003). Managing change in process and people: http:// www.adammathes.com/academic/computer-
Combining a maturity model with a competency-based approach. Total mediatedcommunication/ folksonomies.html
Quality Management & Business Excellence, 14(7), 779 – 787. Maxwell, J.A. (2005). Qualitative research design: An interactive approach.
Hardgrave, B.C., & Armstrong, D.J. (2005). Software process Thousand Oaks, CA: Sage Publications, Inc.
improvement: It’s a journey, not a destination. Communications of the McCormack, K. (2001). Supply chain maturity assessment: A roadmap for
ACM, 48(11), 93 – 96. building the extended supply chain. Supply Chain Practice, 3, 4 – 21.
Henderson, R.M., & Clark, K.B. (1990). Architectural innovation: The McCormack, K., Ladeira, M.B., & de Oliveira, M.P.V. (2008). Supply chain
reconfiguration of existing product technologies and the failure of maturity and performance in Brazil. Supply Chain Management: An
established firms. Administrative Science Quarterly, 35(1), 9 – 30. International Journal, 13(4), 272 – 282.
Homer-Dixon, T. (2001). The ingenuity gap. Canada: Vintage. McGraw-Hill. (2009). The business value of BIM: Getting Building Information
Hutchinson, A., & Finnemore, M. (1999). Standardized process improvement Modeling to the bottom line. McGraw-Hill Construction Analytics.
for construction enterprises. Total Quality Management, 10, 576 – 583. Retrieved from http://construction.com/
IU. (2009a). BIM design & construction requirements, follow-up seminar
Meredith, J.R., Raturi, A., Amoako-Gyampah, K., & Kaplan, B. (1989).
(PowerPoint Presentation). The Indiana University Architect’s Office,
Alternative research paradigms in operations. Journal of Operations
32. Retrieved from http://www.indiana.edu/~uao/ IU%20BIM
Management, 8(4), 297 – 326.
%20Rollout%20Presentation%209-10-2009.pdf.
Michalski, R.S. (1987). Concept learning. In S.S. Shapiro (Ed.), Encyclopedia
IU. (2009b). IU BIM Proficiency Matrix (Multi-tab Excel Workbook). 9 tabs.
of artificial intelligence (Vol. 1, pp. 185 – 194). New York: Wiley.
The Indiana University Architect’s Office. Retrieved from http://www.
Michalski, R.S., & Stepp, R.E. (1987). Clustering. In S.S. Shapiro (Ed.),
indiana.edu/~uao/IU%20BIM%20Proficiency%20Matrix.xls.
Encyclopedia of artificial intelligence (Vol. 1, pp. 103 – 111). New York:
Jaco, R. (2004). Developing an IS/ICT management capability maturity
Wiley.
framework. Paper presented at the Proceedings of the 2004 Annual
Mutai, A. (2009). Factors influencing the use of Building Information Modeling
Research Conference of the South African Institute of Computer
(BIM) within leading construction firms in the United States of America
Scientists and Information Technologists on IT Research in
(Unpublished Doctor of philosophy). Indiana State University, Terre Haute.
Developing Countries, Stellenbosch, Western Cape, South Africa.
NIBS. (2007). BIM Capability Maturity model. National Institute for Building
Jones, C. (1994). Assessment and control of software risks. New Jersey:
Sciences (NIBS) Facility Information Council (FIC). Retrieved October 11,
Prentice-Hall.
2008, from www.buildingsmartalliance.org/client/assets/files/bsa/
Keller, T., Gerjets, P., Scheiter, K., & Garsoffky, B. (2006). Information
BIM_CMM_v1.9.xls
visualizations for knowledge acquisition: The impact of dimensionality and Nightingale, D.J., & Mize, J.H. (2002). Development of a lean enterprise
color coding. Computers in Human Behavior, 22(1), 43 – 65. transformation maturity model. Information Knowledge Systems
Management, 3(1), 15 – 30.

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


Measuring BIM performance 141

NIST. (2007). National Building Information Modeling Standard – Version SEI. (2008a, 11 October). Capability maturity model integration. Software
1.0 – Part 1: Overview, principles and methodologies. Washington, Engineering Institute/Carnegie Melon. Retrieved from http://www.sei.
DC: US Department of Commerce. cmu.edu/cmmi/index.html
OGC. (2008). Portfolio, programme, and project management maturity model SEI. (2008b). Capability Maturity Model Integration for Services
(P3M3). England: Office of Government Commerce. (CMMI-SVC), partner and piloting draft, V0.9c. Pittsburgh, PA: Software
OGC. (2009, 13 February). Information Technology Infrastructure Library Engineering Institute/Carnegie Melon.
(ITIL) – Office of Government Commerce. Retrieved from http://www.
SEI. (2008c, 24 December). CMMI for services. Retrieved from http://www.
itil-officialsite.com/home/home.asp sei.cmu.edu/cmmi/models/CMMI-Services-status.html
Ollerenshaw, A., Aidman, E., & Kidd, G. (1997). Is an illustration always SEI. (2008d, 11 October). People Capability Maturity Model – Version 2,
worth ten thousand words? Effects of prior knowledge, learning style Software Engineering Institute/Carnegie Melon. Retrieved from http://
and multimedia illustrations on text comprehension. International www.sei.cmu.edu/cmm-p/version2/index.html
Journal of Instructional Media, 24(3), 227 – 238. Stephens, S. (2001). Supply chain operations reference model version 5.0: A
Onuma. (2011, 20 October). Onuma model server. Retrieved from http:// new tool to improve supply chain efficiency and achieve best practice.
onuma.com/products/BimDataApi.php Information Systems Frontiers, 3(4), 471 – 476.
Paulk, M.C., Weber, C.V., Garcia, S.M., Chrissis, M.B., & Bush, M. (1993). Succar, B. (2009). Building information modelling framework: A research and
Key practices of the capability maturitymModel – version 1.1 delivery foundation for industry stakeholders. Automation in Construction,
(Technical Report). Software Engineering Institute, Carnegie Mellon 18(3), 357 – 375.
University, Pittsburgh, PA. Succar, B. (2010a). Building information modelling maturity matrix. In
w
Pederiva, A. (2003). The COBIT maturity model in a vendor evaluation J. Underwood, & U. Isikdag (Eds.), Handbook of research on
case.
Information Systems Control Journal, 3, 26 – 29. building information modelling and construction informatics: concepts
Penttil¨a, H. (2006). Describing the changes in Architectural Information and technologies (pp. 65 – 103). Information Science Reference, IGI
Technology to understand design complexity and free-form architectural Publishing. doi:10.4018/978-1-60566-928-1.ch004
expression. ITcon, (Special Issue The Effects of CAD on Building Form Succar, B. (2010b). The five components of BIM performance measurement.
and Design Quality), 11, 395 – 408. Paper presented at the CIB World Congress.
Rogers, E.M. (1995). Diffusion of innovation New York: Free Press. Suermann, P.C., Issa, R.R.A., & McCuen, T.L. (2008, 16 – 18 October).
Sahibudin, S., Sharifi, M., & Ayat, M. (2008). Combining ITIL, COBIT and Validation of the U.S. National Building Information Modeling Standard
ISO/ Interactive Capability Maturity Model. Paper presented at the 12th
IEC 27002 in order to design a comprehensive IT framework in International Conference on Computing in Civil and Building Engineering,
organizations. Paper presented at the Modeling & Simulation, AICMS Beijing, China.
08. Second Asia International Conference, Kuala Lumpur, Malaysia. Tashakkori, A., & Teddlie, C. (1998). Mixed methodology: combining
Sarshar, M., Haigh, R., Finnemore, M., Aouad, G., Barrett, P., Baldry, D., & qualitative and quantitative qpproaches. Thousand Oaks, CA: Sage.
Sexton, M. (2000). SPICE: A business process diagnostics tool for Taylor, J., & Levitt, R.E. (2005). Inter-organizational knowledge flow and
construction projects. Engineering Construction & Architectural innovation diffusion in project-based industries. Paper presented at the
Management, 7(3), 241 – 250. 38th International Conference on System Sciences, Hawaii, USA.
Sebastian, R., & Van Berlo, L. (2010). Tool for benchmarking BIM Tergan, S.O. (2003). Knowledge with computer-based mapping tools.
performance of design, engineering and construction firms in the Paper presented at the ED-Media 2003 World Conference on
Netherlands. Architectural Engineering and Design Management (Special Educational Multimedia, Hypermedia & Telecommunication, Honolulu,
Issue: Integrated Design and Delivery Solutions), 6, 254 – 263. HI: University of Honolulu.
SEI. (2006a). Capability Maturity Model Integration for Development TNO. (2010). BIM QuickScan – A TNO initiative (sample QuickScan Report
(CMMI-DEV), improving processes for better products. Pittsburgh, – PDF). 3. Retrieved from
PA: Software Engineering Institute/Carnegie Melon. http://www.bimladder.nl/wp-content/uploads/ 2010/01/voorbeeld-
SEI. (2006b). Capability Maturity Model Integration Standard (CMMI) quickscan-pdf.pdf
appraisal method for process improvement (SCAMPI) A, Version 1.2 – UKCO. (2011). Government construction strategy. London: United Kingdom
method definition document. Pittsburgh, PA: Software Engineering Cabinet Office.
Institute/Carnegie Melon. Vaidyanathan, K., & Howell, G. (2007). Construction supply chain maturity
SEI. (2006c). CMMI for development, improving processes for better model – Conceptual framework. Paper presented at the International
products. Pittsburgh, PA: Software Engineering Institute/Carnegie Melon. Group for Lean Construction (IGLC-15), Michigan, USA.

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT


142 B. SUCCAR et al.

Van der Heijden, K., & Eden, C. (1998). The theory and praxis of reflective Weinberg, G.M. (1993). Quality software management (Vol. 2): First-order
learning in strategy making. In C. Eden, & J.-C. Spender (Eds.), measurement. New York: Dorset House Publishing Co., Inc.
Managerial and organizational cognition: Theory, methods and research Widergren, S., Levinson, A., Mater, J., & Drummond, R. (2010, 25 – 29
(pp. 58 – 75). London: Sage. July). Smart grid interoperability maturity model. Paper presented at
Venkatesh, V., & Davis, F.D. (2000). A theoretical extension of the the Power and Energy Society General Meeting, 2010 IEEE,
technology acceptance model: Four longitudinal field studies. Minnesota, USA.
Management Science, 46(2), 186 – 204. Wilkinson, P. (2008, 12 July). SaaS-based BIM. Extranet evolution –
Walker, D.H.T., Bourne, L.M., & Shelley, A. (2008). Influence, stakeholder Construction collaboration technologies. Retrieved from http://www.
mapping and visualization. Construction Management and Economics, extranetevolution.com/extranet_evolution/2008/04/saas-based-
26(6), 645 – 658. bim.html

ARCHITECTURAL ENGINEERING AND DESIGN MANAGEMENT

You might also like