0% found this document useful (0 votes)
403 views

Data Migration

Concepts of data migration in detail
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
403 views

Data Migration

Concepts of data migration in detail
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 31

Data Migration through an Information

Development Approach
A Management Overview
Introducing MIKE2.0
An Open Source Methodology for Information Development
http://www.openmethodology.org
2008 BearingPoint, Inc. 2 CROSS
Data Migration through an Information
Development Approach
Agenda
Data Migration through
Information Development
Executive Summary
Business Drivers for Better Data Migration
Guiding Principles for Better
Data Migration
MIKE2.0 Methodology
5 phased-approach to Better
Business Intelligence
Example Task Outputs from
Strategy Activities
Lessons Learned
2008 BearingPoint, Inc. 3 CROSS
Data Migration through Information Development
Scope within BearingPoint's IM Solution Suite
Information Management Solution Suite
Delivered through a Collaborative Approach with the IM Profession and our Alliance Vendors
Enterprise Information Management
Supported by Solution Capabilities that provide a foundation for Suite Delivery
Access, Search and
Content Delivery
BI and EPM
Information
Asset Management
Enterprise Data Management Enterprise Content Management
Information Strategy, Architecture and Governance
B
u
s
i
n
e
s
s

S
o
l
u
t
i
o
n
s

C
o
m
m
e
r
c
i
a
l

&

O
p
e
n


S
o
u
r
c
e

P
r
o
d
u
c
t

S
o
l
u
t
i
o
n
s

Sets the new standard for Information Development through an Open Source Offering
2008 BearingPoint, Inc. 4 CROSS
Data Migration through Information Development
Executive Summary
Migrating from the legacy environment to a new system can be a straightforward activity or be a very
complex initiative. Migration can come in many forms:
A migration from a relatively simply system into another system
Upgrading a system to a new version through an approach that requires changing the
underlying data
The convergence of multiple systems into a single composite system
Complex migration from one system to a new system, which requires the migration to be rolled out
over a period of time
Multiple, con-current systems migrations and consolidation efforts. This is referred to as
"IT Transformation"
In most large organisations, migration of Enterprise Systems is very complex. To simplify this complexity,
we first aim to understand the scope of the problem and then formulate some initial solution techniques.
The MIKE2.0 Solution for Data Migration provides techniques for measuring the complexity of the Data
Migration initiative and determining the activities that are required. It also defines the strategic architectural
capabilities as well as high-level solution architecture options for solving different data migration challenges.
It then moves into the set of required Foundation Activities, Incremental Design, and Delivery steps. The
Executive Summary presents some of the strategy activities.
Similar MIKE2 Solutions include:
The MIKE2.0 Solution for IT Transformation provides a complementary Solution Approach for dealing
with these issues on a very large scale
The MIKE2.0 Solution for Master Data Management provides an approach for on running multiple
systems in an ongoing fashion that synchronise data sets such as customer, product, employee
and locality
2008 BearingPoint, Inc. 5 CROSS
Data Migration through Information Development
Business Drivers for Better Data Migration
Better data quality into the new
target systems
A systematic approach to prioritizing
functionality to be moved to the target
Alignment of related migration initiatives
A standards-based approach to large-scale
systems implementation
The ability run co-existent applications to
reduce deployment risk
An ability to trace the flow of information
across all systems in the architecture
Building new analytical systems as part of
the operational data migration
Achieve
High-risk implementations from a
business perspective
Very complex code that is difficult to
manage and is only used "once off"
Issues with reconciling common data across
all systems
Inefficient software development processes
that increase cost and slow delivery
Inflexible systems and lock-in to
specific technologies
Unnecessary duplication of
technology spend
Avoid
Better
Data
Migration
Change Drivers
Continuously Changing Business Environment
Today's Systems are More Data-Dependent
Reduced Technical Complexity & Cost
Architectures Moving to Vendor-Based Systems
2008 BearingPoint, Inc. 6 CROSS
Data Migration through Information Development
10 Guiding Principles for Better Data Migration
Measure the complexity of your initiative Understand your technology requirements based on the sophistication of
the migration effort. Determine the full set of capabilities that are required
Don't bite off too much at once Establish an overall architectural blueprint for a complex programme and
migrate system functionality a piece at a time. Complex systems can be progressively decommissioned through
co-existent applications
Investigate & fix DQ problems early Data quality issues discovered at a late stage often result in programme
failures or significant delays. Start with data profiling to identify high risk areas in the early stages of the project.
As soon as possible, get your hands on real data
Use standards to reduce complexity Data Migration is simplified through the use of open and common standards
related to data, integration and infrastructure
Build a metadata-driven solution A comprehensive approach to metadata management is the key to reducing
complexity and promoting reusability across infrastructure. A metadata-driven approach makes it easier for users to
understand the meaning of data and to understand the lineage of data across the environment
Take a diligent approach to testing Data Migrations are complex and user expectations will be high, considering the
transition is typically from a working system. A systematic testing process should be followed, from initial functional
testing to final user acceptance
Don't provide an "infrastructure only" release Unless the delivery times are short or the infrastructure issues
very significant, always aim to provide some form of new business capability in a release business users often get
frustrated with technology-only initiatives. New reporting functionality is often good complement to an
infrastructure-focused release
Make sure the business case is sound If a system is going to be replaced, make sure there is a good business
reason for it. Also make sure that the business appreciates that there will likely be an initial higher cost of systems in the
early stages and that a properly constructed business case actually includes a replacement plan even for the
new system
Align projects into an overall programme If conducting multiple initiatives, there will be many commonalities
across the different projects. Align these projects into an overall programme
Use a detailed, method-based approach The MIKE2.0 Methodology provides an open source approach for
Information Development
2008 BearingPoint, Inc. 7 CROSS
Data Migration Guiding Principles
Investigate & Fix DQ Problems Early
Eliminate the 11
th
Hour Fire Fight
The latter stages of testing is the most expensive and worst time to discover problems with the data. It is late in the
process and there is little time to do the analysis and fix the problem. More times than not this has caused project
delays. By starting with data profiling, we identify our high risk areas in the early stages of the project
All problems need to be worked thru in the staging areas prior to further data movement. Therefore, we make as much
of an effort as possible to fix the problems while the data is standing still. It costs time and resources to move data.
Different types of problems are addressed in each staging area
Test Environment(s)
Production
Target System (s)
6-T
6-P
Migration Staging
Table Scan
Attribute Scan
Assessments
Reporting
1
Transformations
4
Move Staging
Light
Medium
Heavy
Source Systems
P
r
o
c
e
s
s
e
s

5
D
a
t
a

I
n
t
e
g
r
a
t
i
o
n

3
Data Integration Data Re-Engineering Data Re-Engineering
Metadata Management
2
2008 BearingPoint, Inc. 8 CROSS
Data Migration Guiding Principles
Use Standards to Reduce Complexity
Current State Environments
Inventory Source Tables
Inventory
Source Attributes
Inventory
Upstream Sources
Inventory
Downstream Targets
Create as is
Domain Model
Create as is Entity Model
Future State
Environments
Enterprise Apps
Data Models
iODS Data Models
Common Data Standards
Enterprise
Representation
Create Domain Model
Create Entity Model
Create Entity
Relationship Model
Create
Entity Attribute Model
Attribute Mappings
Finance iODS DW Customer
Rationalization Rationalization
Rationalization Rationalization
Mapping Mapping Mapping Mapping
Rationalize Domains
and Entities
across Current
State and Future
State Environments
Rationalize Attributes
across Current State
and Future State
Environments
Map in all Application
Environments to the
Enterprise Standard
Initial Common Data
Standards and
creation of:
Initial DQ Program
Initial Data
Ownership Model
Initial Data
Management
Governance
Processes
Creating
Data Standards
2008 BearingPoint, Inc. 9 CROSS
The MIKE2.0 Methodology
An Open Source Methodology for Information Development
What is MIKE2.0?
MIKE stands for Method for an Integrated Knowledge Environment
MIKE2.0 is our comprehensive methodology for Enterprise Information Management
MIKE2.0 brings together important concepts around Open Source and Web 2.0
The open source version of MIKE2.0 is available at: http://www.openmethodology.org
Key Constructs within MIKE2.0
SAFE (Strategic Architecture for the Federated Enterprise) is the architecture framework for the
MIKE2.0 Methodology
Information Development is the key conceptual construct for MIKE2.0 develop your information
just like applications
MIKE2.0 provides a Comprehensive, Modern Approach
Scope covers Enterprise Information Management, but goes into detail in areas to be used for more
tactical projects
Architecturally-driven approach that starts at the strategic conceptual level, goes to
solution architecture
A comprehensive approach to Data Governance, Architecture and strategic Information Management
MIKE2.0 provides a Collaborative, Open Source Methodology for Information Development
Balances adding new content with release stability through a method that is easier to navigate
and understand
Allows non-BearingPoint users to contribute
Links into BearingPoint's existing project assets on our internal knowledge management systems
Unique approach, we would like to make this "the standard" in the new area of
Information Development
2008 BearingPoint, Inc. 10 CROSS
The MIKE2.0 Methodology
MIKE2.0 for Data Migration
The MIKE2.0 Methodology can be applied to solve different types of Data
Migration problems
For simple migration activities, not all activities from the Overall Implementation Guide
are required. The complete migration may take only a single release
For complex migration scenarios, most activities will be required and will be
implemented over multiple increments. Complex migration scenarios often require
very sophisticated architectural capabilities
Most migrations of Enterprise Applications are very complex processes
The following pages go through some of the initial strategy activities in
MIKE2.0 that
Help introduce the overall approach to Data Migration and how it is applied depending
on the complexity of the problem
Provide an example of one of the tasks in the initial current-state assessment
Propose some high-level solution architecture options that can be applied to different
migration scenarios
Provide an approach for prioritizing complex migrations, based on business priorities
and complexity of the implementation
2008 BearingPoint, Inc. 11 CROSS
Information Development through the 5 Phases of MIKE2.0
Improved Governance and Operating Model
Strategic Programme
Blueprint is done once
Phase 2
Technology
Assessment
Phase 3, 4, 5
Increment 1
Increment 2
Increment 3
Begin Next
Increment
Continuous Implementation Phases
Development
Deploy
Design
Operate
Roadmap &
Foundation
Activities
Phase 1
Business
Assessment
MIKE2.0 Methodology: Phase Overview
The 5 Phases of MIKE2.0
2008 BearingPoint, Inc. 12 CROSS
MIKE2.0 Methodology: Phase Overview
Typical Activities Conducted as part of the Strategy Phases
Phase 1 Business Assessment and Strategy Definition Blueprint
1.1 Strategic Mobilisation
1.2 Enterprise
Information Management
Awareness
1.3 Overall
Business Strategy for
Information Development
1.4 Organisational
QuickScan for
Information Development
1.5 Future State Vision
for Information
Management
1.6 Data Governance
Sponsorship and Scope
1.7 Initial Data
Governance Organisation
1.8 Business
Blueprint Completion
1.9 Programme Review
Phase 2 Technology Assessment and Selection Blueprint

2.1 Strategic Requirements
for BI Application
Development
2.2 Strategic Requirements
for Technology Backplane
Development
2.3 Strategic
Non-Functional
Requirements
2.5 Future-State Logical
Architecture and Gap
Analysis
2.6 Future-State Physical
Architecture and Vendor
Selection
2.7 Data
Governance Policies
2.9 Software Development
Lifecycle Preparation
2.10 Metadata
Driven Architecture
2.11 Technology Blueprint
Completion
2.4 Current-State Logical
Architecture
2.8 Data Standards
2008 BearingPoint, Inc. 13 CROSS
Information Development through the 5 Phases of MIKE2.0
Improved Governance and Operating Model
Strategic Programme
Blueprint is done once
Phase 2
Technology
Assessment
Phase 3, 4, 5
Increment 1
Increment 2
Increment 3
Begin Next
Increment
Continuous Implementation Phases
Development
Deploy
Design
Operate
Roadmap &
Foundation
Activities
Phase 1
Business
Assessment
Activity 1.2
Enterprise Information
Management Awareness
Responsible Status
1.2.1 Assess Team's
Understanding of Information
Management Concepts
Task 1.2.2 Develop and
Initiate Information
Management Orientation
Phase 1 Business Assessment and Strategy Definition Blueprint
1.1 Strategic Mobilisation
1.2 Enterprise
Information
Management Awareness
1.3 Overall Business
Strategy for
Information Development
1.4 Organisational
QuickScan for
Information Development
1.5 Future State Vision
for Information
Management
1.6 Data Governance
Sponsorship and Scope
1.7 Initial Data
Governance Organisation
1.8 Business
Blueprint Completion
1.9 Programme Review
MIKE2.0 Methodology: Task Overview
Task 1.2.1 Develop and Initiate Information Management Orientation
2008 BearingPoint, Inc. 14 CROSS
End to End Tool Based Transformation Capabilities
Acquisition Stage
The acquisition stage is
focused on the sourcing
of data from the
producer. The data is
placed in a staging area
where the data is
scanned and assessed.
Judgments are made on
the complexity of data
quality issues and initially
identified data quality
problems are addressed
Consolidation Stage
The consolidation stage
focuses on attribute
rationalisation into an
integrated data store that
may be required to bring
data together from
multiple systems. Key
transformations occur and
further steps are required
for re-engineering data.
The data and processes
are prepared for migration
to the Move environment.
Considerable collaboration
is needed
in those areas where
decommissioning occurs
Move Stage
The move stage focuses
on moving the data and
application capabilities
that have been
developed to the
production environment.
The move stage has a
staging area that is as
close to production as
possible. Final steps
around data quality
improvement are done
this environment
Post Move Stage
The post move stage is
focused on the data
transformations and
quality aspects that were
best done after the move
to production (but before
the system goes live)
such as environment
specific data or reference
data. Additional process
changes or software
upgrades may also be
required. The skills and
toolsets used are the
same as the ones used in
the prior phases.
Attention is paid to the
ongoing use of the
interfaces created during
the transition process
MIKE2.0 Methodology: Task Overview
Task 1.2.1 Develop and Initiate Information Management Orientation
Introductory Concept: Migration in MIKE2.0 takes places across multiple stages. This means
that that in the continuous implementation phases (phase 3,4,5) these stages are repeated.
The Transformation process is thought of in 4 stages Migration, Consolidation, Move and
Post-Move. Guidelines for each stage are listed below. Some final activities are often put off
until the Post Move Stage.
2008 BearingPoint, Inc. 15 CROSS
Orientation Migration Lite
A lite migration scenario is straightforward: it typically involves loading
data from a single source into a single target. Few changes are required
in terms of data quality improvement; mapping is relatively simple as is
the application functionality to be enabled. Data integration may be on
the back-end of systems and will likely be a once-off,
"big bang"
Orientation Migration Heavy
A heavy migration scenario typically involves providing a solution for
application co-existence that allows multiple systems to be run in
parallel. The integration framework is formulated so the current-state
and future-state can work together. The model for a heavy migration
scenario is representative of an organisation in IT Transformation
As heavy migrations are long running and involve a significant data
integration effort, it is useful to build a parallel analytical environment
to attain a "vertical" view of information
Orientation Migration Medium
A medium migration scenario may involve loading data from a single
source into a single target or to multiple systems. Data quality
improvement will be performed through multiple iterations,
transformation issues may be significant and integration into a common
data model is typically complex
Post
Move Stage
Acquisition
Stage
Consolidation
Stage
Move Stage
End to End Tool Based Transformation Capabilities
MIKE2.0 Methodology: Task Overview
Task 1.2.1 Develop and Initiate Information Management Orientation
Introductory Concept: Depending on the level of complexity different migration
orientations are required. At an introductory level, MIKE2.0 classifies orientations as "lite",
"medium" and "heavy".
Strategy
The migration effort can start
at any one of the
orientations. An enterprise
transformation may have
parts of the effort start
concurrently at each of the
orientations
A migration effort my start
at the Lite orientation and
decide to move to the next
orientation (medium) on the
fly as the results of the data
scans are examined
Further some of the data
rationalization and Data
Quality work may be done in
the target environment after
the Move Phase
2008 BearingPoint, Inc. 16 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.2.1 Develop and Initiate Information Management Orientation
Introductory Concept: Different capabilities are required from the architecture depending on
the level of sophistication required. Capabilities are first defined at the strategic component
level in Activity 1.5.
Capability/Skills Orientation Lite Orientation Medium Orientation Heavy
Table and Attribute Assessment Performed Performed Performed
Relationship Assessment Direct copy of Source Systems Key Integrity Validated Referential Integrity Required
Data Replication None None Multiple Targets
Data Transfer To Target To Target Target/Downstream
Data Synchronization None None For Interfaces
Data Transformation Modest Significant to Similar Structures Major Activity
Data Mapping Minimal SME Supported Major Activity
Data Standardization None Key Attributes All Attributes
Pattern Analysis and Parsing None None Yes
Record Matching None Based on Similar IDs IDs and Pattern Matching
Record De-Duping None None Yes
Out of Box Business Rules As Appropriate As Appropriate As Appropriate
Configure Complex Rules None Application Application/Infrastructure
Out of the Box Interfaces As Appropriate As Appropriate As Appropriate
Configure Custom Interfaces None Application Application/Infrastructure
Data Governance Process Model Documented in High Level Form Key or Lynchpin Processes only End to End Models
Database Independent Functions As Existed in Source Few Custom APIs Infrastructure Services
Reporting Data Move Metrics only DQ and DM metrics Reporting as a Service
'Active' MetaData Repository Specific and 'Physical' Multiple Passive dictionaries Initial Implementation
2008 BearingPoint, Inc. 17 CROSS
Information Development through the 5 Phases of MIKE2.0
Improved Governance and Operating Model
Strategic Programme
Blueprint is done once
Phase 2
Technology
Assessment
Phase 3, 4, 5
Increment 1
Increment 2
Increment 3
Begin Next
Increment
Continuous Implementation Phases
Development
Deploy
Design
Operate
Roadmap &
Foundation
Activities
Phase 1
Business
Assessment
Phase 1 Business Assessment and Strategy Definition Blueprint
1.1 Strategic Mobilisation
1.2 Enterprise
Information
Management Awareness
1.3 Overall Business
Strategy for
Information Development
1.4 Organisational
QuickScan for
Information Development
1.5 Future State Vision
for Information
Management
1.6 Data Governance
Sponsorship and Scope
1.7 Initial Data
Governance Organisation
1.8 Business
Blueprint Completion
1.9 Programme Review
MIKE2.0 Methodology: Task Overview
Task 1.4.1 Assess Current-State Application Portfolio
Activity 1.4
Organisational QuickScan for
Information Development
Responsible Status
Task 1.4.1 Assess Current-State
Application Portfolio
Task 1.4.2 Assess
Information Maturity
Task 1.4.3 Assess Economic
Value of Information
Task 1.4.4 Assess
Infrastructure Maturity
Task 1.4.5 Assess Key Current-
State Information Processes
Task 1.4.6 Define Current-State
Conceptual Architecture
Task 1.4.7 Assess Current-State
People Skills
Task 1.4.8 Assess Current-State
Organisational Structure
Task 1.4.9 Assemble Findings
on People, Organisation and
its Capabilities
2008 BearingPoint, Inc. 18 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.4.1 Assess Current-State Application Portfolio
The Application Portfolio documents major
systems and their functionality:
From an Information Development
perspective, the deliverable should
only be to get a quick overview of
the system to understand major
application functionality
The Application Portfolio also
documents the system owner and
any expected changes it is expected
to undergo during the period of
executing on the Blueprint vision
It is focused at a systems level,
as opposed to infrastructure
and information
Time for this task may vary greatly
depending on the existing artefacts
Ideally, this content is stored in a
structured repository as opposed to an
unstructured document form.
System Name
Description Brief Statement that gives an overview of the system
Platform Technologies the system consists of Application, Database, Operating System, Programming
Language
Level 1 & 2
Functions
This attribute maps the application to the end-to-end process model for both level 1 (Functions)
and Level 2 (Process). For systems that are repeated across the business, this attribute is
repeated for each instance.
Application
Complexity
Rating of the complexity of an application. The ratings used are as follows:
Low These systems are relatively simple, use a simple database or small number of files
and are reasonably well documented. Low complexity systems may also include "black
box" systems where support and documentation is provided by a vendor, the product is
stable (with infrequent new releases), and there is little to no customization. These black
box systems would generally apply more to areas such as production than to customer-
facing of financial systems
Medium These systems are generally more substantial in functionality than low
complexity systems or are reasonably simple in functionality but are poorly documented
and/or use a number of related databases or files. Off-the-shelf vendor products may be
classified as Medium complexity systems if they are generally well documented with some
customization. This classification may also represent a grouping of multiple low complexity
applications
High Highly complex systems that use complex data structures, require trained staff to
configure and support. It would also include those that are poorly documented and difficult
to maintain, whilst containing a significant amount of business functionality. Major off-the-
shelf vendor systems that have been heavily customized, have been tailored by the vendor
to divisional requirements or for which upgrades would be difficult would likely be
classified as High complexity
Divisions Divisions/business clusters supported by the system
Inter-
Divisional
Complexity
Rating of the inter-divisional complexity due to variations in the instances of systems supporting
different divisions. Rating of the complexity of an application. The ratings used are as follows:
Low The system is used in 1 division or in exactly the same way in 2 or more
Medium There are multiple instances of the system but the differences between
divisions are largely configuration changes
High There are multiple instances of the system and there are significant differences in
code and configuration of these systems
Issues &
Limitations
Key Problems associated with the system
Application
Life
Expectancy
This attribute may contain the envisaged life expectancy for the system e.g. whether it will be
decommissioned or be part of the strategic architecture.
Ratings may also be assigned to the system in terms of application life expectancy, using the
following model:
Null Unknown
Low This system will be replaced in the near term (< 2 years)
Medium This system will be replaced in the long term (2 5 years)
High Strategic, long-term application (> 5 years)
Comment Additional comments related to the system e.g. point-of-contact, open questions
2008 BearingPoint, Inc. 19 CROSS
Information Development through the 5 Phases of MIKE2.0
Improved Governance and Operating Model
Strategic Programme
Blueprint is done once
Phase 2
Technology
Assessment
Phase 3, 4, 5
Increment 1
Increment 2
Increment 3
Begin Next
Increment
Continuous Implementation Phases
Development
Deploy
Design
Operate
Roadmap &
Foundation
Activities
Phase 1
Business
Assessment
Phase 1 Business Assessment and Strategy Definition Blueprint
1.1 Strategic Mobilisation
1.2 Enterprise
Information
Management Awareness
1.3 Overall Business
Strategy for
Information Development
1.4 Organisational
QuickScan for
Information Development
1.5 Future State Vision
for Information
Management
1.6 Data Governance
Sponsorship and Scope
1.7 Initial Data
Governance Organisation
1.8 Business
Blueprint Completion
1.9 Programme Review
MIKE2.0 Methodology: Task Overview
Task 1.5.10 High Level Solution Architecture Options
Activity 1.5
Future-State Vision for
Information Management
Responsible Status
1.5.1 Introduce Leading
Business Practices for
Information Management
1.5.2 Define Future-State
Business Alternatives
1.5.3 Define Information
Management Guiding Principles
1.5.4 Define Technology
Architecture Guiding Principles
1.5.5 Define IT Guiding
Principles (Technology Backplane
Delivery Principles)
1.5.6 Define Future-State
Information Process Model
1.5.7 Define Future-State
Conceptual Data Model
1.5.8 Define Future-State
Conceptual Architecture
1.5.9 Define Source-to-
Target Matrix
1.5.10 Define High-Level
Recommendations for
Solution Architecture
2008 BearingPoint, Inc. 20 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.5.10 High Level Solution Architecture Options
A lite migration scenario is straightforward: it typically involves loading data from a single
source into a single target. Few changes are required in terms of data quality improvement;
mapping is relatively simple as is the application functionality to be enabled. Data
integration may be on the back-end of systems and will likely be a once-off, "big bang".
Below is a high level solution option for a lite migration scenario.
2
4
3
1
5
Future-State
Production System
Current-State
System
Future-State
Test System
Transformation
Some Data Quality Cleanup
Once-off migration load
2008 BearingPoint, Inc. 21 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.5.10 High Level Solution Architecture Options
A medium migration scenario may involve loading data from a single source into a single
target or to multiple systems. Data quality improvement will be performed through multiple
iterations, transformation issues may be significant and integration into a common data
model is typically complex.
Data migration may involve multiple iterations through a gradual roll-out of capabilities.
Below is a high level solution option for a medium migration scenario.
6-T
7
Migration Staging
Table Scan
Attribute Scan
Assessments
Reporting
Transformations
Integrated Data Store
Common Data
Model
Detailed Data
Apply '80/20 rule'
for Data
Re-Engineering
Data Producers
Current-State
5
D
a
t
a

I
n
t
e
g
r
a
t
i
o
n

Data
Integration
Profiling
Data
Re-Engineering
Metadata Management
Test Target
Production Target
CUSTOMER ST
CUSTOMER CUSTOMER NUMBER CUSTOMER NAME CUSTOMER CITY CUSTOMER POST
CUSTOMER ADDR CUSTOMER PHONE
CUSTOMER FAX
6-P
1
3
2 4
2008 BearingPoint, Inc. 22 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.5.10 High Level Solution Architecture Options
A heavy migration scenario will require a comprehensive strategy that develops a vision for
people, process, organisation and technology. A heavy Application Co-Existence scenario
shows how multiple systems can be run in parallel so the current-state and future-state can
work together. Below is a high level solution option for a heavy migration scenario and
representative of an organisation in IT Transformation.
System 1
System 2
System 3
System 4
Reporting and
Analysis
System Type Function/Data/Migration
Current
Systems Data
Native
Use
New
System
Data
New
System
Horizontal
Portal
Horizontal
Portal
Horizontal
Portal
Horizontal
Portal
Horizontal
Portal
Current State Implementations New Implementations
Vertical
Portal
Whole
of Product
Whole
of Customer
Technology Backplane
Process Integration
Workflow
Implementation
Business Services
Data Services
Interface Services
Common Messaging
Data Mastering Model
Active Metadata
Portal Enablement
Enterprise
Platform
Data
Mediation
2008 BearingPoint, Inc. 23 CROSS
Information Development through the 5 Phases of MIKE2.0
Improved Governance and Operating Model
Strategic Programme
Blueprint is done once
Phase 2
Technology
Assessment
Phase 3, 4, 5
Increment 1
Increment 2
Increment 3
Begin Next
Increment
Continuous Implementation Phases
Development
Deploy
Design
Operate
Roadmap &
Foundation
Activities
Phase 1
Business
Assessment
Phase 1 Business Assessment and Strategy Definition Blueprint
1.1 Strategic Mobilisation
1.2 Enterprise
Information
Management Awareness
1.3 Overall Business
Strategy for
Information Development
1.4 Organisational
QuickScan for
Information Development
1.5 Future State Vision
for Information
Management
1.6 Data Governance
Sponsorship and Scope
1.7 Initial Data
Governance Organisation
1.8 Business
Blueprint Completion
1.9 Programme Review
MIKE2.0 Methodology: Task Overview
Task 1.8.1 Prioritise Requirements and Identify Immediate Opps
Activity 1. 8 Business
Blueprint Completion
Responsible Status
Task 1.8.1 Prioritise
Requirements and Identify
Immediate Work Opportunities
Task 1.8.2 Define High-Level
Programme Plan
Task 1.8.3 Develop High-Level
Business Case
Task 1.8.4 Assemble Key
Messages to Complete
Business Blueprint
2008 BearingPoint, Inc. 24 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.8.1 Prioritise Requirements and Identify Immediate Opps
For Data Migration initiatives that involve replacement of a number of systems, a key part of
prioritisation involves balancing the desire for new business capabilities with the complexity
of their implementation.
High Level Project Estimating Factors Include:
The complexity of the current-state environment
The number of critical business functions to be enabled
The level of technology sophistication that is required
The number of systems to be migrated
Amount of data within these systems to be migrated
Level of documentation on the system
Availability of Subject Matter Experts
Complexity of system interfaces
Quality of the data within the system
A key aspect of the MIKE2.0 approach is determining these Estimating Factors.
The Estimating Model available as part of MIKE2.0 is described on the following pages.
2008 BearingPoint, Inc. 25 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.8.1 Prioritise Requirements and Identify Immediate Opps
There are many factors that will be used to estimate the time and resources required for delivering a Data Migration project. The
model below can be used to make a quantitative estimate on the complexity of the project and to weigh business priorities. If
multiple migrations are to take place across a large transformation programme, this model can be used to help prioritize the
sequencing of the overall implementation.
These questions should be asked relative to an application or application cluster by senior staff. A large-scale transformation
programme may have multiple applications or application clusters. A starter set of sample questions is listed below.
Criteria for Assessing Difficulty Alignment with Business Enablers
Number and size of databases in Application
Number of Tables per database
Total Number of Attributes
# of attributes that have had multiple definitions
over time
# of attributes in terms of synonyms and antonyms
Number of DB dependent processes
Number of one time Interfaces
Number of ongoing Interfaces
Number of Data Quality Problems and Issues to fix
Knowledge/Documentation of Data Quality issues
Ease of de-duping similar entities in the same DB
Ease of matching same entity records across multiple DBs
Completeness of the functional documentation
Availability of Subject Matter Experts (SMEs)
Maturity of the Enterprise in Managing their data
Degree to which system functions align with
base capabilities
Degree to which system functions align with enhanced
capabilities focused on the new business model
Degree to which the system addresses high priority
customer segment growth
Degree to which the system addresses high priority
customer segment retention
Degree to which the system addresses high priority areas
of product growth
Degree to which the system addresses high priority areas
of product stabilization
Degree to which the system is cost effective
(I.e., cost takeout)
Degree to which the system is flexible to
adding capabilities
2008 BearingPoint, Inc. 26 CROSS
Difficulty Criteria
Number and size of databases in Application
Number of Tables per database
Total Number of Attributes
% of attributes that have had multiple definitions
over time
% of attributes in terms of synonyms and
antonyms
Number of DB dependent processes
Number of one time Interfaces
Number of ongoing Interfaces
Number of Data Quality problems and issues to fix
Knowledge/Documentation of Data Quality issues
Ease of de-duping similar entities in the same DB
Ease of matching same entity records across
multiple DBs
Completeness of the functional documentation
Availability of Subject Matter Experts (SMEs)
Maturity of the Enterprise in Managing their data
MIKE2.0 Methodology: Task Overview
Task 1.8.1 Prioritise Requirements and Identify Immediate Opps
Small 2 Medium 4 Large
Low 2 Average 4 High
2008 BearingPoint, Inc. 27 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.8.1 Prioritise Requirements and Identify Immediate Opps
Alignment with Key Business Enablers
Degree to which system functions align with
base capabilities
Degree to which system functions align with
enhanced capabilities focused on the new
business model
Degree to which the system addresses high priority
customer segment growth
Degree to which the system addresses high priority
customer segment retention
Degree to which the system addresses high priority
areas of product growth
Degree to which the system addresses high priority
areas of product stabilization
Degree to which the system is cost effective
(I.e., cost takeout)
Degree to which the system is flexible to
adding capabilities
1-Low 2 3-Fair 4 5-High
2008 BearingPoint, Inc. 28 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.8.1 Prioritise Requirements and Identify Immediate Opps
Business Alignment
Degree to which system functions align with base capabilities
x +
Degree to which system functions align with enhanced capabilities focused on the new business model
x +
Degree to which the system addresses high priority customer segment growth
x +
Degree to which the system addresses high priority customer segment retention
x +
Degree to which the system addresses high priority areas of product growth
x +
Degree to which the system addresses high priority areas of product stabilization
x +
Degree to which the system is cost effective (I.e., cost takeout)
x +
Degree to which the system is flexible to adding capabilities
x
Difficulty Index
Number and size of Data Bases in Application
x +
Number of Tables per Data Base
2x +
Total Number of Attributes
x

+
# of attributes that have had multiple definitions over time
2x +
# of attributes in terms of synonyms and antonyms
1.5x +
Number of DB dependent processes
3x +
Number of one time Interfaces
2x +
Number of ongoing Interfaces
2x +
Number of Data Quality Problems and Issues to fix
x +
Knowledge/Documentation of Data Quality issues
Minus
Ease of de-duping similar entities in the same DB
x +
Ease of matching same entity records across multiple DBs
3x +
Completeness of the functional documentation
2x
Availability of Subject Matter Experts (SMEs)
x +
Maturity of the Enterprise in Managing their data
3x +
Scoring Formulas
Business Alignment
Equals
Equals
Difficulty Index
2008 BearingPoint, Inc. 29 CROSS
MIKE2.0 Methodology: Task Overview
Task 1.8.1 Prioritise Requirements and Identify Immediate Opps
Metrics on Business Alignment and Difficulty help to formulate priorities for the overall
implementation of a large-scale migration programme. This is done by starting with areas
that are most important for the business and of the lowest complexity. Whilst a simple
model, this helps to clearly illustrate to the business and technical community how priorities
were driven for the project in an objective fashion.
Business Alignment
Transition Difficulty Score
A
l
i
g
n
m
e
n
t

w
i
t
h

K
E
Y

B
u
s
i
n
e
s
s

E
n
a
b
l
e
r
s
Do Now 'early wins'
Transition to Target and use
Transformation Framework for forward
and backward continuity.
All process functions migrated at once.
Should focus on the most sensitive
and critical aspects of the new
business
Analyse and Schedule
Migration to Transformation
Framework followed by Migration to
Target System(s). Migration of process
functions is iterative. Many existing
functions may remain in
'as is' environment for an
extended period
Targets of Opportunity
Transition to Target Systems
as part of Migration Packages
associated with higher priority
systems. Traditional data conversion
techniques used. Good sense of
integration needed
Phase Out
Functions picked up by
Target System(s) as needed
other functions discontinued.
Traditional data conversion
techniques used. Change
management represents a key set
of activities
Hard Easy
Low
High
?
0
2008 BearingPoint, Inc. 30 CROSS
Data Migration through Information Development
Lessons Learned
Priortise Planning
Define business priorities and start with quick wins
Don't do everything at once Deliver complex projects through an
incremental programme
"Big bang" if you can, know that often you can't
Focus on the Areas of High Complexity
Get the Technology Backplane capabilities out in front
Don't wait until the 11
th
hour to deal with Data Quality issues Fix them early
Follow the 80/20 rule for fixing data Does this iteratively through multiple cycles
Understand the sophistication required for Application Co-Existence and that in the
short term your systems will get more complex
Keep the Business Engaged
Communicate continuously on the planned approach defined in the strategy
The overall Blueprint is the communications document for the life of the programme
Try not to be completely infrastructure-focused for long-running releases Always
deliver some form of new business functionality
Align the migration programme with analytical initiatives to give business users more
access to data

You might also like