Microsoft Application Lifecycle
                        Management
                                                     a 30-minute overview




Steve Lange
sr. developer technology specialist | microsoft – denver, co
stevenl@microsoft.com | slange.me
Agenda
• What is Application Lifecycle
  Management?
• Microsoft’s ALM Platform
• Focus Areas
  – Implement process with minimal overhead
  – Plan & manage projects
  – Align roles across the lifecycle
  – Report across project boundaries
• Summary
WHAT IS ALM?
Application Lifecycle
Management (ALM) is
the marriage of
governance,
development, and
operations.
Working together from idea to retirement
Microsoft ALM Platform Overview
But first, what about
you?
 Version Control                                    Test Case
                                   Task            Management
                                Management
            Requirements
            Management
                                              Bug Tracking


                           Automated
       Build                Testing
                                             Development
    Automation                                  Tool
Introducing

MICROSOFT’S ALM
      PLATFORM
Microsoft ALM tools
work to integrate
vertically, not just
horizontally.           Governance




                  Development




                        Operations
Working Vertically


          Tasks    Check-in




 User
Stories




          Tests      Bug




                              Build
Microsoft ALM Platform Overview
ALM Platform: Focus Areas


Process with   Plan &
minimal        manage
overhead       projects


               Report
Align roles    across
across the     project
lifecycle      boundaries
ALM Platform: Focus Areas

PROCESS WITH MINIMAL
          OVERHEAD
Process Guidance & Automation
• Baked into Team Foundation Server
• Provides contextual guidance
  (documentation)
• Delivered via Process Templates     Enable predictability and
• Use templates out of the box, or    repeatability across
                                      projects
  create your own
• Completely customizable
Process Templates
                   •   Product planning based on user stories and story points
                   •   Team progresses most work by moving from active to
MSF for Agile          resolved to closed
                   •   Team is not usually required to support rigorous audits
Software
Development
•   Product planning based on requirements and CR’s
•   Most work moves from proposed to active to resolved to
    closed
•   Team is required to maintain rigorous audit trails          MSF for
•   Team is working toward CMMI appraisal                       CMMI

                   •   Development lifecycle follows Scrum framework (based
                       on Agile principles)

Visual Studio
Scrum
                                                      MSF = Microsoft Solutions Framework
                                                 CMMI = Capability Maturity Model Integration
ALM Platform: Focus Areas

PLANNING & MANAGING
           PROJECTS
Planning & Managing Projects
    Organization
    • Classification
    • Assignment
    • Customization

    Visibility
    • Relate artifacts, actions, intent
    • Parent/child relationships
    • Rollup & drilldown

    User Comfort
    • Use familiar tools for role
    • Excel, Project, IDE, web, others..
    • Eases adoption
Organization

Classification of work artifacts
   – Where
   – When
Integration organizes efforts
   – Build
   – Collaboration
Customize to meet team
needs
   – Not the other way around
Visibility

Top-to-bottom
traceability    Agile planning



                Related
Proactive       information =
notifications   project insight
User Comfort

• Team members work in the tools that suit
  them
User Comfort

• Team members work in the tools that suit
  them
User Comfort

• Team members work in the tools that suit
  them
User Comfort

• Team members work in the tools that suit
  them
User Comfort

• Team members work in the tools that suit
  them
User Comfort

• Team members work in the tools that suit
  them
User Comfort

• Team members work in the tools that suit
  them
User Comfort

• Team members work in the tools that suit
  them
Role-specific tools are
simply front-ends for a
much larger
collaborative
environment.
ALM Platform: Focus Areas

REPORT ACROSS PROJECT
           BOUNDARIES
ALM data is as complex as you think
Reporting: Not just for
                                      managers
     Bug Status                                    Build Quality Indicators
     Trends
          Reporting Services                               Bug Status
Reactivations             Release Burndown
                                               Burndown & Burn Rate
        Sprint Burndown
                          Requirements Progress       Dashboards
Remaining Work
                      Excel
                                                           Status on All Iterations
Test Case Readiness             Stories Overview
                                             Unplanned Work
      Build Success Over Time
                                                             Build Summary
Application Lifecycle Management

          SUMMARY
Microsoft’s ALM Solution

          Create happy teams, enabling
          success from idea to
          retirement



          Process with   Plan &
          minimal        manage
          overhead       projects


                         Report
          Align roles    across
          across the     project
          lifecycle      boundaries
Questions?
Thank You

Steve Lange
sr. developer technology specialist | microsoft
stevenl@microsoft.com | 303-918-0500
slange.me | @stevelange

Upcoming Office Hours:
     – 11/18 @ 9:30 AM (Pacific)
     – 12/2 @ 9:30 AM (Pacific)
     – 12/16 @ 9:30 AM (Pacific)
     – 12/30 @ 9:30 AM (Pacific)
     – 1/13 (2012) @ 9:30 AM (Pacific)
(see blog for details)
APPENDIX
Links & Resources
• Videos
   – Video: Proactive Project Management with Visual Studio 2010
   – Improving Developer-Tester Collaboration with Microsoft Visual Studio 2010
• Whitepapers
   – What is ALM?
   – IDC MarketScape Excerpt: IT Project and Portfolio Management 2010 Vendor
     Analysis
   – The Forrester Wave: Agile Development Management Tools, Q2 2010
   – Attaining Optimal Business Value from Agile Software Development
   – White Paper: Reconciling the Agile Team with Enterprise Project Management
   – Magic Quadrant for Integrated Software Quality Suites
• Microsoft Solutions Framework
   –   MSF for Agile homepage
   –   MSF for CMMI homepage
   –   Visual Studio Scrum 1.0 homepage
   –   MSF for Agile+SDL v5.0
Links & Resources
• Articles & Product Pages
  – Microsoft Application Lifecycle Management
  – Effective Team Development
  – Heterogeneous Development
  – Product homepages
    • Visual Studio
    • Team Foundation Server
    • Test Professional
Case Studies
•   Flextronics - Visual Studio 2010 helps Flextronics’ developers and QA teams work
    together
•   Wintellect - Wintellect uses the testing tools in Visual Studio to speed up debugging
•   AccessIT - Visual Studio 2010 helps Access IT give its customers better team
    collaboration
•   Sogeti - Sogeti better understands legacy systems with Visual Studio 2010
•   EPiServer - EPiServer tests software more effectively and efficiently with Visual
    Studio 2010
•   Equiniti - Share Registrar Cuts Testing Time and Improves Application Lifecycle
    Management
•   ICONICS - ICONICS is cutting cost and increasing productivity with Visual Studio
    2010
•   Penn National Insurance - Penn National Insurance boosted productivity and reduced
    testing time using Visual Studio 2010
•   Readify - Using Visual Studio 2010, Readify saves time and money with virtual testing
•   Länsförsäkringar AB - Swedish insurance company expects to cut software
    development time and costs by 20 percent
•   3M - Eliminating “no repro” bugs helps 3M accelerate delivery of products into the
    marketplace
Microsoft’s ALM Solution




                                             Architecture
                Project Plan
                                                             Dep. Graph                                                     User Interface
                                                                                         Version




                                                                                                               Database




                                                                                                                                                      Auto Test
   Req                         Design                        IntelliTrace                                                   Unit




                                                                             Code
                                                                                         Gen Test Data
                                                             Code Analysis                                                  Web & Load
                                                                                         Deploy
                                                             Code Metrics                                                   Database
   Task                        Validate
                                                                                         Compare
                                                             Code Profiler                                                  Test Impact




                                        Lab Management
                Test Manager




Design                                                                                                                    Gated Check-




                                                                                                                                              TFS Source
                                                                               Compile
                                                                                                                          in




                                                                                                   TFS Build
Plan/Organize
                                                                               Validate Arch                              Branch
Execute Tests                             Test              Dev                Analyze
                                                                                                                          Merge
                                                                               Impact
                                                                               Deploy
Fast Forward                                                                                                              Shelve
                                                                               App/DB
                                          Stag              Pre                Execute Tests                              Visualize
                                           e                Prod
   Test Case
                                                                                                                                      Feature
    Rich Bug                                                                                            TFS Web                       Requests
                                                                                    Prod                                              Help Tickets
                                Needs Work                          Ready
                                                                                                         Access                       Prod Issues
back


                                       Bug Status Report
•   Is the team fixing bugs
    quickly enough to finish on
    time?
•   Is the team fixing high priority
    bugs first?
•   What is the distribution of
    bugs by priority and severity?
•   How many bugs are
    assigned to each team
    member?
back


                                   Bug Trends Report
•   How many bugs is the team
    reporting, resolving, and
    closing per day?
•   What is the overall trend at
    which the team is processing
    bugs?
•   Are bug activation and
    resolution rates declining
    toward the end of the
    iteration as expected?
back


                                  Reactivations Report
•   How many bugs are being
    reactivated?
•   How many user stories are
    being reactivated?
•   Is the team resolving and
    closing reactivated bugs at
    an acceptable rate?
back



        Build Quality Indicators Report
•   What is the quality of the
    software?
•   How often are tests passing,
    and how much of the code is
    being tested?
•   Based on the code and test
    metrics, is the team likely to
    meet target goals?
back

                      Build Success Over Time
•                                      Report
    What parts of the project have produced software that is ready to be tested?
•   What parts of the project are having trouble with regressions or bad checkins?
•   How well is the team testing the code?
back



                            Build Summary Report
•   What is the status of all builds over time?
•   Which builds succeeded?
•   Which builds have a significant number of changes to the code?
•   How much of the code was executed by the tests?
•   Which builds are ready to install?
back

                        Burndown and Burn Rate
•   Is the team likely to finish the
    iteration on time?
                                        Report
•   Will the team complete the
    required work, based on the
    current burn rate?
•   How much work does each
    team member have?
back


                           Remaining Work Report
•   What is the cumulative flow of
    work?
•   Is the team likely to finish the
    iteration on time?
•   Is the amount of work or
    number of work items in the
    iteration growing?
•   Does the team have too much
    work in progress?
•   How is the team doing in
    estimating work for the
    iteration?




                                          # Hours of Items
                                            of Work Work
back


         Status on All Iterations Report
•   Is steady progress being made across all iterations?
•   How many stories did the team complete for each iteration?
•   How many hours did the team work for each iteration?
•   For each iteration, how many bugs did the team find, resolve, or close?
back


     Stories Overview Report (Agile)
•   How much work does each story require?
•   How much work has the team completed for each story?
•   Are the tests for each story passing?
•   How many active bugs does each story have?
back


      Stories Progress Report (Agile)
•   How much progress has the team made toward completing the work for each story?
•   How much work must the team still perform to implement each user story?
•   How much work did the team perform in the last calendar period?
back


    Requirements Progress Report (CMMI)
•   How much progress has the team made toward completing the work for each
    requirement?
•   How much work must the team still perform to implement each requirement?
•   How much work did the team perform in the last calendar period?
back


    Requirements Overview Report (CMMI)
•   How much work does each Requirement require?
•   How much work has the team completed for each Requirement?
•   Are the tests for each Requirement passing?
•   How many active bugs does each Requirement have?
back


               Release Burndown (Scrum)
•   How much work remains in the release?
•   How quickly is your team working through the product backlog?
back


                       Sprint Burndown (Scrum)
•   How much work remains in the sprint?
•   Is your team on track to finish all work for the sprint?
•   When will your team finish all work for the sprint?
•   How much work for the sprint is in progress?
back


                         Unplanned Work Report
•   How much work was
    added after the
    iteration started?
•   Is too much work
    being added during
    the iteration?
back


             Test Case Readiness Report
•   When will all the test cases be ready to run?
•   Will all the test cases be ready to run by the end of the iteration?
•   How many test cases must the team still write and review?
•   How many test cases are ready to be run?
back


                   Test Plan Progress Report
•   How much testing has the
    team completed?
•   Is the team likely to finish
    the testing on time?
•   How many tests are left to
    be run?
•   How many tests are
    passing?
•   How many tests are failing?
•   How many tests are
    blocked?

More Related Content

PPTX
Azure Boards.pptx
PPTX
Azure DevOps
PDF
Agile Methodology
PDF
Agile Process Introduction
PDF
Devops Strategy Roadmap Lifecycle Ppt Powerpoint Presentation Slides Complete...
PPTX
Azure dev ops
PPTX
Agile Methodology
PDF
Introduction agile scrum methodology
Azure Boards.pptx
Azure DevOps
Agile Methodology
Agile Process Introduction
Devops Strategy Roadmap Lifecycle Ppt Powerpoint Presentation Slides Complete...
Azure dev ops
Agile Methodology
Introduction agile scrum methodology

What's hot (20)

PPTX
Getting Started with Azure DevOps
PPTX
Getting Started with Azure Artifacts
PPTX
Scrum for Beginners
PDF
Agile & SCRUM basics
PDF
MuleSoft Anypoint Platform and Three Tier Architecture
PPTX
PDF
Agile & Scrum Training
PDF
12 principles for Agile Development
PPTX
Microsoft DevOps Solution - DevOps
PPTX
Introduction to Azure DevOps
PDF
Welcome to Azure Devops
PPTX
Scrum In Ten Slides
PPTX
Introduction to Agile
PDF
Agile Methodology
PDF
Cloud Migration: Cloud Readiness Assessment Case Study
PPTX
Agile (Scrum)
PPTX
2017 Scrum by Picture
PPTX
Agile Methodology in Software Development
PDF
Istio : Service Mesh
PPTX
Azure DevOps in Action
Getting Started with Azure DevOps
Getting Started with Azure Artifacts
Scrum for Beginners
Agile & SCRUM basics
MuleSoft Anypoint Platform and Three Tier Architecture
Agile & Scrum Training
12 principles for Agile Development
Microsoft DevOps Solution - DevOps
Introduction to Azure DevOps
Welcome to Azure Devops
Scrum In Ten Slides
Introduction to Agile
Agile Methodology
Cloud Migration: Cloud Readiness Assessment Case Study
Agile (Scrum)
2017 Scrum by Picture
Agile Methodology in Software Development
Istio : Service Mesh
Azure DevOps in Action
Ad

Viewers also liked (20)

PDF
Agile Application Lifecycle Management (ALM)
PPT
ALM (Application Lifecycle Management)
PPS
Requirements Management with HP ALM
PPTX
Analytics Solutions from SAP
PDF
Analysis on Air India strike on May 2012
PDF
Physics formula list
PDF
Introduction to Agile software testing
PDF
Presentation on dry granulation
PPTX
Sample project -Marketing Management
PPSX
Integrated Lifecycle Marketing Workshop: Emerging Channels for Email List Bui...
ODP
Ambienti di virtualizzazione
PPTX
Solve 3 Enterprise Storage Problems Today
PDF
Renesas RL78 The True Low Power Microcontroller Platform
PDF
An Introduction to Faye
PPSX
How to refill canon color cartridge 241
PDF
Enterprise TEPPCO Pipeline System Map
PDF
Summary -Fish
PPT
Intermediate Colors
PDF
How to Make the Inc 500 List
PPSX
Friendship’s coupons
Agile Application Lifecycle Management (ALM)
ALM (Application Lifecycle Management)
Requirements Management with HP ALM
Analytics Solutions from SAP
Analysis on Air India strike on May 2012
Physics formula list
Introduction to Agile software testing
Presentation on dry granulation
Sample project -Marketing Management
Integrated Lifecycle Marketing Workshop: Emerging Channels for Email List Bui...
Ambienti di virtualizzazione
Solve 3 Enterprise Storage Problems Today
Renesas RL78 The True Low Power Microcontroller Platform
An Introduction to Faye
How to refill canon color cartridge 241
Enterprise TEPPCO Pipeline System Map
Summary -Fish
Intermediate Colors
How to Make the Inc 500 List
Friendship’s coupons
Ad

Similar to Microsoft ALM Platform Overview (20)

PDF
Application Lifecycle Management & VSTS
PPTX
End-To-End Visual Studio Application Lifecycle Management
PPTX
Visual Studio Application Lifecycle Managment end-to-end
PPTX
PHX Session #1: Development Best Practices And How Microsoft Helps
PPTX
Visual Studio 2010 ALM Tools Overview
PPTX
Upgrading to TFS 2012: What You Need to Know!
PPT
PPTX
A Lap Around Visual Studio 2010
PDF
Visual Studio ALM 2010 And The PMO V 1
PPTX
Agile in Action - Act 1 (Set Up, Planning, Requirements and Architecture)
PPTX
Lanzamiento Visual Studio 2012 - Modern ALM
PPTX
Integrated ALM using Microsoft 2012 Solutions
PPT
The Magic Of Application Lifecycle Management In Vs Public
PPTX
Top 10 Business Reasons for ALM
PPTX
Application Lifecycle Management with Visual Studio 2013
PPTX
HP ALM; HP ALI 2.5
PPT
UX in ALM Series - UX Project Worflow using TFS 2008
PPTX
Introductie Visual Studio ALM 2012
PPTX
Introductie Visual Studio ALM 2012
PPTX
Software Project Management - NESDEV
Application Lifecycle Management & VSTS
End-To-End Visual Studio Application Lifecycle Management
Visual Studio Application Lifecycle Managment end-to-end
PHX Session #1: Development Best Practices And How Microsoft Helps
Visual Studio 2010 ALM Tools Overview
Upgrading to TFS 2012: What You Need to Know!
A Lap Around Visual Studio 2010
Visual Studio ALM 2010 And The PMO V 1
Agile in Action - Act 1 (Set Up, Planning, Requirements and Architecture)
Lanzamiento Visual Studio 2012 - Modern ALM
Integrated ALM using Microsoft 2012 Solutions
The Magic Of Application Lifecycle Management In Vs Public
Top 10 Business Reasons for ALM
Application Lifecycle Management with Visual Studio 2013
HP ALM; HP ALI 2.5
UX in ALM Series - UX Project Worflow using TFS 2008
Introductie Visual Studio ALM 2012
Introductie Visual Studio ALM 2012
Software Project Management - NESDEV

More from Steve Lange (20)

PPTX
Visual Studio ALM 2013 - Edition Comparison
PPTX
Team Foundation Server 2012 Reporting
PPTX
A Deeper Look at Team Foundation Server 2012 Version Control
PPTX
Upgrading to TFS 2010
PPTX
Team Foundation Server - Tracking & Reporting
PPTX
Visual Studio 2010 Testing for Developers
PPTX
Visual Studio LightSwitch (Beta 1) Overview
PPTX
Team Foundation Server 2010 - Overview
PPTX
Visual Studio 2010 Testing Overview
PPTX
TFS 2010: Team Development on Crack
PPTX
Team Foundation Server 2010 - Version Control
PPTX
Whats New In 2010 (Msdn & Visual Studio)
PPTX
PHX Session #5 : Architecture Without Big Design Up Front (Garibay)
PPTX
PHX Session #6: More Bang for Your Buck: Getting the Most out of Team Foundat...
PPTX
PHX - Session #4 Treating Databases as First-Class Citizens in Development
PPTX
PHX - Session #2 Test Driven Development: Improving .NET Application Performa...
PPTX
PHX Session #3 - "It Works on My Machine!" Closing the Loop Between Developme...
PPTX
Big Event Looping Deck
PPTX
Session #6: Get More Bang For Your Buck
PPTX
Session #1: Development Practices And The Microsoft Approach
Visual Studio ALM 2013 - Edition Comparison
Team Foundation Server 2012 Reporting
A Deeper Look at Team Foundation Server 2012 Version Control
Upgrading to TFS 2010
Team Foundation Server - Tracking & Reporting
Visual Studio 2010 Testing for Developers
Visual Studio LightSwitch (Beta 1) Overview
Team Foundation Server 2010 - Overview
Visual Studio 2010 Testing Overview
TFS 2010: Team Development on Crack
Team Foundation Server 2010 - Version Control
Whats New In 2010 (Msdn & Visual Studio)
PHX Session #5 : Architecture Without Big Design Up Front (Garibay)
PHX Session #6: More Bang for Your Buck: Getting the Most out of Team Foundat...
PHX - Session #4 Treating Databases as First-Class Citizens in Development
PHX - Session #2 Test Driven Development: Improving .NET Application Performa...
PHX Session #3 - "It Works on My Machine!" Closing the Loop Between Developme...
Big Event Looping Deck
Session #6: Get More Bang For Your Buck
Session #1: Development Practices And The Microsoft Approach

Recently uploaded (20)

PDF
Examining Bias in AI Generated News Content.pdf
PDF
The-Future-of-Automotive-Quality-is-Here-AI-Driven-Engineering.pdf
PDF
“The Future of Visual AI: Efficient Multimodal Intelligence,” a Keynote Prese...
PDF
A symptom-driven medical diagnosis support model based on machine learning te...
PDF
Electrocardiogram sequences data analytics and classification using unsupervi...
PDF
The-2025-Engineering-Revolution-AI-Quality-and-DevOps-Convergence.pdf
PDF
CEH Module 2 Footprinting CEH V13, concepts
PPTX
AI-driven Assurance Across Your End-to-end Network With ThousandEyes
PDF
The AI Revolution in Customer Service - 2025
PDF
MENA-ECEONOMIC-CONTEXT-VC MENA-ECEONOMIC
PDF
SaaS reusability assessment using machine learning techniques
PDF
giants, standing on the shoulders of - by Daniel Stenberg
PDF
CXOs-Are-you-still-doing-manual-DevOps-in-the-age-of-AI.pdf
PDF
Ensemble model-based arrhythmia classification with local interpretable model...
PDF
LMS bot: enhanced learning management systems for improved student learning e...
PDF
Rapid Prototyping: A lecture on prototyping techniques for interface design
PDF
Altius execution marketplace concept.pdf
PDF
Transform-Quality-Engineering-with-AI-A-60-Day-Blueprint-for-Digital-Success.pdf
PDF
Introduction to MCP and A2A Protocols: Enabling Agent Communication
PDF
Co-training pseudo-labeling for text classification with support vector machi...
Examining Bias in AI Generated News Content.pdf
The-Future-of-Automotive-Quality-is-Here-AI-Driven-Engineering.pdf
“The Future of Visual AI: Efficient Multimodal Intelligence,” a Keynote Prese...
A symptom-driven medical diagnosis support model based on machine learning te...
Electrocardiogram sequences data analytics and classification using unsupervi...
The-2025-Engineering-Revolution-AI-Quality-and-DevOps-Convergence.pdf
CEH Module 2 Footprinting CEH V13, concepts
AI-driven Assurance Across Your End-to-end Network With ThousandEyes
The AI Revolution in Customer Service - 2025
MENA-ECEONOMIC-CONTEXT-VC MENA-ECEONOMIC
SaaS reusability assessment using machine learning techniques
giants, standing on the shoulders of - by Daniel Stenberg
CXOs-Are-you-still-doing-manual-DevOps-in-the-age-of-AI.pdf
Ensemble model-based arrhythmia classification with local interpretable model...
LMS bot: enhanced learning management systems for improved student learning e...
Rapid Prototyping: A lecture on prototyping techniques for interface design
Altius execution marketplace concept.pdf
Transform-Quality-Engineering-with-AI-A-60-Day-Blueprint-for-Digital-Success.pdf
Introduction to MCP and A2A Protocols: Enabling Agent Communication
Co-training pseudo-labeling for text classification with support vector machi...

Microsoft ALM Platform Overview

  • 1. Microsoft Application Lifecycle Management a 30-minute overview Steve Lange sr. developer technology specialist | microsoft – denver, co [email protected] | slange.me
  • 2. Agenda • What is Application Lifecycle Management? • Microsoft’s ALM Platform • Focus Areas – Implement process with minimal overhead – Plan & manage projects – Align roles across the lifecycle – Report across project boundaries • Summary
  • 4. Application Lifecycle Management (ALM) is the marriage of governance, development, and operations.
  • 5. Working together from idea to retirement
  • 7. But first, what about you? Version Control Test Case Task Management Management Requirements Management Bug Tracking Automated Build Testing Development Automation Tool
  • 9. Microsoft ALM tools work to integrate vertically, not just horizontally. Governance Development Operations
  • 10. Working Vertically Tasks Check-in User Stories Tests Bug Build
  • 12. ALM Platform: Focus Areas Process with Plan & minimal manage overhead projects Report Align roles across across the project lifecycle boundaries
  • 13. ALM Platform: Focus Areas PROCESS WITH MINIMAL OVERHEAD
  • 14. Process Guidance & Automation • Baked into Team Foundation Server • Provides contextual guidance (documentation) • Delivered via Process Templates Enable predictability and • Use templates out of the box, or repeatability across projects create your own • Completely customizable
  • 15. Process Templates • Product planning based on user stories and story points • Team progresses most work by moving from active to MSF for Agile resolved to closed • Team is not usually required to support rigorous audits Software Development • Product planning based on requirements and CR’s • Most work moves from proposed to active to resolved to closed • Team is required to maintain rigorous audit trails MSF for • Team is working toward CMMI appraisal CMMI • Development lifecycle follows Scrum framework (based on Agile principles) Visual Studio Scrum MSF = Microsoft Solutions Framework CMMI = Capability Maturity Model Integration
  • 16. ALM Platform: Focus Areas PLANNING & MANAGING PROJECTS
  • 17. Planning & Managing Projects Organization • Classification • Assignment • Customization Visibility • Relate artifacts, actions, intent • Parent/child relationships • Rollup & drilldown User Comfort • Use familiar tools for role • Excel, Project, IDE, web, others.. • Eases adoption
  • 18. Organization Classification of work artifacts – Where – When Integration organizes efforts – Build – Collaboration Customize to meet team needs – Not the other way around
  • 19. Visibility Top-to-bottom traceability Agile planning Related Proactive information = notifications project insight
  • 20. User Comfort • Team members work in the tools that suit them
  • 21. User Comfort • Team members work in the tools that suit them
  • 22. User Comfort • Team members work in the tools that suit them
  • 23. User Comfort • Team members work in the tools that suit them
  • 24. User Comfort • Team members work in the tools that suit them
  • 25. User Comfort • Team members work in the tools that suit them
  • 26. User Comfort • Team members work in the tools that suit them
  • 27. User Comfort • Team members work in the tools that suit them
  • 28. Role-specific tools are simply front-ends for a much larger collaborative environment.
  • 29. ALM Platform: Focus Areas REPORT ACROSS PROJECT BOUNDARIES
  • 30. ALM data is as complex as you think
  • 31. Reporting: Not just for managers Bug Status Build Quality Indicators Trends Reporting Services Bug Status Reactivations Release Burndown Burndown & Burn Rate Sprint Burndown Requirements Progress Dashboards Remaining Work Excel Status on All Iterations Test Case Readiness Stories Overview Unplanned Work Build Success Over Time Build Summary
  • 33. Microsoft’s ALM Solution Create happy teams, enabling success from idea to retirement Process with Plan & minimal manage overhead projects Report Align roles across across the project lifecycle boundaries
  • 35. Thank You Steve Lange sr. developer technology specialist | microsoft [email protected] | 303-918-0500 slange.me | @stevelange Upcoming Office Hours: – 11/18 @ 9:30 AM (Pacific) – 12/2 @ 9:30 AM (Pacific) – 12/16 @ 9:30 AM (Pacific) – 12/30 @ 9:30 AM (Pacific) – 1/13 (2012) @ 9:30 AM (Pacific) (see blog for details)
  • 37. Links & Resources • Videos – Video: Proactive Project Management with Visual Studio 2010 – Improving Developer-Tester Collaboration with Microsoft Visual Studio 2010 • Whitepapers – What is ALM? – IDC MarketScape Excerpt: IT Project and Portfolio Management 2010 Vendor Analysis – The Forrester Wave: Agile Development Management Tools, Q2 2010 – Attaining Optimal Business Value from Agile Software Development – White Paper: Reconciling the Agile Team with Enterprise Project Management – Magic Quadrant for Integrated Software Quality Suites • Microsoft Solutions Framework – MSF for Agile homepage – MSF for CMMI homepage – Visual Studio Scrum 1.0 homepage – MSF for Agile+SDL v5.0
  • 38. Links & Resources • Articles & Product Pages – Microsoft Application Lifecycle Management – Effective Team Development – Heterogeneous Development – Product homepages • Visual Studio • Team Foundation Server • Test Professional
  • 39. Case Studies • Flextronics - Visual Studio 2010 helps Flextronics’ developers and QA teams work together • Wintellect - Wintellect uses the testing tools in Visual Studio to speed up debugging • AccessIT - Visual Studio 2010 helps Access IT give its customers better team collaboration • Sogeti - Sogeti better understands legacy systems with Visual Studio 2010 • EPiServer - EPiServer tests software more effectively and efficiently with Visual Studio 2010 • Equiniti - Share Registrar Cuts Testing Time and Improves Application Lifecycle Management • ICONICS - ICONICS is cutting cost and increasing productivity with Visual Studio 2010 • Penn National Insurance - Penn National Insurance boosted productivity and reduced testing time using Visual Studio 2010 • Readify - Using Visual Studio 2010, Readify saves time and money with virtual testing • Länsförsäkringar AB - Swedish insurance company expects to cut software development time and costs by 20 percent • 3M - Eliminating “no repro” bugs helps 3M accelerate delivery of products into the marketplace
  • 40. Microsoft’s ALM Solution Architecture Project Plan Dep. Graph User Interface Version Database Auto Test Req Design IntelliTrace Unit Code Gen Test Data Code Analysis Web & Load Deploy Code Metrics Database Task Validate Compare Code Profiler Test Impact Lab Management Test Manager Design Gated Check- TFS Source Compile in TFS Build Plan/Organize Validate Arch Branch Execute Tests Test Dev Analyze Merge Impact Deploy Fast Forward Shelve App/DB Stag Pre Execute Tests Visualize e Prod Test Case Feature Rich Bug TFS Web Requests Prod Help Tickets Needs Work Ready Access Prod Issues
  • 41. back Bug Status Report • Is the team fixing bugs quickly enough to finish on time? • Is the team fixing high priority bugs first? • What is the distribution of bugs by priority and severity? • How many bugs are assigned to each team member?
  • 42. back Bug Trends Report • How many bugs is the team reporting, resolving, and closing per day? • What is the overall trend at which the team is processing bugs? • Are bug activation and resolution rates declining toward the end of the iteration as expected?
  • 43. back Reactivations Report • How many bugs are being reactivated? • How many user stories are being reactivated? • Is the team resolving and closing reactivated bugs at an acceptable rate?
  • 44. back Build Quality Indicators Report • What is the quality of the software? • How often are tests passing, and how much of the code is being tested? • Based on the code and test metrics, is the team likely to meet target goals?
  • 45. back Build Success Over Time • Report What parts of the project have produced software that is ready to be tested? • What parts of the project are having trouble with regressions or bad checkins? • How well is the team testing the code?
  • 46. back Build Summary Report • What is the status of all builds over time? • Which builds succeeded? • Which builds have a significant number of changes to the code? • How much of the code was executed by the tests? • Which builds are ready to install?
  • 47. back Burndown and Burn Rate • Is the team likely to finish the iteration on time? Report • Will the team complete the required work, based on the current burn rate? • How much work does each team member have?
  • 48. back Remaining Work Report • What is the cumulative flow of work? • Is the team likely to finish the iteration on time? • Is the amount of work or number of work items in the iteration growing? • Does the team have too much work in progress? • How is the team doing in estimating work for the iteration? # Hours of Items of Work Work
  • 49. back Status on All Iterations Report • Is steady progress being made across all iterations? • How many stories did the team complete for each iteration? • How many hours did the team work for each iteration? • For each iteration, how many bugs did the team find, resolve, or close?
  • 50. back Stories Overview Report (Agile) • How much work does each story require? • How much work has the team completed for each story? • Are the tests for each story passing? • How many active bugs does each story have?
  • 51. back Stories Progress Report (Agile) • How much progress has the team made toward completing the work for each story? • How much work must the team still perform to implement each user story? • How much work did the team perform in the last calendar period?
  • 52. back Requirements Progress Report (CMMI) • How much progress has the team made toward completing the work for each requirement? • How much work must the team still perform to implement each requirement? • How much work did the team perform in the last calendar period?
  • 53. back Requirements Overview Report (CMMI) • How much work does each Requirement require? • How much work has the team completed for each Requirement? • Are the tests for each Requirement passing? • How many active bugs does each Requirement have?
  • 54. back Release Burndown (Scrum) • How much work remains in the release? • How quickly is your team working through the product backlog?
  • 55. back Sprint Burndown (Scrum) • How much work remains in the sprint? • Is your team on track to finish all work for the sprint? • When will your team finish all work for the sprint? • How much work for the sprint is in progress?
  • 56. back Unplanned Work Report • How much work was added after the iteration started? • Is too much work being added during the iteration?
  • 57. back Test Case Readiness Report • When will all the test cases be ready to run? • Will all the test cases be ready to run by the end of the iteration? • How many test cases must the team still write and review? • How many test cases are ready to be run?
  • 58. back Test Plan Progress Report • How much testing has the team completed? • Is the team likely to finish the testing on time? • How many tests are left to be run? • How many tests are passing? • How many tests are failing? • How many tests are blocked?

Editor's Notes

  • #7: From David Chappell’s “What is ALM?” http://go.microsoft.com/?linkid=9743693
  • #11: The business analyst starts by adding user stories. CLICK Once the user stories has been entered the developer creates tasks for implementing each user storyCLICK Meanwhile the tester authors tests against those user stories CLICK Now the developer writes code that implements a task and checks it into TFSCLICK The checking are materialized to a buildCLICK The tester examines the build, notes the delivered changes and deploys the build to test environment (not shown)CLICK The tester begins testing the build by choosing a test and running it using Microsoft Test Manager CLICK The tester identifies a bug and files it with one click – the bug is automatically associated with the test and the user storyCLICK The cycle can continue as the developer fixes the bug, associates a check-in, and then creates a build which the tester then pulls into test (and so on) CLICK
  • #12: The Visual Studio 2010 family is made up of a central team server, and a small selection of client-side tools. The team server—Team Foundation Server 2010—is the backbone of your application lifecycle management…<CLICK>…providing capabilities for source control management, (SCM), build automation, work item tracking and reporting. In this release we’ve expanded the capabilities of Team Foundation Server by adding a true test case management system…<CLICK>…and extended it with Lab Management 2010—a set of capabilities designed to better integrate both physical and virtual labs into the development process. We’ve heard your feedback as well, and we have made it to be easier to set-up and maintain Team Foundation Server—in fact it can be installed, configured and ready to use in as little as 20-minutes. <CLICK>On the client-side we have reduces the complexity of or IDE offerings. For developers, you can choose between Visual Studio 2010 Professional, Premium or Ultimate, with each subsequent product containing all of the features of its predecessor. For testers and business analysts we are introducing Test Professional—a new integrated test environment designed with manual testers in mind.<CLICK>For those people who participate in the development efforts, but for whom Visual Studio—the IDE—is not appropriate, including Java developers, project managers and stakeholders the Team Foundation Server extensibility model enables us to provide alternative interfaces. These include both Team Explorer—a standalone tool built with the Visual Studio shell—and Team Web Access. These tools enable anyone to work directly with Team Foundation Server. In October we announced the acquisition of Teamprise, a technology similar to Team Explorer for the Eclipse IDE on Windows, Linux, Mac OS X and other Unix-based operating systems. That technology has been incorporated into the Visual Studio 2010 product line, and we will be announcing how we are productizing it very soon. The most important thing to know is that we will be releasing a Teamprise-based product, and it will also be included as an MSDN benefit for Visual Studio 2010 Ultimate customers.<CLICK>Of course we are continuing our cross-product integration capabilities with Microsoft Office® and Microsoft Expression. We have improved integration between Team Foundation Server and SharePoint Server with new SharePoint dashboards, and we have a new set of capabilities that make SharePoint development much easier than in the past.Across the board the features and capabilities we built into Visual Studio 2010 are a result of the great feedback we have gotten from our customers. This release continues our commitment to enabling you, our customers, to build the right software, in the right way to ensure success for your business. Throughout the rest of the day you will learn about a variety of capabilities in Visual Studio 2010 that make the process of developing software, by teams of any size, easier. Whether it is by helping you streamline your development process, find and fix bugs quicker, more easily understand existing systems or automate repetitive processes.
  • #42: After the team has started to find and fix bugs, you can track the team's progress toward resolving and closing bugs by viewing the Bug Status report. This report shows the cumulative bug count based on the bug state, priority, and severity.
  • #43: You can use the Bug Trends report to help track the rate at which your team is discovering and resolving bugs. This report shows a rolling or moving average of bugs being reported, resolved, and closed over time. When you manage a large team or a large number of bugs, you can monitor the Bug Trends report weekly to gain insight into how well the team is finding, resolving, and closing bugs.The Bug Trends report calculates a rolling average of the number of bugs that the team has opened, resolved, and closed based on the filters that you specify. The rolling average is based on the seven days before the date for which it is calculated. That is, the report averages the number of bugs in each state for each of the seven days before the date, and then the result is divided by seven.
  • #44: As the team resolves and closes bugs, you can use the Reactivations report to determine how effectively the team is fixing bugs. Reactivations generally refer to bugs that have been resolved or closed prematurely and then reopened. The reactivation rate is also referred to as the fault feedback ratio.You can use the Reactivations report to show either bugs or user stories that have been reactivated. As a product owner, you might want to discuss acceptable rates of reactivation with the team. A low rate of reactivations (for example, less than 5%) might be acceptable depending on your team's goals. However, a high or increasing rate of reactivations indicates that the team might need to diagnose and fix systemic issues. The Reactivations report shows an area graph of the number of bugs or stories that are in a resolved state or that have been reactivated from the closed state.
  • #45: The Build Quality Indicators report shows test coverage, code churn, and bug counts for a specified build definition. You can use this report to help determine how close portions of the code are to release quality. Ideally, test rates, bugs, and code churn would all produce the same picture, but they often do not. When you find a discrepancy, you can use the Bug Quality Indicators report to examine the details of a specific build and data series. Because this report combines test results, code coverage from testing, code churn, and bugs, you can view many perspectives at the same time.
  • #46: The Build Success Over Time report provides a pictorial version of the Build Summary report. The Build Success Over Time report displays the status of the last build for each build category run for each day. You can use this report to help track the quality of the code that the team is checking in. In addition, for any day on which a build ran, you can view the Build Summary for that day.
  • #47: The Build Summary lists builds and provides information about test results, test coverage, code churn, and quality notes for each build.The data that appears in the Build Summary report is derived from the data warehouse. The report presents a visual display of the percentage of tests that are passing, code that is being tested, and changes in code across several builds. You can review the results for both manual and automatic builds, in addition to the most recent builds and continuous or frequent builds. The report lists the most recent builds first and contains build results that were captured during the specified time interval for all builds that were run, subject to the filters that you specified for the report.At a glance, you can determine the success or failure of several build definitions for the time period under review.
  • #48: After a team has worked on one or more iterations, also known as sprints, you can determine the rate of team progress by reviewing the Burndown and Burn Rate report. Burndown shows the trend of completed and remaining work over a specified time period. Burn rate provides calculations of the completed and required rate of work based on the specified time period. In addition, a chart shows the amount of completed and remaining work that is assigned to team members. You can view the Burndown and Burn Rate report based on hours worked or number of work items that have been resolved and closed.
  • #49: After the team has estimated its tasks and begun work, you can use the Remaining Work report to track the team's progress and identify any problems in the flow of work. The Remaining Work report summarizes the data that was captured during the specified time interval for each task, user story, or bug based on the filter criteria that were specified for the report. The data is derived from the data warehouse.You can view this report in either the Hours of Work view or the Number of Work Items view. The first view displays the total number of hours of work for the specified time period and the team's progress toward completing that work. The second view displays the number of work items for the specified time period and the number of work items in each state. Each view provides an area graph that charts the progress of completed work against the total estimated work for the specified time duration.
  • #50: After work has progressed on several iterations, also known as sprints, you can view the team progress by viewing the Status on All Iterations report. This report helps you track the team's performance over successive iterations. For each iteration that is defined for the product areas that you specify, this report displays the following information: Stories Closed: The number of user stories that have been closed. These values are derived from the current values specified for the iteration and the state of each user story.Progress (Hours): A two-bar numeric and visual representation that represents the values for Original Estimate (grey), Completed (green) and Remaining (light blue) based on the rollup of hours that are defined for all tasks. These values are derived from the current values that are specified for the iteration and the hours for each task. Bugs: A numeric value and visual representation for all bugs, grouped by their current states of Active (blue), Resolved (gold) and Closed (green). These values are derived from the current values that are specified for the iteration and the state of each bug.
  • #51: The Stories Overview report lists all user stories, filtered by area and iteration and in order of importance.Work Progress% Hours Completed: A numeric value and visual representation that shows the percentage of completed work based on the rollup of baseline and completed hours for all tasks that are linked to the user story or its child stories.Hours Remaining: A numeric value for the rollup of all remaining hours for all tasks that are linked to the user story or its child stories.Test StatusTest Points: A numeric value that represents the number of pairings of test cases with test configurations in a specific test suite. For more information about test points, see Reporting on Testing Progress for Test Plans.Test Results: A numeric value and visual representation that shows the percentage of test cases, grouped according to the status of their most recent test run, where the options are Passed (green), Failed (red), or Not Run (black).Bugs: A numeric value and visual representation that shows the number of bugs that are linked to the test case or user story, where the options are Active (blue) and Resolved (gold). If a user story is linked to one or more child stories, the values represent a rollup of all bugs for the user story and its child stories.User Stories that Appear in the ReportThe Stories Overview report lists and highlights user stories according to the following criteria:Stories appear in order of their importance, based on their assigned ranking.Stories appear in bold type when they are in the active or resolved state. Stories appear in normal type when they are in the closed state. Stories appear in gray type when their assigned iteration or area is outside the filtered set, but they have tasks or child stories that are within the filtered set of iterations or product areas.
  • #52: The Stories Progress report lists all user stories, filtered by product area and iteration in order of importance.This report displays the following information for each user story that appears in the report: Progress (% Completed): Numeric value that represents the percentage of completed work based on the rollup of baseline and completed hours for all tasks that are linked to the user story or its child stories.Hours Completed: A visual representation of the completed hours, displayed as a dark green bar.Recently Completed: A visual representation of those hours completed within the time interval specified for Recent (Calendar) Days, displayed as a light green bar.Hours Remaining: Rollup of all remaining hours for all tasks that are linked to the user story or its child stories.The Stories Progress report lists and highlights user stories according to the following criteria:Stories appear in order of their importance, based on their assigned ranking.Stories appear in bold type when they are in the active or resolved state. Stories appear in normal type when they are in the closed state. Stories appear in gray type when their assigned iteration or area is outside the filtered set but they have tasks or child stories that are within the filtered set of iterations or product areas.
  • #53: The Requirements Progress report shows the status of completion as determined by the tasks that have been defined to implement the requirement.
  • #54: The Requirements Overview report presents a snapshot of the work that has been performed for the filtered set of requirements to the current date.
  • #55: By reviewing a release burndown report, you can understand how quickly your team has delivered backlog items and track how much work the team must still perform to complete a product release. A release burndown graph shows how much work remained at the start of each sprint in a release. The source of the raw data is your product backlog. Each sprint appears along the horizontal axis, and the vertical axis measures the effort that remained when each sprint started. The amount of estimated effort on the vertical axis is in whatever unit that your scrum team has decided to use (for example, story points or hours).
  • #56: By reviewing a sprint burndown report, you can track how much work remains in a sprint backlog, understand how quickly your team has completed tasks, and predict when your team will achieve the goal or goals of the sprint.A sprint burndown report shows how much work remained at the end of specified intervals during a sprint. The source of the raw data is the sprint backlog. The horizontal axis shows days in a sprint, and the vertical axis measures the amount of work that remains to complete the tasks in the sprint. The work that remains is shown in hours. A sprint burndown graph displays the following pieces of data: The Ideal Trend line indicates an ideal situation in which the team burns down all of the effort that remains at a constant rate by the end of the sprint. The In Progress series shows how many hours remain for tasks that are marked as In Progress in a sprint. The To Do series shows how many hours remain for tasks that are marked as To Do in a sprint. Both the In Progress and the To Do series are drawn based on the actual progress of your team as it completes tasks.
  • #57: Toward the end of an iteration, you can use the Unplanned Work report to determine how much work was added to the iteration that was not planned at the start of the iteration. You can view the unplanned work as measured by work items added, such as tasks, test cases, user stories, and bugs. Having unplanned work may be acceptable, especially if the team has scheduled a sufficient buffer for handling the load of unplanned work (for example, bugs). On the other hand, the unplanned work may represent a real problem if the team does not have the capacity to meet it and is forced to cut back on the planned work.The Unplanned Work report is useful when the team plans an iteration by identifying all work items that they intend to resolve or close during the course of the iteration. The work items that are assigned to the iteration by the plan completion date of the report are considered planned work. All work items that are added to the iteration after that date are identified as unplanned work.
  • #58: The Test Case Readiness report provides an area graph that shows how many test cases are in the Design or Ready state over the time period that you specify. By reviewing this data, you can easily determine how quickly the team is designing test cases and making them ready for testing. When you create a test case, it is automatically set to the design state. After the team has reviewed and approved the test case, then a team member should change its state to Ready, which indicates that the test case is ready to be run.
  • #59: The data that appears in the Test Plan Progress report is derived from the data warehouse and the test results that are generated when tests are run by using Microsoft Test Manager. The report presents an area graph that shows the most recent result of running any test in the specified test plans over time. For more information, see Running Tests.The horizontal axis shows days in a sprint or iteration, and the vertical axis shows test points. A test point is a pairing of a test case with a test configuration in a specific test suite. For more information about test points, see Reporting on Testing Progress for Test Plans.