0% found this document useful (0 votes)
10 views23 pages

Software Materials Grok CH

Software engineering course for computer scince

Uploaded by

cheruuasrat30
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views23 pages

Software Materials Grok CH

Software engineering course for computer scince

Uploaded by

cheruuasrat30
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Module of Software Engineering

Course Code: CoSc3061

Comprehensive Notes

May 29, 2025

Contents
Module Introduction . . . . . . . . . . . . . . . . . . . . . . . . . 7 . . . . . . . . . . . . . . . 6
Module Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . 7 . . . . . . . . . . . . . . . 6
Organization of the Module . . . . . . . . . . . . . . . . . . . . . 8 . . . . . . . . . . . . . 6
1 Chapter 1: Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Chapter 1: Introduction . . . . . . . . . . . . . . . . . . . . . . 10 . . . . . . . . . . . . . 7
1.1 Two Orthogonal Views of Software . . . . . . . . . . . . . . . . . . . . . 7
1.1. Two Orthogonal Views of Software . . . . . . . . . . . . . . . . . . 10 . . . . . . . . . 7
1.2 Software Development Process Models . . . . . . . . . . . . . . . . . . . 8
1.2. Software Development Process Models . . . . . . . . . . . . . . . . 11 . . . . . . . . 8
1.2.1 Software Process . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.2.1. Software Process . . . . . . . . . . . . . . . . . . . . . . . . 11 . . . . . . . . . . . . 8
1.2.2 Software Life Cycle and Process Models . . . . . . . . . . . . . . 9
1.2.2. Software Life Cycle and Process Models . . . . . . . . . . . . 13 . . . . . . 9
1.2.3 Process Activities . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.2.3. Process Activities . . . . . . . . . . . . . . . . . . . . . . . 23 . . . . . . . . . . . . 10
1.2.4 Process Assessment Models . . . . . . . . . . . . . . . . . . . . . 11
1.2.4. Process Assessment Models . . . . . . . . . . . . . . . . . . . 27 . . . . . . . . . 11
1.2.5 Software Process Metrics . . . . . . . . . . . . . . . . . . . . . . 11
1.2.5. Software Process Metrics . . . . . . . . . . . . . . . . . . . . 28 . . . . . . . . . . 11
1.3 Object-Oriented System Development Methodology . . . . . . . . . . . 12
1.3. Object-Oriented System Development Methodology . . . . . . . . . . 29 . . . . 12
1.3.1 Why an Object-Oriented Approach . . . . . . . . . . . . . . . . . 12
1.3.1. Why an Object-Oriented Approach . . . . . . . . . . . . . . . 30 . . . . . . . 12
1.3.2 Overview of the Unified Approach . . . . . . . . . . . . . . . . . 12
1.3.2. Overview of the Unified Approach . . . . . . . . . . . . . . . 31 . . . . . . . 12
1.3.3 An Object-Oriented Philosophy (Reading Assignment) . . . . . . 13
1.3.3. An Object-Oriented Philosophy (Reading Assignment) . . . . . 33 . . 13
1.3.4 Basic Concepts of an Object . . . . . . . . . . . . . . . . . . . . . 13
1.3.4. Basic Concepts of an Object . . . . . . . . . . . . . . . . . . 33 . . . . . . . . . 13
1.3.5 Attributes of an Object, Its State, and Properties . . . . . . . . . 13
1.3.5. Attributes of an Object, Its State, and Properties . . . . . . . . 36 . . . 13
2 Chapter 2: Unified Modeling Language (UML) . . . . . . . . . . . . . . 13
Chapter 2: Unified Modeling Language (UML) . . . . . . . . . . . 42 . . . . . . 13
2.1 An Overview of UML . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

1
2.1. An Overview of UML . . . . . . . . . . . . . . . . . . . . . . . . . 42 . . . . . . . . . . . . . 13
2.2 Building Blocks of UML . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.2. Building Blocks of UML . . . . . . . . . . . . . . . . . . . . . . . 43 . . . . . . . . . . . . 13
2.3 UML Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3. UML Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 . . . . . . . . . . . . . . 14
2.3.1 Use Case Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.1. Use Case Diagrams . . . . . . . . . . . . . . . . . . . . . . . 46 . . . . . . . . . . . . 14
2.3.2 Class Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.2. Class Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . 50 . . . . . . . . . . . . . 14
2.3.3 State Chart Diagram . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.3. State Chart Diagram . . . . . . . . . . . . . . . . . . . . . . 54 . . . . . . . . . . . 14
2.3.4 Activity Diagrams . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.4. Activity Diagrams . . . . . . . . . . . . . . . . . . . . . . . 55 . . . . . . . . . . . . 14
2.3.5 Diagram Organization . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.5. Diagram Organization . . . . . . . . . . . . . . . . . . . . . 56 . . . . . . . . . . . 14
2.3.6 Diagram Extensions . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.3.6. Diagram Extensions . . . . . . . . . . . . . . . . . . . . . . 58 . . . . . . . . . . . 14
3 Chapter 3: Requirements Elicitation . . . . . . . . . . . . . . . . . . . . 14
Chapter 3: Requirements Elicitation . . . . . . . . . . . . . . . . 61 . . . . . . . . . 14
3.1 An Overview of Requirements Elicitation . . . . . . . . . . . . . . . . . 14
3.1. An Overview of Requirements Elicitation . . . . . . . . . . . . . . . 61 . . . . . . . 14
3.2 Requirements Elicitation Concepts . . . . . . . . . . . . . . . . . . . . . 14
3.2. Requirements Elicitation Concepts . . . . . . . . . . . . . . . . . . 64 . . . . . . . . . 14
3.2.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . . 14
3.2.1. Functional Requirements . . . . . . . . . . . . . . . . . . . . 64 . . . . . . . . . . 14
3.2.2 Non-Functional and Pseudo-Requirements . . . . . . . . . . . . . 14
3.2.2. Non-Functional and Pseudo-Requirements . . . . . . . . . . . 65 . . . . . 14
3.2.3 Levels of Description . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2.3. Levels of Description . . . . . . . . . . . . . . . . . . . . . . 66 . . . . . . . . . . . 15
3.2.4 Correctness, Completeness, Consistency, Clarity, and Realism . . 15
3.2.4. Correctness, Completeness, Consistency, Clarity, and Realism . . 67 15
3.2.5 Verifiability and Traceability . . . . . . . . . . . . . . . . . . . . 15
3.2.5. Verifiability and Traceability . . . . . . . . . . . . . . . . . . 69 . . . . . . . . . 15
3.3 Requirements Elicitation Activities . . . . . . . . . . . . . . . . . . . . . 15
3.3. Requirements Elicitation Activities . . . . . . . . . . . . . . . . . . 69 . . . . . . . . . 15
3.3.1 Identifying Actors . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.3.1. Identifying Actors . . . . . . . . . . . . . . . . . . . . . . . 69 . . . . . . . . . . . . 15
3.3.2 Identifying Scenarios . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.3.2. Identifying Scenarios . . . . . . . . . . . . . . . . . . . . . . 71 . . . . . . . . . . . 15
3.3.3 Identifying Use Cases . . . . . . . . . . . . . . . . . . . . . . . . 15
3.3.3. Identifying Use Cases . . . . . . . . . . . . . . . . . . . . . . 73 . . . . . . . . . . . 15
3.3.4 Refining Use Cases . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.3.4. Refining Use Cases . . . . . . . . . . . . . . . . . . . . . . . 73 . . . . . . . . . . . . 15
3.3.5 Identifying Relationships among Actors and Use Cases . . . . . . 15

2
3.3.5. Identifying Relationships among Actors and Use Cases . . . . . 74 . . 15
3.3.6 Identifying Initial Analysis Objects . . . . . . . . . . . . . . . . . 15
3.3.6. Identifying Initial Analysis Objects . . . . . . . . . . . . . . . 76 . . . . . . . 15
3.3.7 Identifying Non-Functional Requirements . . . . . . . . . . . . . 15
3.3.7. Identifying Non-Functional Requirements . . . . . . . . . . . . 77 . . . . . 15
3.4 Managing Requirements Elicitation . . . . . . . . . . . . . . . . . . . . 16
3.4. Managing Requirements Elicitation . . . . . . . . . . . . . . . . . . 78 . . . . . . . . . 16
3.4.1 Eliciting Information from Users: Knowledge Analysis of Tasks . 16
3.4.1. Eliciting Information from Users: Knowledge Analysis of Tasks . 78 16
3.4.2 Negotiating Specifications with Clients: Joint Application Design 16
3.4.2. Negotiating Specifications with Clients: Joint Application Design 80 16
3.4.3 Validating Requirements: Usability Testing . . . . . . . . . . . . 16
3.4.3. Validating Requirements: Usability Testing . . . . . . . . . . . 80 . . . . . 16
3.4.4 Documenting Requirements Elicitation . . . . . . . . . . . . . . . 16
3.4.4. Documenting Requirements Elicitation . . . . . . . . . . . . . 82 . . . . . . 16
4 Chapter 4: Software Project Management . . . . . . . . . . . . . . . . . 16
Chapter 4: Software Project Management . . . . . . . . . . . . . . 87 . . . . . . . . 16
4.1 Responsibility of Software Project Managers . . . . . . . . . . . . . . . 16
4.1. Responsibility of Software Project Managers . . . . . . . . . . . . . 87 . . . . . . 16
4.2 Project Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.2. Project Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . 89 . . . . . . . . . . . . . . 16
4.3 The Organization of SPMP Document . . . . . . . . . . . . . . . . . . . 16
4.3. The Organization of SPMP Document . . . . . . . . . . . . . . . . 89 . . . . . . . . 16
4.4 Project Size Estimation Metrics . . . . . . . . . . . . . . . . . . . . . . 16
4.4. Project Size Estimation Metrics . . . . . . . . . . . . . . . . . . . .90 . . . . . . . . . . 16
4.5 Project Estimation Technique . . . . . . . . . . . . . . . . . . . . . . . 16
4.5. Project Estimation Technique . . . . . . . . . . . . . . . . . . . . . 91 . . . . . . . . . . 16
4.6 Scheduling, Organization, and Team Structures . . . . . . . . . . . . . . 16
4.6. Scheduling, Organization, and Team Structures . . . . . . . . . . . . 94 . . . . . 16
4.7 Risk Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.7. Risk Management . . . . . . . . . . . . . . . . . . . . . . . . . . .95 . . . . . . . . . . . . . . 16
4.8 Quality Assurance Monitoring Plans . . . . . . . . . . . . . . . . . . . . 17
4.8. Quality Assurance Monitoring Plans . . . . . . . . . . . . . . . . . 98 . . . . . . . . . 17
5 Chapter 5: Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Chapter 5: Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 99 . . . . . . . . . . . . . . . 17
5.1 Analysis Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5.1. Analysis Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . 100 . . . . . . . . . . . . . 17
5.1.1 Entity, Boundary, and Control Objects . . . . . . . . . . . . . . . 17
5.1.1. Entity, Boundary, and Control Objects . . . . . . . . . . . . 100 . . . . . . 17
5.1.2 Association Multiplicity Revisited . . . . . . . . . . . . . . . . . . 17
5.1.2. Association Multiplicity Revisited . . . . . . . . . . . . . . . 101 . . . . . . . 17
5.1.3 Qualified Associations . . . . . . . . . . . . . . . . . . . . . . . . 17
5.1.3. Qualified Associations . . . . . . . . . . . . . . . . . . . . . 102 . . . . . . . . . . 17

3
5.1.4 Generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5.1.4. Generalization . . . . . . . . . . . . . . . . . . . . . . . . .103 . . . . . . . . . . . . . 17
5.2 Analysis Activities: From Use Cases to Objects . . . . . . . . . . . . . . 17
5.2. Analysis Activities: From Use Cases to Objects . . . . . . . . . . . 104 . . . . . 17
5.2.1 Identifying Entity Objects . . . . . . . . . . . . . . . . . . . . . . 17
5.2.1. Identifying Entity Objects . . . . . . . . . . . . . . . . . . . 104 . . . . . . . . . 17
5.2.2 Identifying Boundary Objects . . . . . . . . . . . . . . . . . . . . 17
5.2.2. Identifying Boundary Objects . . . . . . . . . . . . . . . . . 105 . . . . . . . . 17
5.2.3 Identifying Control Objects . . . . . . . . . . . . . . . . . . . . . 17
5.2.3. Identifying Control Objects . . . . . . . . . . . . . . . . . . 106 . . . . . . . . . 17
5.2.4 Modeling Interactions between Objects: Sequence Diagrams . . . 17
5.2.4. Modeling Interactions between Objects: Sequence Diagrams . . 107 17
5.2.5 Identifying Associations . . . . . . . . . . . . . . . . . . . . . . . 17
5.2.5. Identifying Associations . . . . . . . . . . . . . . . . . . . . 109 . . . . . . . . . . 17
5.2.6 Identifying Attributes . . . . . . . . . . . . . . . . . . . . . . . . 18
5.2.6. Identifying Attributes . . . . . . . . . . . . . . . . . . . . . 110 . . . . . . . . . . . 18
5.2.7 Modeling Generalization Relationships between Objects . . . . . 18
5.2.7. Modeling Generalization Relationships between Objects . . . . 111 . 18
5.2.8 Reviewing the Analysis Model . . . . . . . . . . . . . . . . . . . . 18
5.2.8. Reviewing the Analysis Model . . . . . . . . . . . . . . . . . 112 . . . . . . . . 18
6 Chapter 6: Object-Oriented System Design . . . . . . . . . . . . . . . . 18
Chapter 6: Object-Oriented System Design . . . . . . . . . . . . 115 . . . . . . . 18
6.1 System Design Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . 18
6.1. System Design Concepts . . . . . . . . . . . . . . . . . . . . . . . 117 . . . . . . . . . . . . 18
6.1.1 Subsystems and Classes . . . . . . . . . . . . . . . . . . . . . . . 18
6.1.1. Subsystems and Classes . . . . . . . . . . . . . . . . . . . . 117 . . . . . . . . . . 18
6.1.2 Services and Subsystem Interfaces . . . . . . . . . . . . . . . . . 18
6.1.2. Services and Subsystem Interfaces . . . . . . . . . . . . . . . 118 . . . . . . . 18
6.1.3 Coupling and Coherence . . . . . . . . . . . . . . . . . . . . . . . 18
6.1.3. Coupling and Coherence . . . . . . . . . . . . . . . . . . . .119 . . . . . . . . . . 18
6.1.4 Software Architecture . . . . . . . . . . . . . . . . . . . . . . . . 18
6.1.4. Software Architecture . . . . . . . . . . . . . . . . . . . . . 120 . . . . . . . . . . . 18
6.2 System Design Activities: From Objects to Subsystems . . . . . . . . . 19
6.2. System Design Activities: From Objects to Subsystems . . . . . . . 126 . . . 19
6.2.1 Identifying Design Goals . . . . . . . . . . . . . . . . . . . . . . . 19
6.2.1. Identifying Design Goals . . . . . . . . . . . . . . . . . . . .126 . . . . . . . . . . 19
6.3 Documenting System Design . . . . . . . . . . . . . . . . . . . . . . . . 19
6.3. Documenting System Design . . . . . . . . . . . . . . . . . . . . . 129 . . . . . . . . . . 19
6.4 An Overview of Object Design . . . . . . . . . . . . . . . . . . . . . . . 19
6.4. An Overview of Object Design . . . . . . . . . . . . . . . . . . . . 130 . . . . . . . . . . 19
6.5 Object Design Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
6.5. Object Design Concepts . . . . . . . . . . . . . . . . . . . . . . . 133 . . . . . . . . . . . . 20
6.5.1 Application Objects vs. Solution Objects Revisited . . . . . . . . 20
6.5.1. Application Objects vs. Solution Objects Revisited . . . . . . 133 . . 20

4
6.5.2 Types, Signatures, and Visibility Revisited . . . . . . . . . . . . . 20
6.5.2. Types, Signatures, and Visibility Revisited . . . . . . . . . . 134 . . . . . 20
6.5.3 Contracts: Invariants, Preconditions, and Post-Conditions . . . . 20
6.5.3. Contracts: Invariants, Preconditions, and Post-Conditions . . . 134 20
6.5.4 UML’s Object Constraint Language . . . . . . . . . . . . . . . . 20
6.5.4. UML’s Object Constraint Language . . . . . . . . . . . . . . 135 . . . . . . . 20
7 Chapter 7: Software Quality Assurance . . . . . . . . . . . . . . . . . . 20
Chapter 7: Software Quality Assurance . . . . . . . . . . . . . . 139 . . . . . . . . 20
7.1 Overview of Software Quality Assurance . . . . . . . . . . . . . . . . . . 21
7.1. Overview of Software Quality Assurance . . . . . . . . . . . . . . . 139 . . . . . . . 21
7.2 Quality Control Techniques . . . . . . . . . . . . . . . . . . . . . . . . . 21
7.1. Quality Control Techniques . . . . . . . . . . . . . . . . . . . . . 140 . . . . . . . . . . . 21
7.2.1 Fault Avoidance Techniques . . . . . . . . . . . . . . . . . . . . . 21
7.1.2. Fault Avoidance Techniques . . . . . . . . . . . . . . . . . . 140 . . . . . . . . . 21
7.2.2 Fault Detection Techniques . . . . . . . . . . . . . . . . . . . . . 21
7.1.3. Fault Detection Techniques . . . . . . . . . . . . . . . . . . 142 . . . . . . . . . 21
7.2.3 Fault Tolerance Techniques . . . . . . . . . . . . . . . . . . . . . 22
7.1.4. Fault Tolerance Techniques . . . . . . . . . . . . . . . . . . 144 . . . . . . . . . 22
7.3 Testing Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
7.2. Testing Concepts . . . . . . . . . . . . . . . . . . . . . . . . . . 145 . . . . . . . . . . . . . . 22
7.4 Testing Activities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
7.3. Testing Activities . . . . . . . . . . . . . . . . . . . . . . . . . . 146 . . . . . . . . . . . . . 22
7.4.1 Inspecting Components . . . . . . . . . . . . . . . . . . . . . . . 22
7.3.1. Inspecting Components . . . . . . . . . . . . . . . . . . . . 147 . . . . . . . . . . 22
7.4.2 Unit Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
7.3.2. Unit Testing . . . . . . . . . . . . . . . . . . . . . . . . . .147 . . . . . . . . . . . . . 22

5
Module Introduction
• Purpose: Introduces students to managing complexity in software development through
modeling, covering the software development lifecycle: requirements gathering, analy-
sis, design, implementation, testing, and maintenance.
• Focus: Emphasizes Object-Oriented (OO) concepts, tools, and methodologies, par-
ticularly using the Unified Modeling Language (UML) as the standard notation for
modeling OO systems.
• Scope: Covers both structured and OO approaches, including:
– System requirement analysis and software specification.
– Design methods, software testing, and project management techniques.
– Software reuse, Computer-Aided Software Engineering (CASE), and UML-based
modeling.
• Relevance: UML is widely adopted by major software developers (e.g., Microsoft,
Oracle), making it a critical skill for modeling OO systems.

Module Objectives
• Goals:
– Provide a thorough understanding of Object-Oriented Software Engineering princi-
ples.
– Teach students to elicit, analyze, specify, validate, and manage software require-
ments, producing complete and consistent requirement documents.
– Explain the OO software development process, including methodologies and work-
flows.
– Equip students with skills to manage software projects effectively.
– Enable students to build multiple system models (e.g., use case, object, dynamic
models) and apply them across development phases.
• Learning Outcomes:
– Describe UML theory, concepts, and methods in detail.
– Create requirements using use case modeling.
– Demonstrate conceptual and technical skills in analyzing, designing, and implement-
ing OO software systems.
– Use tools and techniques for OO software engineering.
– Solve software development problems (from specification to testing) individually and
in teams.

Organization of the Module


• Structure: The module is divided into seven chapters, each addressing a key aspect
of software engineering with a focus on OO methodologies:

6
– Chapter 1: Introduction: Covers software processes, process models, and OO
system development.
– Chapter 2: Unified Modeling Language (UML): Explores UML notations and
diagrams for modeling.
– Chapter 3: Requirements Elicitation: Discusses techniques for gathering and
specifying requirements.
– Chapter 4: Software Project Management: Addresses project planning, esti-
mation, and risk management.
– Chapter 5: Analysis: Focuses on analyzing use cases to identify objects and
relationships.
– Chapter 6: Object-Oriented System Design: Covers system and object design
concepts and activities.
– Chapter 7: Software Quality Assurance: Examines testing and quality control
techniques.
• Approach: Combines theoretical concepts with practical applications, using UML for
OO modeling and emphasizing iterative development and testing.

1 Chapter 1: Introduction
General Objective
• Introduce the concept of a software process as a coherent set of activities for software
production.

Specific Objectives
• Understand software processes and process models.
• Learn three generic process models (Waterfall, Incremental, Reuse-Oriented) and their
applicability.
• Explore fundamental process activities: requirements engineering, development, test-
ing, and evolution.
• Understand the need for processes to handle changing requirements and designs.
• Learn how the Rational Unified Process (RUP) integrates good software engineering
practices for adaptable processes.

1.1 Two Orthogonal Views of Software


• Traditional Approach:
– Views software as a collection of functions/procedures and isolated data.
– Defined as: Algorithms + Data Structures = Programs.
– Focuses on system functions, with different methodologies for each development
phase.

7
– Challenges: Complex transitions between phases increase project duration and
complexity.
• Object-Oriented Approach:
– Centers on objects that combine data and functionality.
– Objects are modular, easily replaced, modified, and reused.
– Reduces complexity and redundancy, enabling easier phase transitions.
• Comparison (Table 1.1):
– Traditional: Procedure-focused, complex phase transitions, longer project dura-
tion, higher complexity.
– OO: Object-focused, modular, easier transitions, shorter duration, reduced com-
plexity.

1.2 Software Development Process Models


1.2.1 Software Process
• Definition: A set of related activities leading to a software product, including devel-
opment from scratch or extending existing systems.
• Importance: Provides stability, control, and organization to prevent chaos in devel-
opment.
• Participants: Software engineers, managers, and clients (who request the software).
• Fundamental Activities:

1. Software Specification: Define functionality and operational constraints.


2. Software Design and Implementation: Produce software meeting the speci-
fication.
3. Software Validation: Ensure the software meets customer expectations.
4. Software Evolution: Adapt software to changing needs.

• Sub-Activities: Include requirements validation, architectural design, unit testing,


etc.
• Supporting Activities: Documentation, configuration management.
• Process Descriptions:
– Include products (e.g., architecture model), roles (e.g., project manager), and pre-
/post-conditions (e.g., approved requirements before design).
• Categories:
– Plan-Driven: All activities planned in advance, progress measured against the
plan.
– Agile: Incremental planning, easier to adapt to changing requirements.
– Balance between plan-driven and agile is often needed.

8
1.2.2 Software Life Cycle and Process Models
• Software Process Model: A simplified, abstract representation of a process, provid-
ing partial information (e.g., activities but not roles).
• Generic Models (Process Paradigms):
1. Waterfall Model:
– Structure: Sequential phases (requirements, design, implementation, testing,
maintenance).
– Characteristics: No phase is complete until documentation is approved by
SQA. Feedback loops exist for maintenance.
– Why Waterfall?: Once a phase is done, progress moves forward without
returning, like water over a cliff.
– Advantages:
∗ Simple, easy to understand, and manage.
∗ Phases completed sequentially without overlap.
∗ Suitable for small projects with well-understood requirements.
– Disadvantages:
∗ Inflexible; difficult to revisit earlier phases.
∗ No working software until late in the cycle.
∗ High risk and uncertainty.
∗ Poor for long or changing-requirement projects.
2. Incremental Development:
– Structure: Interleaves specification, development, and validation, delivering
system increments.
– Characteristics: Each increment adds functionality, starting with a basic
version.
– Example: Word processor increments (file management, advanced editing,
spell-check, layout).
– Advantages:
∗ Early working software delivery.
∗ Flexible, low-cost requirement changes.
∗ Easier testing/debugging in smaller iterations.
∗ Customer feedback per increment.
∗ Lower initial cost, better risk management.
– Disadvantages:
∗ Needs good planning and design.
∗ Requires clear system definition upfront.
∗ Higher total cost than waterfall.
3. Reuse-Oriented Software Engineering:
– Structure: Integrates reusable components (e.g., web services, object collec-
tions, COTS systems).
– Stages:
∗ Component analysis: Search for matching components.

9
∗ Requirements modification: Adjust requirements to fit components.
∗ System design with reuse: Design framework around components.
∗ Development and integration: Develop custom code and integrate compo-
nents.
– Advantages:
∗ Reduced cost and risk.
∗ Faster delivery.
– Disadvantages:
∗ Compromised requirements.
∗ Loss of control over component evolution.
∗ Increased maintenance costs.
∗ Not-invented-here syndrome.
∗ Limited tool support.

1.2.3 Process Activities


• Overview: Real processes interleave technical, collaborative, and managerial activities
for specifying, designing, implementing, and testing software.
• Core Activities:

1. Software Specification (Requirements Engineering):


– Understand and define required services and constraints.
– Produces an agreed requirements document.
– Sub-Activities:
∗ Feasibility Study: Assess if user needs can be met with current technology,
cost-effectively.
∗ Requirements Elicitation and Analysis: Derive requirements via obser-
vation, discussions, and prototypes.
∗ Requirements Specification: Translate information into user and system
requirements.
∗ Requirements Validation: Check for realism, consistency, and complete-
ness.
2. Software Design and Implementation:
– Convert specifications into executable systems via design and programming.
– Design Activities:
∗ Architectural design: Define system structure and components.
∗ Interface design: Specify unambiguous component interfaces.
∗ Component design: Detail component functionality.
∗ Database design: Design data structures and database representation.
3. Software Validation (Verification and Validation):
– Ensure the system meets specifications and customer expectations.
– Primarily uses program testing, supplemented by inspections and reviews.
4. Software Evolution (Maintenance):

10
– Adapt software to changing needs, cheaper than hardware changes.
– Often more costly than development, considered less creative but critical.

1.2.4 Process Assessment Models


• Purpose: Improve processes by measuring attributes, deriving metrics, and using
indicators for enhancement strategies.
• Measurement:
– Indirectly assess process efficacy via outcomes (e.g., errors, defects, productivity,
effort, schedule).
– Metrics include size-oriented (lines of code) and function-oriented (function points).
• Types:
– Process Metrics: Strategic, provide insight into process effectiveness.
– Project Metrics: Tactical, help adapt workflow and technical approaches.
• Benefits:
– Improve planning, tracking, and control.
– Assess product quality and guide improvements.
• Implementation:
– Use control charts for statistical validity.
– Follow a goal-driven approach with a metrics baseline database.

1.2.5 Software Process Metrics


• Purpose: Used for strategic (process) and tactical (project) purposes.
• Applications:
– Estimation: Use past project metrics for effort and time estimates.
– Monitoring: Compare actual effort/time to estimates for progress control.
– Technical Metrics: Assess design quality, guide coding and testing.
• Goals:
– Minimize development schedules by avoiding delays and risks.
– Assess and improve product quality, reducing defects and rework.
• Metric Types:
– Inputs: Resources (e.g., people, tools).
– Outputs: Deliverables (e.g., documentation, code).
– Results: Effectiveness of deliverables.
• Variability: Metrics vary across projects due to factors like team skill, technology,
and customer knowledge.

11
1.3 Object-Oriented System Development Methodology
• Overview: Differs from traditional function-based approaches by building modular,
reusable objects that encapsulate data and functionality.
• Key Concepts:
– Software as a collection of discrete objects modeling real-world entities.
– Objects have attributes (data) and methods (functions).
– Objects are grouped into classes, forming a dynamic class tree.
• Example: Windows objects (e.g., a window or chart) are self-contained, handling their
own operations (e.g., opening, drawing).

1.3.1 Why an Object-Oriented Approach


• Benefits:
– Easier to adapt to changes.
– More robust and maintainable.
– Promotes design and code reuse.
– Creates modular functionality.
• Reasons for Effectiveness:
– Higher Abstraction: Objects encapsulate data and methods, simplifying design
and maintenance.
– Seamless Transitions: Uses consistent OO language across analysis, design, and
implementation, reducing complexity.
– Good Programming Techniques: Classes are tightly cohesive and loosely cou-
pled, minimizing change impacts.
– Reusability: Objects model real-world domains, making them reusable across
projects.

1.3.2 Overview of the Unified Approach


• Unified Modeling Language (UML):
– A set of notations for describing and modeling applications.
– Does not specify development steps (methodology).
• Unified Approach (UA):
– Specifies development tasks, centered on Jacobson’s use case model.
– Use cases capture user goals and interactions with the system.
– Supports dynamic class trees and reusable class libraries.
• UA Components:
– Use-case driven development.
– UML for modeling.

12
– OO analysis and design.
– Reusable class repositories.
– Layered architecture.
– Incremental development and prototyping.
– Continuous testing.

1.3.3 An Object-Oriented Philosophy (Reading Assignment)


• Note: This is a reading assignment, likely discussing the philosophical foundations of
OO development, emphasizing modularity, encapsulation, and abstraction.

1.3.4 Basic Concepts of an Object


• Definition: An object packages data (attributes) and procedures (methods) to model
real-world entities.
• Characteristics:
– Attributes reflect the object’s state via data values.
– Methods define what the object can do.
– Attributes and methods are inseparable.
• Example: A car object has attributes (e.g., color) and methods (e.g., drive).

1.3.5 Attributes of an Object, Its State, and Properties


• Attributes: Data elements defining an object’s characteristics (e.g., a car’s color,
speed).
• State: The current values of an object’s attributes at a given time.
• Properties: Behaviors or constraints associated with attributes (e.g., valid color val-
ues).

2 Chapter 2: Unified Modeling Language (UML)


2.1 An Overview of UML
• Definition: UML is a standardized modeling language for visualizing, specifying,
constructing, and documenting software systems.
• Purpose: Provides a common notation for OO modeling, adopted by major developers.
• Applications: Used across analysis, design, and implementation phases.

2.2 Building Blocks of UML


• Components:
– Things: Structural elements (e.g., classes, objects, use cases).
– Relationships: Connections between things (e.g., associations, dependencies).
– Diagrams: Visual representations of system models.

13
2.3 UML Diagrams
• Overview: UML includes various diagrams to model different system aspects.

2.3.1 Use Case Diagrams


• Model interactions between actors (users/systems) and the system.
• Show system functionality from the user’s perspective.

2.3.2 Class Diagrams


• Represent static structure, showing classes, attributes, methods, and relationships (e.g.,
inheritance, associations).

2.3.3 State Chart Diagram


• Model the dynamic behavior of objects, showing state transitions triggered by events.

2.3.4 Activity Diagrams


• Depict workflows and processes, showing activities, decisions, and transitions.

2.3.5 Diagram Organization


• Guidelines for structuring and combining diagrams for clarity and coherence.

2.3.6 Diagram Extensions


• Techniques for customizing UML diagrams to specific needs or domains.

3 Chapter 3: Requirements Elicitation


3.1 An Overview of Requirements Elicitation
• Purpose: Gather and define system requirements to ensure the system meets user
needs.
• Process: Involves understanding stakeholder needs and constraints.

3.2 Requirements Elicitation Concepts


3.2.1 Functional Requirements
• Specify what the system must do (e.g., features, operations).

3.2.2 Non-Functional and Pseudo-Requirements


• Non-Functional: Define system qualities (e.g., performance, security).
• Pseudo-Requirements: Constraints or preferences not strictly required.

14
3.2.3 Levels of Description
• Requirements described at varying detail levels (e.g., high-level user needs, detailed
system specs).

3.2.4 Correctness, Completeness, Consistency, Clarity, and Realism


• Correctness: Requirements accurately reflect user needs.
• Completeness: All necessary requirements are included.
• Consistency: No contradictions among requirements.
• Clarity: Requirements are unambiguous.
• Realism: Requirements are feasible within constraints.

3.2.5 Verifiability and Traceability


• Verifiability: Requirements can be tested to confirm they are met.
• Traceability: Requirements can be linked to design, code, and tests.

3.3 Requirements Elicitation Activities


3.3.1 Identifying Actors
• Determine external entities (users, systems) interacting with the system.

3.3.2 Identifying Scenarios


• Describe specific instances of system use to capture user interactions.

3.3.3 Identifying Use Cases


• Define functional requirements as use cases, detailing actor-system interactions.

3.3.4 Refining Use Cases


• Improve use case clarity and detail through iteration.

3.3.5 Identifying Relationships among Actors and Use Cases


• Model dependencies and interactions (e.g., includes, extends relationships).

3.3.6 Identifying Initial Analysis Objects


• Identify key objects involved in use cases for further analysis.

3.3.7 Identifying Non-Functional Requirements


• Specify quality attributes (e.g., performance, usability) alongside functional require-
ments.

15
3.4 Managing Requirements Elicitation
3.4.1 Eliciting Information from Users: Knowledge Analysis of Tasks
• Analyze user tasks to understand their needs and workflows.

3.4.2 Negotiating Specifications with Clients: Joint Application Design


• Collaborative sessions with stakeholders to agree on requirements.

3.4.3 Validating Requirements: Usability Testing


• Test requirements with users to ensure they meet expectations.

3.4.4 Documenting Requirements Elicitation


• Create a formal requirements document (e.g., SRS) to guide development.

4 Chapter 4: Software Project Management


4.1 Responsibility of Software Project Managers
• Oversee project planning, execution, and delivery, ensuring scope, time, and quality
goals are met.

4.2 Project Planning


• Define project objectives, resources, timelines, and deliverables.

4.3 The Organization of SPMP Document


• Structure of the Software Project Management Plan (SPMP), including scope, sched-
ule, and risk management.

4.4 Project Size Estimation Metrics


• Techniques to estimate project size (e.g., lines of code, function points).

4.5 Project Estimation Technique


• Methods for estimating effort, cost, and duration (e.g., COCOMO, expert judgment).

4.6 Scheduling, Organization, and Team Structures


• Create project schedules and define team roles and structures (e.g., hierarchical, ma-
trix).

4.7 Risk Management


• Identify, assess, and mitigate project risks (e.g., technical, schedule risks).

16
4.8 Quality Assurance Monitoring Plans
• Define processes to monitor and ensure software quality throughout the project.

5 Chapter 5: Analysis
5.1 Analysis Concepts
5.1.1 Entity, Boundary, and Control Objects
• Entity Objects: Represent persistent data (e.g., database entities).
• Boundary Objects: Interface between the system and external actors (e.g., UI com-
ponents).
• Control Objects: Manage interactions and workflows within the system.

5.1.2 Association Multiplicity Revisited


• Define the number of instances in relationships between objects (e.g., one-to-many).

5.1.3 Qualified Associations


• Associations with qualifiers (e.g., keys) to reduce multiplicity or specify access.

5.1.4 Generalization
• Model inheritance relationships between classes (e.g., superclass-subclass).

5.2 Analysis Activities: From Use Cases to Objects


5.2.1 Identifying Entity Objects
• Extract persistent objects from use cases (e.g., data entities).

5.2.2 Identifying Boundary Objects


• Identify objects that interact with external actors (e.g., forms, screens).

5.2.3 Identifying Control Objects


• Define objects that coordinate system behavior (e.g., controllers).

5.2.4 Modeling Interactions between Objects: Sequence Diagrams


• Create sequence diagrams to show object interactions over time.

5.2.5 Identifying Associations


• Define relationships between objects (e.g., associations, dependencies).

17
5.2.6 Identifying Attributes
• Specify data attributes for objects.

5.2.7 Modeling Generalization Relationships between Objects


• Model inheritance hierarchies to promote reuse.

5.2.8 Reviewing the Analysis Model


• Validate the analysis model for completeness and consistency.

6 Chapter 6: Object-Oriented System Design


6.1 System Design Concepts
6.1.1 Subsystems and Classes
• Decompose the system into subsystems, each containing related classes to reduce com-
plexity.
• Subsystems are cohesive units with defined interfaces.

6.1.2 Services and Subsystem Interfaces


• Services: Sets of related operations provided by a subsystem.
• Subsystem Interfaces: Define how subsystems interact with others, ensuring mod-
ularity.

6.1.3 Coupling and Coherence


• Coupling: Degree of dependency between subsystems (low coupling is desirable).
• Coherence: Strength of internal relationships within a subsystem (high coherence is
desirable).

6.1.4 Software Architecture


• Defines the system’s structure and organization.
• Common architectures:
– Repository Architecture: Subsystems access a central data structure (e.g., parse
tree in compilers).
– Model/View/Controller (MVC):
∗ Model: Maintains domain knowledge.
∗ View: Displays data to users.
∗ Controller: Manages user interactions.
∗ Uses subscribe/notify for state propagation.
– Client/Server Architecture: Servers provide services to clients via remote calls
(e.g., web systems).

18
– Peer-to-Peer Architecture: Subsystems act as both clients and servers, increasing
complexity.
– Pipe and Filter Architecture: Subsystems (filters) process data streams via pipes
(e.g., Unix shell).

6.2 System Design Activities: From Objects to Subsystems


6.2.1 Identifying Design Goals
• Define qualities the system should prioritize (e.g., reliability, security, modifiability).
• Derived from non-functional requirements and application domain.
• Criteria:
– Performance: Response time, throughput, memory usage.
– Dependability: Robustness, reliability, availability, fault tolerance, security, safety.
– Cost: Development, deployment, upgrade, maintenance, administration costs.
– Maintenance: Extensibility, modifiability, adaptability, portability, readability,
traceability.
– End-User: Utility, usability.
• Trade-Offs: Balance goals (e.g., space vs. speed, delivery time vs. functionality).

6.3 Documenting System Design


• System Design Document (SDD):
– Describes design goals, subsystem decomposition, hardware/software mapping, data
management, access control, control flow, and boundary conditions.
– Used to define interfaces and guide developers.
• Template:

1. Introduction (purpose, goals, definitions).


2. Current software architecture.
3. Proposed architecture (subsystem decomposition, mapping, data management,
security, control, boundary conditions).
4. Subsystem services.
5. Glossary.

6.4 An Overview of Object Design


• Purpose: Bridge the gap between analysis/system design and implementation by
refining models and identifying objects.
• Activities:
– Service Specification: Define subsystem interfaces (APIs) with operations, argu-
ments, and exceptions.

19
– Component Selection: Use/adapt off-the-shelf components (e.g., class libraries).
– Restructuring: Manipulate models to increase reuse (e.g., merge classes, simplify
associations).
– Optimization: Improve performance (e.g., change algorithms, add redundant as-
sociations).

6.5 Object Design Concepts


6.5.1 Application Objects vs. Solution Objects Revisited
• Application Objects: Represent domain concepts (e.g., customer, order).
• Solution Objects: Support components without domain counterparts (e.g., UI, database).

6.5.2 Types, Signatures, and Visibility Revisited


• Types: Specify attribute/parameter value ranges and operations (e.g., int for a
counter).
• Signatures: Define operation parameter and return types (e.g., put(Object, Object):void).
• Visibility:
– Private: Accessible only by the class.
– Protected: Accessible by class and subclasses.
– Public: Accessible by all classes.

6.5.3 Contracts: Invariants, Preconditions, and Post-Conditions


• Contracts: Constraints ensuring caller and callee assumptions align.
• Types:
– Invariants: Always true for class instances (e.g., data consistency).
– Preconditions: Must be true before an operation.
– Postconditions: Must be true after an operation.

6.5.4 UML’s Object Constraint Language


• OCL: Formal language for specifying constraints on UML elements (e.g., attributes,
classes).
• Usage: Define invariants, preconditions, and postconditions as true/false expressions.
• Limitations: Not procedural; cannot express control flow.

7 Chapter 7: Software Quality Assurance


Objectives
• Describe quality assurance issues.

20
• Learn non-execution-based testing (inspections).
• Understand execution-based testing principles.
• Identify what needs to be tested.

7.1 Overview of Software Quality Assurance


• Reliability: Probability a system operates without failure under specified conditions.
• Definitions:
– Failure: Deviation from specified behavior.
– Error: State leading to potential failure.
– Fault (Bug): Mechanical/algorithmic defect causing errors.
• Testing: Systematic attempt to find errors, aiming to identify faults for correction.
• Contrast: Testing is not about proving correctness but demonstrating faults exist.

7.2 Quality Control Techniques


• Categories:

1. Fault Avoidance: Prevent errors statically (before execution).


2. Fault Detection: Identify errors during development (e.g., debugging, testing).
3. Fault Tolerance: Recover from failures at runtime.

7.2.1 Fault Avoidance Techniques


• Techniques:
– Development Methodologies: Use models to reduce complexity (e.g., encapsu-
lation, low coupling).
– Configuration Management: Control changes to prevent inconsistencies.
– Verification: Prove correctness formally (limited to specific cases).
– Reviews:
∗ Walkthroughs: Informal developer presentations of code/APIs.
∗ Inspections: Formal review by a team, checking against requirements.

7.2.2 Fault Detection Techniques


• Types:
– Debugging:
∗ Start from failures to find erroneous states and faults.
∗ Correctness Debugging: Fix functional deviations.
∗ Performance Debugging: Address non-functional issues (e.g., speed).
– Testing:
∗ Create planned failures to identify errors.

21
∗ A successful test finds defects, not proves their absence.
∗ Types:
· Unit Testing: Test individual objects/subsystems.
· Integration Testing: Test combined components.
· System Testing: Test the entire system (functional, performance).
· Acceptance Testing: Client validates against requirements.

7.2.3 Fault Tolerance Techniques


• Purpose: Recover from failures during execution.
• Examples:
– Atomic Transactions: Ensure actions complete or revert (databases).
– Modular Redundancy: Use multiple components for the same task (e.g., space
shuttle computers).
• Scope: Critical for high-reliability systems.

7.3 Testing Concepts


• Elements:
– Component: Testable system part (object, subsystem).
– Fault: Design/coding mistake.
– Error: Fault manifestation during execution.
– Failure: Deviation from specification.
– Test Case: Inputs and expected outputs to detect faults.
– Test Stub/Driver: Simulate dependencies for isolated testing.
– Correction: Change to fix faults, potentially introducing new ones.

7.4 Testing Activities


7.4.1 Inspecting Components
• Process: Formal review of source code to find defects (Fagan’s method).
• Steps:
– Overview, preparation, inspection meeting, rework, follow-up.

7.4.2 Unit Testing


• Focus: Test individual components (objects, subsystems).
• Benefits: Simplifies testing, isolates faults, enables parallel testing.
• Techniques:
– Equivalence Testing: Test representative inputs from equivalence classes.
– Boundary Testing: Focus on edge cases (e.g., zero, max values).

22
– Path Testing: Exercise all code paths (white box).
– State-Based Testing: Test object states using UML state charts.

23

You might also like