0% found this document useful (0 votes)
8 views6 pages

Stqa Insem Pyq

The document outlines the Software Development Life Cycle (SDLC) for a food delivery application, detailing stages from requirement analysis to continuous improvement. It also discusses the differences between testing and debugging, challenges in the SDLC, and quality control plans for various software applications. Additionally, it examines the relationship between quality and productivity, defines quality from different stakeholder perspectives, and differentiates between tools and techniques in software development.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views6 pages

Stqa Insem Pyq

The document outlines the Software Development Life Cycle (SDLC) for a food delivery application, detailing stages from requirement analysis to continuous improvement. It also discusses the differences between testing and debugging, challenges in the SDLC, and quality control plans for various software applications. Additionally, it examines the relationship between quality and productivity, defines quality from different stakeholder perspectives, and differentiates between tools and techniques in software development.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

CONSTRUCT SDLC FOR REAL LIFE FOOD ### 5.

**Continuous Improvement**:
DELIVERY APPLICATION - Embracing innovation and regularly updating
### 1. **Requirement Analysis**: processes leads to higher e ciency and quality.
- Understand user needs: Ordering food online, - Organizations use tools like PDCA (Plan-Do-Check-
tracking delivery, payment options, restaurant ratings.- Act) to drive improvements.
Gather functional and non-functional requirements, like ### 6. **Evidence-Based Decision Making**:
speed and scalability. - Decisions are made using accurate data and
### 2. **Planning**: analysis.
- De ne project scope, budget, and timeline. - This ensures better control and alignment with
- Identify resources (developers, testers, designers). organizational goals.
- Prepare risk management strategies. ### 7. **Relationship Management**:
### 3. **System Design**: - Building and maintaining strong relationships with
- Create architectural diagrams and UI/UX designs. stakeholders, including suppliers, enhances mutual
- De ne database structure and APIs. trust and long-term success.
- Plan for app features like search lters, order history,
noti cations.
### 4. **Development**: DIFFERENCE BETWEEN TESTING AND
- Implement features using programming languages DEBUGGING
(e.g., Java, Python). ### 1. **De nition**:
- Frontend development for user interface (React, - **Testing**: The process of systematically identifying
Angular). and reporting defects or bugs in software by executing
- Backend development for server logic (Node.js, it under controlled conditions.
Django). - **Debugging**: The process of diagnosing, locating,
### 5. **Testing**: and xing the root causes of the defects found during
- Perform unit, integration, and system testing. testing or normal usage.
- Check functionality, security, and performance. ## 2. **Objective**:
- Fix bugs and ensure app runs smoothly on di erent - **Testing**: Focuses on evaluating software quality
devices. and ensuring it meets requirements.
### 6. **Deployment**: - **Debugging**: Focuses on correcting the issues
- Deploy the app on platforms like Google Play Store identi ed to ensure proper functionality.
and Apple App Store. ### 3. **Who Performs It**:
- Set up servers for backend operations. - **Testing**: Typically conducted by dedicated
### 7. **Maintenance**: **testers** or **Quality Assurance (QA)** teams.
- Monitor app performance and user feedback. - **Debugging**: Performed by **developers** or
- Release updates and patches for new features or coders responsible for the code.
bug xes. ### 4. **Output**:
- Ensure continuous system improvements. - **Testing**: Produces test reports that document
defects and software quality.
- **Debugging**: Results in corrected code and a
The pillars of a Quality Management System (QMS) working software solution.
form the foundation for consistently delivering products
or services that meet customer expectations while
ensuring continuous improvement. Here are the core CHALLENGES AND PROBLEMATIC AREA IN SDLC
pillars: 1. **Requirement Gathering and Analysis**:
### 1. **Customer Focus**: - Misunderstood or incomplete requirements can lead
- Meeting and exceeding customer needs is the to software that doesn't meet client expectations.
primary aim. - Changing requirements during development adds
- Understanding customer requirements and striving complexity.
for satisfaction ensures loyalty and trust. 2. **Planning**:
### 2. **Leadership**: - Unrealistic project schedules and budgets often
- Strong leadership establishes clear goals, vision, result in delays and overspending.
and direction. - Poor risk management leads to unpreparedness for
- Leaders create an environment where employees unforeseen issues.
feel motivated and aligned with the organization's 3. **Design Phase**:
objectives. - Inadequate or awed design can cause issues
### 3. **Engagement of People**: during development and testing.
- Employees at all levels are critical to the - Lack of proper documentation makes maintenance
organization's success. di cult.
- Encouraging involvement, skill development, and 4. **Development Phase**:
teamwork enhances overall performance. - Coding errors or lack of adherence to coding
### 4. **Process Approach**: standards can lead to defective software.
- Managing activities as interconnected processes - Inadequate collaboration among team members
boosts e ciency and consistency. may slow progress.
- This approach optimizes resource utilization and 5. **Testing Phase**:
minimizes errors.
ffi
fi
fi
fi
fi
fi
fi
ffi
fi
fl
ffi
fi
ff
- Limited test coverage may allow undetected defects - Release regular updates to enhance usability and x
to pass through. bugs.
- Delayed testing results in expensive bug xes in
later stages.
6. **Deployment**: PLAN SOFTWARE QUALITY CONTROL WITH
- Poorly planned deployment can lead to compatibility RESPECT TO SPACE RESEARCH
or performance issues on user platforms. ### **1. De ne Quality Objectives**:
- Insu cient training for end-users may reduce - Ensure the software accurately performs
adoption. calculations, simulations, and data processing.
7. **Maintenance**: - Guarantee safety and reliability for mission-critical
- Lack of timely updates or patches can result in applications, such as navigation and communication
security vulnerabilities. systems.
- Di culty in scaling the application may lead to user - Adhere to industry standards like ISO 9001 and
dissatisfaction. NASA’s software assurance guidelines.
### **2. Develop Testing Strategy**:
- **Functional Testing**: Verify the software’s ability to
PLAN SOFTWARE QUALITY CONTROL WITH handle complex computations, trajectory planning, and
RESPECT TO COLLEGE ATTENDANCE SOFTWARE signal processing.
### **1. De ne Quality Objectives**: - **Performance Testing**: Assess how the software
- Ensure accurate attendance recording. operates under extreme conditions, such as varying
- Guarantee data security for student information. temperature, pressure, or data load.
- Deliver a user-friendly interface for administrators, - **Security Testing**: Ensure robust protection
faculty, and students. against unauthorized access, especially for sensitive
### **2. Develop a Testing Strategy**: mission data.
- **Functional Testing**: Test core features like - **Stress Testing**: Evaluate the software’s durability
attendance marking, report generation, and editing under peak loads, such as high data throughput during
attendance records. launch or landing.
- **Usability Testing**: Check if the software is easy to ### **3. Simulations and Real-World Scenarios**:
navigate for all users. - Use simulators to mimic real-world conditions, such
- **Performance Testing**: Ensure smooth operation as orbit, atmospheric reentry, or planetary
under heavy user loads (e.g., during class registration). environments.
- **Security Testing**: Validate that student data is - Incorporate edge cases like system failure or
encrypted and protected from unauthorized access. unexpected anomalies to test software resilience.
### **3. Prepare Test Data**: ### **4. Quality Metrics**:
- Use real-life scenarios such as adding new - Monitor defect density, mean time to failure, and
students, handling absenteeism, and generating overall system stability.
monthly attendance reports. - Measure precision of scienti c calculations and
- Include edge cases like network failure or incorrect accuracy of data analysis.
data entry. ### **5. Documentation and Reporting**:
### **4. Execute Testing in Phases**: - Maintain detailed records of test results, xes, and
- **Unit Testing**: Verify individual features like software versions.
attendance marking or user login. - Share regular updates with stakeholders, ensuring
- **Integration Testing**: Ensure modules like transparency in QC processes.
attendance and noti cations work seamlessly together. ### **6. Continuous Improvement**:
- **System Testing**: Test the complete software to - Use feedback from previous missions or projects to
identify issues in overall functionality. re ne software quality.
### **5. Set Quality Metrics**: - Adapt to emerging technologies and evolving space
- Monitor defect rates, resolution time for issues, and research requirements.
system uptime. ### **7. Safety and Compliance Checks**:
- Check responsiveness and accuracy of attendance - Conduct rigorous veri cation and validation to
marking under real-time conditions. ensure compliance with mission-speci c safety
### **6. Conduct Reviews and Inspections**: protocols.
- Regularly inspect code, designs, and test results to - Include redundant checks to eliminate risks in
identify potential aws early. critical systems.
- Review software updates to ensure compatibility
with existing features.
### **7. Final Veri cation and Validation**: EXAMINE RELATIONSHIP BETWEEN QUALITY
- Perform **Acceptance Testing**: Engage faculty and PRODUCTIVITY
administrators to con rm the software meets their ### **1. Positive Correlation**:
expectations. - High-quality processes and outputs often lead to
- Validate compliance with educational institution increased productivity. For instance, fewer defects
policies and standards. reduce the need for rework, saving time and resources,
### **8. Continuous Improvement**: which boosts overall e ciency.
- Collect feedback from users (administrators, faculty, ### **2. Cost vs. E ciency Trade-o s**:
students) and incorporate suggestions. - Poor quality leads to frequent xes and downtime,
which hampers productivity and increases costs.
fi
ffi
ffi
fi
fi
fl
fi
ffi
fi
fi
ffi
fi
fi
fi
ff
fi
fi
fi
fi
- On the other hand, investing in quality through - **Tools**: Physical or digital instruments used to
robust testing and streamlined processes may require perform tasks or achieve objectives (e.g., software,
initial e ort but yields better productivity in the long run.
frameworks, equipment).
### **3. Employee Morale**: - **Techniques**: Methods or approaches applied to
- High-quality systems empower employees to work use tools e ectively or accomplish tasks (e.g.,
e ectively and feel satis ed, which improves output and strategies, processes).
productivity. ### **2. Purpose**:
- Conversely, poor quality may frustrate workers and - **Tools**: Serve as enablers for work by providing
lead to ine ciencies. functionality (e.g., Selenium for software testing).
### **4. Customer Satisfaction**: - **Techniques**: Guide the execution by de ning the
- High-quality products ensure customer loyalty, steps or methodology (e.g., Black-box testing method).
which in turn drives demand and productive output. ### **3. Dependency**: - **Tools**: Can exist
- Low-quality products may result in complaints, independently; their use depends on the task (e.g., a
refunds, or negative feedback, disrupting productivity.
debugger works without speci c techniques).
### **5. Balance for Optimization**:
- **Techniques**: Often require tools for practical
- Striving for perfect quality might sometimes slow
implementation (e.g., Agile development needs
productivity due to excessive focus on detail.
collaboration tools like Jira).
- A balanced approach ensures both quality and
productivity are optimized without sacri cing either. ### **4. Examples**:
- **Tools**: IDEs (Integrated Development
Give classi cation on di erent types of products Environments), testing frameworks, project
Products can be classi ed into various categories management software.
based on their characteristics, usage, and target - **Techniques**: Testing methods (unit testing,
market. Here's a broad classi cation: regression testing), software development
### **1. Based on Tangibility**: methodologies (Waterfall, Agile).
- **Tangible Products**: Physical goods that can be ### **5. Applicability**:
touched and stored (e.g., furniture, electronics, food). - **Tools**: Focused on execution and operational
- **Intangible Products**: Non-physical o erings, such support.
as services (e.g., education, consultancy, software as a - **Techniques**: Focused on optimizing process
service). and problem-solving.
### **2. Based on Consumer Use**:
- **Consumer Products**: Goods purchased for
personal use. DEFINE QUALITY AS VIEWED BY DIFFERENT
- *Convenience Goods*: Frequently bought with STAKEHOLDERS FO SOFTWARE DEVELOPMENT
minimal e ort (e.g., toothpaste, snacks). AND USAGE
- *Shopping Goods*: Compared for quality and price ### **1. Customers/End-Users**:
before buying (e.g., clothes, electronics). - Quality means a user-friendly interface, reliable
- *Specialty Goods*: Unique items with brand loyalty functionality, and error-free performance.
(e.g., luxury cars, designer watches). - They value software that meets their requirements,
- *Unsought Goods*: Not actively sought by delivers expected features, and provides satisfaction.
consumers (e.g., insurance, emergency tools). ### **2. Developers**:
- **Industrial Products**: Goods used for production - Quality means clean, e cient, and maintainable
or business operations (e.g., machinery, raw materials). code.
### **3. Based on Durability**: - Developers seek adherence to coding standards
- **Durable Goods**: Long-lasting products (e.g., and minimal bugs in the system.
appliances, vehicles). ### **3. Quality Assurance (QA) Team**:
- **Non-Durable Goods**: Consumable items with - Quality is de ned as compliance with prede ned
short lifespans (e.g., groceries, cosmetics). standards and speci cations.
### **4. Based on Manufacturing Purpose**: - QA teams focus on defect detection, test coverage,
- **Raw Materials**: Basic materials used to create and ensuring the software passes all validation checks.
other products (e.g., cotton, metals). ### **4. Project Managers**:
- **Finished Goods**: Ready-to-use products (e.g., - Quality means meeting project deadlines and
smartphones, packaged food). staying within budget while delivering software that
- **Intermediate Goods**: Products used in the ful lls requirements.
production of nal goods (e.g., car parts). - They value balance between functionality, cost, and
### **5. Based on Branding**: timeliness.
- **Branded Products**: Products from established ### **5. Organizations**:
brands (e.g., Nike, Apple). - Quality ensures reputation, customer retention, and
- **Generic Products**: Non-branded or less- alignment with business goals.
recognizable items. - They emphasize scalability, security, and
performance to maintain market competitiveness.
### **6. Regulatory Authorities**:
DIFFERENCE BETWEEN TOOLS AND TECHNIQUES - Quality means adherence to industry standards and
### **1. De nition**: regulations, such as ISO and CMMI compliance.
- Authorities ensure the software complies with legal
and ethical requirements.
ff
fi
ff
ff
ffi
ff
fi
fi
fi
fi
fi
fi
fi
ff
ffi
fi
fi
fi
ff
fi
fi
### **7. Maintenance/Support Teams**: 4. **Fixed**: The defect is resolved or xed by the
- Quality involves ease of troubleshooting and developer.
e ective updates. 5. **Retest**: The tester retests the defect to ensure it’s
- They prefer software that is well-documented and resolved.
easy to maintain over its lifecycle. 6. **Veri ed**: The defect is veri ed as xed if it passes
the retest.
7. **Closed**: The defect is marked as closed when no
CONTRAINTS OF SOFTWARE PRODUCT QUALITY further issues are found.
ASSESMENT Reopened If the defect still exists during testing, it is
### **1. Limited Time and Resources**: reopened and goes through the cycle again.
- Deadlines may restrict the depth of quality
assessments.
- Insu cient manpower or budget may limit the scope
of testing and evaluation.
### **2. Incomplete Requirements**:
- Vague or changing requirements during
development can hinder accurate quality assessment.
- Misunderstandings between stakeholders about
expectations can lead to gaps in evaluation.
### **3. Complexity of Software**:
- Large and intricate systems with numerous modules
are harder to assess comprehensively.
- Interdependencies between components add to the
challenges of identifying quality issues.
### **4. Subjectivity in Metrics**:
- De ning and measuring quality often involve
subjective metrics like user satisfaction, which can vary
among stakeholders.
- Lack of standardized metrics can lead to
inconsistent assessments.
### **5. Constraints in Testing Environment**:
- Simulating real-world conditions accurately can be
challenging.
- Hardware and network limitations may restrict
thorough performance testing.
### **6. Evolving Technology**:
- Rapid technological advancements may render
some quality assessment tools or techniques obsolete.
- Compatibility with emerging technologies (e.g., AI,
IoT) adds new challenges.
### **7. Human Factors**:
- Human errors in quality control or testing processes
can a ect results.
- Resistance to change from developers or
organizations may impede improvement e orts.
### **8. Security and Privacy Issues**:
- Ensuring the software’s security and privacy
compliance requires additional resources and expertise,
often creating constraints.
- Some vulnerabilities might remain undetected
despite rigorous testing.

The DEFECT LIFE CYCLE (also known as the Bug Life


Cycle) represents the process a defect goes through
during its lifecycle. It starts when a defect is identi ed
and ends when it is closed after veri cation. Here's a
concise explanation, followed by a diagram:
### Stages in the Defect Life Cycle:
1. **New**: A defect is identi ed and logged.
2. **Assigned**: The defect is assigned to the
responsible developer or team.
3. **Open**: The developer starts analyzing and xing
the defect.
ff
ff
fi
ffi
fi
fi
fi
fi
fi
fi
ff
fi
fi
EXPLAIN TEST EFFICIENCY AND DEFECT 4. **Customer-Centric Focus**: Aligning
REJECTION organizational goals with customer needs and
### 1. Test E ciency: feedback.
- It measures the e ectiveness of the testing process in 5. **Continuous Learning**: Fostering a culture of
identifying defects. ongoing learning and skill development to adapt to
- **Formula**: quality standards.
$$ \text{Test E ciency} = \frac{\text{Number of Valid 6. **Proactive Problem-Solving**: Shifting from
Defects}}{\text{Total Number of Defects Introduced in reactive to proactive approaches in identifying and
the Application}} \times 100 $$ resolving issues.
- A higher percentage indicates that the testing process 7. **Performance Metrics and Accountability**:
successfully detected more defects. Establishing clear quality metrics and ensuring
### Example: accountability for meeting them.
If 80 defects are introduced during development and 8. **Innovation and Adaptability**: Encouraging
testing identi es 60 valid defects, the test e ciency is: creative thinking and exibility to implement e ective
$$ \text{Test E ciency} = \frac{60}{80} \times 100 = 75\ quality improvement strategies.
% $$
### 2. Defect Rejection: EXPLAIN USE CASE TESTING WITH ONE EXAMPLE
- It represents the percentage of defects reported by Use Case Testing:
testers that are considered invalid or not reproducible. Use Case Testing is a functional testing technique used
- **Formula**: to identify test scenarios based on the "use cases" of a
$$ \text{Defect Rejection Rate} = \frac{\text{Number ofsystem. A use case represents a speci c way in which
Rejected Defects}}{\text{Total Number of Defects a user interacts with the system to achieve a goal. This
Reported}} \times 100 $$ technique ensures that the system works as expected
- A lower defect rejection rate indicates that testers are
in real-world user scenarios.
reporting valid issues. Key Steps in Use Case Testing:
### Example: 1.Identify the use cases from system requirements or
If 40 defects are reported and 5 are rejected, the defectdocumentation.
rejection rate is: 2.Create test cases that correspond to each step in the
$$ \text{Defect Rejection Rate} = \frac{5}{40} \times 100use case.
= 12.5\% $$ 3.Test various paths, including the normal (happy path)
and alternative ows (e.g., exceptions, errors).
WHAT ARE ENTRY AND EXIT CRITERIA OF TESTING **Use Case**: User withdraws cash from an ATM.
#### **Entry Criteria**: **Actors**: User, ATM system.
1. **Requirement Documents**: Complete and #### Test Scenarios:
approved functional speci cations. 1. **Normal Flow (Happy Path)**:
2. **Test Plan and Test Cases**: Test plans and test - User inserts a valid debit card.
cases are reviewed and ready. - User enters the correct PIN.
3. **Test Environment Setup**: The test environment is User selects "Withdraw Cashand speci es the amount.
properly con gured. ATMdispenses cash,and the user's account is debited.
4. **Availability of Test Data**: Necessary test data is 2. **Alternative Flow**:
created and available. - User enters an incorrect PIN. System denies access
5. **Prerequisites Completed**: Code is unit-tested and prompts the user to re-enter the PIN.
and deployed to the test environment. - User exceeds the daily withdrawal limit. System
#### **Exit Criteria**: rejects the transaction and displays an error message.
1. **Defect Resolution**: All critical and high-severity - ATM runs out of cash. System shows a noti cation
defects are resolved. and ends the transaction.
2. **Test Coverage**: Required test cases are
executed and passed. DIFFERENCE BETWEEN TEST PLAN AND TEST
3. **Metrics Review**: Testing metrics like defect STRATERGY
density and test coverage meet expectations. ### **Test Plan**:
4. **Sign-O **: Formal approval from stakeholders. 1. It is a detailed document that de nes the scope,
5. **Regression Testing**: Regression tests are approach, resources, and schedule for speci c testing
complete and veri ed. activities.
2. Focuses on testing activities for a particular project
DEFINE CULTURAL CHANGES REQUIREMENT FOR or release.
3. Includes elements like test objectives, scope,
QUALITY IMPROVEMENT
resources, schedule, test deliverables, etc.
1. **Leadership Commitment**: Strong commitment 4. Created speci cally for individual projects.
from leadership to drive a quality- rst approach across 5. Owned by the test manager or test lead for the
all levels. project.
2. **Employee Empowerment**: Encouraging 6. Contains detailed plans for testing execution.
employees to take ownership of quality and actively ### **Test Strategy**:
contribute to improvements. 1. It is a high-level document that outlines the overall
3. **Collaborative Environment**: Promoting testing approach and methodology.
teamwork and open communication to share ideas and
resolve quality issues collectively.
ff
fi
fi
ffi
ffi
fl
ffi
fi
fi
ff
fl
fi
fi
fi
fi
fi
ffi
fi
ff
fi
2. Focuses on testing processes at the organization- 2. **Con guration Control**: Authorizing and
wide level across multiple projects. managing changes to con gurations.
3. Includes methods, tools, standards, and techniques 3. **Status Accounting**: Maintaining records of
for testing. con guration changes.
4. Created to provide general guidelines and standards 4. **Con guration Auditing**: Verifying compliance
for multiple projects. with standards.
5. Owned by senior management or the QA team at the **Importance**: CM ensures consistent system
organizational level. performance, prevents unauthorized changes, and
6. De nes the framework for testing, not detailed enhances traceability in development.
execution plans.
GIVE DIFFERENCE BETWEEN VERIFICATION AND
VALIDATION
DESCRIBE QUALITY ASSURANCE PROCESS
### **Veri cation**:
Quality Assurance (QA) is a systematic process to
1. It ensures the product is being built correctly
ensure that a product meets prede ned standards and
according to speci cations.
customer expectations. It focuses on **preventing
2. Focuses on design and technical requirements
defects** during the development lifecycle rather than
compliance.
detecting them later. Here are the key steps in the QA
3. Uses **static testing** methods like reviews,
process:
walkthroughs, and inspections.
1. **Requirement Analysis**: 4. Performed during the development phase, before
- Understand and analyze project requirements to set testing begins.
quality standards. 5. Examples include reviewing requirement documents,
- De ne acceptance criteria and quality objectives. design, and code.
2. **Planning**: ### **Validation**:
- Develop a QA plan, including strategies, tools, 1. It ensures the right product is being built to meet user
schedules, and resources. needs.
- De ne roles and responsibilities within the QA team. 2. Focuses on the product’s functionality and purpose.
3. **Test Design**: 3. Uses **dynamic testing** methods, including
- Design test cases, test scenarios, and test scripts functional and non-functional testing.
based on requirements. 4. Performed during or after the testing phase.
- Create clear documentation for the testing process. 5. Examples include executing test cases to validate
4. **Test Environment Setup**: system functionalities.
- Prepare the testing environment, including
hardware, software, and test data.
- Ensure the environment replicates real-world TYPES OF TEST ARTIFACTS
conditions. Test artifacts are documents and deliverables generated
5. **Execution**: during the testing process to ensure e ective
- Perform various types of testing (e.g., functional, communication, tracking, and quality assurance. Here
regression, performance) to verify product quality. are the main types:
- Log defects, track progress, and communicate 1. **Test Plan**: De nes the scope, approach,
results. resources, and schedule for testing activities.
6. **Monitoring and Reporting**: 2. **Test Cases**: Documents outlining speci c
- Measure testing metrics like defect density and test conditions under which a test is conducted to verify
coverage. functionality.
- Generate test reports to provide insights into 3. **Test Scenarios**: High-level descriptions of what
product quality. needs to be tested, focusing on end-to-end
7. **Review and Audit**: functionality.
- Conduct reviews and audits to ensure adherence to 4. **Test Data**: Input data prepared for executing test
quality standards. cases.
- Verify that processes are being followed correctly.
5. **Bug Reports**: Records of identi ed defects,
8. **Continuous Improvement**: including details like severity, priority, and steps to
- Gather feedback from testing outcomes and reproduce.
stakeholders.
6. **Test Summary Reports**: Summarizes the results
- Implement changes to improve future QA processes.
of testing activities, including pass/fail rates and defect
metrics.
7. **Traceability Matrix**: Maps requirements to test
DEFINE AND EXPLAIN CONFIGURATION
cases to ensure coverage.
MANAGMENT
8. **Automation Scripts**: Code used for automated
**De nition**: CM is the process of systematically testing to streamline repetitive tasks.
managing changes in a system's con gurations,
9. **Test Logs**: Logs of activities and outcomes
ensuring consistency and reliability.
recorded during testing.
### Key Activities:
10. **Metrics and Reports**: Key performance
1. **Con guration Identi cation**: Labeling indicators and test statistics used for analysis.
components like software versions for tracking.
fi
fi
fi
fi
fi
fi
fi
fi
fi
fi
fi
fi
fi
fi
fi
fi
ff
fi

You might also like