Stqa Insem Pyq
Stqa Insem Pyq
**Continuous Improvement**:
DELIVERY APPLICATION - Embracing innovation and regularly updating
### 1. **Requirement Analysis**: processes leads to higher e ciency and quality.
- Understand user needs: Ordering food online, - Organizations use tools like PDCA (Plan-Do-Check-
tracking delivery, payment options, restaurant ratings.- Act) to drive improvements.
Gather functional and non-functional requirements, like ### 6. **Evidence-Based Decision Making**:
speed and scalability. - Decisions are made using accurate data and
### 2. **Planning**: analysis.
- De ne project scope, budget, and timeline. - This ensures better control and alignment with
- Identify resources (developers, testers, designers). organizational goals.
- Prepare risk management strategies. ### 7. **Relationship Management**:
### 3. **System Design**: - Building and maintaining strong relationships with
- Create architectural diagrams and UI/UX designs. stakeholders, including suppliers, enhances mutual
- De ne database structure and APIs. trust and long-term success.
- Plan for app features like search lters, order history,
noti cations.
### 4. **Development**: DIFFERENCE BETWEEN TESTING AND
- Implement features using programming languages DEBUGGING
(e.g., Java, Python). ### 1. **De nition**:
- Frontend development for user interface (React, - **Testing**: The process of systematically identifying
Angular). and reporting defects or bugs in software by executing
- Backend development for server logic (Node.js, it under controlled conditions.
Django). - **Debugging**: The process of diagnosing, locating,
### 5. **Testing**: and xing the root causes of the defects found during
- Perform unit, integration, and system testing. testing or normal usage.
- Check functionality, security, and performance. ## 2. **Objective**:
- Fix bugs and ensure app runs smoothly on di erent - **Testing**: Focuses on evaluating software quality
devices. and ensuring it meets requirements.
### 6. **Deployment**: - **Debugging**: Focuses on correcting the issues
- Deploy the app on platforms like Google Play Store identi ed to ensure proper functionality.
and Apple App Store. ### 3. **Who Performs It**:
- Set up servers for backend operations. - **Testing**: Typically conducted by dedicated
### 7. **Maintenance**: **testers** or **Quality Assurance (QA)** teams.
- Monitor app performance and user feedback. - **Debugging**: Performed by **developers** or
- Release updates and patches for new features or coders responsible for the code.
bug xes. ### 4. **Output**:
- Ensure continuous system improvements. - **Testing**: Produces test reports that document
defects and software quality.
- **Debugging**: Results in corrected code and a
The pillars of a Quality Management System (QMS) working software solution.
form the foundation for consistently delivering products
or services that meet customer expectations while
ensuring continuous improvement. Here are the core CHALLENGES AND PROBLEMATIC AREA IN SDLC
pillars: 1. **Requirement Gathering and Analysis**:
### 1. **Customer Focus**: - Misunderstood or incomplete requirements can lead
- Meeting and exceeding customer needs is the to software that doesn't meet client expectations.
primary aim. - Changing requirements during development adds
- Understanding customer requirements and striving complexity.
for satisfaction ensures loyalty and trust. 2. **Planning**:
### 2. **Leadership**: - Unrealistic project schedules and budgets often
- Strong leadership establishes clear goals, vision, result in delays and overspending.
and direction. - Poor risk management leads to unpreparedness for
- Leaders create an environment where employees unforeseen issues.
feel motivated and aligned with the organization's 3. **Design Phase**:
objectives. - Inadequate or awed design can cause issues
### 3. **Engagement of People**: during development and testing.
- Employees at all levels are critical to the - Lack of proper documentation makes maintenance
organization's success. di cult.
- Encouraging involvement, skill development, and 4. **Development Phase**:
teamwork enhances overall performance. - Coding errors or lack of adherence to coding
### 4. **Process Approach**: standards can lead to defective software.
- Managing activities as interconnected processes - Inadequate collaboration among team members
boosts e ciency and consistency. may slow progress.
- This approach optimizes resource utilization and 5. **Testing Phase**:
minimizes errors.
ffi
fi
fi
fi
fi
fi
fi
ffi
fi
fl
ffi
fi
ff
- Limited test coverage may allow undetected defects - Release regular updates to enhance usability and x
to pass through. bugs.
- Delayed testing results in expensive bug xes in
later stages.
6. **Deployment**: PLAN SOFTWARE QUALITY CONTROL WITH
- Poorly planned deployment can lead to compatibility RESPECT TO SPACE RESEARCH
or performance issues on user platforms. ### **1. De ne Quality Objectives**:
- Insu cient training for end-users may reduce - Ensure the software accurately performs
adoption. calculations, simulations, and data processing.
7. **Maintenance**: - Guarantee safety and reliability for mission-critical
- Lack of timely updates or patches can result in applications, such as navigation and communication
security vulnerabilities. systems.
- Di culty in scaling the application may lead to user - Adhere to industry standards like ISO 9001 and
dissatisfaction. NASA’s software assurance guidelines.
### **2. Develop Testing Strategy**:
- **Functional Testing**: Verify the software’s ability to
PLAN SOFTWARE QUALITY CONTROL WITH handle complex computations, trajectory planning, and
RESPECT TO COLLEGE ATTENDANCE SOFTWARE signal processing.
### **1. De ne Quality Objectives**: - **Performance Testing**: Assess how the software
- Ensure accurate attendance recording. operates under extreme conditions, such as varying
- Guarantee data security for student information. temperature, pressure, or data load.
- Deliver a user-friendly interface for administrators, - **Security Testing**: Ensure robust protection
faculty, and students. against unauthorized access, especially for sensitive
### **2. Develop a Testing Strategy**: mission data.
- **Functional Testing**: Test core features like - **Stress Testing**: Evaluate the software’s durability
attendance marking, report generation, and editing under peak loads, such as high data throughput during
attendance records. launch or landing.
- **Usability Testing**: Check if the software is easy to ### **3. Simulations and Real-World Scenarios**:
navigate for all users. - Use simulators to mimic real-world conditions, such
- **Performance Testing**: Ensure smooth operation as orbit, atmospheric reentry, or planetary
under heavy user loads (e.g., during class registration). environments.
- **Security Testing**: Validate that student data is - Incorporate edge cases like system failure or
encrypted and protected from unauthorized access. unexpected anomalies to test software resilience.
### **3. Prepare Test Data**: ### **4. Quality Metrics**:
- Use real-life scenarios such as adding new - Monitor defect density, mean time to failure, and
students, handling absenteeism, and generating overall system stability.
monthly attendance reports. - Measure precision of scienti c calculations and
- Include edge cases like network failure or incorrect accuracy of data analysis.
data entry. ### **5. Documentation and Reporting**:
### **4. Execute Testing in Phases**: - Maintain detailed records of test results, xes, and
- **Unit Testing**: Verify individual features like software versions.
attendance marking or user login. - Share regular updates with stakeholders, ensuring
- **Integration Testing**: Ensure modules like transparency in QC processes.
attendance and noti cations work seamlessly together. ### **6. Continuous Improvement**:
- **System Testing**: Test the complete software to - Use feedback from previous missions or projects to
identify issues in overall functionality. re ne software quality.
### **5. Set Quality Metrics**: - Adapt to emerging technologies and evolving space
- Monitor defect rates, resolution time for issues, and research requirements.
system uptime. ### **7. Safety and Compliance Checks**:
- Check responsiveness and accuracy of attendance - Conduct rigorous veri cation and validation to
marking under real-time conditions. ensure compliance with mission-speci c safety
### **6. Conduct Reviews and Inspections**: protocols.
- Regularly inspect code, designs, and test results to - Include redundant checks to eliminate risks in
identify potential aws early. critical systems.
- Review software updates to ensure compatibility
with existing features.
### **7. Final Veri cation and Validation**: EXAMINE RELATIONSHIP BETWEEN QUALITY
- Perform **Acceptance Testing**: Engage faculty and PRODUCTIVITY
administrators to con rm the software meets their ### **1. Positive Correlation**:
expectations. - High-quality processes and outputs often lead to
- Validate compliance with educational institution increased productivity. For instance, fewer defects
policies and standards. reduce the need for rework, saving time and resources,
### **8. Continuous Improvement**: which boosts overall e ciency.
- Collect feedback from users (administrators, faculty, ### **2. Cost vs. E ciency Trade-o s**:
students) and incorporate suggestions. - Poor quality leads to frequent xes and downtime,
which hampers productivity and increases costs.
fi
ffi
ffi
fi
fi
fl
fi
ffi
fi
fi
ffi
fi
fi
fi
ff
fi
fi
fi
fi
- On the other hand, investing in quality through - **Tools**: Physical or digital instruments used to
robust testing and streamlined processes may require perform tasks or achieve objectives (e.g., software,
initial e ort but yields better productivity in the long run.
frameworks, equipment).
### **3. Employee Morale**: - **Techniques**: Methods or approaches applied to
- High-quality systems empower employees to work use tools e ectively or accomplish tasks (e.g.,
e ectively and feel satis ed, which improves output and strategies, processes).
productivity. ### **2. Purpose**:
- Conversely, poor quality may frustrate workers and - **Tools**: Serve as enablers for work by providing
lead to ine ciencies. functionality (e.g., Selenium for software testing).
### **4. Customer Satisfaction**: - **Techniques**: Guide the execution by de ning the
- High-quality products ensure customer loyalty, steps or methodology (e.g., Black-box testing method).
which in turn drives demand and productive output. ### **3. Dependency**: - **Tools**: Can exist
- Low-quality products may result in complaints, independently; their use depends on the task (e.g., a
refunds, or negative feedback, disrupting productivity.
debugger works without speci c techniques).
### **5. Balance for Optimization**:
- **Techniques**: Often require tools for practical
- Striving for perfect quality might sometimes slow
implementation (e.g., Agile development needs
productivity due to excessive focus on detail.
collaboration tools like Jira).
- A balanced approach ensures both quality and
productivity are optimized without sacri cing either. ### **4. Examples**:
- **Tools**: IDEs (Integrated Development
Give classi cation on di erent types of products Environments), testing frameworks, project
Products can be classi ed into various categories management software.
based on their characteristics, usage, and target - **Techniques**: Testing methods (unit testing,
market. Here's a broad classi cation: regression testing), software development
### **1. Based on Tangibility**: methodologies (Waterfall, Agile).
- **Tangible Products**: Physical goods that can be ### **5. Applicability**:
touched and stored (e.g., furniture, electronics, food). - **Tools**: Focused on execution and operational
- **Intangible Products**: Non-physical o erings, such support.
as services (e.g., education, consultancy, software as a - **Techniques**: Focused on optimizing process
service). and problem-solving.
### **2. Based on Consumer Use**:
- **Consumer Products**: Goods purchased for
personal use. DEFINE QUALITY AS VIEWED BY DIFFERENT
- *Convenience Goods*: Frequently bought with STAKEHOLDERS FO SOFTWARE DEVELOPMENT
minimal e ort (e.g., toothpaste, snacks). AND USAGE
- *Shopping Goods*: Compared for quality and price ### **1. Customers/End-Users**:
before buying (e.g., clothes, electronics). - Quality means a user-friendly interface, reliable
- *Specialty Goods*: Unique items with brand loyalty functionality, and error-free performance.
(e.g., luxury cars, designer watches). - They value software that meets their requirements,
- *Unsought Goods*: Not actively sought by delivers expected features, and provides satisfaction.
consumers (e.g., insurance, emergency tools). ### **2. Developers**:
- **Industrial Products**: Goods used for production - Quality means clean, e cient, and maintainable
or business operations (e.g., machinery, raw materials). code.
### **3. Based on Durability**: - Developers seek adherence to coding standards
- **Durable Goods**: Long-lasting products (e.g., and minimal bugs in the system.
appliances, vehicles). ### **3. Quality Assurance (QA) Team**:
- **Non-Durable Goods**: Consumable items with - Quality is de ned as compliance with prede ned
short lifespans (e.g., groceries, cosmetics). standards and speci cations.
### **4. Based on Manufacturing Purpose**: - QA teams focus on defect detection, test coverage,
- **Raw Materials**: Basic materials used to create and ensuring the software passes all validation checks.
other products (e.g., cotton, metals). ### **4. Project Managers**:
- **Finished Goods**: Ready-to-use products (e.g., - Quality means meeting project deadlines and
smartphones, packaged food). staying within budget while delivering software that
- **Intermediate Goods**: Products used in the ful lls requirements.
production of nal goods (e.g., car parts). - They value balance between functionality, cost, and
### **5. Based on Branding**: timeliness.
- **Branded Products**: Products from established ### **5. Organizations**:
brands (e.g., Nike, Apple). - Quality ensures reputation, customer retention, and
- **Generic Products**: Non-branded or less- alignment with business goals.
recognizable items. - They emphasize scalability, security, and
performance to maintain market competitiveness.
### **6. Regulatory Authorities**:
DIFFERENCE BETWEEN TOOLS AND TECHNIQUES - Quality means adherence to industry standards and
### **1. De nition**: regulations, such as ISO and CMMI compliance.
- Authorities ensure the software complies with legal
and ethical requirements.
ff
fi
ff
ff
ffi
ff
fi
fi
fi
fi
fi
fi
fi
ff
ffi
fi
fi
fi
ff
fi
fi
### **7. Maintenance/Support Teams**: 4. **Fixed**: The defect is resolved or xed by the
- Quality involves ease of troubleshooting and developer.
e ective updates. 5. **Retest**: The tester retests the defect to ensure it’s
- They prefer software that is well-documented and resolved.
easy to maintain over its lifecycle. 6. **Veri ed**: The defect is veri ed as xed if it passes
the retest.
7. **Closed**: The defect is marked as closed when no
CONTRAINTS OF SOFTWARE PRODUCT QUALITY further issues are found.
ASSESMENT Reopened If the defect still exists during testing, it is
### **1. Limited Time and Resources**: reopened and goes through the cycle again.
- Deadlines may restrict the depth of quality
assessments.
- Insu cient manpower or budget may limit the scope
of testing and evaluation.
### **2. Incomplete Requirements**:
- Vague or changing requirements during
development can hinder accurate quality assessment.
- Misunderstandings between stakeholders about
expectations can lead to gaps in evaluation.
### **3. Complexity of Software**:
- Large and intricate systems with numerous modules
are harder to assess comprehensively.
- Interdependencies between components add to the
challenges of identifying quality issues.
### **4. Subjectivity in Metrics**:
- De ning and measuring quality often involve
subjective metrics like user satisfaction, which can vary
among stakeholders.
- Lack of standardized metrics can lead to
inconsistent assessments.
### **5. Constraints in Testing Environment**:
- Simulating real-world conditions accurately can be
challenging.
- Hardware and network limitations may restrict
thorough performance testing.
### **6. Evolving Technology**:
- Rapid technological advancements may render
some quality assessment tools or techniques obsolete.
- Compatibility with emerging technologies (e.g., AI,
IoT) adds new challenges.
### **7. Human Factors**:
- Human errors in quality control or testing processes
can a ect results.
- Resistance to change from developers or
organizations may impede improvement e orts.
### **8. Security and Privacy Issues**:
- Ensuring the software’s security and privacy
compliance requires additional resources and expertise,
often creating constraints.
- Some vulnerabilities might remain undetected
despite rigorous testing.