0% found this document useful (0 votes)
37 views7 pages

QA Question Answer Key

Uploaded by

asharwork52
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views7 pages

QA Question Answer Key

Uploaded by

asharwork52
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

✅ Manual QA Interview Answer Key (Q1–

Q10)

1. What is software testing and why is it important?

• Verifies software functionality and quality


• Detects bugs early to ensure a reliable, user-friendly product
• Helps maintain standards, customer trust, and compliance

2. Difference between verification and validation

• Verification: Are we building the product right? (Design, documents)


• Validation: Are we building the right product? (Testing the application)

3. Types of software testing

• Functional Testing: UI, APIs, database, security, etc.


• Non-Functional Testing: Performance, load, usability
• Other Types: Unit, Integration, System, Acceptance
• Manual vs Automation, Black Box vs White Box

4. Smoke vs Sanity vs Regression

• Smoke Testing: Basic check of critical functionality in a new build


• Sanity Testing: Quick test after minor changes/fixes
• Regression Testing: Ensures old functionality still works after new changes
5. What is exploratory testing?

• Simultaneous learning, testing, and exploring the app without scripted cases
• Helpful when documentation is incomplete or during tight deadlines
• Often uncovers edge-case bugs

6. Key components of a test case

• Test Case ID and Title


• Preconditions
• Test Steps and Test Data
• Expected Result
• Actual Result
• Status (Pass/Fail)
• Comments or Attachments if applicable

7. Boundary Value Analysis (BVA) vs Equivalence Partitioning (EP)

• BVA: Test minimum, maximum, just inside/outside boundaries (e.g., 17, 18, 60, 61)
• EP: Divide data into partitions/groups where test behavior is expected to be the
same
o (e.g., valid age: 18–60 → test one value from each group)

8. Test cases for a login page

• Valid credentials (positive scenario)


• Invalid credentials (wrong username/password)
• Empty username/password fields
• Password field masking
• SQL injection attempt
• Forgot password link
• Session timeout/login expiration
9. Test Scenario vs Test Case

• Test Scenario: High-level description of what to test (e.g., "Test login feature")
• Test Case: Detailed steps to execute the test scenario

10. How to ensure full test coverage

• Map test cases to all requirements (Requirement Traceability Matrix – RTM)


• Use techniques like BVA, EP, Decision Tables
• Cover positive, negative, edge, and exception scenarios
• Review with developers/BA and run peer reviews

Here’s the full copy-pasteable text for your Manual QA Interview Answer Key – Part 2
(Q11–Q30). You can paste this directly into a Word or Google Docs file and format it as
needed:

✅ Manual QA Interview Answer Key (Q11–


Q30)

11. What makes a good bug report?

• Clear and concise title


• Steps to reproduce
• Expected vs actual result
• Attachments (screenshots/logs)
• Severity and priority
• Environment details
12. Severity vs Priority

• Severity: Technical impact (e.g., system crash = high)


• Priority: Urgency of fixing (e.g., marketing bug before campaign = high)

13. Bug tracking tools

• JIRA, Azure DevOps, Bugzilla, Trello (for light projects)


• Used for assigning, commenting, tracking, and reporting bugs

14. Bug life cycle

• New → Assigned → Open → Fixed → Retest → Verified → Closed


• Additional states: Rejected, Duplicate, Deferred, Reopened

15. Developer disagrees with bug?

• Recheck reproduction steps


• Provide supporting screenshots/logs
• Reference requirements
• Escalate to BA/lead if needed

16. Steps to reproduce customer issue

• Gather steps and environment info


• Reproduce on matching test setup
• Use logs or session replay tools
• Document the process
17. Using Wireshark/CleverTap

• Wireshark: Analyze network traffic/API calls


• CleverTap/UserExperior: Review user sessions and behavior

18. Resolved a production issue example

• Identified issue using logs/session replay


• Escalated with details
• Verified and monitored the fix

19. Bug not reproducible?

• Check for environment mismatch


• Attempt alternative flows
• Request more input or video proof
• Defer or close with documentation

20. How to escalate unresolved issues?

• Update ticket status/comments


• Assign or notify relevant lead
• Describe business/user impact
• Follow up until resolved

21. Gathering requirements from stakeholders

• Participate in grooming meetings


• Ask clarifying questions
• Review Figma, BRDs
• Confirm via meeting notes
22. Explaining technical issues simply

• Use real-world analogies


• Focus on user/business impact
• Avoid technical jargon

23. Coordinating with teams

• Attend daily standups and sprint planning


• Use Jira or equivalent tools
• Share blockers, timelines, and goals

24. Bug across teams – communication

• Create a central ticket


• Assign responsibilities per team
• Hold sync call if needed
• Keep documentation up-to-date

25. Handling conflicting priorities

• Consult Product Owner or Business Analyst


• Use severity/priority matrix
• Document risks and decisions
• Propose compromise/test strategy

26. Critical bug before release

• Immediately inform leads


• Suggest rollback or patch fix
• Ensure full retesting
• Support decision to delay if necessary

27. Limited time to test

• Focus on smoke/sanity tests


• Prioritize critical paths
• Log missed tests as tech debt
• Communicate risk clearly

28. Business ignores serious bug

• Present user/business impact with data


• Request written confirmation for deferral
• Document decisions for audit
• Offer phased fix if possible

29. Triage multiple escalations

• Sort by severity and impact


• Handle blockers and critical bugs first
• Communicate progress to stakeholders

30. Blocked by incomplete requirements

• Mark blocker in ticketing tool


• Notify BA/PO
• Test known scope with assumptions
• Flag uncertainties with notes

You might also like