0% found this document useful (0 votes)
4 views14 pages

NLP

The document is a question bank for Class X on the topic of Natural Language Processing (NLP), covering various aspects such as definitions, applications, techniques, and challenges of NLP. It includes multiple-choice questions, assertion-reasoning questions, and case studies related to real-life applications of NLP. The content aims to test and enhance students' understanding of NLP concepts and their practical implications.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views14 pages

NLP

The document is a question bank for Class X on the topic of Natural Language Processing (NLP), covering various aspects such as definitions, applications, techniques, and challenges of NLP. It includes multiple-choice questions, assertion-reasoning questions, and case studies related to real-life applications of NLP. The content aims to test and enhance students' understanding of NLP concepts and their practical implications.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

MAHARISHI VIDYA MANDIR MOGAPPAIR

CLASS X- ARTIFICIAL INTELLIGENCE

TOPIC: NLP QBANK

1. What does NLP stand for?

a) Natural Logic Processing b) Natural Language Processing c) Natural Learning Processing d) Non-
Language Processing

Answer: b) Natural Language Processing

2. Which domain does NLP fall under?

a) Data Science b) Computer Vision c) Artificial Intelligence d) Robotics

Answer: c) Artificial Intelligence

3. What type of data does NLP deal with?

a) Tabular data b) Visual data c) Numerical data d) Natural Language data

Answer: d) Natural Language data

4. What is the main goal of NLP?

a) To process images b) To understand human languages c) To analyze financial data d) To operate


machinery

Answer: b) To understand human languages

5. Which of the following is an application of NLP?

a) Object detection b) Sentiment analysis c) Face recognition d) Stock market prediction

Answer: b) Sentiment analysis

6. What does a virtual assistant like Siri or Alexa primarily rely on?

a) Data Science b) Computer Vision c) NLP d) Robotics

Answer: c) NLP

7. What does automatic summarization in NLP do?


a) Converts images to text b) Summarizes numerical data c) Extracts the main information from text d)
Creates a chatbot

Answer: c) Extracts the main information from text

8. Which technique in NLP is used to remove unnecessary words like “and” or “the”?

a) Tokenization b) Lemmatization c) Stopwords removal d) Stemming

Answer: c) Stopwords removal

9. What is the process of breaking text into individual words or terms called?

a) Sentence segmentation b) Tokenization c) Lemmatization d) Parsing

Answer: b) Tokenization

10. Which is a challenge for NLP when dealing with human language?

a) Identifying numbers b) Processing unstructured data c) Parsing simple sentences d) None of the
above

Answer: b) Processing unstructured data

11. Which of the following is a step in text normalization?

a) Generating summary b) Tokenization c) Machine learning d) Object detection

Answer: b) Tokenization

12. In which step of text normalization are words reduced to their root form?

a) Lemmatization b) Sentence segmentation c) Stopwords removal d) Summarization

Answer: a) Lemmatization

13. What is the main purpose of sentiment analysis?

a) To categorize documents b) To identify opinions or emotions in text c) To summarize large texts d) To


convert text into speech

Answer: b) To identify opinions or emotions in text

14. Which application of NLP helps in filtering spam emails?

a) Sentiment analysis b) Automatic summarization c) Text classification d) Speech recognition

Answer: c) Text classification


15. What kind of chatbot relies on pre-defined scripts?

a) Smart-bot b) Rule-based bot c) Dynamic bot d) Machine learning bot

Answer: b) Rule-based bot

16. Which type of chatbot uses large databases and can learn from more interactions?

a) Script-bot b) Rule-based bot c) Smart-bot d) Static bot

Answer: c) Smart-bot

17. Which of the following is NOT an application of NLP?

a) Image recognition b) Sentiment analysis c) Automatic summarization

d) Speech recognition

Answer: a) Image recognition

18. Which AI model helps extract features from text using word frequency?

a) Neural network b) Bag of words c) Decision tree d) Convolutional neural network

Answer: b) Bag of words

19. What is the primary step in converting text to numerical format in NLP?

a) Tokenization b) Stopwords removal c) Text normalization d) Text summarization

Answer: c) Text normalization

20. What is meant by “underfitting” in an AI model?

a) Model performs perfectly b) Model does not fit well with the data c) Model fits too well to the data
d) Model generalizes well

Answer: b) Model does not fit well with the data

21. Which algorithm helps in finding the importance of words in documents?

a) Decision tree b) Neural network c) TFIDF d) Random forest

Answer: c) TFIDF

22. In text normalization, which step involves reducing words to their base forms?

a) Stemming b) Tokenization c) Stopwords removal d) Parsing

Answer: a) Stemming
23. What kind of words are typically removed during stopword removal in NLP?

a) Important keywords b) Common and unimportant words c) Rare words d) Numerical data

Answer: b) Common and unimportant words

24. Which NLP task involves assigning predefined categories to a document?

a) Speech recognition b) Text classification c) Sentiment analysis d) Image recognition

Answer: b) Text classification

25. Which of the following is NOT a challenge for NLP?

a) Multiple meanings of words b) Perfect syntax without meaning c) Variations in grammar

d) Identifying colors

Answer: d) Identifying colors

26. What is the first step in text normalization?

a) Sentence segmentation b) Stemming c) Lemmatization d) Stopword removal

Answer: a) Sentence segmentation

27. What is an example of a smart-bot?

a) Customer service chatbot b) Alexa c) Scripted chatbot d) None of the above

Answer: b) Alexa

28. In NLP, what does tokenization involve?

a) Breaking text into words or sentences b) Removing stopwords c) Summarizing a document

d) Analyzing sentiment

Answer: a) Breaking text into words or sentences

29. Which of the following is a real-life application of NLP?

a) Image segmentation b) Predictive maintenance c) Spam email filtering d) Fraud detection

Answer: c) Spam email filtering

30. Which type of NLP bot adapts based on interactions and improves over time?

a) Script-bot b) Smart-bot c) Static bot d) Predefined bot


Answer: b) Smart-bot

31. What is the main limitation of a script-based chatbot?

a) It can only handle predefined conversations b) It requires a large database c) It learns from user
interactions d) It uses machine learning

Answer: a) It can only handle predefined conversations

32. Which step comes after tokenization in text normalization?

a) Stopwords removal b) Lemmatization c) Text summarization d) Stemming

Answer: a) Stopwords removal

33. What is lemmatization?

a) Removing prefixes from words b) Converting words to their base or dictionary form

c) Splitting text into sentences d) Removing stopwords

Answer: b) Converting words to their base or dictionary form

34. What do virtual assistants like Google Assistant use for speech recognition?

a) Computer Vision b) Data Science c) NLP d) Robotics

Answer: c) NLP

35. What is the output of a successful sentiment analysis in NLP?

a) A numerical value b) A summary c) The overall sentiment (positive/negative/neutral)

d) A speech output

Answer: c) The overall sentiment (positive/negative/neutral)

36. Which process removes words like “is” or “the” from text in NLP?

a) Stemming b) Stopwords removal c) Tokenization d) Lemmatization

Answer: b) Stopwords removal

37. Which is a technique for summarizing a large set of text data?

a) Lemmatization b) Automatic summarization c) Speech synthesis d) Tokenization

Answer: b) Automatic summarization

38. What does the TFIDF algorithm measure?


a) Importance of words in a document b) Word length c) Sentence complexity d) Syntax accuracy

Answer: a) Importance of words in a document

39. What is a common issue with NLP when dealing with human languages?

a) Words with multiple meanings b) Predicting stock prices c) Recognizing colors d) Detecting shapes

Answer: a) Words with multiple meanings

40. What is the process of dividing text into individual sentences?

a) Tokenization b) Stemming c) Sentence segmentation d) Lemmatization

Answer: c) Sentence segmentation

41. Which chatbot type typically connects users to human agents for complex queries?

a) Script-bot b) Smart-bot c) AI-powered bot d) Static bot

Answer: a) Script-bot

42. What does “perfect syntax, no meaning” refer to in NLP?

a) The sentence is grammatically correct but lacks logical meaning

b) The sentence is incorrect

c) The sentence has perfect semantics

d) The sentence contains stopwords

Answer: a) The sentence is grammatically correct but lacks logical meaning

43. Which of the following tools is used to reduce a word to its base form?

a) Tokenizer b) Lemmatizer c) Summarizer d) Sentiment analyzer

Answer: b) Lemmatizer

44. How does a machine learn to recognize words in NLP?

a) By using images b) By analyzing text data c) By calculating numerical values d) By tracking user
behavior

Answer: b) By analyzing text data

45. What is the primary challenge of translating human language into computer language?

a) Differences in syntax b) Lack of punctuation c) Word capitalization d) Noise in audio data


Answer: a) Differences in syntax

46. What is a corpus in NLP?

a) A large collection of text data b) A summary of documents c) A chatbot model d) A set of stopwords

Answer: a) A large collection of text data

47. In bag-of-words, what is the primary data collected?

a) Word meanings b) Word frequencies c) Word sounds d) Sentence length

Answer: b) Word frequencies

48. Which chatbot typically handles customer service inquiries?

a) Rule-based chatbot b) Smart-bot c) AI-powered chatbot d) Learning bot

Answer: a) Rule-based chatbot

49. What is the purpose of TFIDF in NLP?

a) To classify text b) To assign value to words based on frequency c) To detect sentiment

d) To tokenize text

Answer: b) To assign value to words based on frequency

50. What is the first step in creating a chatbot using NLP?

a) Data acquisition b) Model evaluation c) Model deployment d) Text summarization

Answer: a) Data acquisition

Assertion-Reasoning Questions:

Assertion (A): NLP can be used to automatically summarize large amounts of text data.

Reason (R): NLP algorithms analyze text to extract important information while removing redundant or
irrelevant data.

a) Both A and R are true, and R is the correct explanation of A.

b) Both A and R are true, but R is not the correct explanation of A.

c) A is true, but R is false.

d) A is false, but R is true.

Correct Answer: a) Both A and R are true, and R is the correct explanation of A.
Assertion (A): Stopwords are removed during NLP processing.

Reason (R): Stopwords are frequently used words that add significant value to the meaning of a
sentence.

a) Both A and R are true, and R is the correct explanation of A.

b) Both A and R are true, but R is not the correct explanation of A.

c) A is true, but R is false.

d) A is false, but R is true.

Correct Answer: c) A is true, but R is false.

Assertion (A): Lemmatization is a technique used to reduce words to their root forms.

Reason (R): Lemmatization always results in words that are meaningful, unlike stemming.

a) Both A and R are true, and R is the correct explanation of A.

b) Both A and R are true, but R is not the correct explanation of A.

c) A is true, but R is false.

d) A is false, but R is true.

Correct Answer: a) Both A and R are true, and R is the correct explanation of A.

Assertion (A): Text classification can be used to categorize documents into predefined categories.

Reason (R): Text classification in NLP is essential for sentiment analysis.

a) Both A and R are true, and R is the correct explanation of A.

b) Both A and R are true, but R is not the correct explanation of A.

c) A is true, but R is false.

d) A is false, but R is true.

Correct Answer: b) Both A and R are true, but R is not the correct explanation of A.

Assertion (A): NLP chatbots can replace human customer service agents completely.

Reason (R): NLP-based chatbots use predefined scripts that limit their flexibility in conversations.

a) Both A and R are true, and R is the correct explanation of A.


b) Both A and R are true, but R is not the correct explanation of A.

c) A is true, but R is false.

d) A is false, but R is true.

Correct Answer: d) A is false, but R is true.

Case Study Based Questions:

Case Study 1:

A company is using sentiment analysis to understand customer opinions about its new smartphone
model. They collect social media posts and customer reviews to analyze the data. They want to know
if the overall sentiment is positive, negative, or neutral, and they are also interested in identifying
specific features customers mention.

Questions:

What kind of NLP technique is being used by the company?

Answer: Sentiment analysis.

How would the company identify specific features mentioned by customers in the reviews?

Answer: By using text classification or named entity recognition to extract key terms related to specific
features.

What is the benefit of analyzing customer sentiment in this scenario?

Answer: It helps the company understand customer opinions, improve the product, and respond to
issues or concerns in a timely manner.

Why might the company also want to remove stopwords in this analysis?

Answer: Stopwords are commonly used words that don’t add significant meaning to the analysis, so
removing them can help focus on more relevant terms.

How could the company improve the accuracy of their sentiment analysis over time?

Answer: By training the model on a larger dataset, using advanced algorithms like TFIDF or neural
networks, and continuously updating the dataset with new customer feedback.
Case Study 2:

A mental health app aims to provide users with a chatbot that interacts with them when they are
feeling stressed. The chatbot uses Cognitive Behavioral Therapy (CBT) techniques to guide
conversations and provide helpful suggestions.

Questions:

What NLP application is being used in this chatbot?

Answer: The chatbot uses Natural Language Processing for conversational interaction and therapy-based
suggestions.

How can NLP help the chatbot understand user inputs during a conversation?

Answer: NLP can process and analyze the text input, detect emotions or stress-related keywords, and
respond with appropriate therapeutic advice.

Why is it important for the chatbot to undergo text normalization?

Answer: Text normalization helps the chatbot process unstructured input by breaking it down into
simpler forms, such as tokens or base forms of words, allowing for more accurate responses.

What challenge might the chatbot face when interpreting user emotions?

Answer: Multiple meanings of words and ambiguous expressions could lead to misunderstandings of
the user’s emotional state.

How can the chatbot be improved to better serve its users over time?

Answer: By incorporating machine learning techniques, the chatbot can learn from user interactions,
adapt responses, and improve its understanding of complex emotions and contexts.

Case Study 3:

A retail company is using an NLP-based virtual assistant to help customers on its website. The
assistant helps customers find products, provides recommendations, and answers questions about
the store’s return policy.

Questions:

What is the primary function of the virtual assistant in this case study?
Answer: The virtual assistant helps customers with product recommendations and answers frequently
asked questions, such as those related to return policies.

How does the virtual assistant understand customer queries about different products?

Answer: The assistant uses NLP techniques like text classification and named entity recognition to
identify keywords and understand customer intents.

What benefit does the company gain by using this virtual assistant?

Answer: The company provides a seamless shopping experience, reduces the workload on human
customer service agents, and improves customer satisfaction.

How could the virtual assistant improve product recommendations over time?

Answer: By analyzing past interactions and customer preferences, the assistant can use machine
learning algorithms to offer more personalized product suggestions.

What challenge might the virtual assistant face when interacting with customers?

Answer: It may struggle with ambiguous queries or complex, multi-part questions that require a deeper
understanding beyond predefined scripts.

Case Study 4:

A healthcare provider is developing a chatbot that assists patients in scheduling appointments,


checking symptoms, and reminding them to take medication. The chatbot uses NLP to interpret the
patient’s requests and respond accordingly.

Questions:

What NLP functionality is most important for the healthcare chatbot in this scenario?

Answer: The chatbot uses NLP for understanding patient queries, recognizing symptoms, and scheduling
tasks based on natural language input.

How could the chatbot ensure it gives accurate responses to symptom-related queries?

Answer: By using medical-specific databases and NLP models trained on healthcare-related data, the
chatbot can provide reliable advice and suggestions.

What is the potential benefit of using this chatbot for patients?

Answer: It provides convenient access to healthcare services, reduces wait times, and helps patients
manage their health effectively through timely reminders and assistance.

What might be a limitation of the chatbot when responding to patient symptoms?

Answer: The chatbot may misinterpret symptoms or fail to account for complex medical conditions,
leading to incorrect recommendations.

How can the chatbot be improved to better assist patients over time?

Answer: Continuous updates to its medical knowledge base, as well as integration with patient medical
histories, can help the chatbot provide more accurate and personalized responses.

Case Study 5:

An educational platform is developing an NLP-powered system that helps students write better
essays. The system provides feedback on grammar, style, and sentence structure, and suggests
improvements based on common writing errors.

Questions:

How does the NLP system assist students in improving their writing?

Answer: The system uses NLP techniques like grammar checking, sentence segmentation, and syntax
analysis to identify errors and suggest improvements in writing.

What advantage does the system offer over traditional grammar checking tools?

Answer: The NLP system can provide more advanced feedback, such as style improvement and context-
based suggestions, going beyond simple grammar checks.

Why is sentence segmentation important in this system?

Answer: Sentence segmentation helps the system break down essays into individual sentences for easier
error detection and analysis of sentence structure.

What challenge might the system face when offering feedback on complex essays?

Answer: It may struggle with understanding the deeper meaning or context of sentences, leading to
inaccurate or irrelevant suggestions.

How can the system be enhanced to provide more effective feedback?

Answer: By incorporating machine learning and deeper contextual understanding, the system can offer
more tailored and accurate feedback based on the writing style and complexity of the essay.

Question-Answer:
What is Natural Language Processing (NLP)?

NLP is a sub-field of AI focused on enabling computers to understand and process human languages. It
involves programming computers to analyze and process large amounts of natural language data.

What are the three primary domains of AI discussed in the document?

The three primary domains of AI discussed are Data Science, Computer Vision, and Natural Language
Processing (NLP).

What is the difference between a Script-bot and a Smart-bot?

A Script-bot works around a predefined script and is limited in functionality, whereas a Smart-bot uses
AI, learns from data, and is more flexible and capable of handling various tasks.

What role does sentiment analysis play in NLP?

Sentiment analysis helps in identifying emotions and opinions expressed in text, enabling companies to
understand customer feedback and sentiment towards products or services.

How does the ‘Bag of Words’ model work in NLP?

The ‘Bag of Words’ model converts text into a collection of unique words from the corpus and counts
their frequency, disregarding the order in which they appear.

What is the key purpose of Text Normalization in NLP?

Text Normalization simplifies complex text into a form that machines can process, involving steps such
as tokenization, removing stopwords, and stemming.

Explain Term Frequency and Inverse Document Frequency (TFIDF).

TFIDF measures the importance of a word in a document relative to a collection of documents. Term
Frequency (TF) measures how often a word appears, while Inverse Document Frequency (IDF) reduces
the weight of common words across multiple documents.

What are stopwords and why are they removed in NLP?

Stopwords are common words that don’t add much meaning to the text (e.g., “and”, “the”). They are
removed to help the machine focus on more relevant terms.

Describe one application of Automatic Summarization.

Automatic Summarization can be used to extract key information from large texts, such as summarizing
news articles while avoiding redundancy and maximizing content diversity.

How does Cognitive Behavioral Therapy (CBT) relate to NLP-based projects in the document?
NLP can be used to create chatbots that apply primitive CBT techniques, helping users manage stress
and emotional well-being by interacting with them and suggesting when professional help may be
needed.

What challenges might computers face when processing human language?

Computers struggle with complexities such as the arrangement of words (syntax), multiple meanings of
words, and understanding the context or emotional subtext behind statements.

What are two types of chatbots mentioned in the document?

The two types are Script-bots, which follow predefined scripts, and Smart-bots, which are AI-powered
and can learn from data to handle more complex interactions.

Why is stemming used in NLP, and what is a potential drawback?

Stemming is used to reduce words to their root forms to simplify the data. However, it may produce
non-meaningful words (e.g., “studies” becomes “studi”).

How does lemmatization differ from stemming in NLP?

Lemmatization also reduces words to their base form but ensures the resulting words are meaningful,
whereas stemming may produce non-meaningful root forms.

What is the purpose of Data Acquisition in an NLP project?

Data Acquisition involves collecting relevant conversational data, such as through surveys or interviews,
to enable machines to interpret language and emotions effectively in an NLP project.

You might also like