0% found this document useful (0 votes)
179 views5 pages

Quiz 2 - Dimensionality - Suresh

Uploaded by

Lakshmi Dinakar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
179 views5 pages

Quiz 2 - Dimensionality - Suresh

Uploaded by

Lakshmi Dinakar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

PES-MTech-DSML-PT-S2Dashboard Courses Hackathons

Courses / Machine Learning 3 / Quiz 2 - Dimensionality reduction

Quiz 2 - Dimensionality reduction

Type : Graded Quiz Attempts : 1/1 Questions : 10

Time : 15m Due Date : Nov 17 at 11:59 PM


IST

Your : 10/10
Marks

Instructions

Attempt History

Attempt #1
Marks: 10
Nov 10 at 5:21 PM

Q No: 1 Correct Answer Marks: 1/1

A characteristic equation is defined as:

|A - λI| = 0 You Selected

|A| = λI

|A - I| = λ

None of the given options

Q No: 2 Correct Answer Marks: 1/1

Which of the following statements are correct


(i) LDA specifically tries to model the difference between the

classes of the data.

(ii) On the other hand, the PCA does not take any class

difference into account.

Both (i) and (ii) are correct You Selected

Only (i) is correct

Only (ii) is correct

Both (i) and (ii) are incorrect

Q No: 3 Correct Answer Marks: 1/1

Which of the following statement is TRUE about Linear Discriminative Analysis ?

LDA objective is to maximize the distance between the target You Selected
classes and minimize the distance within classes

minimize both distance between class and distance within class

maximize both distance between class and distance within class

LDA objective is to minimize the distance between the target classes and
maximize the distance within classes

Q No: 4 Correct Answer Marks: 1/1

Principal components are always _____ to each other.


Orthogonal You Selected

Parallel

Horizontal

None of the above

Q No: 5 Correct Answer Marks: 1/1

In PCA, the number of components is __________ to the number of independent


variables.

Less than or equal You Selected

Greater than

Less than

Greater than or equal to

Q No: 6 Correct Answer Marks: 1/1

Why you have to drop unimportant features?

Using the most important features will give you better efficiency You Selected
predicting your target

Standardize the data

To trains the model faster

To Find the correct clusters

Q No: 7 Correct Answer Marks: 1/1

________ is known to have a practical use as a binary as well as multiclass classifier.


LDA You Selected

PCA

Both PCA and LDA

None

Q No: 8 Correct Answer Marks: 1/1

PCA can be used for projecting and visualizing data in lower dimensions.

True You Selected

False

No answer text provided.

No answer text provided.

Q No: 9 Correct Answer Marks: 1/1

Which of the following are the different criterias to select the best principal
components?

Both Scree Plot and Kaiser Criterion You Selected

Scree Plot

Kaiser Criterion

None of the given options

Q No: 10 Correct Answer Marks: 1/1

Which of the following is a step of Data Pre Processing, which is applied to


independent variables, or features of data.
Standardization You Selected

Gradient descent

Error finding

None of these

Comments:

+ Add comments

Previous Next

© 2024 All rights reserved · Privacy · Terms

You might also like