R2032121 1
R2032121 1
A
A Data Node is the slave/worker node B Each incoming file is broken into 32 MB by
and holds the user data in the form of default
Data Blocks
C Data blocks are replicated across D Each incoming file is broken into 12 MB by
different nodes in the cluster to ensure default
a low degree of fault tolerance
002. HDFS works in a __________ fashion. D
A master-slave B worker/slave
C worker/master D master-worker
003. ________ Name Node is used when the Primary Name Node goes down. C
A Rack B Data
C Secondary D First
004. Point out the wrong statement. D
A Replication Factor can be configured B Block Report from each DataNode contains a
at a cluster level (Default is set to 3) list of all the blocks that are stored on that
and also at a file level DataNode
C User data is stored on the local file D DataNode is aware of the files to which the
system of DataNodes blocks stored on it belong to
005. HDFS has the concept of a block, but it is a much larger unit of ________by default. B
A 128 MB B 64 MB
C 32 MB D 12 MB
006. A ________ serves as the master and there is only one NameNode per cluster. B
A Data Node B NameNode
C Data block D Replication
007. ________ is a file system designed for storing very large files with streaming data access A
patterns, running on clusters of commodity hardware.
A HDFS B JVM
C PIPE D HIVE
008. Hadoop comes with a distributed file system called ____________ D
A PIG B JVM
C PIPE D HDFS
009. Filesystems that manage the storage across a network of machines are called C
_________filesystems.
A Python B Java
C Distributed D Hadoop
010. ___________ is method to copy byte from input stream to any other stream in Hadoop. D
A Utils B IUtils
C JUtils D IOUtils
011. Interface ____________ reduces a set of intermediate values which share a key to a smaller B
set of values.
A Mapper B Reducer
C Writable D Readable
012. Reducer is input the grouped output of a ____________ A
A Mapper B Reducer
C Writable D Readable
013. The output of the reduce task is typically written to the FileSystem via ____________ A
A OutputCollector B InputCollector
C OutputCollect D interface
014. For YARN, the ___________ Manager UI provides host and port information. C
A Data Node B NameNode
C Resource D Replication
015. In order to read any file in HDFS, instance of __________ is required. A
A Filesystem B Datastream
C Outstream D inputstream
016. Which of the following scenario may not be a good fit for HDFS A
A HDFS is not suitable for scenarios B HDFS is suitable for storing data related to
requiring multiple/simultaneous writes applications requiring low latency data
to the same file access
C HDFS is suitable for storing data D HDFS is suitable for storing data file is
related to applications requiring low broken into 32 MB by default
latency data access
017. ________ is the slave/worker node and holds the user data in the form of Data Blocks. A
A DataNode B NameNode
C Data block D Replication
018. HDFS provides a command line interface called __________ used to interact with HDFS. B
A HDFS Shell B FS Shell
C DFS Shell D DS Shell
019. During the execution of a streaming job, the names of the _______ parameters are D
transformed.
A Vmap B Mapvim
C Mapreduce D mapred
020. The standard output (stdout) and error (stderr) streams of the task are read by the B
TaskTracker and logged to _________
A ${HADOOP_LOG_DIR}/user B ${HADOOP_LOG_DIR}/userlogs
C ${HADOOP_LOG_DIR}/logs D ${HADOOP_LOG_DIR}/user/log
021. The _____________ can also be used to distribute both jars and native libraries for use in B
the map and/or reduce tasks.
A DistributedLog B DistributedCache
C DistributedJars D DistributedFile
022. __________ is used to filter log files from the output directory listing. B
A OutputLog B OutputLogFilter
C DistributedLog D DistributedJars
023. Map output larger than ___________ percent of the memory allocated to copying map C
outputs.
A 10 B 15
C 25 D 35
024. Jobs can enable task JVMs to be reused by specifying the job configuration _________ B
A mapred.job.recycle.jvm.num.tasks B mapissue.job.reuse.jvm.num.tasks
C mapred.job.reuse.jvm.num.tasks D mapred.job.jvm.num.tasks
025. Applications can use the _________ provided to report progress or just indicate that they B
are alive.
A Collector B Reporter
C Dashboard D Input
026. Which of the following parameter is to collect keys and combined values? D
A Key B Values
C Reporter D output
027. ____________ specifies the number of segments on disk to be merged at the same time. D
A mapred.job.shuffle.merge.percent B mapred.job.reduce.input.buffer.percen
C mapred.inmem.merge.threshold D io.sort.factor
028. ______________ is another implementation of the MapRunnable interface that runs C
mappers concurrently in a configurable number of threads.
A MultithreadedRunner B MultithreadedMap
C MultithreadedMapRunner D SinglethreadedMapRunner
029. The key, a ____________ is the byte offset within the file of the beginning of the line. B
A LongReadable B LongWritable
C ShortReadable D ShortWritable
030. _________ is the output produced by TextOutput Format, Hadoop default Output Format. D
A KeyValueTextInputFormat B FileValueTextInputFormat
C FileValueTextOutputFormat D KeyValueTextOutputFormat
031. __________ is a variant of Sequence File Input Format that converts the sequence files B
keys and values to Text objects.
A SequenceFile B SequenceFileAsTextInputFormat
C SequenceAsTextInputFormat D TextInputFormat
032. An input _________ is a chunk of the input that is processed by a single map. C
A Textformat B datanode
C split D Join
033. An ___________ is responsible for creating the input splits, and dividing them into D
records.
A TextOutputFormat B TextInputFormat
C OutputInputFormat D InputFormat
034. The output of the reduce task is typically written to the FileSystem via _____________ A
A OutputCollector.collect B OutputCollector.get
C OutputCollector.receive D OutputCollector.put
035. ___________ generates keys of type Long Writable and values of type Text. B
A Text Output Format B TextInput Format
C OutputInput Format D OutputCollector.collect
036. In _____________ the default job is similar, but not identical, to the Java equivalent. B
A Mapreduce B Streaming
C Orchestration D Mapred
037. Point out the correct statement. A
A The minimum split size is usually 1 B Applications may impose a minimum split
byte, although some formats have a size
lower bound on the split size
C The maximum split size defaults to D Sequence File As Text Input Format is a
the maximum value that can be variant of Sequence FileInputFormat that
represented by a Java long type retrieves the sequence files keys and values
as opaque binary objects
038. Point out the wrong statement. C
A Hadoop works better with a small B Combine File Input Format is designed to
number of large files than a large work well with small files
number of small files
C Combine FileInput Format does not D The minimum split size is usually 1 byte,
compromise the speed at which it can although some formats have a lower bound
process the input in a typical Map on the split size
Reduce job
039. Among the following option which of the following property gets configured on mapred- D
site.xml
A Java environment variables B Replication factor
C Directory names to store hdfs files D Host and port where MapReduce task runs.
040. Mapper class is B
A Static type B Generic type
C Abstract type D Final
041. Which of the following writes MapFiles as output? C
A DBInpFormat B MapFileOutputFormat
C SequenceFileAsBinaryOutputFormat D MapFileOutputFormatAsBinaryOutputForma
042. The split size is normally the size of a ________ block, which is appropriate for most D
applications.
A Generic B Task
C Library D HDFS
043. ______ class allows you to specify the Input Format and Mapper to use on a per-path basis. A
A MultipleInputs B SingleInputs
C SingleOutputs D MultipleOutputs
044. ___________ is an input format for reading data from a relational database, using JDBC. C
A DBInput B DBOuutput
C DBInputFormat D DBInpFormat
045. Which of the following is the default output format? D
A TextFormat B TextOutput
C DBOuutput D TextOutputFormat
046. What is the minimum amount of data that a disk can read or write in HDFS B
A Byte size B Block size
C Heap D Record
047. Which of the following phases occur simultaneously, while map-outputs are being fetched A
they are merged.
A Shuffle and Sort B Reduce and Sort
C Shuffle and Map D Shuffle and Heap
048. Mapper and Reducer implementations can use the ________ to report progress or just C
indicate that they are alive.
A Partitioner B OutputCollector
C Reporter D Shuffle and Sort
049. __________ is a generalization of the facility provided by the Map Reduce framework to B
collect data output by the Mapper or the Reducer.
A Partitioner B Output Collector
C Reporter D InputCollector
050. Identify the slave node among the following. B
A Job node B Data node
C Task node D Name node
051. Identify the node which acts as a checkpoint node in HDFS. A
A Secondary Name node B Secondary data node
C Name node D Data node
052. Among the following which does the Job control in Hadoop? C
A Task class B Mapper class
C Job class D Reducer cass
053. Fixed-size pieces of Map Reduce job is known as ________ A
A Splits B Tasks
C Maps D Records
054. The output of map tasks is written in? A
A Local disk B File system
C HDFS D Secondary storage
055. Which of the following function is used to read data in PIG? D
A WRITE B READ
C EXE D LOAD
056. You can run Pig in interactive mode using the ______ shell. A
A Grunt B FS
C DFS D HDFS
057. Which of the following will run pig in local mode? B
A $ pig -x aez_local B $ pig -x local
C $ pig -x tez_local D $ pig
058. Which of the following command is used to show values to keys used in Pig A
A Set B Declare
C Display D Script
059. Point out the correct statement. A
A You can run Pig in either mode using B You can run Pig in batch mode using the
the pig command Grunt shell
C You can run Pig in interactive mode D You can run Pig in batch mode using the
using the FS shell shell
060. You can run Pig in batch mode using __________ C
A Pig shell command B Pig prog
C Pig scripts D Pig options
061. _________ is the primary interface for a user to describe a MapReduce job to the Hadoop B
framework for execution.
A Map Parameters B JobConf
C Memory Conf D Reporter
062. Which among the following command is used to copy a directory from one node to another B
in HDFS?
A rcp B distcp
C dcp D drcp
063. Pig operates in mainly how many nodes? A
A Two B Three
C Four D Five
064. Which of the following operator executes a shell command from the Hive shell? B
A | B !
C ^ D +
065. Point out the wrong statement. C
A Hive Commands are non-SQL B source FILE <filepath> executes a script file
statement such as setting a property or inside the CLI
adding a resource
C bfs <bfs command> executes a dfs D hive is Query language similar to SQL
command from the Hive shell
066. _________ is a shell utility which can be used to run Hive queries in either interactive or D
batch mode.
A $HIVE/bin/have B $HIVE/bin/hive
C $HIVE_HOME/hive D $HIVE_HOME/bin/hive
067. Which is the additional command line option is available in Hive 0.10.0? A
A -database <dbname> B -database <dname>>
C -db <dbname> D -dbase <<dbname>
068. Which of the following is shortcut for DUMP operator? B
A \de alias B \d alias
C \q D \d
069. Which of the following command sets the value of a particular configuration variable B
(key)?
A set -v B set <key>=<value>
C set D reset
070. Use the __________ command to run a Pig script that can interact with the Grunt shell C
A fetch B declare
C run D display
071. Which of the following command can be used for debugging? A
A exec B execute
C error D throw
072. Which of the following operator is used to view the map reduce execution plans D
A DUMP B DESCRIBE
C STORE D EXPLAIN
073. The property set to run hive in local mode as true so that it runs without creating a A
mapreduce job is
A hive.exec.mode.local.auto B hive.exec.mode.local.override
C hive.exec.mode.local.settings D hive.exec.mode.local.config
074. When one of the join tables is small enough to fit into memory, The type of join used by B
Hive is
A Inner Join B Map join
C Reduce Join D Sort Join
075. Hive specific commands can be run from Beeline, when the Hive _______ driver is used. B
A ODBC B JDBC
C ODBC-JDBC D JVM
076. Which of the following data type is supported by Hive? D
A map B record
C string D enum
077. The 2 default TBLPROPERTIES added by hive when a hive table is created is B
A hive_version and last_modified by B last_modified_by and last_modified_time
C last_modified_time and hive_version D last_modified_by and table_location
078. The position of a specific column in a Hive table C
A Must be arranged alphabetically B can be anywhere in the table creation clause
C must match the position of the D Must match the position only for date time
corresponding data in the data file data type in the data file
079. The CLI when invoked without the -i option will attempt to load C
$HIVE_HOME/bin/.hiverc and $HOME/.hiverc as _______ files.
A processing B termination
C initialization D running
080. When $HIVE_HOME/bin/hive is run without either the -e or -f option, it enters C
______mode.
A Batch B initialization
C Interactive shell D Multiple
081. The default delimiter in hive to separate the element in STRUCT is B
A '\001 ' B '\oo2 '
C '\oo3 ' D '\oo4 '
082. HiveServer2 introduced in Hive 0.11 has a new CLI called __________ A
A BeeLine B SqlLine
C HiveLinewrong D CLilLine
083. A user creates a UDF which accepts arguments of different data types, each time it is run. It B
is an example of
A Aggregate Function B Generic Function
C Standard UDF D Super Functions
084. What Hive can not offer? B
A Storing data in tables and columns B Online transaction processing
C Handling date time data D Partitioning stored data
085. The DISTRIBUTED BY clause in hive A
A comes Before the sort by clause B comes after the sort by clause
C does not depend on position of sort by D cannot be present along with the sort by
clause clause
086. In Hive SerDe stands for B
A serialize and Deserialize B serializer and Deserializer
C Serialize and Destruct D serve and destruct
087. While querying a hive table for a Array type column, if the array index is nonexistent then A
A NULL is returned B Error is reported.
C Partial results are returned D "NA" is returned
088. ________ was designed to overcome the limitations of the other Hive file formats. A
A ORC B OPC
C ODC D PVC
089. Each database created in hive is stored as D
A a file B a jar file
C a hdfs block D a directory
090. In ______ mode HiveServer2 only accepts valid Thrift calls. A
A Remote B HTTP
C Embedded D Interactive
091. A View in Hive can be dropped by using B
A Drop table B Drop view
C Delete view D Remove view
092. HBase is a distributed ________ database built on top of the Hadoop file system. C
A Row-oriented B Tuple-oriented
C Column-oriented D Record-oriented
093. HBase is ________ defines only column families B
A Row Oriented B Schema-less
C Fixed Schema D Column Oriented
094. Apache HBase is a non-relational database modeled after Googles _________ C
A BigTop B Scanner
C Bigtable D FoundationDB
095. HCatalog is installed with Hive, starting with Hive release D
A 0.12.0 B 0.10.0
C 0.9.0 D 0.11.0
096. A GenericUDF is a Function that D
A Takes one or more columns form a B Takes one or more columns form many rows
row and returns a single value and returns a single value
C Take zero or more inputs and produce D Detects the type of input programmatically
multiple columns or rows of output and provides appropriate response
097. By default when a database is dropped in Hive C
A The tables are also deleted B The hdfs blocks are formatted
C The directory is deleted if there are no D The hdfs blocks are deleted
tables
098. The drawback of managed tables in hive is B
A They can never be dropped B They cannot be shared with other
applications
C They are always stored under default D They cannot grow bigger than a fixed size of
directory 100GB
099. What can be altered about a view C
A its name B its location
C its TBLPROPERTIES D The query it is based on
100. __________ class adds HBase configuration files to its object. D
A Collector B Component
C Conversion D Configuration
101. _________ is the main configuration file of HBase. C
A hbase.xml B hbase-site.xml
C hbase-site-conf.xml D hdbase-site-conf.xml
102. HBase uses the _______ File System to store its data. B
A Imphala B Hadoop
C Scala D Hive
103. ZooKeeper itself is intended to be replicated over a sets of hosts called ____________ C
A master B chunks
C ensemble D subdomains
104. The minimum number of row versions to keep is configured per column family via C
________
A HBaseDecriptor B HTabDescriptor
C HColumnDescriptor D HRowDescriptor
105. HBase supports a ____________ interface via Put and Result. A
A bytes-in/bytes-out B bytes-in
C bytes-out D bytes-echo
106. The _________ Server assigns regions to the region servers and takes the help of Apache C
ZooKeeper for this task.
A Slave B Region
C Master D Zookeeper
107. Which of the following command provides information about the user? D
A user B status
C version D whoami
108. HBaseAdmin and ____________ are the two important classes in this package that provide A
DDL functionalities.
A HTableDescriptor B HDescriptor
C HTable D HTabDescriptor
109. ZooKeeper allows distributed processes to coordinate with each other through registers, A
known as ___________
A znodes B hnodes
C vnodes D rnodes
110. Zookeeper essentially mirrors the _______ functionality exposed in the Linux kernel. B
A iread B inotify
C iwrite D icount
111. ZooKeepers architecture supports high ____________ through redundant services. NO
ANSWER
GIVEN
A flexibility B scalability
C availability D interactivity
112. Predictive analytics uses statistics and ____ to determine future performance. B
A Algorithmic techniques B Modeling techniques
C System development and design D Weather forecasts
techniques
113. The java package structure has changed from com.yahoo.zookeeper* to ___________ B
A apache.zookeeper B org.apache.zookeeper
C org.apache.zookeeper.pk D org.apache.zookeeper.package
114. A number of constants used in the client ZooKeeper API were renamed in order to reduce C
________ collision.
A value B counter
C namespace D Scalability
115. Which of the guarantee is provided by Zookeeper? D
A Interactivity B Flexibility
C Scalability D Reliability
116. ZooKeeper is especially fast in ___________ workloads. B
A write B read-dominant
C read-write D write-dominant
117. The underlying client-server protocol has changed in version _______ of ZooKeeper. C
A 6.0.0 B 4.0.0
C 3.0.0 D 2.0.0
118. ________ is rapidly being adopted for computing descriptive and query types of analytics C
on Big data.
A EDR B Azure
C Hadoop D InfoSight
119. _________ involves predicting a response with meaningful magnitude, such as quantity A
sold, stock price, or return on investment.
A Regression B Summarization
C Clustering D Classification
120. Which of the following involves predicting a categorical response? D
A Regression B Summarization
C Clustering D Classification
121. Which of the following contains pre-built predictive tools? C
A ssas B fossilx
C alteryx D paleoTS
122. The IBM _________ analytics appliances combine high-capacity storage for Big Data with C
a massively-parallel processing platform for high-performance computing.
A InfoSight B LityxEQ
C Watson D Netezza
123. ______ is an integrated hosted analytics platform for marketing insights, predictive models, B
and marketing optimization
A InfoSight B LityxEQ
C Watson D Netezza
124. Predictive analytics is a process harnesses ____, often massive, data sets into models. D
A Storage B Network
C Homogeneous D Heterogeneous
125. Amongst which of the following is / are the correct workflow of predictive analytics A
A Import data → Clean the data → B Clean the data → Develop a predictive
Develop a predictive model → model → Import data → Integrate the model
Integrate the model
C Clean the data → Develop a D Import data → Integrate the model → Clean
predictive model → Import data → the data → Develop a predictive model
Integrate the model
126. Predictive analytics relies on capturing relationships between explanatory variables and the A
__
A Predicted variables B Descriptive variables
C Prescriptive variables D Pre-Prescriptive variables
127. In a simple linear regression model (One independent variable), If we change the input D
variable by 1 unit. How much output variable will change?
A by 1 B no change
C by intercept D by its slope
128. In syntax of linear model lm(formula,data,..), data refers to ______ B
A Matrix B Vector
C Array D List
129. ________ is an incredibly powerful tool for analyzing data. A
A Linear regression B Logistic regression
C Gradient Descent D Greedy algorithms
130. Predicting y for a value of x thats outside the range of values we actually saw for x in the C
original data is called ___________
A Regression B Extrapolation
C Intra polation D Polation
131. If Linear regression model perfectly first i.e., train error is zero, then ______________ C
A Test error is also always zero B Test error is non zero
C Couldnt comment on Test error D Test error is equal to Train error
132. How many coefficients do you need to estimate in a simple linear regression model (One B
independent variable)?
A 1 B 2
C 3 D 4
133. __________ is proprietary tool for predictive analytics. B
A R B SAS
C EDR D SSAS
134. Which of the following is preferred for text analytics? D
A R B S
C EDR D Python
135. ______ is simplest class of analytics. A
A Descriptive B Predictive
C Prescriptive D Summarization
136. ______ regression method is also known as the ordinary least squares estimation. B
A Simple B Direct
C Indirect D Mutual
137. What do we do the curvilinear relationship in linear regression? A
A ignore B consider
C may be considered D sometimes consider
138. Which one of the following statements about the correlation coefficient is correct? C
A The correlation coefficient is B Both the change of scale and the change of
unaffected by scale changes. origin have no effect on the correlation
coefficient.
C The correlation coefficient is D The correlation coefficient is affected by
unaffected by the change of origin. changes of origin and scale.
139. Choose the least likely assumption of a classic normal linear regression model? D
A When outliers are present in the data B The independent variable and the dependent
series, correlation is a more reliable or variable have a linear relationship.
relevant measure
C There is no randomness in the D The independent variable is normally
independent variable. distributed.
140. Although it may seem overly simplistic, _______ is extremely useful both conceptually A
and practically.
A Linear regression B Logistic regression
C Gradient Descent D Greedy algorithms
141. When there are more than one independent variables in the model, then the linear model is C
termed as _______
A Unimodal B Multiple model
C Multiple Linear model D Multiple Logistic model
142. Analysis of variance in short form is? D
A ANOV B AVA
C ANOV D ANOVA
143. ________ is a simple approach to supervised learning. It assumes that the dependence of Y A
on X1, X2, . . . Xp is linear
A Linear regression B Logistic regression
C Gradient Descent D Greedy algorithms
144. Which of the following is not an assumption for simple linear regression? A
A Multicollinearity B Constant variance
C Normally distributed variables D Normally distributed residuals
145. Continuous predictors influence the ______ of the regression line, while categorical C
predictors influence the _____________.
A R2, p-value B p-value, R2
C slope, intercept D intercept, slope
146. Multinomial logistic regression is used when A
A the outcome variable is nominal with B the outcome variable is ordinal with three or
three or more categories. more categories.
C at least one predictor is nominal with D at least one predictor is ordinal with three or
three or more categories. more categories.
147. The model significance tests for multinomial and ordinal regression use which of the B
following test statistics?
A F-statistics B Chi-squared statistics
C Percent correctly predicted D Odds ratios with 95% confidence intervals
148. Multiple linear regression (MLR) is a __________ type of statistical analysis. C
A univariate B bivariate
C multivariate D statistical
149. A term used to describe the case when the predictors in a multiple regression model are B
correlated is called:
A Polynomial B Multicollinearity
C Heteroscedasticity D Homoscedasticity
150. In regression analysis, the variable that is being predicted is B
A the independent variable B the dependent variable
C usually denoted by x D usually denoted by r
151. A regression analysis is inappropriate when; D
A you want to make predictions for one B you have two variables that are measured on
variable based on information about an interval or ratio scale.
another variable.
C the pattern of data points forms a D there is heteroscedasticity in the scatter plot
reasonably straight line.
152. Data visualtization is realteed with A
A pictorial representation B numerical representation
C structural representation D static representation
153. which of the following does not visualize data C
A Charts B Maps
C Shapes D Graphs
154. Data _____________refers to graphical representation of data A
A Visualization B Analysis
C Plotting D Handling
155. information visualtization techniques are C
A Pie Chart B Line Chart
C Flow Chart D Bar Chart
156. Which of the following adds marginal sums to an existing table? B
A par() B prop.table()
C addmargins() D quantile()
157. Which of the following lists names of variables in a data.frame? A
A quantile() B names()
C barchart() D par()
158. One way to examine model fit for multinomial and ordinal regression is to compute C
A F-statistics. B chi-squared statistics.
C percent correctly predicted. D odds ratios with 95% confidence intervals.
159. Data visualization is also an element of the broader _____________. B
A deliver presentation architecture B data presentation architecture
C dataset presentation architecture D data process architecture
160. Which of the following groups find the correlation matrix? A
A factor.model B col.max(x)
C stem D which.max(x)
161. __________ refers to the use of persistent brushing, followed by subsequent operations D
such as touring to compare the groups.
A Identification B Scaling
C Brushing D Painting
162. For the following which can be used to change a plots aspect ratio, revealing different data B
features.
A Identification B Scaling
C Brushing D Painting
163. Data Visualization tools come with connectors to different data sources, including most A
common relational databases like ____________and most of the cloud storage platforms.
A Hadoop B Internet
C Stem D Database
164. which are pros of Data Visualtization D
A it can be distracting B it can 't represent information
C it can misrepresent information D it can be accessed quickly by a wider
audience
165. Which of the following lists names of variables in a data.frame A
A quantile() B names()
C barchart() D par()
166. which of the interica techniques is not used for data visualtization C
A Heat Maps B Bullet Graphs
C Fever Maps D Bubble Clouds
167. Data Visualization is also an element of the broader B
A data process architecture B data presentation architecture
C deliver presentation architecture D data programming process architecture
168. from the following which factors that influnce data visualization choices A
A Audience B Static
C Service D Charts
169. _________allow to distribute two or more data sets over a 2D or even 3D space to show A
the relationship between these sets and the parameters on the plot.
A Plots B Maps
C Charts D Shapes
170. __________ is one of the advanced data visualization techniques that help determine the D
correlation between multiple constantly updating (steaming) data sets.
A Diagrams B Charts
C Shapes D Matrix
171. The most popular data visualization library in python is _____ B
A matinfolib B matplotlib
C pip D matpiplib
172. which of the following are the data visualization tools for non developers C
A D3.js B High Charts
C RAW D Fusion Charts
173. which of the following are the data visualization tools for developers D
A Datawrapper B RAW
C plotly D Chart.js
174. 70% of the interactive visualization adopters improve collaboration and knowledge A
sharing.
A knowledge sharing. B underlying data.
C frequently. D findings.
175. Todays __________% of human communication is visual, and it tells that human eyes are D
processing images 60,000 times more than the text-based data
A 73 B 83
C 63 D 93