11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 3) Quiz - APDaga DumpBox : The
mpBox : The Th…
HOME ABOUT CONTACT DISCLAIMER PRIVACY POLICY ADVERTISE
CATEGORIES INTERNET OF THINGS ARTIFICIAL INTELLIGENCE YOUTUBE SHOPBY.IN OTHER
Recent Posts ARDUINO All Internet of things (IOT) Tutorials in sequence using Arduino, NodeMCU & Raspberry Pi TROUBLESHOOTS [Solved]
TOTAL PAGEVIEWS
Home / Arti cial Intelligence / Deep Learning / Machine Learning / Q&A / Improving Deep Neural Networks: Hyperparameter
tuning, Regularization and Optimization (Week 3) Quiz 1 0 9 5 6 6 8 9
Improving Deep Neural Networks:
Hyperparameter tuning,
Regularization and Optimization
(Week 3) Quiz
Akshay Daga (APDaga) January 17, 2020 Arti cial Intelligence, Deep Learning, Machine Learning, Q&A
▸Hyperparameter tuning, Batch Normalization,
Programming Frameworks :
CATEGORIES
Arduino Artificial Intelligence Celonis
Deep Learning Freescale/NXP Hadoop
IoT (Internet of Things) Machine Learning
MATLAB NodeMCU Open Diary Python
Q&A Raspberry Pi SQL Troubleshoots
Improving Deep Neural Networks Week-3 (MCQ)
ZStar
1. If searching among a large number of hyperparameters, you should try values in a grid rather
than random values, so that you can carry out the search more systematically and not rely on LINKS MONETIZED BY
chance. True or False?
True
YOUTUBE CHANNEL
False
APDaga DumpBox
YouTube 6K
2. Every hyperparameter, if set poorly, can have a huge negative impact on training, and so all
hyperparameters are about equally important to tune well. True or False?
https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-3-quiz.html 1/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 3) Quiz - APDaga DumpBox : The Th…
True
False
Yes. We’ve seen in lecture that some hyperparameters, such as the
learning rate, are more critical than others.
POPULAR POSTS
Coursera: Machine Learning (We
[Assignment Solution] - Andrew
Coursera: Machine Learning (We
[Assignment Solution] - Andrew
Coursera: Machine Learning (We
[Assignment Solution] - Andrew
3. During hyperparameter search, whether you try to babysit one model (“Panda” strategy) or
Coursera: Machine Learning (We
train a lot of models in parallel (“Caviar”) is largely determined by: [Assignment Solution] - Andrew
Whether you use batch or mini-batch optimization
Coursera: Machine Learning (We
[Assignment Solution] - Andrew
The presence of local minima (and saddle points) in your neural network
The amount of computational power you can access
The number of hyperparameters you have to tune
4. If you think β (hyperparameter for momentum) is between on 0.9 and 0.99, which of the
following is the recommended way to sample a value for beta?
r = np.random.rand() SOCIAL COUNTER
beta = r*0.09 + 0.9
3.5k Likes 1.7k F
r = np.random.rand() 2.8k Subscribes 520 F
beta = 1-10**(- r - 1)
RECENT POSTS COMMENTS
All Internet of things (IOT) Tu
r = np.random.rand() sequence using Arduino, No
beta = 1-10**(- r + 1) & Raspberry Pi
Akshay Daga (APDaga) Jun
[Solved] How to install 100%
Netflix on Honor 9x Pro & Hua
Mate 30 Pro | Simple and Sec
r = np.random.rand()
Akshay Daga (APDaga) Jun
beta = r*0.9 + 0.09
https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-3-quiz.html 2/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 3) Quiz - APDaga DumpBox : The Th…
Celonis - Data Analyst (Theo
Questions)
Akshay Daga (APDaga) Ma
Celonis - Data Engineer (The
MCQ Questions)
Akshay Daga (APDaga) Ma
5. Finding good hyperparameter values is very time-consuming. So typically you should do it once
at the start of the project, and try to nd very good hyperparameters so that you don’t ever
have to revisit tuning them again. True or false?
True
YOUTUBE CHANNEL
False APDaga DumpBox
YouTube 6K
FOLLOWERS
6. In batch normalization as presented in the videos, if you apply it on the lth layer of your neural Seguidores (25) Siguiente
network, what are you normalizing?
FACEBOOK
App que cuenta desde fotos
Use nuestra aplicación para contar
automáticamente desde fotos
7. In the normalization formula , why do we use epsilon?
To speed up convergence
In case μ is too small
https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-3-quiz.html 3/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 3) Quiz - APDaga DumpBox : The Th…
To have a more accurate normalization
To avoid division by zero
8. Which of the following statements about γ and β in Batch Norm are true?
β and γ are hyperparameters of the algorithm, which we tune via random
sampling.
They set the mean and variance of the linear variable of a given layer.
They can be learned using Adam, Gradient descent with momentum, or
RMSprop, not just with gradient descent.
The optimal values are , and β = μ.
There is one global value of and one global value of for each layer,
and applies to all the hidden units in that layer.
Check-out our free tutorials on IOT (Internet of Things):
APDaga DumpBox
YouTube 6K
9. After training a neural network with Batch Norm, at test time, to evaluate the neural network
on a new example you should:
Skip the step where you normalize using μ and since a single test example
cannot be normalized.
If you implemented Batch Norm on mini-batches of (say) 256 examples, then to
evaluate on one test example, duplicate that example 256 times so that you’re
working with a mini-batch the same size as during training.
Use the most recent mini-batch’s value of μ and to perform the needed
normalizations.
https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-3-quiz.html 4/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 3) Quiz - APDaga DumpBox : The Th…
Perform the needed normalizations, use μ and estimated using an
exponentially weighted average across mini-batches seen during training.
10. Which of these statements about deep learning programming frameworks are true?
(Check all that apply)
A programming framework allows you to code up deep learning
algorithms with typically fewer lines of code than a lower-level language such
as Python.
Deep learning programming frameworks require cloud-based machines to run.
Even if a project is currently open source, good governance of the project
helps ensure that the it remains open even in the long term, rather than
become closed or modi ed to bene t only one company.
Click here to see solutions for all Machine Learning Coursera Assignments.
&
Click here to see more codes for Raspberry Pi 3 and similar Family.
&
Click here to see more codes for NodeMCU ESP8266 and similar Family.
&
Click here to see more codes for Arduino Mega (ATMega 2560) and similar Family.
Feel free to ask doubts in the comment section. I will try my best to answer it.
If you nd this helpful by any mean like, comment and share the post.
This is the simplest way to encourage me to keep doing such work.
Thanks & Regards,
- APDaga DumpBox
SHARE THIS Facebook Twitter
ARTIFICIAL INTELLIGENCE
Improving Deep Neural
CELONIS CELONIS
Networks: Hyperparameter
Celonis - Data Analyst (Theory Celonis - Data Engineer (Theory tuning, Regularization and
MCQ Questions) MCQ Questions) Optimization (Week 3) Quiz
PREVIOUS NEXT
Improving Deep Neural Networks: Hyperparameter Improving Deep Neural Networks: Hyperparameter
tuning, Regularization and Optimization (Week 2) tuning, Regularization and Optimization (Week 1 -
Quiz Initialization)
POST COMMENT BLOGGER FACEBOOK DISQUS
No comments
https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-3-quiz.html 5/6
11/30/2020 Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization (Week 3) Quiz - APDaga DumpBox : The Th…
Enter your comment...
Comment as: tonyveas@gma Sign out
Publish Preview Notify me
Created By ThemeXpose
https://www.apdaga.com/2020/01/improving-deep-neural-networks-hyperparameter-tuning-regularization-and-optimization-week-3-quiz.html 6/6