Autoplay
Autocomplete
Previous Lesson
Complete and Continue
313. Advanced Neural Network Methods
Introduction
Get Started
A Roadmap
In Our Last Episode...
1 Regularization
1.0 How Regularization Works
1.1 Constructing a Network with Regularization (7:14)
1.2 Executing Regularization during Stochastic Gradient Descent (3:35)
1.3 L1 Regularization (LASSO) (4:45)
1.4 L2 Regularization (Ridge or Tikhonov) (2:15)
1.5 Regularization in Action (2:55)
1..6 Custom Regularizers (3:25)
2 Dropout
2.0 How Dropout Works (6:26)
2.1 Set dropout rates during layer creation (2:09)
2.2 Drop out nodes during training (4:48)
2.3 Dropout in Action (1:56)
3 Custom layers and Skip-layers
3.0 What is a Computation Graph and Who Cares? (9:10)
3.1 Computation Graph for Custom Layers (3:48)
3.2 Build a Model with Custom Layers (3:05)
3.3 Run a Model with a Branching Computation Graph (4:28)
3.4 Build a Generic Layer (3:04)
3.5 Update the Dense Layer (3:16)
3.6 Build a Normalization Layer (4:02)
3.7 Build a Difference-Calculating Layer (3:40)
3.8 Test Autoencoder Performance (4:04)
4 Optimizers
4.0 Separate the Optimizer from the Regularizers (5:43)
4.1 Separate the Regularizers into Pre- and Post-Optimizer (2:56)
4.2 Stochastic Gradient Descent (SGD) (6:30)
4.3 Momentum (4:53)
4.4 Adam (5:03)
4.5 Experimental Optimizers (2:24)
4.6 Optimizer Comparison (3:53)
5 Initialization
5.0 Glorot Initialization (5:52)
5.1 He Initialization (1:52)
5.2 Custom Initializer (2:16)
5.3 Initializer Comparison (3:32)
6 Martian data
6.0 Reporting (9:07)
6.1 Enable one-line download and install (6:31)
6.2 Get images of Mars (3:33)
6.3 Build Martian images autoencoder example (8:35)
6.4 Run autoencoder on Martian images (5:53)
3.0 What is a Computation Graph and Who Cares?
These are the
optimization
and
backpropagation
links mentioned in the video.
Download
Complete and Continue
Discussion
0
comments
Load more
0 comments