Autoplay
Autocomplete
Previous Lesson
Complete and Continue
322. Convolutional neural networks in two dimensions
1. Classifying handwritten digits
1.1 Welcome (2:23)
1.2 Project overview (2:26)
1.3 The MNIST digits data set (2:17)
1.4 Overview of the convolutional neural network model (2:43)
1.5 Results from pre-trained model (4:26)
1.6 Examples of prediction successes and failures (6:02)
1.7 Why Cottonwood? (3:05)
1.8 Training code walkthrough: Setup (3:57)
1.9 Training code walkthrough: Adding layers (3:13)
1.10 Training code walkthrough: Connecting layers (3:33)
1.11 Training code walkthrough: Training loop (3:06)
1.12 Testing code walkthrough (5:10)
1.13 Reporting code walkthrough: Loss history and text summary (4:42)
1.14 Reporting code walkthrough: Collecting examples (5:44)
1.15 Reporting code walkthrough: Rendering examples (6:08)
1.16 Cottonwood tour: Core, experimental, data (5:27)
1.17 Cottonwood tour: Tests, cheatsheet (3:47)
2. Convolution and a walking tour of the code
2.1 How two dimensional convolution works
2.2 Code tour: 2D convolution (introduction) (3:18)
2.3 Code tour: 2D convolution initialization (8:36)
2.4 Code tour: 2D convolution forward pass (10:52)
2.5 Code tour: 2D convolution backward pass (7:21)
2.6 Code tour: Bias layers (2:57)
3. Classifying CIFAR 10 images
3.1 About the CIFAR 10 image classification data set (2:47)
3.2 Get the data and the code (3:09)
3.3 Train, test, and evaluate the model (5:11)
3.4 Visualizing convolution layers and kernels (3:13)
3.5 Model structure and training curve (2:19)
3.6 Model creation and training (15:22)
3.7 Model testing (2:30)
3.8 Training curve (4:00)
3.9 Reporting script (5:00)
3.10 Reports (5:05)
3.11 Convolution reports (6:48)
3.12 Example images, correct and incorrectly classified (10:12)
3.13 Get the data into the model (8:36)
4. Neural network components
4.1 Code tour: Regularization (9:07)
4.2 Code tour: Max pooling (10:16)
4.3 SoftMax
4.4 Code tour: SoftMax (3:52)
4.5 Batch normalization
4.6 Code tour: Online normalization (8:01)
1.6 Examples of prediction successes and failures
Complete and Continue
Discussion
0
comments
Load more
0 comments