In Our Last Episode...

In Course 312 we built the foundation of a neural network framework from scratch. We developed an autoencoder out of fully connected layers with backpropagation and stochastic gradient descent.

In Course 313 we evolved it into a full featured neural network by adding regularization, optimizers, and initializers. We also introduced the computation graph as a way to describe what it does.

And in Course 311 we built a custom visualization, both to help us debug the network during construction, and to help us better understand its results.

The result of all this has been the Cottonwood machine learning framework, a lightweight tool for experimenting with neural networks and algorithms.

Now, in Course 314, we will use Cottonwood and the autoencoder we built to do something useful. We will build a neural network-based image compression algorithm suitable for sending images where there are severe bandwidth constraints, like from Mars to earth. While doing this, we will make use of hyperparameter optimization techniques, the tricks and tools for getting the very best performance out of a neural network.

Discussion

0 comments