Course 314 on neural network hyperparameter optimization is coming along smoothly. The first two sections are complete, covering single- and multi-parameter optimization. The third section, covering Evolutionary Powell's method is nearly complete as well. It's an experimental method published here for the first time. It shows how to extend our optimizer to try new concepts.

The next few sections will cover how to make our code fast and efficient, and then we'll use it to choose a solid set of hyperparameters for our autoencoder. Then before we're done, we'll refine our task so that the error measure reflects our goal more accurately: to compress images of the surface of Mars as much as we can. And we'll round out the course by using our hyperparameter optimization code to help us modify our autoencoder compression system and build the strongest compressor we can. Will it outperform jpeg? I have no idea. I'm pretty curious to see.

I'm not moving the expected release date yet (March 1) but I'm feeling better and better about it coming in ahead of schedule.

Brandon