Section 2 of Course 313 covers dropout in neural networks. It's a powerful tool for getting good results from your neural network, and it actually speeds up computation.

I'm really pleased with how smoothly the material on advanced neural network methods is coming together. It's particularly fun to be assembling it into the Cottonwood framework, building out the toolset that we'll be using for the next several courses in this series.

Happy Building!