Exercise 9. Add linear forward propagation.

We're going to build up the neural network's functionality gradually. In this exercise, we keep everything linear (that means we're sticking to multiplications and additions). It might be helpful to review this portion of the video.



And this one.


As a hint, our layer weights are represented as a two-dimensional array - a matrix. It's very convenient to to matrix multiplication on the inputs to get the outputs. If you haven't had a linear algebra course before, not to worry. The first three lectures in Prof. Gilbert Strang's video series will be plenty to get you started here. If you ever decide you want to go deeper, here is a set of recommendations for top-notch online resources.

Coding challenge

  • In layer.py, add a forward_prop()method to the Dense class. It should
    1. take in an array of inputs,
    2. concatenate a bias value of 1 to the array,
    3. linearly combine the inputs using a 2D array of weights, and
    4. return the result as an array of outputs.
    We're not worrying about adding any nonlinearity yet. That will come later.
  • In framework.py, add a forward_prop()method to the ANN class. It should take in a set of inputs and run it on its model. For now, it's OK to assume it has just a single layer. We'll add in multiple layers later.
  • Add calls to forward_prop() to both the train() and evaluate() methods in ANN. Keep a print() line in the code as a cheap test.

My solution

Here is all the code we've written up to this point.

Complete and Continue  
Discussion

12 comments