Our theano.function
call looks a bit different than in our first example. Yeah, we have this additional updates
parameter. updates
allows us to update our shared variables according to an expression. updates
expects a list of 2-tuples:
The second part of each tuple can be an expression or function that returns the new value we want to update the first part to. In our case, we have two shared variables we want to update, theta1
and theta2
and we want to use our grad_desc
function to give us the updated data. Of course our grad_desc
function expects two arguments, a cost function and a weight matrix, so we pass those in. fc
is our cost expression. So every time we invoke/call the cost
function that we’ve compiled with Theano, it will also update our shared variables according to our grad_desc
rule. Pretty convenient!
Additionally, we’ve compiled a run_forward
function just so we can run the network forward and make sure it has trained properly. We don’t need to update anything there.
Now let’s define our training data and setup a for
loop to iterate through our training epochs.