With all that done, let's walk through the the feed forward function, line by line.
First, recall from the section, Emulating a neural network, that we can define a neural network as follows:
func affine(weights [][]float64, inputs []float64) []float64 { return activation(matVecMul(weights, inputs))}
We do the first matrix multiplication as part of calculating the first hidden layer: hidden := m.do(func() (tensor.Tensor, error) { return nn.hidden.MatVecMul(a)) }). MatVecMul is used because we're multiplying a matrix by a vector.
Then we perform the second part of calculating a layer: act0 := m.do(func() (tensor.Tensor, error) { return hidden.Apply(sigmoid, tensor.UseUnsafe()) }). Once again, the tensor.UseUnsafe() ...