Math for Neural Networks and Calculus Fundamentals via Brilliant.org

A little over a month ago, Simon picked up neural networks again (something he had tried a while ago but couldn’t grasp intuitively). He started the Artificial Neural Networks course on Brilliant.org and covered vectors, matrices, optimisation, perceptrons and multilayer perceptrons fairly quickly and even built his first perceptron in Python from scratch (will publish a video about this project shortly). As soon as he reached the chapter on Backpropagation, however, he realised his current knowledge of Calculus wasn’t enough. This is how Simon, completely on his own, decided to get back to studying Calculus (something he lost interest in last year). After gulping up several chapters of the Calculus Fundamentals course, Simon told me he was now ready to do Backpropagation (nearly done now). On to the convolutional neural networks (the next chapter in the course)!

As of today, these are his progress stats:

Below are some impressions of doing Calculus Fundamentals.

On Saturday, March 7 Simon yelled: “Now I understand it! The Chain Rule!” — “But I remember you tried to explain Chain Rule to me a while ago”, I said. — “But I didn’t understand it intuitively!”
Since the lockdown began, Simon and his math tutor have started zooming instead of the biweekly sessions at our place.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s