Week 0 Jupyter notebooks: Part 0, Part 1 and Part 2. A link to MNIST for beginners.

Week 0 exercises: Three simple exercises in one file. And two more substantial ones: exercise_0 and exercise_1.

Solutions: simple 0, simple 1, simple 2 and exercise 0, exercise 1. Another version of solutions in one file.

Week 1 Jupyter notebooks: Linear Regression and Regression with Graph Visualization. ReLU's and ReLU's with variable sharing. DNN for MNIST.

Week 1 exercises. The data loader section for exercise 2.

Solutions: exercise 0, exercise 1, exercise 2.

Week 2 Jupyter notebooks: DNN for MNIST with technical refinements. CNN for MNIST. CNN for CIFAR-10.

Week 2 exercises.

Solutions: exercise 0 and exercise 1.

Week 3 Jupyter notebooks: RNN with vanilla cells and RNN with LSTM cells.

Here is a nice introduction to RNN's and LSTM cells. Check out also this blog. A blog on Differentiable Neural Computers with links to the original paper, commentary and videos. A nice introduction to neural attention.

Two discussions of CNN's: Beginner's Guide to CNN's and notes from a course in Stanford.

Two multipart articles on capsule networks: Understanding Hinton's Capsule Networks and Unsupervised learning of a useful hierarchy of visual concepts. A third article: Capsule Networks are Shaking Up AI. A video on capsule networks.

A recent paper on state-of-the-art handwritten text recognition. A video on CTC and HWR. This folder contains related articles in two categories. Simple code for 2MNIST.

An intro book to DL and the Deep Learning book. Both are free.

An article on the "Canadian Mafia". GAN papers. Three dimensional video of different types of NN's predicting MNIST.

The TensorFlow Get Started site; the scikit-learn site.

Week 0 exercises: Three simple exercises in one file. And two more substantial ones: exercise_0 and exercise_1.

Solutions: simple 0, simple 1, simple 2 and exercise 0, exercise 1. Another version of solutions in one file.

Week 1 Jupyter notebooks: Linear Regression and Regression with Graph Visualization. ReLU's and ReLU's with variable sharing. DNN for MNIST.

Week 1 exercises. The data loader section for exercise 2.

Solutions: exercise 0, exercise 1, exercise 2.

Week 2 Jupyter notebooks: DNN for MNIST with technical refinements. CNN for MNIST. CNN for CIFAR-10.

Week 2 exercises.

Solutions: exercise 0 and exercise 1.

Week 3 Jupyter notebooks: RNN with vanilla cells and RNN with LSTM cells.

Here is a nice introduction to RNN's and LSTM cells. Check out also this blog. A blog on Differentiable Neural Computers with links to the original paper, commentary and videos. A nice introduction to neural attention.

Two discussions of CNN's: Beginner's Guide to CNN's and notes from a course in Stanford.

Two multipart articles on capsule networks: Understanding Hinton's Capsule Networks and Unsupervised learning of a useful hierarchy of visual concepts. A third article: Capsule Networks are Shaking Up AI. A video on capsule networks.

A recent paper on state-of-the-art handwritten text recognition. A video on CTC and HWR. This folder contains related articles in two categories. Simple code for 2MNIST.

An intro book to DL and the Deep Learning book. Both are free.

An article on the "Canadian Mafia". GAN papers. Three dimensional video of different types of NN's predicting MNIST.

The TensorFlow Get Started site; the scikit-learn site.