Neural Networks Fundamentals: Neurons and Layers
This course introduces the core building blocks of neural networks. You'll learn what a neuron is, how it processes information, the role of activation functions, and how neurons are organized into layers. By the end, you'll implement a single dense layer from scratch using Python and NumPy.
The MLP Architecture: Activations & Initialization
This course builds upon single layers to construct a complete Multi-Layer Perceptron (MLP). You'll learn to stack layers, explore different activation functions like ReLU and Softmax, and understand the importance of weight initialization for effective training.
Training Neural Networks: the Backpropagation Algorithm
This course dives into how neural networks learn from data. You'll implement loss functions to measure prediction errors, understand the intuition and mechanics of gradient descent, master the backpropagation algorithm to calculate gradients, and use an optimizer to update network weights.
Building and Applying Your Neural Network Library
This course focuses on transforming your code into a reusable Python library and applying it to a real-world problem. You'll refactor your existing components into a structured package, build a `Model` class for easier network definition and training, and finally, train your neural network on the California Housing dataset for a regression task.