Machine Learning
1421 learners
Gradient Descent: Building Optimization Algorithms from Scratch
Delve into the intricacies of optimization techniques with this immersive course that focuses on the implementation of various algorithms from scratch. Bypass high-level libraries to explore Stochastic Gradient Descent, Mini-Batch Gradient Descent, and advanced optimization methods such as Momentum, RMSProp, and Adam.
Python
5 lessons
21 practices
4 hours
Coding and Data Algorithms
Lessons and practices
Observing Stochastic Gradient Descent in Action
Tuning the Learning Rate in SGD
Stochastic Sidesteps: Updating Model Parameters
Updating the Linear Regression Model Params with SGD
Mini-Batch Gradient Descent in Action
Calculating Gradients and Errors in MBGD
Calculating Gradients for Mini-Batch Gradient Descent
Adjust the Batch Size in Mini-Batch Gradient Descent
Visualizing Momentum in Gradient Descent
Adjusting Momentum in Gradient Descent
Adding Momentum to Gradient Descent
Optimizing the Roll: Momentum in Gradient Descent
RMSProp Assisted Space Navigation
Scaling the Optimizer: Adjusting RMSProp with Gamma
Adjust the Decay Rate in RMSProp Algorithm
Implement RMSProp Update
Implement RMSProp's Squared Gradient Update
Optimizing Robot Movements with ADAM Algorithm
Adjusting the Learning Rate in ADAM Optimization
Optimize the Orbit: Tuning the ADAM Optimizer's Epsilon Parameter
ADAM Optimizer: Implement the Coordinate Update
Meet Cosmo:
The smartest AI guide in the universe
Our built-in AI guide and tutor, Cosmo, prompts you with challenges that are built just for you and unblocks you when you get stuck.

Join the 1M+ learners on CodeSignal
Be a part of our community of 1M+ users who develop and demonstrate their skills on CodeSignal