Keep exploring at ► brilliant.org/.... Get started for free for 30 days - and the first 200 people get 20% off an annual premium subscription!
How can we find maximums and minimums for complicated functions with an enormous number of variables like you might get when studying neural networks or machine learning? In this video we are going to talk about a topic from nonlinear optimization called Gradient Descent (or Gradient Ascent if you want maximums) where you step-by-step approach an extremum by stepping in the direction of the gradient vector. We're going to see the basic algorithm, see some common pitfalls, and then upgrade it using a method called line searches to improve the efficiency.
Check out my MATH MERCH line in collaboration with Beautiful Equations
►beautifulequat...
COURSE PLAYLISTS:
►DISCRETE MATH: • Discrete Math (Full Co...
►LINEAR ALGEBRA: • Linear Algebra (Full C...
►CALCULUS I: • Calculus I (Limits, De...
► CALCULUS II: • Calculus II (Integrati...
►MULTIVARIABLE CALCULUS (Calc III): • Calculus III: Multivar...
►VECTOR CALCULUS (Calc IV) • Calculus IV: Vector Ca...
►DIFFERENTIAL EQUATIONS: • Ordinary Differential ...
►LAPLACE TRANSFORM: • Laplace Transforms and...
►GAME THEORY: • Game Theory
OTHER PLAYLISTS:
► Learning Math Series
• 5 Tips To Make Math Pr...
►Cool Math Series:
• Cool Math Series
BECOME A MEMBER:
►Join: / @drtrefor
MATH BOOKS I LOVE (affilliate link):
► www.amazon.com...
SOCIALS:
►Twitter (math based): / treforbazett
►Instagram (photography based): / treforphotography
Негізгі бет Intro to Gradient Descent || Optimizing High-Dimensional Equations
Пікірлер: 93