I never understood what is a gradient descent and a cost function is until I watch this video 🙏🙏
@mohitpatel7876
4 жыл бұрын
Best explanation of cost function, we learned it as masters students and the course couldnt explain it as well.. simply brilliant
@anuragmukherjee1878
2 жыл бұрын
For those who are confused. The convergence derivative will be dJ/dm.
@tusharikajoshi8410
Жыл бұрын
what's J in this? Y values? I'm super confused about this d/dm of m, cz it would be just 1. and m I think is just total number of values. Shouldn't the slope be d/dx of y?
@mdmynuddin1888
Жыл бұрын
@@tusharikajoshi8410 it will be the cost or loss (J)
@mdmynuddin1888
Жыл бұрын
new(m) = m- d(loss or cost)/dm * Alpha(learning rate.
@suhasiyer7317
Жыл бұрын
Super helpful
@threads25
11 ай бұрын
I'dont think because it netwons method actually
@manikaransingh3234
4 жыл бұрын
I don't see a link on the top right corner for the implementation as you said in the end.
@navjotsingh8372
2 жыл бұрын
I have seen many teachers explaining the same concept, but your explainations are next level. Best teacher.
@soumikdutta77
2 жыл бұрын
Why am I not surprised with such a lucid and amazing explanation of cost function, gradient descent,Global minima, learning rate ...may be because watching you making complex things seems easy and normal has been one of my habit. Thank you SIR
@ayurdubey4818
2 жыл бұрын
The video was really great. But I would like to point out that the derivative that you took for convergence theorem, there instead of (dm/dm) it should be derivative of cost function with respect to m . Also a little suggestion at the end it would have been helpful, if you mentioned what m was, total number of points or the slope of the best fit line. Apart from this the video helped me a lot hope you add a text somewhere in this video to help the others.
@pjanjanam
3 жыл бұрын
A small comment at 17:35. I guess it is Derivative of J(m) over m. In other words, the rate of change of J(m) over a minute change of m. That gives us the slope at instantaneous points, especially for non linear curves when slope is not constant. At each point of "m, J(m)", Gradient descent travels in the opposite direction of slope to find the Global minima, with the smaller learning rate. Please correct me if I am missing something. Thanks for a wonderful video on this concept @Krish, your videos are very helpful to understand the Math intuition behind the concepts, I am a super beneficiary of your videos, Huge respect!!.
@varungupta2727
4 жыл бұрын
Similar to Andrew NG course from coursera kind of revision for me 😊😊
@Gayathri-jo4ho
4 жыл бұрын
Can you please suggest me how to begin with in order to learn machine learning
@Gayathri-jo4ho
4 жыл бұрын
@@ArpitDhamija did you have knowledge on machine learning??if so, please suggest me I saw so many but I couldnt able to .
@shhivram929
4 жыл бұрын
@@Gayathri-jo4ho This playlist itself is a fantastic place to start, Or can enroll in this course "Machine Learning A-Z by krill eremenkrov" in udemy. The course will give you an intuitive understanding of the ML Algorithms. Then it's up to you to research and study the math behind each concept..Reff (kgnuggets, Medium, MachineLearningplus and lot more)
@Gayathri-jo4ho
4 жыл бұрын
@@shhivram929 thank you
@sarithajaligama9548
3 жыл бұрын
Exactly. This is the equivalent of Andrew Ng's description
@dhainik.suthar
3 жыл бұрын
This maths is same as coursera machine learning courses Thank you sir for this great content ..
@padduchennamsetti6516
Ай бұрын
you just made the whole concept clear with this video,you are a great teacher
@shubhamkohli2535
4 жыл бұрын
Really awesome video , so much better than many famous online portals charging huge amount of money to teach things.
@animeshkoley6478
3 жыл бұрын
Best explanation of Linear Regression🙏🙏🙏.Simply wow🔥🔥
@mayureshgawai5951
3 жыл бұрын
No one can find easiest explanation of gradient descent on youtube. This video is the exception.
@chimadivine7715
10 күн бұрын
Now I understand what GD means. Thanks always, Krish
@RJ-dz6ie
4 жыл бұрын
How can I not say that you are amazing !! I was struggling to understand the importance of gradient descent and u cleared it to me in the simplest way possible.. Thank you so much sir :)
@python_by_abhishek
3 жыл бұрын
Before watching this video I was struggling with the concepts exactly like you were struggling in plotting the gradient descent curve. ☺️Thanks for explaining this beautifully.
@tarunsingh-yj9lz
Жыл бұрын
Best video on youtube to understand the intution and math(surface level) behind Linear regression. Thank you for such great content
@nanditagautam6310
3 жыл бұрын
This is the best stuff i ever came across on this topic !
@rezafarrokhi9871
3 жыл бұрын
Thanks for all great prepared videos, I think you meant (deriv.J(m) / deriv(m)) at 17'.45", is it correct?
@azizahmad1344
3 жыл бұрын
Such a great explanation of gradient descent and convergence theorem.
@PritishMishra
4 жыл бұрын
I knew that their will be an Indian that can make all the stuffs easy !! Thanks Krish
@priyanshusharma2516
3 жыл бұрын
Watched this video 3 times back to back .Now its embaded in my mind forever. Thanks Krish , great explanation !!
@w3r161
6 ай бұрын
Thank you my friend, you are a great teacher!
@vishnuppriya5263
Жыл бұрын
Really great sir. I very much thank you sir for this clear explanation
@moulisiramdasu6753
3 жыл бұрын
Really thanks you krish. you just cleared my doubts on cost function and gradient descent. First I saw Andrew Ng class but have few doubts after seeing you video. Now its crystal clear.. Thank You...
@nurali2525
3 жыл бұрын
This guy was born to teach
@pranitaumarji5224
4 жыл бұрын
Thankyou for this awesome explanation!
@dsc40sundar18
Жыл бұрын
H i sir great content and a big fan of your work let me ask a doubt in cost function many books or blogs takes the cost function as 1/NSUMATION( Y - Y^) BUT you used 1/2N SUMATION( Y - Y^) so i was bit confused in that part and tq u for wonderful content thnak you so much sir
@dhruv1324
Жыл бұрын
never found a better explaination
@anuragbhatt6178
4 жыл бұрын
The best I've come across on gradient descent and convergence theorem
@annapurnaparida7655
3 жыл бұрын
So beautifully explained...did not find anywhere this kind of clarity....keepnup the good work....
@arunsundar489
4 жыл бұрын
Please add the indepth math intution of other algorithms like logistic, random forest, support vector and ANN.. Many Thanks for the clearly explained abt linear regression
@SaroashRahil
7 ай бұрын
the only video that made gradient descent so simple that even 2nd grade students woud understand
@kevinsusan3345
4 жыл бұрын
I had so much difficulty in understanding gradient descent but after this video It's perfectly clear
@muralimohan6974
3 жыл бұрын
Bro, how we update the slope
@V2traveller
4 жыл бұрын
every line you speak..so much important to understand ths concept......thank u
@FaizanKhan-fn6ew
4 жыл бұрын
Thanq so much for all your efforts.... Knowledge, rate of speech and ability to make thing easy are nicest skill that you hold...
@ahmedbouchou6893
4 жыл бұрын
Hi . Can you please do a video about the architecture of machine learning systems in real world . How does really work in real life .for example how hadop (pig,hive) , spark, flask , Cassandra , tableau are all integrated to create a machine learning architecture. Like an e2e
@pradeepmallampalli6510
3 жыл бұрын
Thank you Soo much Krish. No where I could find such a detailed explanation You made my Day!
@aayushsuman4592
6 ай бұрын
Thank you so much, Krish!
@PankajMishra-ey3yh
3 жыл бұрын
I think in the Convergence theorem part, the derivative should be d(J(m))/d(m), as in a y-x graph, we take derivative of y wrt x. Here our Y is J(m) and X is m.
@ShubhamGupta-ej8xr
2 жыл бұрын
Ya also I think this thing.
@jaisamdariya4307
3 жыл бұрын
I wish I could like this thousand times.
@FaizanKhan-fn6ew
4 жыл бұрын
I am working in some company with bpm domain... I have no idea about programming but some how I manage to create interrest in ML... The best part is I just want to learn it to enhance my knowledge and I m ready to work for free... If you can suggest something will help...
@vidyasagarpatil2557
4 жыл бұрын
This is when you become genius
@Neuraldata
4 жыл бұрын
We would also recommend your videos to our students!
@supervickeyy1521
4 жыл бұрын
i knew the concept of Linear Regression but didn't know the logic behind it.. the way Line of Regression is chosen. Thanks for this!
@Karthik-wj5rs
Жыл бұрын
Finally I understood the perfect answer of gradient descent..
@nivitus9037
4 жыл бұрын
Great...
@aritra8820
2 жыл бұрын
when you are writing convergence theorm it should be m - d(j(m))/dm * alpha
@auroshisray9140
3 жыл бұрын
Thank you Krish bhaiya!
@avinashgote2770
Жыл бұрын
good expplanation now clear all queries
@shailesh1981able
2 жыл бұрын
Awesome!! Cleared all doubts seeing this video! Thanks alot Mr. Krish for creating indepth content on such subject!
@SanjeevKumar-dr6qj
Жыл бұрын
Great sir. Love this video
@9902152322
2 жыл бұрын
god bless you too sir, explained very well. basics helps to grow high level understanding
@RanjithKumar-jo7xf
2 жыл бұрын
Nice Explanation, I like this.
@kannanparthipan7907
4 жыл бұрын
Why 2m in place of m in cost function calculation... Pls explain
@subhamnagar7794
4 жыл бұрын
you can write m also, authors prefer 2m because when you find the derivative the 2 gets cancelled
@pradnyavk9673
2 жыл бұрын
very well explained Thank you.
@sanjug7317
2 жыл бұрын
Very good and detailed explanation
@tezzbhandari3725
2 жыл бұрын
The graph of the cost function is not gradient descent. The automatic differentiation of cost function with respect to m is gradient decent which is used to update the m.
@Dinesh-uh4gw
3 жыл бұрын
Excellent Explanation
@sagarparigi1884
3 жыл бұрын
This video is really helpful.
@mvcutube
3 жыл бұрын
Nice tutorial. Thank you
@rambaldotra2221
3 жыл бұрын
Thank You Sir, You have explained everything about gradient Descent in the best possible easiest way !!
@jaspreetsingh5334
3 жыл бұрын
Thanks Krish u are helping alot
@123man123man1
9 ай бұрын
Thank you for sharing this insightful video about linear regression. While I found it informative, I'm uncertain about how it addresses the challenge of avoiding local minima. I'd greatly appreciate it if you could provide some insights on this aspect as well.
@akshaychauhan5919
3 жыл бұрын
It should be derivative of J(m) w.r.t. m which will give slope of J vs m curve
@ngarwailau2665
2 жыл бұрын
Your explanations are the clearest!!!
@karthiavenger4577
4 жыл бұрын
Yaar you nailed it man after watching sooo many videos i had some Idea , By Finishing your Video now i m completely clear 😍😍😍😍
@jagdishsahu1118
4 жыл бұрын
Right
@yashodhansatellite1
4 жыл бұрын
Hats off
@mohitpatel7876
4 жыл бұрын
At 14:56, how do we decide how many slope values to try? and how about selecting intercepts in a certain range?..
@ruchit9697
4 жыл бұрын
The trials of slope selections go until the cost function reaches the local minima point ....and for intercept there are some random initialization techniques through which a fixed value is set for intercept....
@Tales.of.Irshad
4 жыл бұрын
I feel so sad for him... because only aspired Data science is gonna watch this video so he will have fewer subscribers that are not even comparable with what he is giving... Really hats of you sir,. I have taken 2 online paid class but I don't think they are better thank you, Never.
@cutecreature_san
3 жыл бұрын
your videos are clear and easy to understand
@jayeshmudaliar9155
3 жыл бұрын
best one sir thank you so much
@debrupdey7948
Жыл бұрын
great video sir, so lucid
@lubaidkhan2937
3 жыл бұрын
Thanks krish sir
@shhivram929
3 жыл бұрын
Hi krish, that was an awesome explanation of Gradient Descent. With respect to finding the optimal slope. But in linear regression both slope and the intercept are tweakable parameters, how do we achive the optimal intercept value in linear regression.
@TheBala7123
2 жыл бұрын
Excellent explanation sir. I have started following your videos for all the ML related topics its very interesting. One doubt = In Gradient Descent, when slope is zero, M value will be considered as the slope of the best file line. I do not understand this. Can you please explain here? Thanks.
@DeekshithTN-e7u
6 ай бұрын
really great explanation sir 😍
@shan5612
4 жыл бұрын
Great,but not able to find the link for how to implement in python,plz awaiting for your valuable reply.
@YoutuberEnjoy
Жыл бұрын
simply great
@wellwhatdoyakno6251
2 жыл бұрын
lovely! love it.
@PavanKumar-xg8ye
3 жыл бұрын
Excellent!!!!!
@nidhimehta9278
3 жыл бұрын
Best video on theory of linear regression! Thankyou soo much Krish!
@akshaygupta6321
4 жыл бұрын
In a single sentence "You're best"
@AGhosh-sb8lf
2 жыл бұрын
thank you sir thank you so much
@sajidchoudhary1165
4 жыл бұрын
yes very nice explanation
@arrooow9019
3 жыл бұрын
Oh my gosh this is awesome tutorial I ever seen God bless you sir🤩🤩
@shchiranth6626
3 жыл бұрын
Great Tut sir got things pretty quick with this video ty
@ankitchauhan6629
3 жыл бұрын
What about the C (intercept) value? how does the algorithm selects the C value?
@shaiksuleman3191
3 жыл бұрын
Sir No Words to explain simply super b
@guptarohyt
2 жыл бұрын
Great explanation, how to figure out which direction to move?
@divyatejadadi6898
2 жыл бұрын
excellent explanation
@AjayKumar-id7mb
3 жыл бұрын
After watching this 3 times everything is clear Repetition is the key
@arhaangarg1482
3 жыл бұрын
couldnt undertsand when andrew Ng was teaching but you bro !!!
@Cricketpracticevideoarchive
4 жыл бұрын
correct me if I am wrong Krish I Think if C is non-zero then the the best fit line will not pass through origin, and for a single variable it will always be a line, not a plain
@ShiVa-jy5ly
4 жыл бұрын
Thankyou sir...Get to learn so much from you.
@mellowftw
3 жыл бұрын
Thanks so much sir.. you're doing good for the community
@dhainik.suthar
3 жыл бұрын
there is a small mistakes in slope update m' = m - alpha * d(j)/dm m' = updated m j = cost function
@kushshri05
4 жыл бұрын
Plz try to upload videos on this series in span of 2 days...
Пікірлер: 321