Great clarity. You clearly understand your stuff from a deep level so it's easy to teach.
@ramielkady938
9 ай бұрын
PS: Video is targeted at people who already have a deep knowledge of what the video is trying to explain.
@riazali-vi8tu
4 жыл бұрын
Well explained, you should do more videos
@MathematicsMadeSimple1
4 жыл бұрын
Clear explanation. Thank you for shading more light especially on the application of eigenvalues and vectors.
@patyyyou
4 жыл бұрын
Nicely done. It hit the right level for someone who understands the linear algebra behind Eigenvectors and Eigenvalues but still needed to make the leap of connecting a dot or two in the application of PCA to a problem. Again, thank you!
@anuraratnasiri5516
4 жыл бұрын
Beautifully explained! Thank you so much!
@Darkev77
3 жыл бұрын
I do understand that eigenvalues represent the factor by which the eigenvectors are scaled, but how do they signify “the importance of certain behaviors in a system”, what other information do eigenvalues tell us other than a scaling factor? Also, why do eigenvectors point towards the spread of data?
@malstroemphi1096
Жыл бұрын
If you consider a raw matrix or just geometric examples eigenvalues are just a scaling factor indeed. And you cannot say much more. But here, we are talking with additional context: we know we are doing statistics and putting "data" into a covariance matrix, which means we can now add more interpretations. The eigen vector is not just some eigenvector of some matrix, it's the eigenvector of a *covariance matrix* in the context of statistics, we've put data into a matrix whose elements measure all the possible spread of data, which is why we can now say an eigenvector points towards the spread of data and its length (eigenvalue) relates to the importance of that spread.
@eturkoz
5 жыл бұрын
Your explanations are awesome! Thank you!
@szilike_10
3 жыл бұрын
Believe it or not, I've been wondering a lot about the concept of covariance because every video seems to miss the reason behind the idea. But I think I kind of figured it out today before watching this video and I drew the same exact thing that is in the thumbnail. So I guess was thinking correctly : ))
@apesnajnin
4 жыл бұрын
Really, amazing lecture! It's make my conception clear regarding eigenvalue and eigenvector. Thanks a lot!
@spyhunter0066
2 жыл бұрын
Around the minute of 1.36, you said "we divide by n for covariance", but we divide by n-1, instead. Please, do check on that. Thanks for the video. Maybe, I sohuld say estimated covariance has the n-1 division.
@TheSyntaxerror1
5 жыл бұрын
Love this video, great work!
@roshinroy5129
2 жыл бұрын
Awesome explanation!! Nobody did it better!
@bottom297
Ай бұрын
Extremely helpful. Thank you!
@softpeachhy8967
3 жыл бұрын
1:37 shouldn’t the covariance be divided by (n-1)?
@danielheckel2755
5 жыл бұрын
Nice visual explanation of covariance!
@davestaggers2981
2 жыл бұрын
Graphical interpretation of covariance is very intuitive and useful for me. Thank you.
@bootyhole
8 ай бұрын
Excellent video, thank you!
@nickweimer6126
5 жыл бұрын
Great job explaining this
@vietdaoquoc7629
10 ай бұрын
thank you for this amazing video
@simonala7090
8 ай бұрын
Would love to request an in person version
@Muuip
2 жыл бұрын
Great concise presentation, much appreciated! 👍
@skshahid5565
Жыл бұрын
Why do you stop making videos?
@123arskas
2 жыл бұрын
Thank you. It was beautiful
@EdeYOlorDSZs
3 жыл бұрын
poggers explination thankyou
@ivandda00
3 ай бұрын
ty
@Timbochop
2 жыл бұрын
Good job, no wasted time
@Pedritox0953
3 жыл бұрын
Good explanation
@Agastya007
3 жыл бұрын
Plz do more videos
@arjunbemarkar7414
4 жыл бұрын
How do u find eigenvalues and eigenvectors from the covariance matrix?
@Eta_Carinae__
4 жыл бұрын
Same as usual, right? Find lambda using det(Sigma - lambda * I) = 0, so just take lambda away from the main diagonal of the Cov. Matrix, take the determinant of that and you'd be left with some polynomial of lambda which you then solve for, each solution being a unique eigenvalue.
@sakkariyaibrahim2650
2 жыл бұрын
Good lecture
@Trubripes
6 ай бұрын
Thanks for concisely explaining that PCA is just SVD on the covariance matrix.
@latanezimbardo7129
3 жыл бұрын
1:28 I personally visualise covariance like this, I always thought i was wrong, I have never seen others doing this, how come??
@TechLord79
5 жыл бұрын
Very well done!
@m.y.s4260
5 жыл бұрын
awesome explanation! thx!
@prof.laurenzwiskott
2 жыл бұрын
Very nice video. I plan to use it for my teaching. What puzzles me a bit is that the PCs you give as an example are not orthogonal to each other.
@saDikus1
2 жыл бұрын
Great video! Can anyone tell how she decided that PC1 is spine length and PC2 is Body mass? Should we guess (hypothesize) this in real world scenarios?
@f0xn0v4
Жыл бұрын
I have always dreaded statistics, but this video made these concepts so simple while connecting it to Linear algebra. Thank you so much ❤
@liviyabags
4 жыл бұрын
I LOVE YOU !!!!! whattay explanation... thank you so much
@tusharkush7
4 жыл бұрын
This video needs a golden buzzer.
@simonala7090
8 ай бұрын
Agreed!!!
@blackshadowofmysoul
3 жыл бұрын
Best PCA Visual Explanation! Thank You!!!
@sebgrootus
7 ай бұрын
Incredible video. Genuinely exactly what i needed.
@matato2932
Жыл бұрын
thank you for this amazing and simple explanation
@Lapelu9
Жыл бұрын
I thought PCA was a hard concept. Your video is so great!
@zendanmoko5005
2 жыл бұрын
Thank you! very nice video, well explained!
@1291jes
4 жыл бұрын
This is excellent, Emma... I will subscribe to your videos!
@jordigomeztorreguitart
3 жыл бұрын
Great explication. Thank you.
@VivekTR
3 жыл бұрын
Hello Emma, Great job! Very nicely explained.
@skewbinge6157
3 жыл бұрын
thanks for this simple yet very clear explanation
@getmotivated3619
5 жыл бұрын
You are awesome... u make a mediocre out of a knownothing.
@Agastya007
3 жыл бұрын
I love the way u spelled "data" at [3:34]😁😁
@basavg1
2 жыл бұрын
Very Nice..pls keep posting
@user-or7ji5hv8y
3 жыл бұрын
Wow, that was quite good explanation.
@thryce82
4 жыл бұрын
nice job was always kinda confused by this.
@abdulrahmanmohamed8800
4 жыл бұрын
A very good explanation.
@subinnair3835
5 жыл бұрын
Dear mam, How did you obtain the matrix at 5:30 ?
@emfreedman3905
5 жыл бұрын
Find the Covariance Matrix of these variables, like at 2:15, and find its eigen decomposition (find its two dominant eigenvectors). The matrix at 5:30 is the two dominant eigenvectors. Each column is an eigenvector.
@subinnair3835
5 жыл бұрын
Emma Freedman thank u ! The video's explanation was great and covered all the fundamentals required to fully understand PCA !! 😃
@tusharpandey6584
4 жыл бұрын
awesome explanation! make more vids pls
@vitokonte
4 жыл бұрын
Very nice explanation!
@stephenaloia6695
3 жыл бұрын
Thank you, Ma'am!
@tractatusviii7465
4 жыл бұрын
investigate hedge/hogs
@crispinfoli9448
4 жыл бұрын
Great video, thank you!
@DanielDa2
3 жыл бұрын
great explanation
@KimJennie-fl3sg
4 жыл бұрын
I just love the voice🙄😸
@nbr2737
3 жыл бұрын
beautiful, thanks a lot!
@Matt-bq9fi
3 жыл бұрын
Great explanation!
@siliencea9362
4 жыл бұрын
thank you so much!! :)
@haroldsu
4 жыл бұрын
Thank you for this great lecture.
@mahmoudreda1083
4 жыл бұрын
thanks A LOT
@DoFlamingo_1P
4 жыл бұрын
AWESOMEEE 🤘🤘🤘
@adriantorresnunez
5 жыл бұрын
Best explanation I have heard from PCA. Thank you
@astiksachan8135
4 жыл бұрын
5:12 was very good
@콘충이
4 жыл бұрын
Awesome!
@SpeakerSparkTalks
5 жыл бұрын
nicely explained
@checkout8352
5 жыл бұрын
Tnks
@raghav5520
5 жыл бұрын
Well explained
@astiksachan8135
4 жыл бұрын
4:35
@AEARArg
3 жыл бұрын
Congratulations Emma, your work is excellent!
@2894031
3 жыл бұрын
babe var(x,x) makes no sense. either you say var(x) or cov(x,x)
@ABC-hi3fy
3 жыл бұрын
No one explains why they use covariance matrix. Why not use actual data and find its igen vector/igen values. I have been watching hundreds of videos books. No one explains that. It just doesn't make sense to me to use covariance matrix. Covariance is very useless parameter. It doesn't tell you much at all.
@malstroemphi1096
Жыл бұрын
No it does especially using PCA. But you are right, you need actual data. Say the data are 3D points of some 3d objects, if you use this technique (build a cov matrix using the 3D points and do the PCA of it) then you will find a vector aligned with overall direction of the shape: for instance you will find the main axis of a 3d cylinder. This is quite a useful information.
Пікірлер: 87