You're probably the only one on the Internet who explained PCA mathematically! Thank you so much!
@aindrilasaha1592
3 жыл бұрын
Trust me after having spent hours on google and youtube, this is the best thing that i found on PCA, hats off to you and thanks a lot!! Wish you all the best for your channel.
@tilestats
3 жыл бұрын
Thank you!
@pipsch12
2 жыл бұрын
I so agree. I don't understand why PCA is presented in such an overly complicated fashion by almost everybody. This video is so simple because it covers every step of the process and gives clear and easy explanations without unnecessary details and confusing language. THANK YOU.
@SanthoshKumar-dk8vs
Жыл бұрын
True, great explanation 👏
@scottzeta3067
2 жыл бұрын
This video is totally underated. If my uni's lecture is even half good as yours, I won't spend so much time.
@RayRay-yt5pe
2 ай бұрын
I can't believe the concept can be explained this simply! Nice one! You have a new subscriber. I honestly think it's criminal that something this simple is made overly convoluted by other individuals.
@abebawt1169
6 ай бұрын
After I watch this video, I feel like everyone else make PCA complicated, deliberately. Thank you for making it easy!
@gacemamine5970
2 ай бұрын
Fantastic explanation👑👑👑, Thank you very much.
@firstkaransingh
2 жыл бұрын
Excellent explanation of a very complex topic. Please do try to explain the SVD procedure if you can. Thanks 👍
@michaeldouglas7641
2 жыл бұрын
I would like to sincerely thank you for this video. Almost all YT maths videos only focus on the high level concepts. Finding a linear, step by step explication of the process is rare. Please do make more of these videos. Others I would love to see are: a step by step of one of the GLM's (logistic?), a SBS of gaussian process, and maybe a step by step of factor analysis. Thanks again
@tilestats
2 жыл бұрын
Thank you! I think my videos about logic regression will interest you. You find all my videos at www.tilestats.com
@mohdzoubi3819
Жыл бұрын
It is a great video. The corresponding PDF file of this video is also great .Thank you very much.
@BrenerHotz
7 күн бұрын
Thank you for make it easy!
@tonyhuang9001
8 ай бұрын
Love from China😘
@randbak1527
Жыл бұрын
totally underrated video I've been searching for a simple yet informative explanation of PCA and you are the best you should be the top on on the search . thank you
@hayki_ds
Жыл бұрын
Perfect Thanks
@sefatergbashi
Жыл бұрын
Best lecture on PCA calculations so far! Thank you
@nikeforo2612
2 жыл бұрын
Your videos are a godsend, extremely helpful and clear. Thanks a lot. Is there any chance you will cover Correspondence Analysis any time soon? That would nicely complement the series of videos on data dimensionality reduction techniques. Just wondering....
@tilestats
2 жыл бұрын
Thank you! That method is not on my list but maybe in the future. However, there will soon be a video on principal component regression.
@ankhts
Жыл бұрын
Able to understand mathematics of PCA with your videos ...Many Thanks ... if you reading this comment do watch explanation on GLM , probably best explaination available on youtube
@wlt6311
9 ай бұрын
Thanks for this nice video, best explaination of PCA. Others just explain without showing the calculation.
@ramkumargorre2958
2 жыл бұрын
This is one of the best videos explained the PCA concept mathematically.
@NatnichaSujarae
4 ай бұрын
you're a life saver! I've been trying to understand this for daysssss and this is the only video that nailed it! Thank you so muchhh
@crickethighlight555
Жыл бұрын
Tomorrow will be my quiz I had not even attend the lecture but after watching your tutorial I am ready for quiz so Thanks 🙏
@yoonchaena3137
2 жыл бұрын
Thanks~! I want to but this channel stock~!, it will be bigger one.
@tilestats
2 жыл бұрын
Thank you :)
@mrbilalkhan
2 ай бұрын
video lecture on Eigenvector and Eigenvalues mentioned at 05:31 can be found at kzitem.info/news/bejne/mnmKlp6knp9lqYI
@AbhishekVerma-kj9hd
Жыл бұрын
God bless you sir what an amazing explanation I'm really touched and thank you for this video
@danialb9894
Жыл бұрын
Best explanation for PCA. Thank you. Wish you the best ❤❤
@alaghaderi9079
2 жыл бұрын
One of the best videos about pca that I have seen. but where is svd ?:))
@notknown42
Жыл бұрын
Best PCA video I have seen at this platform. Well done - Greetings from Germany
@SS-pn7ss
11 ай бұрын
thank you so much for this great video
@RobertWei-p1l
Жыл бұрын
man, it's so helpful, thank you so much!!!
@divyab592
Жыл бұрын
best explanation of PCA so far!!! thank you so much
@shankars4384
11 ай бұрын
You are the best TileStats. I love you a lot man!
@endritgooglekonto230
8 ай бұрын
best tutorial ever on PCA I have ever found!
@BushiZack
9 ай бұрын
Good job man!!! Thank you so much
@eaintthu3488
2 жыл бұрын
please explain about kernel PCA
@sakkariyaibrahim2650
2 ай бұрын
Please tell me what it means if direction is changed?
@workcontact9726
Жыл бұрын
great, thanks for this video
@bobrarity
9 ай бұрын
appreciate the video, helped a lot
@joaovictorf.r.s.1570
Жыл бұрын
Perfect presentation! Thanks!
@betting55555
Жыл бұрын
great video, thanks!
@anmolpardeshi3138
4 ай бұрын
I see that you centered the data. Is only centering required for "standardization" or scaling is also normally done such that the mean =0; standard deviation=1? this will then change the covariance matrix since variance of individual dimensions will equal 1.
@tilestats
4 ай бұрын
It is not a requirement, mathematically, to standardize your data (mu = 0, SD = 1), but it is highly recommended, especially if you have variables with a large difference in the variance. I discuss that in the next video about PCA: kzitem.info/news/bejne/xZ5ux4iBkYJ8n4o
@Fa94La
Жыл бұрын
Thanks for that video what name of book that you depend upon?
@tilestats
Жыл бұрын
I mainly used internet to learn ML.
@RuiLima1981
Жыл бұрын
minute 6.17, how did you get the value 3.84? Should it not be 35.2?
@tilestats
Жыл бұрын
4.4 x 8 - 5.6 x 5.6 = 3.84
@casper8374
Жыл бұрын
the best 🙏🏼
@sainivasgandham7982
7 ай бұрын
why did you take n-1 while calculating the covariance matrix
@tilestats
7 ай бұрын
Because that is how you calculate the variance. Have a look at this video if you like to know more: kzitem.info/news/bejne/0YJ-l4V3bXhqqHo
@dpi3
Жыл бұрын
absolutely brilliant!
@andresromeroramos5410
2 жыл бұрын
I simply loved your teaching way. AWESOME video!
@tilestats
2 жыл бұрын
Thank you!
@rahuldebdas5608
9 ай бұрын
Sir can you please upload a similar mathematical video on oblige rotation of Principal components? It will be very helpful.
@AJ-fo3hp
3 жыл бұрын
Thank you very much
@nassersaed4993
9 ай бұрын
Hi, thanks for the very informative tutorial, can you please explain at 11:00 how you obtained the pc scores by multiplying the eigenvector matrix with centred data?
@tilestats
9 ай бұрын
Have a look at this video, starting at about 9 min, to see how to do matrix multiplication: kzitem.info/news/bejne/sqp3wKeNrJd6fqA
@nassersaed4993
8 ай бұрын
Okay, got it ! thank you so much🙏
@gabrielfrattini4090
2 жыл бұрын
This was amazing, so clear
@tilestats
2 жыл бұрын
Thank you!
@sakkariyaibrahim2650
2 ай бұрын
Great lecture. Best explanation of PCA that I could find in internet
@wanqin3396
Жыл бұрын
why for the standarlization of data did not need to divide standard deviation
@tilestats
Жыл бұрын
Here I only center the data, but you can also standardize as I do in this video kzitem.info/news/bejne/xZ5ux4iBkYJ8n4o
@md.shafaatjamilrokon8587
2 жыл бұрын
Thanks
@hammasmajeed3715
2 жыл бұрын
Your videos are very helpful . Thanks
@tilestats
2 жыл бұрын
Thank you!
@fredbatti
2 жыл бұрын
Amazing Video, very well explained. A question: Anybody knows a way to sum the eigenvectors (weights) to 1. To exactly how much of of orginal valeus contribute to the component?
@tilestats
2 жыл бұрын
Thank you! To transform the weights so that they sum to one, simply divide each weight by the sum of the weights (given that the weights are positive). However, I usually like to think of the weights as correlation coefficients as I explain in the fourth video about PCA.
@fredbatti
2 жыл бұрын
@@tilestats I have find out that if we power all the weights by 2 it will end up summing to 1 ! regardless the signal. Thanks for the contribution ! Appreciate it
@tilestats
2 жыл бұрын
Yes, but note that the weights are usually expressed as loadings (see PCA 4 video) by most statistical software tools. The square of these loadings do not then sum up to one.
@ConfusedRocketShip-fv7qy
9 ай бұрын
Amazing! Best teacher for PCA
@sasakevin3263
2 жыл бұрын
Your video gave me 100% understanding of PCA, before that, I know nothing about PCA. Thank you!
@cindywang8852
2 жыл бұрын
Very informative! Thank you!
@tilestats
2 жыл бұрын
Thanks!
@oscarernestocl9319
Жыл бұрын
for the example that starts @18:00 , first you have vector [-2/ 3] , then you multiply by the covariance matrix to get vector: [8 / 12] (to transform the vector), and then you multiply again [8/12] by the covariance matrix to get the direction of the eigenvector. However, in the second example you dont transform the vector and just multiply the initial one [4/1] by the covariance matrix. So my question is: why it is necessary to transform the vector in the first case? Thank youu!!!
@tilestats
Жыл бұрын
I just show one iteration in the second example but the more iterations you do (multiply the new vector with the covariance matrix), the closer you will get to the eigenvector.
@KS-df1cp
2 жыл бұрын
Great but not sure how you got normalized values of eigen vectors. Can you please direct me towards that video or step you skipped? Thanks. Also, what are the eigen vectors that you get for eigen value 0.32? My simplified value of y is -0.72 x I dont know why you got 0.81.
@tilestats
2 жыл бұрын
kzitem.info/news/bejne/mnmKlp6knp9lqYI Starts at around 8 min.
@KS-df1cp
2 жыл бұрын
@@tilestats Got it and I forgot to take the sqrt of the denominator :/ thank you again
@mwanganamubita9617
2 жыл бұрын
@@tilestats Thanks for this very informative video. I have one question - For lambda = 0.32, I am getting y = -0.73 when x =1, the normalized vector with unit length of 1 is [0.81, -0.59] instead of [-0.81, 0.59]. Please verify and advise
@tilestats
2 жыл бұрын
If you set x= 1, you get [0.81, -0.59], but if you set y=1, you will get [-0.81, 0.59]. If you set x to 1, or y to 1, is arbitrary because both vectors are eigenvectors to the covariance matrix (they just point in the opposite direction). Both vectors will give the same variance of PC2.
@mwanganamubita9617
2 жыл бұрын
@@tilestats Many thanks for the explanation. Much appreciated!
@wondwossengebretsadik3334
3 жыл бұрын
This is an excellent explanation. Thanks a lot.
@tilestats
3 жыл бұрын
Thank you!
@kmowl1994
2 жыл бұрын
Very helpful, thank you!
@tilestats
2 жыл бұрын
Thank you!
@coffee-pot
Жыл бұрын
Thank you so much. Your videos are the best and this particular video is beyond amazing.
@mahdi1594
2 жыл бұрын
bro, great job, love the way you explain things. You might see this comment copied and pasted across few of your other videos, I am just doing this for the algorithm.
@tilestats
2 жыл бұрын
Thank you!
@ramankaur5657
Жыл бұрын
hi, instead of center(ing) the data, is it also viable to standardise the data?
@tilestats
Жыл бұрын
Sure, have a look at the next video: kzitem.info/news/bejne/xZ5ux4iBkYJ8n4o
@ramankaur5657
Жыл бұрын
@@tilestats Thanks, I just watched it! Hoping you could help me with following as well: if I am applying the eigenvectors to another set of new data (with same variables as the original data) (i.e., not the original data i ran PCA on), I assume I should also standardise the new data before applying the eigenvector (weighting) on the new data?
@polarbear986
2 жыл бұрын
This is so good, Thank you!
@tilestats
2 жыл бұрын
Thank you!
@karodada8005
2 жыл бұрын
Great video, thanks !
@tilestats
2 жыл бұрын
Thank you!
@TranHoangNam_A-km3vj
10 ай бұрын
you are the best teacher i ever known
@arunkumar0702
2 жыл бұрын
Very well explained ... indeed !! Keep up the good work ..!! Many thanks for conceiving and producing this excellent series on PCA. I look forward to viewing your videos on other topics !!
@valerianaazuing6008
2 жыл бұрын
Zz
@valerianaazuing6008
2 жыл бұрын
Zz
@valerianaazuing6008
2 жыл бұрын
Use the edit icon to pin, add or delete clips.Use the edit icon to pin, add or delete clips.
@valerianaazuing6008
2 жыл бұрын
Use the edit icon to pin, add or delete clips.Use the edit icon to pin, add or delete clips.Use the edit icon to pin, add or delete clips.
@valerianaazuing6008
2 жыл бұрын
Use the edit icon to pin, add or delete clips.
@zero8wow342
2 жыл бұрын
Please why others don't center the data first before using it to form the covariance matrix
@tilestats
2 жыл бұрын
You do not need to center the data to compute a covariance matrix. You will get the same matrix with uncentered data because the spread of the data does not depend on the mean. The reason why I center the data in this video is because that is the first step in PCA.
@areejhameed3923
2 жыл бұрын
thank you so much
@tilestats
2 жыл бұрын
Thank you!
@rezafarrokhi9871
3 жыл бұрын
That is so helpful.
@tilestats
3 жыл бұрын
That's great!
@bhavyakalwar8131
2 жыл бұрын
Tile stats best
@aishwaryapant-w7s
Жыл бұрын
9:17 how to do normalization?
@tilestats
Жыл бұрын
Have a look at around 8 min in this video: kzitem.info/news/bejne/mnmKlp6knp9lqYI
@aishwaryapant-w7s
Жыл бұрын
@@tilestats OKAY THANK YOU
@shashanksadafule
2 жыл бұрын
Amazing Explanation!
@tilestats
2 жыл бұрын
Thank you!
@preethiagarwal5355
2 жыл бұрын
U cud hv explained how to calculate eigen values as part of this itself ...to make us watch other videos causes loosung of interest...sry uts not a one stop shop. Y dont u make it comprehensive
@tilestats
2 жыл бұрын
Because I try to keep the videos below 20 min and then I cannot include all details that I have covered in previous videos. This video is just one, out of many, in my course: kzitem.info/door/PLLTSM0eKjC2fZqeVFWBBBr8KSqnBIPMQD
@preethiagarwal5355
2 жыл бұрын
@@tilestats wow 👏 thanx
@arunkumar0702
2 жыл бұрын
I executed the steps in python .. I notice that the Matrix of Eigen Vectors returned by the sklearn .. pc = PCA(n_components = 2) pc.components_ is as follows: [ [-0.58906316, -0.80808699], [-0.80808699, 0.58906316] ] Whereas the one that you have calculated is: [ [ -0.80808699 , 0.58906316], [ 0.58906316 , 0.80808699 ] ] It would help if you could help me understand this difference . What am I missing ??
@tilestats
2 жыл бұрын
It seems like your function rotate the data counter clockwise, which explains the difference. It does not matter for the results. You may try to switch order of the input variables to see of that change the output.
@muhammadusmanbutt3341
2 жыл бұрын
Can you please tell me where are the precious video related eigenvalues nd eigenvectors?
@tilestats
2 жыл бұрын
If you go to www.tilestats.com You find all my videos in a logical order.
Пікірлер: 126