I think of myself as an excellent teacher, and I've been given awards for it, but I don't think I could do any better. I think this guy is awesome. Well done man. You're providing a ton of intuition and key concepts.
@satyrswang5643
7 жыл бұрын
agree
@hrvojecoskovic5341
4 жыл бұрын
Jure is genious! Such a simple explanation of something quite abstract. Thank you
@anoop8753
4 жыл бұрын
thanks, have been cracking my head on this to get a hang of it, after gilbert strangs lecture this 10 minutes cleared the rest of my doubts.
@赵昊然-t7w
4 жыл бұрын
That‘s the best course about SVD I have seen! Thank you!
@tianmingwu4255
4 жыл бұрын
SVD explained! Amazing!
@RofaidaGoda
7 жыл бұрын
Amazing description and application
@M3gal0maniac
Жыл бұрын
Really good explanation, been watching videos for a while and this is definitely in my top 3.
@ycdantywong
6 жыл бұрын
Since the first right singular vector is composed of Movie rating 1 and Movie rating 2, shouldn't the first row of t(V) have 2 columns only? How do you map 5 data points (in first row of t(V)) to a 2-D space??
@TheJustinmulli
3 жыл бұрын
It is just a 2 dimensional example. In the case of our 7x5 matrix A, we actually have 5 movies (and hence the line is in 5 dimensions instead of 2). We have 3 lines because A has rank 3. It is confusing because he then relates the first row of our V transpose to the line in the 2d graph, but that's just to explain that the first row of V transpose is one of the lines of best fit. In reality, the rows of our V transpose actually relate 5 movies (not 2) and exist in 5 dimensions. A line exists in 1 dimension, so actually we are mapping 5 data points onto a 1-D space. In the 2d example he is mapping 2 data points to a 1-D space.
@somendutta182
Жыл бұрын
beautifully explained. can you please make a video of higher order svd? How HOSVD can be used to extract features from tensorial data? It will be really helpful
@ayatkhrisat5964
4 жыл бұрын
finally, I understood it. Thank you!
@zorantill981
2 жыл бұрын
When he said 'minimum reconstruction error', I got Red Alert 2 flash backs.
@billybob7177
4 жыл бұрын
Dang. This made so much sense, thanks!
@rudypieplenbosch6752
2 жыл бұрын
Very helpfull explanation of how to interpret the data
@thee_ss
2 жыл бұрын
The slides are exact replica of the program from North Carolina University.
@Tapsthequant
4 жыл бұрын
This is beautiful...
@bharathwajan6079
Жыл бұрын
A and B have same no. of dimensions how to represent A in redued dimension
@OleksandrFialko
6 жыл бұрын
where is reduction ? The matrix dimension stays the same
@philrobinson2924
6 жыл бұрын
The matrix stays the same, but the SVD has been reduced by a dimension. Because of that, we end up with the B matrix, the approximation of A from a reduction in a dimension
@minerva646
6 жыл бұрын
@@philrobinson2924 but what do you do to go back to the exact original matrix after you remove those dimensions?
@Shkencetari
5 жыл бұрын
We have reduced the rank of the matrix, meaning that representing almost the same data can be done almost entirely as the original data with fewer coordinates. For example, in the example mentioned above, the rank has been decreased from 3 to 2, meaning that we can project our data from 3 to 2 dimensions.
@hrvojecoskovic5341
4 жыл бұрын
Imagine it this way. You send a rover on Mars to picture the planets enviroment and send those gathered information back to you. Insteed of sending picture matrix A, rover decomposite it into 3 smaller dimension matrices, eliminate some vector in spectral base which does not contribute to picture quality (reduces noise), and sends it back to you. You multiply it and have almost equal picture like it took on Mars.
@frankielee1836
7 жыл бұрын
amazing video
@AshutoshRaj
3 жыл бұрын
You are awesome.
@0824kenchan
7 жыл бұрын
Why does PCA need to take SVD on covariance matrix XX', and the this doesn't require? Do they result in the same things?
@Shkencetari
5 жыл бұрын
You can do eigendecomposition (which is a step you need to take during PCA) only on symmetric matrices. So X may not be symmetric, but X^(T) X is symmetric. On the other hand, you can do this SVD decomposition on any type of matrices (not just on symmetric ones).
@gowrithampi9751
4 жыл бұрын
@@Shkencetari thanks, or I'd have spent hours thinking about this.
@malharjajoo7393
7 жыл бұрын
fantastic !
@papeneri
2 жыл бұрын
Can someone help it seems I am the only one who didn't get it. In the first diagram I see 2 dimentions and 15 points then the matrixes have a lot more dimensions :? what are the vectors we see here what does each dimension represent?
@lovemormus
6 жыл бұрын
Sould take this class long ago!
@riteshpatil7230
2 жыл бұрын
Best explanation ever that I found on this topic !
@snehashishpaul2740
6 жыл бұрын
Very nice explanation.
@aqibafzal3106
2 жыл бұрын
shouldn't U and V be square matrices??
@TUMENG-TSUNGF
Жыл бұрын
I have that confusion as well, I thought the singular matrix should be the same shape as A?
@Llasslo
4 жыл бұрын
Sorry, this explanation sucks. Why does some vector gives the direction of the most variation in the data? Just no answer. "It just does". Why not to mention the PCA, why not to say that given the data matrix A, the direction of the most variance x is given by maximizing / = (x*A*Ax) /x*x (the Rayleigh quotient), which is maximized by the eigenvector u of matrix A*A that corresponds to its greatest eigenvalue. And given the SVD of A=USV*, the A*A=VS*U*USV*=VS*SV*, and so the column of V that corresponds to the greatest eigenvalue gives the direction of the most variation in the data.
@AbhishekSen
6 жыл бұрын
Roman concept!!!
@dw61w
3 жыл бұрын
So do you perform SVD directly on data matrix A? Why does Andrew Ng perform SVD on the covariance matrix 5:01 of this video kzitem.info/news/bejne/06SdlmiMg6qGp2k ? Is computing covariance matrix necessary?
@sidd-AI-n-ART
3 жыл бұрын
as a james bond fan , that accent made me feel he's planning to bomb usa , jokes apart he's an awesome teacher!!!!!!!!!!
@PengyuZhao
4 жыл бұрын
Anyone from KTH end up here
@anouarjoual1422
7 жыл бұрын
you talk fast man
@divyasasidharan2960
7 жыл бұрын
u can choose ur speed on the vdo, if its bothering u or is hard to follow
@karthik-ex4dm
5 жыл бұрын
leskovec is the founder of word2vec..😜😜
@mayurkulkarni755
7 жыл бұрын
DEEMENSNALITI REEDACTIONN
@nonegog5
6 жыл бұрын
Provide constructive criticism about the subject dealt with. You're indian. Your accent is funny to others.
Пікірлер: 47