Dear Dr. Hower, you are an amazing teacher. I enjoy watching your lectures.
@DrValerieHower
3 жыл бұрын
Hi thank you. Perhaps you answered your own question. Thank you for the feedback.
@masakmemasak319
3 жыл бұрын
the best lecture for SVD right now for me! as you mention every step clearly in detail and flow..it really help me who is not into this topic during my bachelor.. thank you Dr! :D
@DrValerieHower
3 жыл бұрын
Thank you so much for your feedback!
@Spacexioms
Жыл бұрын
This is the greatest video I’ve seen on svd
@DrValerieHower
Жыл бұрын
Thank you so much!!
@spencerfradkin3762
3 жыл бұрын
Dr. Hower, My name is Spencer and I was a student in your Calc 2 class at FAU a couple Summers ago. Your calc 2 class was my favorite class in undergrad. I was searching youtube for svd videos for a graduate class I'm taking and I can't believe I came across your channel. This video is exactly what I needed and it's explained as well as your calc 2 lectures. Thanks!!!
@DrValerieHower
3 жыл бұрын
Spencer! It is so wonderful to hear from you. I really appreciate your feedback and hope everything is going well for you. :)
@MiguelSantos-vi3gi
Ай бұрын
Love your class and attitude
@DrValerieHower
Ай бұрын
Thank you so much!!!
@matthewchunk3689
4 жыл бұрын
Excellent topic. Thanks!
@menugrg3708
3 жыл бұрын
The best explanation of Singular Value Decomposition.. many thanks Dr. Hower.
@DrValerieHower
3 жыл бұрын
You are very welcome. I appreciate the feedback :)
@bashiruddin3891
3 жыл бұрын
The sort of lecture we wish we could have found at the start of semester.Thanks a lot
@DrValerieHower
3 жыл бұрын
Thanks so much for your feedback.
@mustafizurrahman5699
2 ай бұрын
Splendid video on SVD
@DrValerieHower
2 ай бұрын
Thank you!
@TheTacticalDood
4 жыл бұрын
Thanks, very nice lecture!
@redouaneabegar5490
3 жыл бұрын
Best numerical application of SVD I've ever found on KZitem. Thank you ma'am
@DrValerieHower
3 жыл бұрын
Thank you so much. I appreciate your feedback!
@BharathSaiS
2 жыл бұрын
The first time you see a math teacher with a smile..
@DrValerieHower
2 жыл бұрын
:)
@donaldduck4042
3 жыл бұрын
The best lecture on youtube so far
@DrValerieHower
3 жыл бұрын
Thank you!
@amshudharvadla6482
Жыл бұрын
It's amazing 👏 Love from India
@DrValerieHower
Жыл бұрын
Thank you!
@electrocrats1100
2 жыл бұрын
nice explanation mam, Respect from INDIA
@DrValerieHower
2 жыл бұрын
Thank you for your feedback!
@moin2163
Жыл бұрын
Keep up the energy! Thank you for this video.
@DrValerieHower
Жыл бұрын
You are welcome. Thank you for the comment :)
@rexmagat4051
7 ай бұрын
Thanks. Doctor. Great
@DrValerieHower
7 ай бұрын
You are welcome! Thank you for the comment :)
@mihirparab6620
3 жыл бұрын
Cool and awesome teaching ma'am👍Thank you very much
@DrValerieHower
3 жыл бұрын
You are welcome! Thank you for your feedback.
@rafaeljabbour2502
6 ай бұрын
At 39:00 you used ker(row(V)) to find the third unit vector, is it always the case that you can use that? or do you sometimes need to use Gram-Schmidt process?
@rashmisharma5821
2 жыл бұрын
amazingly elegant :)
@DrValerieHower
2 жыл бұрын
Thank you!
@madhumitanath3275
2 жыл бұрын
Excellent explanation ma'am.
@DrValerieHower
2 жыл бұрын
Thanks so much!
@ungarlinski7965
Жыл бұрын
@21:00 So if I wanted my Sigma matrix to have the columns switched so that is was not not diagonal anymore so that the sigma1 and sigma2 values were on the off diagonal, I could just use this same procedure and solve for the new U? I know this would no longer be SVD, but I'm curious about this kind of decomposition too.
@DrValerieHower
Жыл бұрын
I'll speak to the 2x2 case here in which case Sigma is square. Yes correct we would not have the SVD. But if you swap the two columns of Sigma so that 0s are along the diagonal. sigma1 is in the (1,2) entry and sigma 2 is in the (2,1) entry. It is still the case that Transpose(Sigma) times Sigma is diagonal and a matrix that is similar to Transpose(A) times A. You can take U and V orthogonal but take care in the order of the columns.
@climitod8524
Ай бұрын
31:50 I am very confused how you got v here. I did the ker(A^(T)a -9I) = {,) and I did it for 81. I got the same vectors as you but you associated them differently I did the identity matrix for v because that's the order that makes sense and how its defined in the text with v1*k1 for k=eigenvalues.
@climitod8524
Ай бұрын
Its it because we switch the order of eigenvalues Im really confused.
@DrValerieHower
Ай бұрын
Careful in your notation. A kernel is a subspace which would not be a two element set. But to answer your question, each vector in V is a unit (hence nonzero) eigenvector for ATA. The order comes from looking in Sigma. We put singular values along diagonal of Sigma in a nonincreasing order. Then the vectors in V must match the order. is an eigenvector for ATA with eigenvalue 81. hence it corresponds to singular value sqrt(81)=9. We put this in the first column of V.
@russellsharpe288
3 жыл бұрын
Thank-you for this clear exposition. However, I thought you were going to prove that every matrix has such a decomposition. But at 12:45 you simply assume it does, and then derive facts about sigma and V based on this assumption. That is not sufficient to show that such a decomposition exists, is it?
@DrValerieHower
3 жыл бұрын
Hi. My discussion is constructive, meaning I discuss how to find the decomposition. The Spectral theorem states that a matrix is orthogonally diagonalizable if and only if it is symmetric. I show A^TA is symmetric. Sigma and V come from its orthogonal diagonalization.
@russellsharpe288
3 жыл бұрын
@@DrValerieHower Thanks. I think the penny has dropped now. As you say, the Spectral Theorem gives A*A = VDV*, and then you construct columns of U from the normalised nonzero A-images of the columns of V (this is the bit I was somehow missing). These U-columns are orthogonal because the V-columns are (and using A*A= VDV*), and we can further extend to a complete orthogonal basis if necessary (as you do in fact in the penultimate example). Then sigma(i).u(i) = Av(i) pretty much by construction and US=AV falls out immediately. Got it. I was confused because I am working my way through Axler's book, and he has a complicated proof of Polar Decomposition from which he deduces SVD as a corollary, but in fact I now see it is much easier the other way around. Thanks again for your help on this. Great channel.
Пікірлер: 48