Great Video, this is the only source I could find, that finally explained to me the connection between the interpretation of the eigen centrality and its definition via the eigenvector equation. Thank you a lot!
@andrewbeveridge74763 жыл бұрын
Thanks! I am glad that you found it useful.
@rbg98542 жыл бұрын
Thank you so much for the detailed explanation! I had been looking for this kind of videos for a long time but most of the videos only have a light touch on the maths side. This is a much more comprehensive one!
@debasismondal36193 жыл бұрын
This video is great but underrated if one to understand the concepts of Eighenvector and Katz centrality I hardly came across any videos which explains it in so much detail.
@andrewbeveridge74763 жыл бұрын
I have a follow-up video on Katz centrality, which helps to make that connection and how they relate to PageRank. kzbin.info/www/bejne/b6m7ZY1riMuebck
@kyxas3 жыл бұрын
Hi Andrew, I tried solving your example and got a different eigenvector. I guess its the adjacency matrix A where the fifth row should be (0, 0, 0, 1, 0, 1) and not (1, 0, 0, 1, 0, 1).
@andrewbeveridge74763 жыл бұрын
Yes, you are correct. My matrix has a typo. Thanks for pointing that out.
@PowerYAuthority Жыл бұрын
Came here to say this, was running these calculations with numpy and it wasn't matching
@brownchocohuman6595 Жыл бұрын
I have a question. Instead of recursively updating the values, why don't we straight up calculate the normalised eigen vector and get the centrality value of each node from there?
@andrewbeveridge747611 ай бұрын
You are correct! In practice, we just find the eigenvector for the largest eigenvalue of matrix A. This is also called the "dominant" eigenvalue of A. My goal at 4:30 was give an example of the convergence of this recursive update rule. After that, the video uses the spectral decomposition theorem to show that we are converging to the eigenvector for the largest eigenvalue.
@souravdey12273 жыл бұрын
seriously good explanation in such a short time. only problem is suddenly the volume dipped so low that I could barely hear.
@andrewbeveridge74763 жыл бұрын
You are right: the audio does dip starting around 9:00. Sorry about that. It looks like I need to upload a new version (with a new URL) to fix it. Thanks for the feedback and I'm glad the video was helpful.
@Kane953010 ай бұрын
Hi Andrew, for directed graphs, since the matrix is no longer symmetric, the eigenvectors are no longer necesarily orthonormal and so the spectral decomposition theorem wouldn't apply. In this case, how do we prove that the eigenvector centrality calculation works?
@andrewbeveridge74768 ай бұрын
Directed networks do get tricky for a few different reasons. When the directed network is strongly connected and not k-partitite (so the adjacency matrix is primitive and irreducible), the eigenvector for the dominant eigenvalue will still determine the limiting values of the vector produced by repeatedly multiplying by A^T. This is the same idea that we use in the "power method" to find the dominant eigenvalue and eigenvector. I recommend reading up on that method.
@Keyakina8 ай бұрын
How did you normalize and get 0.32 instead of zero at 5:50?
@andrewbeveridge74768 ай бұрын
I divided each entry by 52, which is the value of the largest entry. So the first entry becomes 17/52=0.329. My goal here was to make it easier for a human to compare the entries. So the first entry is about 32.9% of the largest entry. In hindsight, I shouldn't have called this "normalizing." since the resulting vector doesn't have length 1. "Rescaling" would be a better term. Thanks for the comment.
@eliesjj220711 ай бұрын
This is awesome!
@avnimishra6874 Жыл бұрын
How you got the normalized value ? Plz tell (time 5.54)
@andrewbeveridge747611 ай бұрын
We divide by the largest entry so that the largest value is 1. This makes it easy for a human to compare the relative sizes. In this case, [17, 38, 37, 52, 39, 47] becomes [17/52, 38/52, 37/52, 1, 39/52, 47/52] = [0.32, 0.73, 0.71, 1, 0.75, 0.90].
@zhichen22882 жыл бұрын
excellent, thank you!
@hernanepereira503 жыл бұрын
Hi Andrew. How are you? a_{51}=0, isn't it?
@andrewbeveridge74763 жыл бұрын
Yes, you are right! Thank you for the correction for the matrix A starting at 6:50. So the fifth entry of Ax should be x_4 + x_6.