- What do the eigenvalues of a covariance matrix represent?
- What do the eigenvectors of the covariance matrix give us?
- What do eigenvalues of correlation matrix mean?
What do the eigenvalues of a covariance matrix represent?
Long story short: The eigenvalues of the covariance matrix encode the variability of the data in an orthogonal basis that captures as much of the data's variability as possible in the first few basis functions (aka the principle component basis).
What do the eigenvectors of the covariance matrix give us?
Because the eigenvectors of the covariance matrix are orthogonal to each other, they can be used to reorient the data from the x and y axes to the axes represented by the principal components. You re-base the coordinate system for the dataset in a new space defined by its lines of greatest variance.
What do eigenvalues of correlation matrix mean?
The eigenvalues are related to the variances of the variables on which the correlation matrix is based; that is, the p eigenvalues are related to the variances of the p variables. True variances must be nonnegative, because they are computed from sums of squares, which themselves are each nonnegative.