Riemann surface
matlab

18.6 Exercises

Step through the pca.R file on the class website. Then replicate the analysis of the cars data given above.

Suppose that we use $q$ directions, given by $q$ orthogonal length-one vectors $\vec{w}_1, \ldots \vec{w}_q$. We want to show that minimizing the mean squared error is equivalent to maximizing the sum of the variances of the scores along these directions.
(a) Write $\mathbf{w}$ for the matrix forms by stacking the $\vec{w}_i$. Prove that $\mathbf{w}^T \mathbf{w}=\mathbf{I}_q$.
(b) Find the matrix of $q$-dimensional scores in terms of $\mathbf{x}$ and $\mathbf{w}$. Hint: your answer should reduce to $\vec{x}_i \cdot \vec{w}_1$ when $q=1$.
(c) Find the matrix of $p$-dimensional approximations based on these scores in terms of $\mathbf{x}$ and w. Hint: your answer should reduce to $\left(\vec{x}_i \cdot \vec{w}_1\right) \vec{w}_1$ when $q=1$
(d) Show that the MSE of using the vectors $\vec{w}_1, \ldots \vec{w}_q$ is the sum of two terms, one of which depends only on $\mathbf{x}$ and not $\mathbf{w}$, and the other depends only on the scores along those directions (and not otherwise on what those directions are). Hint: look at the derivation of Eq. 18.5, and use Exercise $2 a$.
(e) Explain in what sense minimizing projection residuals is equivalent to maximizing the sum of variances along the different directions.

Exercise 1. If $A$ is any $m \times n$ matrix of real numbers, then the $m \times m$ matrix $A A^T$ and the $n \times n$ matrix $A^T A$ are both symmetric.

Thus, we can apply the theorem to the matrices $A A^T$ and $A^T A$. It is natural to ask how the eigenvalues and eigenvectors of these matrices are related.

E-mail: help-assignment@gmail.com  微信:shuxuejun

help-assignment™是一个服务全球中国留学生的专业代写公司