Starting from $$ y = \un x^\top\un A\un x, $$ we can write $$ \un x y \un x^\top = \un x\un x^\top\un A\un x\un x^\top, $$ and therefore, since $y$ is scalar, $$ {\rm tr}(\un x\un x^\top)y = {\rm tr}(\un x\un x^\top\un A\un x\un x^\top). $$ With the second of Equations (6), $$ {\rm tr}(\un x\un x^\top)y = {\rm tr}(\un A\un x\un x^\top\un x\un x^\top). $$ But since $(\un x^\top\un x)$ is also scalar, $$ {\rm tr}(\un x\un x^\top)y = {\rm tr}(\un A\un x\un x^\top)(\un x^\top\un x). $$ The identity then follows from ${\rm tr}(\un x\un x^\top)=(\un x^\top\un x)$.
Let the eigenvectors of $\un A$ be $\un u_1$ and $\un u_2$. Then $$\eqalign{ \un u_2^\top\un A\un u_1 &= \lambda_1\un u_2^\top\un u_1\cr \un u_1^\top\un A\un u_2 &= \lambda_2\un u_1^\top\un u_2.} $$ Subtracting gives, since $\un u_2^\top\un u_1=\un u_1^\top\un u_2$, $$ (\lambda_1-\lambda_2)\un u_1^\top\un u_2 =\un u_2^\top\un A\un u_1-\un u_1^\top\un A\un u_2, $$ But since $\un u_1^\top\un A\un u_2$ is scalar it is equal to its own transpose. $$ \un u_1^\top\un A\un u_2 = (\un u_1^\top\un A\un u_2)^\top = \un u_2^\top\un A\un u_1 $$ since $\un A$ is symmetric. Hence the RHS is zeor and $\un u_1$ and $\un u_2$ are orthogonal.
Let $f(\un x)=ax_1^2+bx_2^2$ and $h(\un x)=x_1+x_2-1$. Then we get the three equations $$\eqalign{ {\partial\over\partial x_1}(f(\un x)+\lambda h(\un x))&=2ax_1+\lambda = 0\cr {\partial\over\partial x_2}(f(\un x)+\lambda h(\un x))&=2bx_2+\lambda = 0\cr {\partial\over\partial \lambda}(f(\un x)+\lambda h(\un x))&=x_1+x_2-1 = 0.} $$ The solution for $\un x$ is $$ x_1={b\over a+b},\quad x_2={a\over a+b}. $$
(a) $$\eqalign{ \langle X\rangle &= \int_0^1 x\cdot 1\ dx = 1/2\cr \var(X) &= \int_0^1 (x-1/2)^2\cdot 1\ dx = 1/12.} $$
(b)
From the definition of variance,
$$\eqalign{ \var(Z) &= \langle(Z-\langle Z\rangle)^2\rangle \cr &= \langle Z^2 - 2Z\langle Z\rangle +\langle Z\rangle^2\rangle \cr &= \langle Z^2 \rangle -2\langle Z\rangle^2 + \langle Z\rangle^2 \cr &= \langle Z^2 \rangle -\langle Z\rangle^2.} $$Using this result,
$$\eqalign{ \var(a_0+a_1 Z) &= \var(a_1 Z) = \langle a_1^2 Z^2\rangle - \langle a_1 Z\rangle^2\cr &= a_1^2(\langle Z^2 \rangle -\langle Z\rangle^2) = a_1^2\var(Z).} $$(a) The gamma function is $$ \Gamma(\alpha) = \int_0^\infty x^{\alpha-1}e^{-x}dx. $$ Integrating by parts ($\int v du = uv-\int u dv$) with $$ v = x^{\alpha-1}, \quad du = e^{-x}dx \Rightarrow u = -e^{-x}, $$ we have $$\eqalign{ \Gamma(\alpha) &= -e^{-x}x^{\alpha-1}\Big|_0^\infty +\int_0^\infty e^{-x}(\alpha-1)x^{\alpha-2}dx \cr & = (\alpha-1)\int_0^\infty x^{\alpha-2}e^{-x} = (\alpha-1)\Gamma(\alpha-1). } $$
(b) The mean of the Gamma distribution is
$$ \langle X\rangle = \int_0^\infty x {1\over \beta^\alpha\Gamma(\alpha)}x^{\alpha-1}e^{-x/\beta}dx. $$Let $y=x/\beta$. Then
$$\eqalign{ \langle X\rangle &= \int_0^\infty \beta y {1\over \beta^\alpha\Gamma(\alpha)}\beta^{\alpha-1} y^{\alpha-1}e^{-y}\beta dy\cr &=\beta{1\over\Gamma(\alpha)}\int_0^\infty y^\alpha e^{-y}dy = \beta{1\over\Gamma(\alpha)}\Gamma(\alpha+1) = \alpha\beta. } $$Similarly
$$\eqalign{ \langle X^2 \rangle &= \int_0^\infty (\beta y)^2 {1\over \beta^\alpha\Gamma(\alpha)}\beta^{\alpha-1} y^{\alpha-1}e^{-y}\beta dy\cr &= \beta^2{1\over\Gamma(\alpha)}\int_0^\infty y^{\alpha+1} e^{-y}dy \cr &= \beta^2{1\over\Gamma(\alpha)}\Gamma(\alpha+2) = \beta^2{1\over\Gamma(\alpha)}(\alpha+1)\Gamma(\alpha+1) = \alpha(\alpha+1)\beta^2. } $$Therefore the variance is given by
$$ \var(X^2) = \langle X^2 \rangle - \langle X\rangle^2 = \alpha(\alpha+1)\beta^2 - (\alpha\beta)^2 = \alpha\beta^2. $$Let $X$ be standard normally distributed, i.e. with density function $$ p(x) = \phi(x) = {1\over\sqrt{2\pi}}e^{-x^2/2}. $$ We can't apply Theorem 2.1 directly to get the density function for $X^2$ because the function $u(x) = x^2$ is not monotonic wherever $p(x)\ne 0$. But consider the random variable $$ Y = Z^2,\quad\hbox{where}\quad Z=|X|. $$ We saw that $Z$ has the density function $$ f(z)=\cases{2\phi(z) & for $z>0$\cr 0 & elsewhere.} $$ The function $y=u(z)=z^2$ is monotonic wherever $f(z)>0$, so Theorem 2.1 holds. Inverting, $z=w(y)= y^{1/2}$. So for $y>0$ $$ g(y)=f(y^{1/2})\left|{1\over 2}y^{-1/2}\right|={2\over\sqrt{2\pi}}e^{-y/2}{1\over 2}y^{-1/2}={1\over\sqrt{2\pi}}y^{-1/2}e^{-y/2}, $$ which is the chi-square distribution for $m=1$ degree of freedom.
Let $A_i$ represent the situation "auto is behind door $i$". The a priori probabilities are $\pr(A_i)=1/3$, $i=1,2,3$. Let $O_i$ be the observation "quizmaster opens door $i$". Suppose the contestant chooses door 2 and the quizmaster opens door 1. Then we have the following conditional probabilities:
$$\eqalign{ \pr(O_1\mid A_1) &= 0\ \quad\hbox{quizmaster won't give the car away.}\cr \pr(O_1\mid A_2) &= 1/2\quad\hbox{quizmaster is indifferent.}\cr \pr(O_1\mid A_3] &= 1\ \quad\hbox{quizmaster has no choice.} } $$Now apply Bayes' Theorem to find the a posteriori probability that the auto is behind door 2, given the observation:
$$\eqalign{ \pr(A_2\mid O_1) &= {\pr(O_1\mid A_2)\pr(A_2)\over \pr(O_1\mid A_2)\pr(A_2)+\pr(O_1\mid A_3)\pr(A_3)}\cr &= {(1/2)(1/3)\over (1/2)(1/3)+(1)(1/3)} = 1/3,} $$whereas, with the same argument,
$$ \pr(A_3\mid O_1) = 2/3. $$The contestant would therefore be well-advised to switch to door 3.