Advanced Linear Algebra: Write Out Your Solutions Clearly

Advanced Linear Algebra Write Out Your Solutions In A Clear Compl

Do one of the following. (a) Let A be an n à— n matrix over C with characteristic polynomial (x − λ)n. Prove that for k ≥ 1, rank(A − λI)k−1 − rank(A − λI)k = the number of Jordan blocks of A of size l à— l with l ≥ k. (b) Let V of V . transformation be a finite dimensional Let A : V → V such that vector space over a field be ( Af)(v) ̃ a linear = transformation. f(Av) for all v ∈ V Define V F, A and ̃ : and f ∈ V ̃ V ̃ let . → Show V ̃ V ̃ be to that the be vector A the and unique A space ̃ have linear dual the same Jordan canonical form. 2. Do one of the following. (a) Do the followoing i. Prove: If A is an invertible matrix with SVD A = UΣV ∗, then Σ is invertible and A−1 has SVD A−1 = V Σ−1U∗. ii.

The condition number of an invertible matrix A is κ(A) = ||A||op||A−1||op. Show that κ(A) > 1. Hint: Use (i) to write ||A−1||op in terms of the singular values of A. Comment: The condition number bounds the relative error in the solution to Ax = b in terms of the relative error in b: Given an error h in b, we are solving Ax = b + h. It can be shown that ||A−1h||/||x|| ≤ κ(A)||h||/||b||. (b) In class we saw an algorithm to find the SVD of a matrix A, which required us to (1) find the eigenvalues of A∗A to find the singular values, (2) orthonormal bases for its eigenspaces for the left singular vectors, and (3) a scaled image of A applied to each element of these basis vectors for the right singular vectors.

Each step involves other steps, such as finding a determinant and applying Gram-Schmidt. i. Show each step (with substeps) as you compute the SVD of A = [1 −1 1 − ] . ii. Show each step as you use the SVD of A in (i) to compute the Moore-Penrose inverse of A. You may perform/verify your work with a computer algebra system, but all key details should be included in your solution. Include a description/title for each step. 3. Do one of the following. (a) Let V denote an inner product space. Fix y, z ∈ V and define T : V → V by T(x) = 〈x, y〉z. Show that T is linear and that T ∗ exists. Give an expression for T ∗(x) involving x, y, and z. (b) Let V be a finite-dimensional inner product space, and let {α1,...,αn} be an orthonormal basis for V . Show that for any vectors α, β ∈ V , (α|β) = ∑n(α|αi)(β |αi). k=. Write a brief synopsis of the following proof. That is, say, summarize the key steps/big ideas in a few sentences which provide an outline, omitting the more routine details. Let T be a linear operator on the finite-dimensional space V . Prove that T has a cyclic vector if and only if the following is true: Every linear operator U which commutes with T is a polynomial in T. Proof. First suppose T has a cyclic vector α, and say U commutes with T. Then V has a basis {α, T α, . . . , T n−1α}. Now Uα = ∑ ni=1 ciT i−1α. Let f = ∑ni=1 ciXi−1, so Uα = f(T)α.

Hence UT iα = T iUα = T if(T)α = f(T)T iα. Thus U and f(T) agree on a basis of V , so U = f(T). Now suppose every linear operator U which commutes with T is a polynomial in T. By Theorem 3, we have cyclic decomposition V = Z(α1;T) ⊕···âŠ• Z(αr;T). Since each cyclic subspace is T-invariant, T commutes with the projection Ei of V to Z(αi;T).

In particular, Ei = fi(T) for some polynomial fi by assumption. Suppose for the sake of contradiction that r > 1. Observe that E2α1 = f2(T)α1 = 0. In particular, the T-annihilator p1 of α1 divides f2. But by the cyclic decomposition theorem, the T-annihilator p2 of α2 divides p1.

Hence p2|f2, which implies f2α2 = 0, contradicting the fact that E2 the projection of V on Z(α2;T) acts as the identity on α2. Thus r = 1, so T has a cyclic vector. Sample: There is no definitive correct synopsis of a proof, as different people may judge different steps as important enough to merit mentioning. The actual problem is more involved than this sample (e.g. each direction of the if and only if will have its own subsynopsis), and so the synopsis will be as well. But for reference we offer the following elementary example. if vectors v1, ..., vn are linearly independent, then every vector in their span can be written as a unique linear combination of these vectors.

Proof. Suppose γ in the span v1, ..., vn of has two distinct expressions: γ = b1v1 + ··· + bnvn γ = c1v1 + ··· + cnvn Subtracting these two equations gives 0=(b1v1 + ··· + bnvn) − (c1v1 + ··· + cnvn)=(b1 − c1)v1 + ··· + (bn − cn)vn. Since the expressions are distinct, not all of the coefficients (bi − ci) are zero, so v1, ..., vn are linearly dependent. The result follows by contrapositive. The key in this proof is showing that linear dependence is the negation of the uniqueness of the linear combination representation.