#### 5.1.2 Matrices

If some characters seem to be missing, it's because MathJax is not loaded correctly. Refreshing the page should fix it.

1. [E] Why do we say that matrices are linear transformations?
2. [E] What’s the inverse of a matrix? Do all matrices have an inverse? Is the inverse of a matrix always unique?
3. [E] What does the determinant of a matrix represent?
4. [E] What happens to the determinant of a matrix if we multiply one of its rows by a scalar ?
5. [M] A matrix has four eigenvalues . What can we say about the trace and the determinant of this matrix?
6. [M] Given the following matrix:

Hint: rely on a property of this matrix to determine its determinant.

7. [M] What’s the difference between the covariance matrix and the Gram matrix ?
8. Given and

1. [M] Find such that: .
2. [E] When does this have a unique solution?
3. [M] Why is it when A has more columns than rows, has multiple solutions?
4. [M] Given a matrix A with no inverse. How would you solve the equation ? What is the pseudoinverse and how to calculate it?
9. Derivative is the backbone of gradient descent.

1. [E] What does derivative represent?
2. [M] What’s the difference between derivative, gradient, and Jacobian?
10. [H] Say we have the weights and a mini-batch of elements, each element is of the shape so that . We have the output . What’s the dimension of the Jacobian ?
11. [H] Given a very large symmetric matrix A that doesn’t fit in memory, say and a function that can quickly compute for . Find the unit vector so that is minimal.

Hint: Can you frame it as an optimization problem and use gradient descent to find an approximate solution?