If some characters seem to be missing, it's because MathJax is not loaded correctly. Refreshing the page should fix it.
- [E] Why do we say that matrices are linear transformations?
- [E] What’s the inverse of a matrix? Do all matrices have an inverse? Is the inverse of a matrix always unique?
- [E] What does the determinant of a matrix represent?
- [E] What happens to the determinant of a matrix if we multiply one of its rows by a scalar ?
- [M] A matrix has four eigenvalues . What can we say about the trace and the determinant of this matrix?
[M] Given the following matrix:
Without explicitly using the equation for calculating determinants, what can we say about this matrix’s determinant?
Hint: rely on a property of this matrix to determine its determinant.
- [M] What’s the difference between the covariance matrix and the Gram matrix ?
- [M] Find such that: .
- [E] When does this have a unique solution?
- [M] Why is it when A has more columns than rows, has multiple solutions?
- [M] Given a matrix A with no inverse. How would you solve the equation ? What is the pseudoinverse and how to calculate it?
Derivative is the backbone of gradient descent.
- [E] What does derivative represent?
- [M] What’s the difference between derivative, gradient, and Jacobian?
- [H] Say we have the weights and a mini-batch of elements, each element is of the shape so that . We have the output . What’s the dimension of the Jacobian ?
[H] Given a very large symmetric matrix A that doesn’t fit in memory, say and a function that can quickly compute for . Find the unit vector so that is minimal.
Hint: Can you frame it as an optimization problem and use gradient descent to find an approximate solution?