A matrix is a rectangular array of numbers arranged in rows and columns. While this definition sounds simple, matrices are one of the most powerful tools in mathematics. They are used to solve systems of equations, represent transformations in computer graphics, process data in machine learning, and model networks in graph theory.
Matrix addition is straightforward: add corresponding elements. Multiplication is more involved. To multiply two matrices, you take the dot product of each row of the first matrix with each column of the second. The result has the same number of rows as the first matrix and the same number of columns as the second. This means the order matters: AB is generally not the same as BA.
The determinant is a scalar value associated with a square matrix. For a 2x2 matrix [a b; c d], the determinant is ad - bc. The determinant tells you whether the matrix has an inverse (non-zero determinant) and how the matrix scales areas when used as a transformation.
Eigenvalues are special scalars λ such that multiplying the matrix by a certain vector (the eigenvector) just scales that vector by λ. They are used in principal component analysis, vibration analysis, and stability theory.
Practice with our Matrix Calculator and Eigenvalue Finder to solidify your understanding.