If we consider a collection of numbers arranged in $n$ rows and $n$ columns, i.e. a square matrix that we will call $\bf{A}$. Let us also consider a column vector $\bf x$ with $n$ non-zero elements. We can therefore carry out the matrix multiplication $\bf{Ax}$. Now we raise the following question: Is there a number $\lambda$ such that the multiplication $\lambda \bf x$ gives us the same result as $\bf Ax$. In other words: $\bf{Ax} = \lambda \bf x$, if so, then we say that $\lambda$ is an Eigenvalue of $\bf A$ and $\bf x$ is the Eigenvector. Great! That part is fine and we can compute these quantities, but why are we interested in this? Well, it turns out that many applications in science and engineering rely on linear transformations, which in turn use Eigenvectors and Eigenvalues. A linear transformation is a function between two vector spaces that preserves the operations of addition and scalar multiplication. In simpler terms, a linear transformation takes, for example, straight lines into straight lines or to a single point, and they can be used to elucidate how to stretch or rotate an object, and that is indeed useful.