3.2 Matrices, the workhorses of linear algebra
I am quite sure that you were already familiar with the notion of matrices before reading this book. Matrices are one of the most important data structures that can represent systems of equations, graphs, mappings between vector spaces, and many more. Matrices are the fundamental building blocks of machine learning.
At first look, we define a matrix as a table of numbers. If the matrix A has, for instance, n rows and m columns of real numbers, we write

When we don’t want to write out the entire matrix as (3.1), we use the abbreviation A = (ai,j)i=1,j=1n,m.
The set of all n ×m real matrices is denoted by ℝn×m. We will exclusively talk about real matrices, but when referring to other types, this notation is modified accordingly. For instance, ℤ{n×m} denotes the set of integer matrices.
Matrices can be added and multiplied together, or multiplied by a scalar.
Definition 14. ...