Learn Before
Concept

Matrix-Vector Product as a Transformation

Multiplying a matrix ARmimesn\mathbf{A} \in \mathbb{R}^{m imes n} by a vector xRn\mathbf{x} \in \mathbb{R}^n can be interpreted conceptually as applying a transformation that projects vectors from an nn-dimensional space (Rn\mathbb{R}^{n}) into an mm-dimensional space (Rm\mathbb{R}^{m}). These transformations are highly versatile; for example, specific square matrices can represent geometric rotations. Furthermore, matrix-vector products are the foundational calculations used to compute the outputs of a neural network layer based on the data received from the preceding layer.

0

1

Updated 2026-05-01

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L