Learn Before
Concept

Matrix-Vector Product as Transformation

Multiplication by a matrix ARm×n\mathbf{A} \in \mathbb{R}^{m \times n} can be interpreted as a transformation that projects vectors from an nn-dimensional space (Rn\mathbb{R}^{n}) into an mm-dimensional space (Rm\mathbb{R}^{m}). These transformations are highly useful; for instance, specific square matrices can represent rotations. Furthermore, matrix-vector products are fundamental in deep learning, as they describe the core calculation for computing the outputs of a neural network layer based on the previous layer's outputs.

0

1

Updated 2026-05-02

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L