Concept

Unnecessary Memory Allocation in Machine Learning

Running operations that constantly allocate new memory can be highly undesirable in machine learning. First, updating hundreds of megabytes of parameters multiple times per second via new allocations creates severe memory overhead, which is why these updates should be performed in-place. Second, if multiple variables point to the same parameters, failing to update the memory in-place requires carefully updating all references to avoid memory leaks or inadvertently referring to stale parameters.

0

1

Updated 2026-05-01

Contributors are:

Who are from:

Tags

D2L

Dive into Deep Learning @ D2L