Short Answer

Architectural Rationale for Activation Function Choice

A key architectural decision in prominent large language models like PaLM and LLaMA was the use of a Swish-based Gated Linear Unit (SwiGLU) in their feed-forward network layers. Analyze one significant advantage this choice offers over using a more traditional, non-gated activation function like ReLU in the context of these large-scale models.

0

1

Updated 2025-10-09

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Analysis in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science