Short Answer

Hyperparameter Configuration for a Standard Language Model

For a standard Transformer-based language model with approximately 110 million total parameters, what are the specific values for the following three architectural hyperparameters: number of Transformer layers, number of attention heads, and hidden size?

0

1

Updated 2025-10-04

Contributors are:

Who are from:

Tags

Ch.1 Pre-training - Foundations of Large Language Models

Foundations of Large Language Models

Foundations of Large Language Models Course

Computing Sciences

Recall in Bloom's Taxonomy

Cognitive Psychology

Psychology

Social Science

Empirical Science

Science