Concept

LLaMA2

LLaMA2 is a family of large language models introduced in 2023. It includes prominent versions such as the 65-billion-parameter LLaMA2-65B, which was pre-trained on between 1.0 trillion and 1.4 trillion tokens. The training data for LLaMA2 comes from a diverse mix of public sources, including webpages, software code, Wikipedia, books, academic papers, and question-and-answer content.

0

1

Updated 2026-04-21

Contributors are:

Who are from:

Tags

Ch.2 Generative Models - Foundations of Large Language Models

Foundations of Large Language Models

Computing Sciences

Foundations of Large Language Models Course

Related