logo
How it worksCoursesResearch CommunitiesBenefitsAbout Us
Schedule Demo
Learn Before
  • Positional-based Sparse Attention

    Concept icon
Concept icon
Concept

Compound Sparse Attention

Composed of more than one atomic sparse attention pattern

Image 0

0

1

Concept icon
Updated 2022-05-20

Contributors are:

Adam Nik
Adam Nik
🏆 1

Who are from:

Carleton College
Carleton College
🏆 1

References


  • A Survey of Transformers (Lin et. al, 2021)

Tags

Data Science

Related
  • Atomic Sparse Attention Example Diagram

    Concept icon
  • Compound Sparse Attention

    Concept icon
  • Extended Sparse Attention

    Concept icon
  • An engineer designs a sparse attention mechanism where, for any given token at position i, the model is only allowed to attend to the tokens within a fixed-size window around it (e.g., from position i-k to i+k). This rule is applied uniformly across the entire sequence, irrespective of the specific words involved. Which statement best analyzes the core principle of this design?

  • Analysis of a Sparse Attention Strategy

  • In a positional-based sparse attention mechanism, the set of tokens that a given token attends to is dynamically adjusted during processing based on the semantic similarity of the surrounding tokens.

Learn After
  • Star-Transformer

  • Longformer

  • ETC

  • BigBird Transformer

logo 1cademy1Cademy

Optimize Scalable Learning and Teaching

How it worksCoursesResearch CommunitiesBenefitsAbout Us
TermsPrivacyCookieGDPR

Contact Us

iman@honor.education

Follow Us




© 1Cademy 2026

We're committed to OpenSource on

Github