Artificial IntelligenceModels & Architecture

Sparse Attention

Overview

An attention mechanism that selectively computes relationships between a subset of input tokens rather than all pairs, reducing quadratic complexity in transformer models.

Cross-References(2)

More in Artificial Intelligence

See Also