Natural Language ProcessingSemantics & Representation

BERT

Overview

Bidirectional Encoder Representations from Transformers — a language model that understands context by reading text in both directions.

Cross-References(1)

Natural Language Processing

More in Natural Language Processing