logo

BERT (Bidirectional Encoder Representation from Transformers)

Machine learning

A language representation model designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers.