BERT (Bidirectional Encoder Representation from Transformers)
Machine learning
A language representation model designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers.
Try Labelbox today
Get started for free or see how Labelbox can fit your specific needs by requesting a demo