Bidirectional Encoder Representations from Transformers

What Does Bidirectional Encoder Representations from Transformers Mean?

Bidirectional Encoder Representations from Transformers (BERT) is a deep learning strategy for natural language processing (NLP) that helps artificial intelligence (AI) programs understand the context of ambiguous words in text.

Advertisements

Applications that use BERT are able to predict the correct meaning of a synonym by processing text in both left-to-right and right-to-left directions simultaneouosly.

Techopedia Explains Bidirectional Encoder Representations from Transformers

Google engineers used tools like Tensorflow to create the BERT neural network architecture. Until BERT, AI programs were unidirectional, which means they could only process text from left-to-right.

BERT's bidirectionality, combined with a masking strategy that teaches the programming how to predict the meaning of an ambiguous term, allows deep learning neural networks to use unsupervised learning techniques to create new NLP models.

This approach to natural language understanding (NLU) is so powerful that Google suggests that users can use BERT to train a state-of-the-art question and answer system in about 30 minutes as long as they have enough training data.

Advertisements

Related Terms

Latest Emerging Technology Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…