BERT: Why it’s been revolutionizing NLP

<p>BERT, which stands for Bidirectional Encoder Representations from Transformers, is a language model published in 2018 that achieved state-of-the-art performance on multiple tasks, including question-answering and language understanding. It not only beat previous state-of-the-art computational models, but also surpassed human performance in question-answering.</p> <p><strong>What&rsquo;s BERT?&nbsp;</strong>BERT is a computational model that converts words into numbers. This process is crucial because machine learning models take in numbers (not words) as inputs, so an algorithm that converts words into numbers allows you to train machine learning models on your originally-textual data.</p> <p><a href="https://towardsdatascience.com/bert-why-its-been-revolutionizing-nlp-5d1bcae76a13"><strong>Website</strong></a></p>