Tag: BERT

The Ultimate Guide to Training BERT from Scratch: Introduction

A few weeks ago, I trained and deployed my very own question-answering system using Retrieval Augmented Generation (RAG). The goal was to introduce such a system over my study notes and create an agent to help me connect the dots. LangChain truly shines in these specific types of applications: ...

Natural Language Processing: Beyond BERT and GPT

The world of technology is ever-evolving, and one area that has seen significant advancements is Natural Language Processing (NLP). A few years back, two groundbreaking models, BERT and GPT, emerged as game-changers. They revolutionized how machines understood and interacted with human language, mak...

Boosting Password Security with Natural Language Understanding: Building a Simple Password Strength Checker with BERT Transformer

Inan era where cyber threats are more pervasive than ever, ensuring the security of online accounts is of paramount importance. Passwords are often the first line of defense against unauthorized access, making their strength a critical factor in safeguarding our digital lives. In this article, ...

Large Language Models, Part 1: BERT

2017was a historical year in machine learning when the Transformer model made its first appearance on the scene. It has been performing amazingly on many benchmarks and has become suitable for lots of problems in Data Science. Thanks to its efficient architecture, many other Transformer-ba...

Understanding Large Language Models: The Physics of (Chat)GPT and BERT

ChatGPT, or more broadly Large Language AI Models (LLMs), have become ubiquitous in our lives. Yet, most of the mathematics and internal structures of LLMs are obscure knowledge to the general public. So, how can we move beyond perceiving LLMs like ChatGPT as magical black boxes? Physics may prov...

Natural Language Processing: Beyond BERT and GPT

The world of technology is ever-evolving, and one area that has seen significant advancements is Natural Language Processing (NLP). A few years back, two groundbreaking models, BERT and GPT, emerged as game-changers. They revolutionized how machines understood and interacted with human language, mak...

Transformer Architectures and the Rise of BERT, GPT, and T5: A Beginner’s Guide

In the vast and ever-evolving realm of artificial intelligence (AI), there are innovations that don’t just make a mark; they redefine the trajectory of the entire domain. Among these groundbreaking innovations, the Transformer architecture emerges as a beacon of change. It’s akin to the ...

LawBERT: Towards a Legal Domain-Specific BERT?

Google’s Bidirectional Encoder Representations from Transformers (BERT) is a large-scale pre-trained autoencoding language model developed in 2018. Its development has been described as the NLP community’s “ImageNet moment”, largely because of how adept BERT is at performing ...