Better Programming

Advice for programmers.

Follow publication

Member-only story

Build a Natural Language Classifier With Bert and Tensorflow

James Briggs
Better Programming
Published in
6 min readDec 23, 2020
Photo by Joel Naren on Unsplash

High-performance transformer models like BERT and GPT-3 are transforming a huge array of previously menial, language-based tasks, into the work of a clicks, saving a lot of time.

In most industries, the newest wave of language optimization is just getting started — taking their first baby steps. But these seedlings are widespread, and sprouting quickly.

Much of this adoption is thanks to the incredibly low barrier-to-entry. If you know the basics of TensorFlow or PyTorch, and take a little time to get to grips with the Transformers library — you’re already halfway there.

With the Transformers library, it takes just three lines of code to initialize a cutting-edge ML model — a model built from the billions of research dollars spent by the likes of Google, Facebook, and OpenAI.

This article will take you through the steps to build a classification model that leverages the power of transformers, using Google’s BERT.

Transformers
- Finding Models
- Initializing
- Bert Inputs and Outputs
Classification
- The Data
- Tokenization
- Data Prep
- Train-Validation Split
- Model Definition
- Train

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

James Briggs
James Briggs

Written by James Briggs

Freelance ML engineer learning and writing about everything. I post a lot on YT https://www.youtube.com/c/jamesbriggs

Write a response