DEV Community

Cover image for Easy(ish) Language Classification With BERT in TensorFlow
James Briggs
James Briggs

Posted on

Easy(ish) Language Classification With BERT in TensorFlow

Chapters for each section of the video (preprocessing, model build, prediction) are in the video timeline.

Transformers have been described as the fourth pillar of deep learning, alongside the three big neural net architectures of CNNs, RNNs, and MLPs.

However, from the perspective of natural language processing - transformers are much more than that. Since their introduction in 2017, they've come to dominate a majority of NLP benchmarks - and continue to impress daily.

What I'm saying is, transformers are damn cool. And with libraries like HuggingFace's transformers - it has become too easy to build incredible solutions with them.

So, what's not to love? Incredible performance paired with the ultimate ease-of-use.

In this video, we'll work through building a multi-class classification model using transformers - from start-to-finish.

Top comments (0)