top of page

TensorFlow Simple Hugging Faces NLP Guided Project in Python

Updated: Nov 9, 2023

free python hugging faces independent turtorial how to use distil bert for NLP classificaiton from transformers tensorflow library

Hugging Face is an open-source library that provides pre-trained models for natural language processing (NLP) tasks. In this project, we will be using the TFDistilBertForSequenceClassification model for sequence classification and the DistilBertConfig and DistilBertTokenizer tools from Hugging Face. The goal of this project is to demonstrate how these tools can be used together to perform sequence classification on text data. We will start by loading the necessary libraries and then move on to exploring the data and implementing the sequence classification task.

In this project, we'll see that traditional NLP processing like removing punctuation and stopwords actually tends not to add value to our predictions. This also includes stemming our words to be in the root form. Although this classical preprocessing isn't as valuable with transformers there is a special type of preprocessing that we will need to complete for the BERT model. We need to add a [CLS] at the beginning of each and [SEP] at the end of the sequences. This will be different for different transformers but BERT was trained with these tags so we will get the best predictions if we do the same here.

Want to read more?

Subscribe to to keep reading this exclusive post.

108 views0 comments


Rated 0 out of 5 stars.
Couldn’t Load Comments
It looks like there was a technical problem. Try reconnecting or refreshing the page.
bottom of page