top of page
< Back

Simple Introduction to Hugging Faces NLP Pipelines in Python Classification Text Generation

free python hugging faces how to use pipeline in transformers library learn how pipelines work in hugging faces for free

In the lesson, we will introduce NLP Pipelines in Hugging Faces.  We will show how to use a transformers pipeline for sentiment classification to start.  And then show how the pipeline can be used to classify different texts in any number of different categories with incredible ease with zero-shot classification.  The value of pipelines is that they combine the tokenization and model and then the decoding into one very very easy-to-use function.  This makes it quick and easy to use in many common NLP problems like text classification but even text summarization or question and answer.  If you a beginer with transformers Hugging Faces Pipelines are an excellent place to start because of their simplicity in use. 


With pipelines, we'll have the flexibility to use the many different models as well.  From GPT2, RoBerta, BERT, and one our favorite Electra.  Pipelines will handle the individual tokenization specific to each model and then the return to use the sentiment, the classification, and even the generated text


Learn how simple and easy it is to use the a Hugging Faces Pipeline in the free Python machine-learning class. 


In the class we introduce common NLP tasks and make a point of comparing many popular like Roberta, GPT2, XLNet, BART, and Electra.  Some transformers in, especially before we've complete fine tuning will work better in some situations and some on other NLP tasks.  Although transfomers are powerful the tend to be very good at certain tasks and just ok at others.


The complete Python Notebook with Code is included for free today below.  

Summary

In this free beginner's Python NLP lesson, we discussed how to use Hugging Face's transformer library to get a sentiment rating for a text.  We also discussed how to use the Pipeline to classify text.  The interesting thing about the transformers pipeline is we can classify texts into which category we choose using the zero-shot classification model.  It doesn't have to be categories or classes we have in a target variable.  We will show you how to use the pipeline to categorize a given into categories of our choosing using candidate labels.


We also should how to use a Hugging Faces Pipeline to Generate Text in one line of code.  Without fine-tuning some models like BERT are great at generating text and often end up repeating the same word over and over.  However out of the box with no fine-tuning GPT2 does a realistic job at finishing off the example sentences.  Although not perfect it can amazingly be done in one line of code.  


Recaping our experiment we saw that DistilBERT and RoBerta performed the best on in the sentiment pipeline.  Facebook's bart performed significantly better than any other model on the zero-shot classification and GPT2 preformed the best on text generation task.  


Thank you for joining us in the free Hugging Faces Tutorial introducing a transformer Pipeline.


learn how to use a transformers pipeline learning how to use zero shot classification learning how to generate text with hugging faces NLP pipeline generation text in one line of Python code free independent learning
gratis python how to use hugging faces pipleine to classify text in a python NLP problem we use show how to use a pipeline to generate text use BERT or and transformer model in pipeline
bottom of page