Below you can find code tutorials for working with BERT and the HuggingFace Library.
This notebook demonstrates how to use a pre-trained BERT model with the popular HuggingFace
transformers Python library.
In this example, we look for words that have a similar vector to a query word from a collection of poems. The results are illustrative of what BERT vectors represent, but also of the limitations of the tokenization scheme that it uses.
“Measuring Word Similarity with BERT
(Spanish Language Sonnets)”
[Full Colab Notebook Coming Soon!] [Demo with Results Only]
“Training and Fine-Tuning BERT for Classification: Classfying Goodreads Reviews By Book Genre”
This notebook demonstrate how users can train and fine-tune a BERT model for classification with the popular HuggingFace
transformers Python library. We fine-tune a BERT model on Goodreads reviews from the UCSD Book Graph with the goal of predicting the genre of the book being reviewed.