Global AI and Data Science

 View Only

CORD-19: Improving Text Generation with Transformers

By Nick Acosta posted Tue April 14, 2020 12:47 PM

  

Review and LearnAI

 
Recently, I have been writing about CORD-19, how to get it into python, and how to use TensorFlow to generate abstracts with the dataset. The quality of the abstracts generated can be significantly improved with models that leverage transformers. I will show how to do so both below and in a hands on format this Thursday at LearnAI, an IBM and O’Reilly day of no-cost learning: hands-on labs, interactive leadership talks, and open office hours with experts.

Hugging Face


Hugging Face is an NLP company that built and maintains the transformers python library to provide state-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch. Retraining a model on customized data is as easy as pick a model and pointing it to the proper data.
 
Hugging Face's run_language_modeling can easily be called to use or fine-tune popular NLP models with a few parameters 


I have selected to retrain (fine-tune) OpenAI's GPT-2 model for text generation. This notebook outlines how to do so and use the resulting model to generate new abstracts on the coronavirus. It is highly recommended to run the notebook in an environment that has access to a GPU (such as Colab). For me, the model was going to take around 30 hours to retrain on my laptop compared to just 15 minutes with Colab. The transformer library will create checkpoints of the model as it retrains and place them into a specified output directory. An interesting feature of note for the transformers library is that models and can retrained in TensorFlow and loaded from their checkpoints for further training or inference in PyTorch, or vice versa. This functionality is also demonstrated in the notebook mentioned earlier.
 
Abstracts generated without (above) and with (below) transformers. Note the difference in natural language between the outputs.



#GlobalAIandDataScience
#GlobalDataScience
#Hands-on
#Highlights-home
0 comments
2042 views

Permalink