TensorFlow

[NLP] Add an interactive example on how to use Optimized Bert Versions with TF 2.0

Bert is one of the most popular language models, but there are optimized versions of it available which can and should be used in production.

Some of these are:

  • Albert (A Little Bert)
  • RoBERTa (Robustly Optimized BERT Pre training Approach)
  • DistilBERT

Your task is to make engaging and interactive google colab notebooks for these models using tensorflow 2.0. The participant must make at least one notebook for one of the models mentioned above in order to complete this task.

This Guide Will help you understand how you can use huggingface's pytorch transformers to interop with tensorflow 2.0 and get it working in less than 15 lines of code.

A demo of these models can be found here.

Task tags

  • python
  • tensorflow 2.0
  • documentation
  • natural language processing
  • bert

Students who completed this task

anigasan, jedlimlx, boron, As1234, Rick Wierenga, Ja-sniff/Javismb, generationxcode, WZHANG, RichieX, Qwerty71, Rachin

Task type

  • code Code
  • chrome_reader_mode Documentation / Training
  • assessment Outreach / Research
close

2019