How to pre-train BERT on different objective tasks using HuggingFace

This page summarizes the projects mentioned and recommended in the original post on /r/deeplearning

Scout Monitoring - Free Django app performance insights with Scout Monitoring
Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
www.scoutapm.com
featured
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
  • notebooks

    Jupyter notebooks for the Natural Language Processing with Transformers book (by nlp-with-transformers)

  • HuggingFace has an excellent book called "Natural Language Processing with Transformers". It explains HF ecosystem pretty nicely. And they have accompanying GitHub repo which has notebooks for all chapters.

  • d2l-en

    Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.

  • There might is bert library for pre-train bert model in huggingface, But I suggestion that you train bert model in native pytorch to understand detail, Limu's course is recommended for you

  • Scout Monitoring

    Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.

    Scout Monitoring logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • which book to chose for deep learning :lan Goodfellow or francois chollet

    1 project | /r/learnmachinelearning | 7 Apr 2023
  • d2l-en: Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 400 universities from 60 countries including Stanford, MIT, Harvard, and Cambridge.

    1 project | /r/u_TsukiZombina | 29 Jan 2023
  • The Transformer in Machine Translation

    1 project | /r/MindSporeOSS | 13 Jan 2022
  • D2l-En

    1 project | news.ycombinator.com | 4 Jan 2022
  • I created a way to learn machine learning through Jupyter

    2 projects | /r/learnmachinelearning | 30 Apr 2021