Jupyter Notebook Transformers

Open-source Jupyter Notebook projects categorized as Transformers

Top 23 Jupyter Notebook Transformer Projects

  • nn

    ๐Ÿง‘โ€๐Ÿซ 60 Implementations/tutorials of deep learning papers with side-by-side notes ๐Ÿ“; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), ๐ŸŽฎ reinforcement learning (ppo, dqn), capsnet, distillation, ... ๐Ÿง 

  • Transformers-Tutorials

    This repository contains demos I made with the Transformers library by HuggingFace.

  • Project mention: AI enthusiasm #6 - Finetune any LLM you want๐Ÿ’ก | dev.to | 2024-04-16

    Most of this tutorial is based on Hugging Face course about Transformers and on Niels Rogge's Transformers tutorials: make sure to check their work and give them a star on GitHub, if you please โค๏ธ

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • pytorch-sentiment-analysis

    Tutorials on getting started with PyTorch and TorchText for sentiment analysis.

  • Promptify

    Prompt Engineering | Prompt Versioning | Use GPT or other prompt based models to get structured output. Join our discord for Prompt-Engineering, LLMs and other latest research

  • Project mention: Promptify 2.0: More Structured, More Powerful LLMs with Prompt-Optimization, Prompt-Engineering, and Structured Json Parsing with GPT-n Models! ๐Ÿš€ | /r/ArtificialInteligence | 2023-07-31

    First up, a huge Thank You for making Promptify a hit with over 2.3k+ stars on Github ! ๐ŸŒŸ

  • adapters

    A Unified Library for Parameter-Efficient and Modular Transfer Learning

  • hands-on-llms

    ๐Ÿฆ– ๐—Ÿ๐—ฒ๐—ฎ๐—ฟ๐—ป about ๐—Ÿ๐—Ÿ๐— ๐˜€, ๐—Ÿ๐—Ÿ๐— ๐—ข๐—ฝ๐˜€, and ๐˜ƒ๐—ฒ๐—ฐ๐˜๐—ผ๐—ฟ ๐——๐—•๐˜€ for free by designing, training, and deploying a real-time financial advisor LLM system ~ ๐˜ด๐˜ฐ๐˜ถ๐˜ณ๐˜ค๐˜ฆ ๐˜ค๐˜ฐ๐˜ฅ๐˜ฆ + ๐˜ท๐˜ช๐˜ฅ๐˜ฆ๐˜ฐ & ๐˜ณ๐˜ฆ๐˜ข๐˜ฅ๐˜ช๐˜ฏ๐˜จ ๐˜ฎ๐˜ข๐˜ต๐˜ฆ๐˜ณ๐˜ช๐˜ข๐˜ญ๐˜ด

  • Project mention: Where to start | /r/mlops | 2023-09-13

    There are 3 courses that I usually recommend to folks looking to get into MLE/MLOps that already have a technical background. The first is a higher-level look at the MLOps processes, common challenges and solutions, and other important project considerations. It's one of Andrew Ng's courses from Deep Learning AI but you can audit it for free if you don't need the certificate: - Machine Learning in Production For a more hands-on, in-depth tutorial, I'd recommend this course from NYU (free on GitHub), including slides, scripts, full-code homework: - Machine Learning Systems And the title basically says it all, but this is also a really good one: - Hands-on Train and Deploy ML Pau Labarta, who made that last course, actually has a series of good (free) hands-on courses on GitHub. If you're interested in getting started with LLMs (since every company in the world seems to be clamoring for them right now), this course just came out from Pau and Paul Iusztin: - Hands-on LLMs For LLMs I also like this DLAI course (that includes Prompt Engineering too): - Generative AI with LLMs It can also be helpful to start learning how to use MLOps tools and platforms. I'll suggest Comet because I work there and am most familiar with it (and also because it's a great tool). Cloud and DevOps skills are also helpful. Make sure you're comfortable with git. Make sure you're learning how to actually deploy your projects. Good luck! :)

  • ZoeDepth

    Metric depth estimation from a single image

  • Project mention: Software 3D scanner. Free on Prusa Printables | /r/prusa3d | 2023-04-27
  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • transformers-interpret

    Model explainability that works seamlessly with ๐Ÿค— transformers. Explain your transformers model in just 2 lines of code.

  • mup

    maximal update parametrization (ยตP)

  • Project mention: Announcing xAI July 12th 2023 | /r/xdotai | 2023-07-13

    Our team is led by Elon Musk, CEO of Tesla and SpaceX. We have previously worked at DeepMind, OpenAI, Google Research, Microsoft Research, Tesla, and the University of Toronto. Collectively we contributed some of the most widely used methods in the field, in particular the Adam optimizer, Batch Normalization, Layer Normalization, and the discovery of adversarial examples. We further introduced innovative techniques and analyses such as Transformer-XL, Autoformalization, the Memorizing Transformer, Batch Size Scaling, and ฮผTransfer. We have worked on and led the development of some of the largest breakthroughs in the field including AlphaStar, AlphaCode, Inception, Minerva, GPT-3.5, and GPT-4.

  • course-content-dl

    NMA deep learning course

  • Transformer-MM-Explainability

    [ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.

  • gpt2bot

    Your new Telegram buddy powered by transformers

  • adaptnlp

    An easy to use Natural Language Processing library and framework for predicting, training, fine-tuning, and serving up state-of-the-art NLP models.

  • uni2ts

    Unified Training of Universal Time Series Forecasting Transformers

  • Project mention: Moirai: A Time Series Foundation Model for Universal Forecasting | news.ycombinator.com | 2024-03-25

    Code is available! https://github.com/SalesforceAIResearch/uni2ts

  • optimum-intel

    ๐Ÿค— Optimum Intel: Accelerate inference with Intel optimization tools

  • browser-ml-inference

    Edge Inference in Browser with Transformer NLP model

  • diffusers-interpret

    Diffusers-Interpret ๐Ÿค—๐Ÿงจ๐Ÿ•ต๏ธโ€โ™€๏ธ: Model explainability for ๐Ÿค— Diffusers. Get explanations for your generated images.

  • ocrpy

    OCR, Archive, Index and Search: Implementation agnostic OCR framework.

  • language-planner

    Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"

  • mgpt

    Multilingual Generative Pretrained Model

  • HugsVision

    HugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision

  • REaLTabFormer

    A suite of auto-regressive and Seq2Seq (sequence-to-sequence) transformer models for tabular and relational synthetic data generation.

  • MachineLearning-QandAI-book

    Machine Learning Q and AI book

  • Project mention: Machine Learning and AI Beyond the Basics Book | news.ycombinator.com | 2024-04-16
  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020).

Jupyter Notebook Transformers related posts

Index

What are some of the best open-source Transformer projects in Jupyter Notebook? This list will help you:

Project Stars
1 nn 48,004
2 Transformers-Tutorials 7,510
3 pytorch-sentiment-analysis 4,225
4 Promptify 3,020
5 adapters 2,390
6 hands-on-llms 2,232
7 ZoeDepth 1,939
8 transformers-interpret 1,212
9 mup 1,169
10 course-content-dl 708
11 Transformer-MM-Explainability 704
12 gpt2bot 424
13 adaptnlp 414
14 uni2ts 390
15 optimum-intel 321
16 browser-ml-inference 294
17 diffusers-interpret 259
18 ocrpy 218
19 language-planner 213
20 mgpt 194
21 HugsVision 188
22 REaLTabFormer 183
23 MachineLearning-QandAI-book 182

Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com