gpt-2
aitextgen
gpt-2 | aitextgen | |
---|---|---|
64 | 19 | |
21,370 | 1,829 | |
2.1% | - | |
2.2 | 1.8 | |
6 days ago | 11 months ago | |
Python | Python | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gpt-2
-
What are LLMs? An intro into AI, models, tokens, parameters, weights, quantization and more
Medium models: Roughly between 1B to 10B parameters. This is where Mistral 7B, Phi-3, Gemma from Google DeepMind, and wizardlm2 sit. Fun fact: GPT 2 was a medium sized model, much smaller than its latest versions.
- Sam Altman is still trying to return as OpenAI CEO
- Build Personal ChatGPT Using Your Data
-
Are the recent advancements in AI technology primarily driven by recent discoveries or the progress in hardware capabilities and the abundance of available data?
"Our model, called GPT-2 (a successor to GPT), was trained simply to predict the next word in 40GB of Internet text. Due to our concerns about malicious applications of the technology, we are not releasing the trained model. As an experiment in responsible disclosure, we are instead releasing a much smaller model for researchers to experiment with, as well as a technical paper. "
-
BING IS NOW THE DEFAULT SEARCH FOR CHATGPT
They did release GPT-2 under the MIT License.
-
Don Knuth Plays with ChatGPT
Did you arrive at this certainty through reading something other than what OpenAI has published? The document [0] that describes the training data for GPT-2 makes this assertion hilarious to me.
[0]: https://github.com/openai/gpt-2/blob/master/model_card.md#da...
- Was frustriert euch an der Nutzung oder der Diskussion um KI?
- The AI
-
Help with pet project to learn - Running ChatGPT-2 at home
I made a clone of https://github.com/openai/gpt-2 on my local laptop
- По поводу опасности ИИ и предложений остановить разработки на 6 месяцев.
aitextgen
-
Where is the engineering part in "prompt engineer"?
It's literally a wrapper for the ChatGPT API (currently). I have another library for training models from scratch but haven't had time to work on it.
-
self-hosted AI?
I'm experimenting with https://github.com/minimaxir/aitextgen for some some simple tasks. It is pretty much a wrapper around gpt2 and gpt neox models.
-
How would I go about implementing warmup steps from the Transformers library?
I'm sorry if this is the wrong place to ask, but I wasn't sure where else to turn. Several of us have already opened an issue with AITextGen, but it seems that the maintainer isn't particularly active these days. I'm a fairly proficient developer (self-taught), and I know my way around ML, but I was not formally-educated in deep learning. A lot of Pytorch-Lightning looks like black magic, to me. I suspect that I'm missing an important detail that would be fairly simple for many of you to identify.
-
NanoGPT
To train small gpt-like models, there's also aitextgen: https://github.com/minimaxir/aitextgen
-
Neuro-sama sings "Take On Me" with her Angelic Voice
It's actually relatively easy to train your own GPT model and there are multiple tools out there that make it almost just plug and play: https://github.com/minimaxir/aitextgen
-
Is there a place with all the models indexed?
I've been learning python and for the past few days, I've been playing around with the aitextgen library.
-
I built an AI model to auto-generate Dominion cards. Here are the hilariously bad results.
Then I ran that through the ai and got it to spit out cards that looked like that training data. I used aitextgen. So I let it run for like 4 hours and it thinks it has made 10,000 rows of cards. But some of these cards are duplicates to each other or to cards that already exist, or use a card name that already exists in the original game, or have like 20 '|' characters in one row, or have zero '|'. So I run a script to remove all of these cards like that, and I end up with like 2,000-4,500 cards that are "functional".
-
Thoughts on GPT3?
If you search this subreddit, you should find lots of discussions about it, as well as alternatives like GPT-J (open source). If you'd like to experiment with GPT-2 for text generation, try https://github.com/minimaxir/aitextgen. It's fun to play with.
-
Show HN: Tensorpedia – Using GPT-2 to synthesize Wikipedia articles
Hey HN! I've been lurking for a while now and I've finally created something that I feel is worth sharing.
I've called this project "Tensorpedia." At its core, Tensorpedia takes in a title and utilizes it as a prompt for GPT-2 to synthesize the introductory part of a Wikipedia article. The machine learning stuff is written using a wonderful library called aitextgen [0], using Wikipedia's "Vital Articles" as a data set [1]. The server is written in Node, and it uses Redis as an article cache. If you want to read my article about it (for some reason), you can check it out here [2].
I created this project to get more experience with server technologies. While I wouldn't say it's a complicated application, I learned quite a lot from it.
Additionally, as I was inspired by all of those this-x-doesn't-exist projects from a while back, this project is mostly for fun. As such, I don't know how much practical use it has, but I've generated some pretty hilarious articles from it.
[0] https://github.com/minimaxir/aitextgen
[1] https://en.wikipedia.org/wiki/Wikipedia:Vital_articles/Level...
[2] https://jonahsussman.net/posts/2022-01-this-wiki-dne/
-
Downloaded GPT-2, Encode.py, and Train.py not found.
If by downloaded you mean clone the gpt-2 github repo it doesn't come with those scripts. I personally played around with https://github.com/minimaxir/aitextgen which is a simple wrapper around the gpt-2 code, it comes with some very clear usage. (Shout out to minimaxir and everyone else involved in aitextgen for making using gpt-2 easy to use!)
What are some alternatives?
dalle-mini - DALL·E Mini - Generate images from a text prompt
lm-evaluation-harness - A framework for few-shot evaluation of language models.
minGPT - A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
DiscordChatAI-GPT2 - A chat AI discord bot written in python3 using GPT-2, trained on data scraped from every message of my discord server (can be trained on yours too)
Real-Time-Voice-Cloning - Clone a voice in 5 seconds to generate arbitrary speech in real-time
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
sentencepiece - Unsupervised text tokenizer for Neural Network-based text generation.
nanoGPT - The simplest, fastest repository for training/finetuning medium-sized GPTs.
jukebox - Code for the paper "Jukebox: A Generative Model for Music"
trump_gpt2_bot - aitextgen (aka GPT-2) Twitter bot