There are 6 repositories under gpt-neo topic.
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
SkyCode是一个多语言开源编程大模型,采用GPT3模型结构,支持Java, JavaScript, C, C++, Python, Go, shell等多种主流编程语言,并能理解中文注释。模型可以对代码进行补全,拥有强大解题能力,使您从编程中解放出来,专心于解决更重要的问题。| SkyCode is an open source programming model, which adopts the GPT3 model structure. It supports Java, JavaScript, C, C++, Python, Go, shell and other languages, and can understand Chinese comments.
Salesforce codegen with web server
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression
Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B
A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance
Hebrew text generation models based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 made avilable via TPU Research Cloud Program.
Auto-generate an entire paper from a prompt or abstract using NLP
Few Shot Learning using EleutherAI's GPT-Neo an Open-source version of GPT-3
Codebase for Linguistic Collapse: Neural Collapse in (Large) Language Models [NeurIPS 2024] [arXiv:2405.17767]
:pencil: Amazon product description generator using GPT-Neo for Texta.ai
A notebook that runs GPT-Neo with low vram (6 gb) and cuda acceleration by loading it into gpu memory in smaller parts.
I am using this to load gpt-j-6b to prevent excessive ram usage
Natural language model AI via HTTP
AI Text Generator : Friedrich Nietzsche
GPT-2 is a natural language processing technology developed by OpenAI and has some free applications
This repository contains various experiments and prototypes to get use to working with GPT-like models and being creative with them.
Efficiently query multiple prompts with ease: a command-line tool for batch querying large language models.
Tool to generate lorem ipsum-style Insights for Insights Explorer
GPT-Neo fine tuning from Twitch chat conversations
Text Generator for Amazon Ads. Use Natural Language Generation (NLG) technology to auto-generate text. Fine-tuning of pre-trained gpt-neo models to improve upon the RNN LSTM model
Ingest television series transcripts and output a totally new script.
Evaluating the quality of Automatic Code Generation and Recommendation tools, e.g. GitHub Copilot
A student-built repository that uses GPT-Neo models within a Flask app that tells jokes given user input
An application to create blog using custom trained LSTM and Transformer and also fine tuned GPT-Neo text generation model. UI for the application is available at : https://github.com/Transformers-G5/gen-front
Leverage GPT Neo, a GPT3 architecture clone, which has been trained on 2.7 billion parameters to generate text and code.
The is an academic project made during my Master degree at UP8 in France in the Generative AI course instructed by Dr. Youness EL MARHAOUI