Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
-
Updated
Jun 14, 2023 - Python
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
SkyCode是一个多语言开源编程大模型,采用GPT3模型结构,支持Java, JavaScript, C, C++, Python, Go, shell等多种主流编程语言,并能理解中文注释。模型可以对代码进行补全,拥有强大解题能力,使您从编程中解放出来,专心于解决更重要的问题。| SkyCode is an open source programming model, which adopts the GPT3 model structure. It supports Java, JavaScript, C, C++, Python, Go, shell and other languages, and can understand Chinese comments.
QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression
Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B
A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance
Hebrew text generation models based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 made avilable via TPU Research Cloud Program.
Auto-generate an entire paper from a prompt or abstract using NLP
Few Shot Learning using EleutherAI's GPT-Neo an Open-source version of GPT-3
A notebook that runs GPT-Neo with low vram (6 gb) and cuda acceleration by loading it into gpu memory in smaller parts.
📝 Amazon product description generator using GPT-Neo for Texta.ai
Codebase for arXiv:2405.17767, based on GPT-Neo and TinyStories.
Natural language model AI via HTTP
Automate GPT3 website-login.
GPT-2 is a natural language processing technology developed by OpenAI and has some free applications
Add a description, image, and links to the gpt-neo topic page so that developers can more easily learn about it.
To associate your repository with the gpt-neo topic, visit your repo's landing page and select "manage topics."