Skip to content
This repository has been archived by the owner on Feb 4, 2022. It is now read-only.

Latest commit

 

History

History
5 lines (3 loc) · 619 Bytes

README.md

File metadata and controls

5 lines (3 loc) · 619 Bytes

GPT-J-6B is a GPT-like language model from EleutherAI for text generation.

This is a version of their inference Colab notebook with an additional repetition_penalty setting that makes tokens that have already appeared less likely to reappear in the hopes that it improves the quality of the output text. The technique was originally described in section 4.1 of this paper.

You can try out this new Colab notebook here.