Does anyone know where the old original GPT-2 (transformer) model ended up?
Remember 2-3 years ago when OpenAI had a website called transformer that would complete a sentence to write a bunch of text. Most of it was incoherent but I think it is important for historic and humor purposes.
"This organization is maintained by the transformers team at Hugging Face and contains the historical (pre-"Hub") checkpoints like openai-gpt ("GPT-1"), gpt2, gpt2-medium, gpt2-large, gpt2-xl."