GPT
Generative Pre-trained Transformer
Created: Feb 09, 2023 by Pradeep Gowda. Updated: Jun 20, 2023 Tagged: gpt, deep-learning, llm
GPT = Generative Pre-trained Transformer
- Transformers from Scratch
- The GPT-3 Architecture, on a Napkin; HN
- GPT-3 - Wikipedia
- A GPT in 60 Lines of NumPy
- Let’s build GPT: from scratch, in code, spelled out. - YouTube by Andrej Karpathy.
- karpathy/minGPT: A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
- TheR1D/shell_gpt: A command-line interface (CLI) productivity tool powered by OpenAI’s text-davinci-003 model, will help you accomplish your tasks faster and more efficiently.; Get API keys here.
- Simply explained: how does GPT work? | Confused bit
- GPT4 should be part of your toolkit • Buttondown
- imartinez/privateGPT: Interact privately with your documents using the power of GPT, 100% privately, no data leaks – a really promising
- Understanding GPT tokenizers
- [2106.06981] Thinking Like Transformers Weiss, Gail, Yoav Goldberg, and Eran Yahav. “Thinking Like Transformers,” 2021. abs/2106.06981https://arxiv.org/abs/2106.06981. and explanation blog post - Thinking Like Transformers by Alexander Rush