btbytes.com

Home

❯

GPT

GPT

Feb 09, 20231 min read

  • gpt
  • deep-learning
  • llm

GPT = Generative Pre-trained Transformer

  • GPT in 60 Lines of NumPy | Jay Mody
  • How GPT3 Works - Visualizations and Animations – Jay Alammar – Visualizing machine learning one concept at a time.
  • Transformers from Scratch
  • The GPT-3 Architecture, on a Napkin; HN
  • GPT-3 - Wikipedia
  • A GPT in 60 Lines of NumPy
  • Let’s build GPT: from scratch, in code, spelled out. - YouTube by Andrej Karpathy.
  • mlx-gpt2: gpt-2 from scratch in mlx code from GPT from scratch with MLX
  • karpathy/minGPT: A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
  • TheR1D/shell_gpt: A command-line interface (CLI) productivity tool powered by OpenAI’s text-davinci-003 model, will help you accomplish your tasks faster and more efficiently.; Get API keys here.
  • Simply explained: how does GPT work? | Confused bit
  • GPT4 should be part of your toolkit • Buttondown
  • imartinez/privateGPT: Interact privately with your documents using the power of GPT, 100% privately, no data leaks — a really promising
  • Understanding GPT tokenizers
  • [2106.06981] Thinking Like Transformers (Weiss et al., 2021) and explanation blog post - Thinking Like Transformers by Alexander Rush

See: AutoGPT

Weiss, G., Goldberg, Y., & Yahav, E. (2021). Thinking Like Transformers. CoRR, abs/2106.06981. https://arxiv.org/abs/2106.06981

Graph View

  • GitHub
  • LinkedIn
  • Twitter