First attempt at learning GPT
Let’s build GPT: from scratch, in code, spelled out. - YouTube by Andrej Karpathy
- Langauge model = models the sequence of tokens/word and follows how words follow.. and completes the sequence.
- Neural network
- Attention is all you need Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. “Attention is All you Need,” 2017. Vol. 30.
- Generatively Pretrained Tranformer
- Architecture (The Transformer - Model Architecture)
Bubble.io - no code programming. has a Marketplace with “agencies” to purchase services from consultancies. I wonder what other low-code/no-code/SaaS companies have this model.