OpenAccess-AI-Collective/axolotl: Go ahead and axolotl questions – via
If your training a non-code dataset and you plan to train it on less than 34b params you should train mistral before llama-2 (IMO)
OpenAccess-AI-Collective/axolotl: Go ahead and axolotl questions – via
If your training a non-code dataset and you plan to train it on less than 34b params you should train mistral before llama-2 (IMO)