llama homepage The latest, as on Jul 2024, is llama 3.1 which has 405B, 70B and 8B models, optimized for “do it all”, cost/performance tradeoff, and light-weight+speed.
You can run llama locally on your laptop or pc using the following tools:
- ollama + OpenWebUI;
- ollama is an open-source project that provides an easy way to download and run local LLMs
- OpenWebUI gives you a web-based ChatGPT like user interface to interact with the local LLMs.
- LM Studio - is a desktop app
- GPT4All is an open-source ecosystem for running LLMs locally on your computer.
- MLX; optimized to run LLMs on Apple Silicon on macOS.
