<aside> 📢
ollama can currently be served in WSL, Linux, macOS
</aside>
Run the command
curl -fsSL <https://ollama.com/install.sh> | sh
Pull a model first, the catalog of models are available at https://ollama.com/library
ollama pull llama3
Serve the Model
ollama serve