Self-Hosting with Ollama
Run popular LLMs like Llama, Gemma and dozens of others locally for free
Last updated
Run popular LLMs like Llama, Gemma and dozens of others locally for free
Last updated
Install the Ollama application locally on your machine: press the Download button at to download the Ollama app. Make sure the app is running after installation.
To make sure that your server is running, test the connection using the Test Connection button.
Install at least one Ollama model, either manually from the , or using the Terminal command ollama run <model_name>
. For example, to install the llama 3.2 model, use the command ollama run llama3.2
.