Ollama
Ollama. Get up and running with large language models. Run Llama 3, Mistral, Gemma, and other models. Customize and create your own.
- Website: https://ollama.com
- Github: https://github.com/ollama/ollama
- Docs: https://github.com/ollama/ollama/tree/main/docs
- Linux
- Mac
- Windows
Installing Ollama on Linux
Install Ollama running this one-liner:
curl -fsSL https://ollama.com/install.sh | sh
AMD Radeon GPU support
While AMD has contributed the amdgpu
driver upstream to the official linux
kernel source, the version is older and may not support all ROCm features. We
recommend you install the latest driver from
https://www.amd.com/en/support/linux-drivers for best support of your Radeon
GPU.
Manual install
Download the ollama
binary
Ollama is distributed as a self-contained binary. Download it to a directory in your PATH:
sudo curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama
Adding Ollama as a startup service (recommended)
Create a user for Ollama:
sudo useradd -r -s /bin/false -m -d /usr/share/ollama ollama
Create a service file in /etc/systemd/system/ollama.service
:
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
[Install]
WantedBy=default.target
Then start the service:
sudo systemctl daemon-reload
sudo systemctl enable ollama
Install CUDA drivers (optional – for Nvidia GPUs)
Download and install CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
nvidia-smi
Install ROCm (optional - for Radeon GPUs)
Make sure to install ROCm v6
Start Ollama
Start Ollama using systemd
:
sudo systemctl start ollama
Update
Update ollama by running the install script again:
curl -fsSL https://ollama.com/install.sh | sh
Or by downloading the ollama binary:
sudo curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama
sudo chmod +x /usr/bin/ollama
Viewing logs
To view logs of Ollama running as a startup service, run:
journalctl -u ollama
Uninstall
Remove the ollama service:
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
Remove the ollama binary from your bin directory (either /usr/local/bin
, /usr/bin
, or /bin
):
sudo rm $(which ollama)
Remove the downloaded models and Ollama service user and group:
sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama
COMING SOON!
Welcome to the Ollama Windows preview.
No more WSL required!
Ollama now runs as a native Windows application, including NVIDIA and AMD Radeon GPU support.
After installing Ollama Windows Preview, Ollama will run in the background and
the ollama
command line is available in cmd
, powershell
or your favorite
terminal application. As usual the Ollama api will be served on
http://localhost:11434
.
As this is a preview release, you should expect a few bugs here and there. If you run into a problem you can reach out on Discord, or file an issue. Logs will often be helpful in dianosing the problem (see Troubleshooting below)
System Requirements
- Windows 10 or newer, Home or Pro
- NVIDIA 452.39 or newer Drivers if you have an NVIDIA card
- AMD Radeon Driver https://www.amd.com/en/support if you have a Radeon card
API Access
Here's a quick example showing API access from powershell
(Invoke-WebRequest -method POST -Body '{"model":"llama2", "prompt":"Why is the sky blue?", "stream": false}' -uri http://localhost:11434/api/generate ).Content | ConvertFrom-json
Troubleshooting
While we're in preview, OLLAMA_DEBUG
is always enabled, which adds
a "view logs" menu item to the app, and increses logging for the GUI app and
server.
Ollama on Windows stores files in a few different locations. You can view them in
the explorer window by hitting <cmd>+R
and type in:
explorer %LOCALAPPDATA%\Ollama
contains logs, and downloaded updates- app.log contains logs from the GUI application
- server.log contains the server logs
- upgrade.log contains log output for upgrades
explorer %LOCALAPPDATA%\Programs\Ollama
contains the binaries (The installer adds this to your user PATH)explorer %HOMEPATH%\.ollama
contains models and configurationexplorer %TEMP%
contains temporary executable files in one or moreollama*
directories