ollama/docs/linux.md

195 lines
4.0 KiB
Markdown
Raw Permalink Normal View History

# Linux
2023-09-24 21:34:44 -07:00
2023-10-25 14:47:49 -07:00
## Install
2023-09-24 21:38:23 -07:00
To install Ollama, run the following command:
```shell
curl -fsSL https://ollama.com/install.sh | sh
2023-09-24 21:38:23 -07:00
```
## Manual install
build: Make target improvements (#7499) * llama: wire up builtin runner This adds a new entrypoint into the ollama CLI to run the cgo built runner. On Mac arm64, this will have GPU support, but on all other platforms it will be the lowest common denominator CPU build. After we fully transition to the new Go runners more tech-debt can be removed and we can stop building the "default" runner via make and rely on the builtin always. * build: Make target improvements Add a few new targets and help for building locally. This also adjusts the runner lookup to favor local builds, then runners relative to the executable, and finally payloads. * Support customized CPU flags for runners This implements a simplified custom CPU flags pattern for the runners. When built without overrides, the runner name contains the vector flag we check for (AVX) to ensure we don't try to run on unsupported systems and crash. If the user builds a customized set, we omit the naming scheme and don't check for compatibility. This avoids checking requirements at runtime, so that logic has been removed as well. This can be used to build GPU runners with no vector flags, or CPU/GPU runners with additional flags (e.g. AVX512) enabled. * Use relative paths If the user checks out the repo in a path that contains spaces, make gets really confused so use relative paths for everything in-repo to avoid breakage. * Remove payloads from main binary * install: clean up prior libraries This removes support for v0.3.6 and older versions (before the tar bundle) and ensures we clean up prior libraries before extracting the bundle(s). Without this change, runners and dependent libraries could leak when we update and lead to subtle runtime errors.
2024-12-10 09:47:19 -08:00
> [!NOTE]
> If you are upgrading from a prior version, you should remove the old libraries with `sudo rm -rf /usr/lib/ollama` first.
Download and extract the package:
```shell
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
```
Start Ollama:
```shell
ollama serve
```
2023-09-24 21:34:44 -07:00
In another terminal, verify that Ollama is running:
2023-09-24 21:34:44 -07:00
```shell
ollama -v
```
### AMD GPU install
2023-09-24 21:34:44 -07:00
If you have an AMD GPU, also download and extract the additional ROCm package:
```shell
curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz
2023-09-24 21:34:44 -07:00
```
### ARM64 install
Download and extract the ARM64-specific package:
```shell
curl -L https://ollama.com/download/ollama-linux-arm64.tgz -o ollama-linux-arm64.tgz
sudo tar -C /usr -xzf ollama-linux-arm64.tgz
```
2023-10-25 14:47:49 -07:00
### Adding Ollama as a startup service (recommended)
2023-09-24 21:34:44 -07:00
Create a user and group for Ollama:
2023-09-24 21:34:44 -07:00
```shell
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama
sudo usermod -a -G ollama $(whoami)
2023-09-24 21:34:44 -07:00
```
Create a service file in `/etc/systemd/system/ollama.service`:
```ini
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
2023-09-24 21:34:44 -07:00
[Install]
WantedBy=multi-user.target
2023-09-24 21:34:44 -07:00
```
Then start the service:
```shell
2023-09-24 21:34:44 -07:00
sudo systemctl daemon-reload
sudo systemctl enable ollama
```
2023-09-25 16:10:32 -07:00
### Install CUDA drivers (optional)
2023-10-25 14:47:49 -07:00
[Download and install](https://developer.nvidia.com/cuda-downloads) CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
```shell
2023-10-25 14:47:49 -07:00
nvidia-smi
```
### Install AMD ROCm drivers (optional)
[Download and Install](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/tutorial/quick-start.html) ROCm v6.
2023-10-25 14:47:49 -07:00
### Start Ollama
Start Ollama and verify it is running:
2023-10-25 14:47:49 -07:00
```shell
2023-10-25 14:47:49 -07:00
sudo systemctl start ollama
sudo systemctl status ollama
2023-10-25 14:47:49 -07:00
```
> [!NOTE]
> While AMD has contributed the `amdgpu` driver upstream to the official linux
> kernel source, the version is older and may not support all ROCm features. We
> recommend you install the latest driver from
> https://www.amd.com/en/support/linux-drivers for best support of your Radeon
> GPU.
## Customizing
To customize the installation of Ollama, you can edit the systemd service file or the environment variables by running:
```shell
sudo systemctl edit ollama
```
Alternatively, create an override file manually in `/etc/systemd/system/ollama.service.d/override.conf`:
```ini
[Service]
Environment="OLLAMA_DEBUG=1"
```
## Updating
2023-10-25 14:47:49 -07:00
Update Ollama by running the install script again:
2023-10-25 14:47:49 -07:00
```shell
curl -fsSL https://ollama.com/install.sh | sh
2023-10-25 14:47:49 -07:00
```
Or by re-downloading Ollama:
2023-10-25 14:47:49 -07:00
```shell
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
2023-10-25 14:47:49 -07:00
```
## Installing specific versions
Use `OLLAMA_VERSION` environment variable with the install script to install a specific version of Ollama, including pre-releases. You can find the version numbers in the [releases page](https://github.com/ollama/ollama/releases).
For example:
```shell
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
```
2023-10-25 14:47:49 -07:00
## Viewing logs
2023-09-25 16:10:32 -07:00
To view logs of Ollama running as a startup service, run:
```shell
journalctl -e -u ollama
2023-09-25 16:10:32 -07:00
```
2023-10-24 14:07:05 -04:00
## Uninstall
Remove the ollama service:
2023-10-25 14:47:49 -07:00
```shell
2023-10-25 14:47:49 -07:00
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
2023-10-24 14:07:05 -04:00
```
Remove the ollama binary from your bin directory (either `/usr/local/bin`, `/usr/bin`, or `/bin`):
2023-10-25 14:47:49 -07:00
```shell
2023-10-25 14:47:49 -07:00
sudo rm $(which ollama)
2023-10-24 14:07:05 -04:00
```
Remove the downloaded models and Ollama service user and group:
```shell
2023-10-25 14:47:49 -07:00
sudo rm -r /usr/share/ollama
sudo userdel ollama
sudo groupdel ollama
2023-10-24 14:07:05 -04:00
```
Remove installed libraries:
```shell
sudo rm -rf /usr/local/lib/ollama
```