From 9395b4a1dceb1d4322865549c6aef5d82625d472 Mon Sep 17 00:00:00 2001 From: Ignasi Cervero Date: Sun, 24 Dec 2023 12:23:16 +0100 Subject: [PATCH] Refactor Docker installation instructions in README for enhanced clarity - Separate GPU support and API exposure instructions into distinct sections - Improve readability and user guidance for Docker Compose setup --- README.md | 17 +++++++++++------ 1 file changed, 11 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index 52fd33540..b3e407f05 100644 --- a/README.md +++ b/README.md @@ -79,13 +79,18 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose docker compose up -d --build ``` -This command will install both Ollama and Ollama Web UI on your system. -Enable GPU support or Exposing Ollama API outside the container stack with the following command: +This command will install both Ollama and Ollama Web UI on your system. + +#### Enable GPU +Use the additional Docker Compose file designed to enable GPU support by running the following command: ```bash -docker compose -f docker-compose.yml \ - -f docker-compose.gpu.yml \ - -f docker-compose.api.yml \ - up -d --build +docker compose -f docker-compose.yml -f docker-compose.gpu.yml up -d --build +``` + +#### Expose Ollama API outside the container stack +Deploy the service with an additional Docker Compose file designed for API exposure: +```bash +docker compose -f docker-compose.yml -f docker-compose.api.yml up -d --build ``` ### Installing Ollama Web UI Only