open-webui/README.md

185 lines
10 KiB
Markdown
Raw Normal View History

2024-02-16 23:06:55 -08:00
# Open WebUI (Formerly Ollama WebUI) 👋
2023-10-08 15:38:42 -07:00
2024-02-19 11:47:37 -08:00
![GitHub stars](https://img.shields.io/github/stars/open-webui/open-webui?style=social)
![GitHub forks](https://img.shields.io/github/forks/open-webui/open-webui?style=social)
![GitHub watchers](https://img.shields.io/github/watchers/open-webui/open-webui?style=social)
![GitHub repo size](https://img.shields.io/github/repo-size/open-webui/open-webui)
![GitHub language count](https://img.shields.io/github/languages/count/open-webui/open-webui)
![GitHub top language](https://img.shields.io/github/languages/top/open-webui/open-webui)
![GitHub last commit](https://img.shields.io/github/last-commit/open-webui/open-webui?color=red)
2023-11-05 15:41:25 -08:00
![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Follama-webui%2Follama-wbui&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false)
2024-02-17 01:11:19 -08:00
[![Discord](https://img.shields.io/badge/Discord-Open_WebUI-blue?logo=discord&logoColor=white)](https://discord.gg/5rJgQTnV4s)
2023-11-11 13:40:47 -08:00
[![](https://img.shields.io/static/v1?label=Sponsor&message=%E2%9D%A4&logo=GitHub&color=%23fe8e86)](https://github.com/sponsors/tjbck)
2023-11-05 15:41:25 -08:00
2024-05-27 00:17:00 +00:00
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our [Open WebUI Documentation](https://docs.openwebui.com/).
2023-10-08 15:38:42 -07:00
2024-02-16 23:06:55 -08:00
![Open WebUI Demo](./demo.gif)
2023-12-19 15:36:32 -05:00
## Key Features of Open WebUI ⭐
- 🚀 **Effortless Setup**: Install seamlessly using Docker or Kubernetes (kubectl, kustomize or helm) for a hassle-free experience.
2024-05-27 00:39:31 +00:00
- 🤝 **Ollama/OpenAI API Integration**: Effortlessly integrate OpenAI-compatible APIs for versatile conversations alongside Ollama models. Customize the Ollama API URL to link with **LMStudio, GroqCloud, Mistral, OpenRouter, and more**.
- 📱 **Responsive Design**: Enjoy a seamless experience across Desktop PC, Laptop, and Mobile devices.
- ✒️🔢 **Full Markdown and LaTeX Support**: Elevate your LLM experience with comprehensive Markdown and LaTeX capabilities for enriched interaction.
- 🧩 **Model Builder**: Easily create Ollama models via the Web UI. Create and add custom characters/agents, customize chat elements, and import models effortlessly through [Open WebUI Community](https://openwebui.com/) integration.
- 🔍 **RAG Embedding Support**: Change the RAG embedding model directly in document settings, enhancing document processing. This feature supports Ollama and OpenAI models.
- 🌐 **Web Browsing Capability**: Seamlessly integrate websites into your chat experience using the `#` command followed by the URL. This feature allows you to incorporate web content directly into your conversations, enhancing the richness and depth of your interactions.
- 🎨 **Image Generation Integration**: Seamlessly incorporate image generation capabilities using options such as AUTOMATIC1111 API or ComfyUI (local), and OpenAI's DALL-E (external), enriching your chat experience with dynamic visual content.
- 🤖 **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions.
- 🔐 **Role-Based Access Control (RBAC)**: Ensure secure access with restricted permissions; only authorized individuals can access your Ollama, and exclusive model creation/pulling rights are reserved for administrators.
- 🌐🌍 **Multilingual Support**: Experience Open WebUI in your preferred language with our internationalization (i18n) support. Join us in expanding our supported languages! We're actively seeking contributors!
2024-05-26 06:19:39 +00:00
2024-05-27 00:40:34 +00:00
- 🌟 **Continuous Updates**: We are committed to improving Open WebUI with regular updates, fixes, and new features.
Want to learn more about Open WebUI's features? Check out our [Open WebUI documentation](https://docs.openwebui.com/) for a comprehensive overview!
2023-10-08 15:38:42 -07:00
2024-02-16 23:04:37 -08:00
## 🔗 Also Check Out Open WebUI Community!
2023-11-28 00:39:22 -05:00
2024-02-24 13:38:06 -08:00
Don't forget to explore our sibling project, [Open WebUI Community](https://openwebui.com/), where you can discover, download, and explore customized Modelfiles. Open WebUI Community offers a wide range of exciting possibilities for enhancing your chat interactions with Open WebUI! 🚀
2023-11-28 00:39:22 -05:00
2023-10-08 15:38:42 -07:00
## How to Install 🚀
2024-02-24 13:38:06 -08:00
> [!NOTE]
> Please note that for certain Docker environments, additional configurations might be needed. If you encounter any connection issues, our detailed guide on [Open WebUI Documentation](https://docs.openwebui.com/) is ready to assist you.
2024-01-09 22:53:22 -08:00
2024-04-10 10:20:52 +02:00
### Quick Start with Docker 🐳
2024-04-10 01:34:25 -07:00
> [!WARNING]
2024-02-24 13:38:06 -08:00
> When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
2023-12-31 15:10:33 -08:00
2024-04-10 01:34:25 -07:00
> [!TIP]
> If you wish to utilize Open WebUI with Ollama included or CUDA acceleration, we recommend utilizing our official images tagged with either `:cuda` or `:ollama`. To enable CUDA, you must install the [Nvidia CUDA container toolkit](https://docs.nvidia.com/dgx/nvidia-container-runtime-upgrade/) on your Linux/WSL system.
2024-05-08 11:28:42 -07:00
### Installation with Default Configuration
2023-12-31 15:10:33 -08:00
2024-05-08 11:28:42 -07:00
- **If Ollama is on your computer**, use this command:
2023-12-26 17:34:56 -08:00
2024-05-08 11:28:42 -07:00
```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
2023-12-26 17:34:56 -08:00
2024-05-08 11:28:42 -07:00
- **If Ollama is on a Different Server**, use this command:
2023-12-31 15:10:33 -08:00
2024-05-08 11:28:42 -07:00
To connect to Ollama on another server, change the `OLLAMA_BASE_URL` to the server's URL:
```bash
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
- **To run Open WebUI with Nvidia GPU support**, use this command:
```bash
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
```
### Installation for OpenAI API Usage Only
- **If you're only using OpenAI API**, use this command:
```bash
docker run -d -p 3000:8080 -e OPENAI_API_KEY=your_secret_key -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
### Installing Open WebUI with Bundled Ollama Support
This installation method uses a single container image that bundles Open WebUI with Ollama, allowing for a streamlined setup via a single command. Choose the appropriate command based on your hardware setup:
- **With GPU Support**:
Utilize GPU resources by running the following command:
```bash
docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
```
- **For CPU Only**:
If you're not using a GPU, use this command instead:
```bash
docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
```
Both commands facilitate a built-in, hassle-free installation of both Open WebUI and Ollama, ensuring that you can get everything up and running swiftly.
2024-04-10 01:26:23 -07:00
After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
2023-12-31 15:10:33 -08:00
2024-05-08 11:28:42 -07:00
### Other Installation Methods
We offer various installation alternatives, including non-Docker native installation methods, Docker Compose, Kustomize, and Helm. Visit our [Open WebUI Documentation](https://docs.openwebui.com/getting-started/) or join our [Discord community](https://discord.gg/5rJgQTnV4s) for comprehensive guidance.
2024-05-08 11:29:42 -07:00
### Troubleshooting
Encountering connection issues? Our [Open WebUI Documentation](https://docs.openwebui.com/troubleshooting/) has got you covered. For further assistance and to join our vibrant community, visit the [Open WebUI Discord](https://discord.gg/5rJgQTnV4s).
2024-02-25 11:01:05 -08:00
#### Open WebUI: Server Connection Error
2023-12-05 01:58:58 -05:00
2024-02-25 11:01:05 -08:00
If you're experiencing connection issues, its often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
**Example Docker Command**:
```bash
2024-03-06 11:44:00 -08:00
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
2024-02-25 11:01:05 -08:00
```
2023-10-24 18:12:00 -07:00
2024-02-24 13:38:06 -08:00
### Keeping Your Docker Installation Up-to-Date
2023-10-08 15:38:42 -07:00
2024-02-24 13:48:12 -08:00
In case you want to update your local Docker installation to the latest version, you can do it with [Watchtower](https://containrrr.dev/watchtower/):
```bash
docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui
```
In the last part of the command, replace `open-webui` with your container name if it is different.
### Moving from Ollama WebUI to Open WebUI
2024-02-24 13:52:21 -08:00
Check our Migration Guide available in our [Open WebUI Documentation](https://docs.openwebui.com/migration/).
2023-10-08 15:38:42 -07:00
2024-02-24 13:38:06 -08:00
## What's Next? 🌟
2023-10-08 15:38:42 -07:00
2024-02-24 13:38:06 -08:00
Discover upcoming features on our roadmap in the [Open WebUI Documentation](https://docs.openwebui.com/roadmap/).
2023-10-08 15:38:42 -07:00
2023-11-14 12:20:51 -08:00
## Supporters ✨
A big shoutout to our amazing supporters who's helping to make this project possible! 🙏
2023-10-08 15:38:42 -07:00
2023-11-14 12:23:02 -08:00
### Platinum Sponsors 🤍
2023-10-08 15:38:42 -07:00
2023-12-27 11:51:40 -08:00
- We're looking for Sponsors!
### Acknowledgments
2024-02-04 01:17:09 -08:00
Special thanks to [Prof. Lawrence Kim](https://www.lhkim.com/) and [Prof. Nick Vincent](https://www.nickmvincent.com/) for their invaluable support and guidance in shaping this project into a research endeavor. Grateful for your mentorship throughout the journey! 🙌
2023-10-08 15:38:42 -07:00
## License 📜
2023-11-14 13:32:59 -08:00
This project is licensed under the [MIT License](LICENSE) - see the [LICENSE](LICENSE) file for details. 📄
2023-10-08 15:38:42 -07:00
## Support 💬
2023-11-05 15:41:25 -08:00
If you have any questions, suggestions, or need assistance, please open an issue or join our
2024-02-16 23:04:37 -08:00
[Open WebUI Discord community](https://discord.gg/5rJgQTnV4s) to connect with us! 🤝
2023-10-08 15:38:42 -07:00
## Star History
<a href="https://star-history.com/#open-webui/open-webui&Date">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=open-webui/open-webui&type=Date&theme=dark" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=open-webui/open-webui&type=Date" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=open-webui/open-webui&type=Date" />
</picture>
</a>
2023-10-08 15:38:42 -07:00
---
2024-04-20 17:40:55 -05:00
Created by [Timothy J. Baek](https://github.com/tjbck) - Let's make Open WebUI even more amazing together! 💪