mirror of
https://github.com/ollama/ollama.git
synced 2025-04-11 13:21:21 +02:00
Ollama
- Run models easily
- Download, manage and import models
Install
pip install ollama
Example quickstart
import ollama
ollama.generate("./llama-7b-ggml.bin", "hi")
Reference
ollama.load
Load a model for generation
ollama.load("model name")
ollama.generate("message")
Generate a completion
ollama.generate(model, "hi")
ollama.models
List available local models
models = ollama.models()
ollama.serve
Serve the ollama http server
Cooing Soon
ollama.pull
Download a model
ollama.pull("huggingface.co/thebloke/llama-7b-ggml")
ollama.import
Import a model from a file
ollama.import("./path/to/model")
ollama.search
Search for compatible models that Ollama can run
ollama.search("llama-7b")
Future CLI
In the future, there will be an easy CLI for testing out models
ollama run huggingface.co/thebloke/llama-7b-ggml
> Downloading [================> ] 66.67% (2/3) 30.2MB/s
Languages
Go
93.5%
C
2.5%
Shell
1.2%
TypeScript
1%
PowerShell
0.7%
Other
1%