mirror of
https://github.com/ollama/ollama.git
synced 2025-04-09 20:29:23 +02:00
* include seed in params for llama.cpp server and remove empty filter for temp * relay default predict options to llama.cpp - reorganize options to match predict request for readability * omit empty stop --------- Co-authored-by: hallh <hallh@users.noreply.github.com>