db2a9ad1fe
Explicitly disable AVX2 on GPU builds
...
Even though we weren't setting it to on, somewhere in the cmake config
it was getting toggled on. By explicitly setting it to off, we get `/arch:AVX`
as intended.
2024-02-15 14:50:11 -08:00
c9ab1aead3
Merge pull request #2526 from dhiltgen/harden_for_quotes
...
Harden the OLLAMA_HOST lookup for quotes
2024-02-15 14:13:40 -08:00
4a10e7a7fa
Harden the OLLAMA_HOST lookup for quotes
2024-02-15 13:46:56 -08:00
86808f80a8
remove unused import
2024-02-15 12:09:11 -08:00
4240b045e6
always enable view logs
2024-02-15 12:08:27 -08:00
e547378893
disable default debug
2024-02-15 12:05:13 -08:00
fd77dbec4d
do not print update request headers
2024-02-15 11:36:35 -08:00
fefb3e77d1
Update README.md
2024-02-15 10:32:40 -08:00
ed5489a96e
higher resolution tray icons
2024-02-14 22:55:03 -08:00
76113742cf
update installer title
2024-02-15 05:56:45 +00:00
57e60c836f
better windows app and tray icons
2024-02-15 05:56:45 +00:00
622b1f3e67
update installer and app.exe metadata
2024-02-15 05:56:45 +00:00
7ad9844ac0
set exe metadata using resource files
2024-02-15 05:56:45 +00:00
e43648afe5
rerefactor
2024-02-15 05:56:45 +00:00
823a520266
Fix lint error on ignored error for win console
2024-02-15 05:56:45 +00:00
66ef308abd
Import "containerd/console" lib to support colorful output in Windows terminal
2024-02-15 05:56:45 +00:00
29e90cc13b
Implement new Go based Desktop app
...
This focuses on Windows first, but coudl be used for Mac
and possibly linux in the future.
2024-02-15 05:56:45 +00:00
f397e0e988
Move hub auth out to new package
2024-02-15 05:56:45 +00:00
9da9e8fb72
Move Mac App to a new dir
2024-02-15 05:56:45 +00:00
42e77e2a69
handle race condition while setting raw mode in windows ( #2509 )
v0.1.25
2024-02-14 21:28:35 -08:00
9241a29336
Revert "Revert "bump submodule to 6c00a06
( #2479 )"" ( #2485 )
...
This reverts commit 6920964b87
.
2024-02-13 18:18:41 -08:00
f7231ad9ad
set shutting_down
to false
once shutdown is complete ( #2484 )
2024-02-13 17:48:41 -08:00
6920964b87
Revert "bump submodule to 6c00a06
( #2479 )"
...
This reverts commit 2f9ed52bbd
.
2024-02-13 17:23:05 -08:00
2f9ed52bbd
bump submodule to 6c00a06
( #2479 )
2024-02-13 17:12:42 -08:00
caf2b13c10
Fix infinite keep_alive ( #2480 )
2024-02-13 15:40:32 -08:00
1d263449ff
Update README.md to include link to Ollama-ex Elixir library ( #2477 )
2024-02-13 11:40:44 -08:00
48a273f80b
Fix issues with templating prompt in chat mode ( #2460 )
2024-02-12 15:06:57 -08:00
939c60473f
Merge pull request #2422 from dhiltgen/better_kill
...
More robust shutdown
2024-02-12 14:05:06 -08:00
f76ca04f9e
update submodule to 099afc6
( #2468 )
2024-02-12 14:01:16 -08:00
76b8728f0c
Merge pull request #2465 from dhiltgen/block_rocm_pre_9
...
Detect AMD GPU info via sysfs and block old cards
2024-02-12 12:41:43 -08:00
1f9078d6ae
Check image filetype in api handlers ( #2467 )
2024-02-12 11:16:20 -08:00
6d84f07505
Detect AMD GPU info via sysfs and block old cards
...
This wires up some new logic to start using sysfs to discover AMD GPU
information and detects old cards we can't yet support so we can fallback to CPU mode.
2024-02-12 08:19:41 -08:00
26b13fc33c
patch: always add token to cache_tokens ( #2459 )
2024-02-12 08:10:16 -08:00
1c8435ffa9
Update domain name references in docs and install script ( #2435 )
2024-02-09 15:19:30 -08:00
6680761596
Shutdown faster
...
Make sure that when a shutdown signal comes, we shutdown quickly instead
of waiting for a potentially long exchange to wrap up.
2024-02-08 22:22:50 -08:00
42b797ed9c
Update openai.md
2024-02-08 15:03:23 -05:00
336aa43f3c
Update openai.md
2024-02-08 12:48:28 -05:00
69f392c9b7
Merge pull request #2403 from dhiltgen/handle_tmp_cleanup
...
Ensure the libraries are present
v0.1.24
2024-02-07 17:55:31 -08:00
a1dfab43b9
Ensure the libraries are present
...
When we store our libraries in a temp dir, a reaper might clean
them when we are idle, so make sure to check for them before
we reload.
2024-02-07 17:27:49 -08:00
a0a199b108
Fix hanging issue when sending empty content ( #2399 )
2024-02-07 19:30:33 -05:00
ab0d37fde4
Update openai.md
2024-02-07 17:25:33 -05:00
14e71350c8
Update openai.md
2024-02-07 17:25:24 -05:00
453f572f83
Initial OpenAI /v1/chat/completions
API compatibility ( #2376 )
2024-02-07 17:24:29 -05:00
c9dfa6e571
Merge pull request #2377 from dhiltgen/bump_llamacpp
...
Bump llama.cpp to b2081
2024-02-07 12:04:38 -08:00
3dcbcd367d
Merge pull request #2394 from ollama/mxyng/fix-error-response
2024-02-07 11:47:31 -08:00
e805ac1d59
fix response on token error
2024-02-07 11:05:49 -08:00
b9229ffca5
Merge pull request #2378 from ollama/mxyng/runners
...
runners
2024-02-06 13:49:58 -08:00
46c847c4ad
enable rocm builds
2024-02-06 13:36:13 -08:00
92b1a21f79
use linux runners
2024-02-06 13:36:04 -08:00
de76b95dd4
Bump llama.cpp to b2081
2024-02-06 12:06:43 -08:00