Files
ollama/server
Jesse Gross 06007c0a18 Allow models to force a new batch
This is useful for a few things:
 - Work around bugs, such as having 2 images in one batch
 - Keep the image in a single batch for fully connected attention
 - Improve performance by not evaluating embeddings multiple times
2025-03-11 14:49:20 -07:00
..
2024-07-26 14:14:48 -07:00
2025-02-13 16:31:21 -08:00
2025-02-13 16:31:21 -08:00
2025-03-11 14:49:20 -07:00
2025-02-13 16:31:21 -08:00