mirror of
https://github.com/ollama/ollama.git
synced 2025-04-07 11:28:17 +02:00
Update docs
This commit is contained in:
parent
4982089c84
commit
75f88e7aac
@ -94,12 +94,11 @@ except Exception as e:
|
||||
print(f"Error: {e}")
|
||||
```
|
||||
|
||||
#### Experimental
|
||||
#### Extra Arguments
|
||||
|
||||
- `num_ctx` parameter can be used to set the context window for the model
|
||||
- OpenAI Python SDK does not support setting context window size, however this can be set for Ollama through the `extra_body` parameter
|
||||
|
||||
- The recommended way to control this is through the [Ollama Python SDK](https://github.com/ollama/ollama-python) with the `options` parameter
|
||||
```py
|
||||
completion = client.beta.chat.completions.create(
|
||||
model="llama3.1:8b",
|
||||
@ -156,12 +155,11 @@ const embedding = await openai.embeddings.create({
|
||||
})
|
||||
```
|
||||
|
||||
#### Experimental
|
||||
#### Extra Arguments
|
||||
|
||||
- `num_ctx` parameter can be used to set the context window for the model
|
||||
- OpenAI JS SDK does not support setting context window size, however this can be set for Ollama by passing `num_ctx` directly with a `@ts-expect-error` as an undocumented parameter in the [OpenAI JS SDK](https://github.com/openai/openai-node?tab=readme-ov-file#making-customundocumented-requests)
|
||||
|
||||
- The recommended way to control this is through the [Ollama JS SDK](https://github.com/ollama/ollama-js) with the `options` parameter
|
||||
```js
|
||||
const chatCompletion = await openai.chat.completions.create({
|
||||
messages: [{ role: 'user', content: 'Say this is a test' }],
|
||||
|
Loading…
x
Reference in New Issue
Block a user