mirror of
https://github.com/ollama/ollama.git
synced 2025-11-11 00:27:15 +01:00
48 lines
1.3 KiB
Plaintext
48 lines
1.3 KiB
Plaintext
---
|
|
title: JetBrains
|
|
---
|
|
|
|
<Note>This example uses **IntelliJ**; same steps apply to other JetBrains IDEs (e.g., PyCharm).</Note>
|
|
|
|
## Install
|
|
|
|
Install [IntelliJ](https://www.jetbrains.com/idea/).
|
|
|
|
## Usage with Ollama
|
|
|
|
<Note>
|
|
To use **Ollama**, you will need a [JetBrains AI Subscription](https://www.jetbrains.com/ai-ides/buy/?section=personal&billing=yearly).
|
|
</Note>
|
|
|
|
1. In Intellij, click the **chat icon** located in the right sidebar
|
|
|
|
<div style={{ display: 'flex', justifyContent: 'center' }}>
|
|
<img
|
|
src="/images/intellij-chat-sidebar.png"
|
|
alt="Intellij Sidebar Chat"
|
|
width="50%"
|
|
/>
|
|
</div>
|
|
|
|
2. Select the **current model** in the sidebar, then click **Set up Local Models**
|
|
|
|
<div style={{ display: 'flex', justifyContent: 'center' }}>
|
|
<img
|
|
src="/images/intellij-current-model.png"
|
|
alt="Intellij model bottom right corner"
|
|
width="50%"
|
|
/>
|
|
</div>
|
|
|
|
3. Under **Third Party AI Providers**, choose **Ollama**
|
|
4. Confirm the **Host URL** is `http://localhost:11434`, then click **Ok**
|
|
5. Once connected, select a model under **Local models by Ollama**
|
|
|
|
<div style={{ display: 'flex', justifyContent: 'center' }}>
|
|
<img
|
|
src="/images/intellij-local-models.png"
|
|
alt="Zed star icon in bottom right corner"
|
|
width="50%"
|
|
/>
|
|
</div>
|