docs: Document how to use custom api_url in Assistant (#11790)

This essentially documents the comment here:
https://github.com/zed-industries/zed/issues/4424#issuecomment-2053646583

Release Notes:


- N/A
This commit is contained in:
Thorsten Ball 2024-05-14 11:39:57 +02:00 committed by GitHub
parent 95e0d5ed74
commit 4766b41e96
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -78,3 +78,58 @@ After you submit your first message, a name for your conversation is generated b
## Multiple cursor demo
The assistant is capable of sending multiple requests, and receiving multiple responses, in parallel. [Here's a demo](https://zed.dev/img/post/assistant/demo.webm).
## Using a custom API endpoint for OpenAI
You can use a custom API endpoint for OpenAI, as long as it's compatible with the OpenAI API structure.
To do so, add the following to your Zed `settings.json`:
```json
{
"assistant": {
"version": "1",
"provider": {
"name": "openai",
"type": "openai",
"default_model": "gpt-4-turbo-preview",
"api_url": "http://localhost:11434/v1"
}
}
}
```
The custom URL here is `http://localhost:11434/v1`.
## Using Ollama on macOS
You can use Ollama with the Zed assistant by making Ollama appear as an OpenAPI endpoint.
1. Add the following to your Zed `settings.json`:
```json
{
"assistant": {
"version": "1",
"provider": {
"name": "openai",
"type": "openai",
"default_model": "gpt-4-turbo-preview",
"api_url": "http://localhost:11434/v1"
}
}
}
```
2. Download, for example, the `mistral` model with Ollama:
```
ollama run mistral
```
3. Copy the model and change its name to match the model in the Zed `settings.json`:
```
ollama cp mistral gpt-4-turbo-preview
```
4. Use `assistant: reset key` (see the [Setup](#setup) section above) and enter the following API key:
```
ollama
```
5. Restart Zed