mirror of
https://github.com/StanGirard/quivr.git
synced 2024-12-18 08:02:03 +03:00
c87dffced1
improved docs
43 lines
1.3 KiB
Plaintext
43 lines
1.3 KiB
Plaintext
---
|
|
title: Ollama
|
|
---
|
|
The guide was put together in collaboration with members of the Quivr Discord, **Using Quivr fully locally** thread. That is a good place to discuss it. https://discord.com/invite/HUpRgp2HG8
|
|
|
|
## Ollama
|
|
|
|
Ollama is a tool that allows you to run LLMs locally. We are using it to run Llama2, MistralAI and others locally.
|
|
|
|
### Install Ollama
|
|
|
|
Install Ollama from their [website](https://ollama.ai/).
|
|
|
|
Then run the following command to run Ollama in the background:
|
|
|
|
```bash
|
|
ollama run llama2
|
|
```
|
|
|
|
|
|
## Add Ollama Model to Quivr
|
|
|
|
Now that you have your model running locally, you need to add it to Quivr.
|
|
|
|
In order to allow the user to choose between the Ollama, we need to add a new model to the Quivr backend.
|
|
|
|
Go to supabase and in the table `user_settings` either add by default or to your user the following value to the `models` column:
|
|
|
|
```json
|
|
[
|
|
"ollama/llama2",
|
|
"ollama/mistral",
|
|
]
|
|
```
|
|
|
|
This will add the Ollama model to the list of models that the user can choose from.
|
|
|
|
By adding this as default, it means that all new users will have this model by default. If you want to add it to your user only, you can add it to the `models` column in the `user_settings` table. In order for the change to take effect if you put as default your need to drop the entire table with the following command:
|
|
|
|
```sql
|
|
DELETE TABLE user_settings;
|
|
```
|