feat: Add Quivr chatbot example (#2827)

The commit adds a new Quivr chatbot example to the repository. The
example demonstrates how to create a simple chatbot using Quivr and
Chainlit. Users can upload a text file and ask questions about its
content. The commit includes the necessary files, installation
instructions, and usage guidelines.
This commit is contained in:
Stan Girard 2024-07-10 21:42:49 +02:00 committed by GitHub
parent 7b8db6b9ec
commit 5ff8d4ee81
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
5 changed files with 162 additions and 0 deletions

4
.gitignore vendored
View File

@ -89,3 +89,7 @@ backend/.env.test
**/*.egg-info
.coverage
backend/core/examples/chatbot/.files/*
backend/core/examples/chatbot/.python-version
backend/core/examples/chatbot/.chainlit/config.toml
backend/core/examples/chatbot/.chainlit/translations/en-US.json

View File

@ -0,0 +1,45 @@
# Quivr Chatbot Example
This example demonstrates how to create a simple chatbot using Quivr and Chainlit. The chatbot allows users to upload a text file and then ask questions about its content.
## Prerequisites
- Python 3.8 or higher
## Installation
1. Clone the repository or navigate to the `backend/core/examples/chatbot` directory.
2. Install the required dependencies:
```
pip install -r requirements.txt
```
## Running the Chatbot
1. Start the Chainlit server:
```
chainlit run main.py
```
2. Open your web browser and go to the URL displayed in the terminal (usually `http://localhost:8000`).
## Using the Chatbot
1. When the chatbot interface loads, you will be prompted to upload a text file.
2. Click on the upload area and select a `.txt` file from your computer. The file size should not exceed 20MB.
3. After uploading, the chatbot will process the file and inform you when it's ready.
4. You can now start asking questions about the content of the uploaded file.
5. Type your questions in the chat input and press Enter. The chatbot will respond based on the information in the uploaded file.
## How It Works
The chatbot uses the Quivr library to create a "brain" from the uploaded text file. This brain is then used to answer questions about the file's content. The Chainlit library provides the user interface and handles the chat interactions.
Enjoy chatting with your documents!

View File

@ -0,0 +1,45 @@
# Quivr Chatbot Example
This example demonstrates how to create a simple chatbot using Quivr and Chainlit. The chatbot allows users to upload a text file and then ask questions about its content.
## Prerequisites
- Python 3.8 or higher
## Installation
1. Clone the repository or navigate to the `backend/core/examples/chatbot` directory.
2. Install the required dependencies:
```
pip install -r requirements.txt
```
## Running the Chatbot
1. Start the Chainlit server:
```
chainlit run main.py
```
2. Open your web browser and go to the URL displayed in the terminal (usually `http://localhost:8000`).
## Using the Chatbot
1. When the chatbot interface loads, you will be prompted to upload a text file.
2. Click on the upload area and select a `.txt` file from your computer. The file size should not exceed 20MB.
3. After uploading, the chatbot will process the file and inform you when it's ready.
4. You can now start asking questions about the content of the uploaded file.
5. Type your questions in the chat input and press Enter. The chatbot will respond based on the information in the uploaded file.
## How It Works
The chatbot uses the Quivr library to create a "brain" from the uploaded text file. This brain is then used to answer questions about the file's content. The Chainlit library provides the user interface and handles the chat interactions.
Enjoy chatting with your documents!

View File

@ -0,0 +1,63 @@
import tempfile
import chainlit as cl
from quivr_core import Brain
@cl.on_chat_start
async def on_chat_start():
files = None
# Wait for the user to upload a file
while files is None:
files = await cl.AskFileMessage(
content="Please upload a text .txt file to begin!",
accept=["text/plain"],
max_size_mb=20,
timeout=180,
).send()
file = files[0]
msg = cl.Message(content=f"Processing `{file.name}`...", disable_feedback=True)
await msg.send()
with open(file.path, "r", encoding="utf-8") as f:
text = f.read()
with tempfile.NamedTemporaryFile(
mode="w", suffix=".txt", delete=False
) as temp_file:
temp_file.write(text)
temp_file.flush()
temp_file_path = temp_file.name
brain = Brain.from_files(name="user_brain", file_paths=[temp_file_path])
# Store the file path in the session
cl.user_session.set("file_path", temp_file_path)
# Let the user know that the system is ready
msg.content = f"Processing `{file.name}` done. You can now ask questions!"
await msg.update()
cl.user_session.set("brain", brain)
@cl.on_message
async def main(message: cl.Message):
brain = cl.user_session.get("brain") # type: Brain
if brain is None:
await cl.Message(content="Please upload a file first.").send()
return
# Prepare the message for streaming
msg = cl.Message(content="")
await msg.send()
# Use the ask_stream method for streaming responses
async for chunk in brain.ask_streaming(message.content):
await msg.stream_token(chunk.answer)
await msg.send()

View File

@ -0,0 +1,5 @@
quivr-core==0.0.7
langchain-community==0.2.6
faiss-cpu==1.8.0
langchain-openai==0.1.14
chainlit==1.0.0