quivr/backend/models/chats.py
Mamadou DICKO 59fe7b089b
feat(chat): use openai function for answer (#354)
* feat(chat): use openai function for answer (backend)

* feat(chat): use openai function for answer (frontend)

* chore: refacto BrainPicking

* feat: update chat creation logic

* feat: simplify chat system logic

* feat: set default method to gpt-3.5-turbo-0613

* feat: use user own openai key

* feat(chat): slightly improve prompts

* feat: add global error interceptor

* feat: remove unused endpoints

* docs: update chat system doc

* chore(linter): add unused import remove config

* feat: improve dx

* feat: improve OpenAiFunctionBasedAnswerGenerator prompt
2023-06-22 17:50:06 +02:00

24 lines
582 B
Python

from typing import List, Optional, Tuple
from uuid import UUID
from pydantic import BaseModel
class ChatMessage(BaseModel):
model: str = "gpt-3.5-turbo-16k"
question: str
# A list of tuples where each tuple is (speaker, text)
history: List[Tuple[str, str]]
temperature: float = 0.0
max_tokens: int = 256
use_summarization: bool = False
chat_id: Optional[UUID] = None
chat_name: Optional[str] = None
class ChatQuestion(BaseModel):
model: str = "gpt-3.5-turbo-0613"
question: str
temperature: float = 0.0
max_tokens: int = 256