# Description
Please include a summary of the changes and the related issue. Please
also include relevant motivation and context.
## Checklist before requesting a review
Please delete options that are not relevant.
- [ ] My code follows the style guidelines of this project
- [ ] I have performed a self-review of my code
- [ ] I have commented hard-to-understand areas
- [ ] I have ideally added tests that prove my fix is effective or that
my feature works
- [ ] New and existing unit tests pass locally with my changes
- [ ] Any dependent changes have been merged
## Screenshots (if appropriate):
This commit adds the langchain_openai and langchain_anthropic
dependencies to the `llm_endpoint.py` file.
# Description
Please include a summary of the changes and the related issue. Please
also include relevant motivation and context.
## Checklist before requesting a review
Please delete options that are not relevant.
- [ ] My code follows the style guidelines of this project
- [ ] I have performed a self-review of my code
- [ ] I have commented hard-to-understand areas
- [ ] I have ideally added tests that prove my fix is effective or that
my feature works
- [ ] New and existing unit tests pass locally with my changes
- [ ] Any dependent changes have been merged
## Screenshots (if appropriate):
# Description
Using LangGraph instead of LangChain LCEL to build and run the RAG
pipeline, as LangGraph enables greater flexibility and an easier
maintainability of complex (agentic) pipelines
Completes CORE-175
## Checklist before requesting a review
Please delete options that are not relevant.
- [x] My code follows the style guidelines of this project
- [x] I have performed a self-review of my code
- [x] I have commented hard-to-understand areas
- [ ] I have ideally added tests that prove my fix is effective or that
my feature works
- [x] New and existing unit tests pass locally with my changes
- [x] Any dependent changes have been merged
## Screenshots (if appropriate):
---------
Co-authored-by: Stan Girard <girard.stanislas@gmail.com>
# Description
# Testing backend
## Docker setup
1. Copy `.env.example` to `.env`. Some env variables were added :
EMBEDDING_DIM
2. Apply supabase migratrions :
```sh
supabase stop
supabase db reset
supabase start
```
3. Start backend containers
```
make dev
```
## Local setup
You can also run backend without docker.
1. Install [`rye`](https://rye.astral.sh/guide/installation/). Choose
the managed python version and set the version to 3.11
2. Run the following:
```
cd quivr/backend
rye sync
```
3. Source `.venv` virtual env : `source .venv/bin/activate`
4. Run the backend, make sure you are running redis and supabase
API:
```
LOG_LEVEL=debug uvicorn quivr_api.main:app --log-level debug --reload --host 0.0.0.0 --port 5050 --workers 1
```
Worker:
```
LOG_LEVEL=debug celery -A quivr_worker.celery_worker worker -l info -E --concurrency 1
```
Notifier:
```
LOG_LEVEL=debug python worker/quivr_worker/celery_monitor.py
```
---------
Co-authored-by: chloedia <chloedaems0@gmail.com>
Co-authored-by: aminediro <aminedirhoussi1@gmail.com>
Co-authored-by: Antoine Dewez <44063631+Zewed@users.noreply.github.com>
Co-authored-by: Chloé Daems <73901882+chloedia@users.noreply.github.com>
Co-authored-by: Zewed <dewez.antoine2@gmail.com>