fix: Update LLMEndpoint to include max_tokens parameter (#3201)

# Description

Please include a summary of the changes and the related issue. Please
also include relevant motivation and context.

## Checklist before requesting a review

Please delete options that are not relevant.

- [ ] My code follows the style guidelines of this project
- [ ] I have performed a self-review of my code
- [ ] I have commented hard-to-understand areas
- [ ] I have ideally added tests that prove my fix is effective or that
my feature works
- [ ] New and existing unit tests pass locally with my changes
- [ ] Any dependent changes have been merged

## Screenshots (if appropriate):
This commit is contained in:
Stan Girard 2024-09-13 10:59:41 +02:00 committed by GitHub
parent 8fb4887716
commit 13ed225b17
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -42,6 +42,7 @@ class LLMEndpoint:
if config.llm_api_key
else None,
azure_endpoint=azure_endpoint,
max_tokens=config.max_tokens
)
elif config.model.startswith("claude"):
_llm = ChatAnthropic(
@ -50,6 +51,7 @@ class LLMEndpoint:
if config.llm_api_key
else None,
base_url=config.llm_base_url,
max_tokens=config.max_tokens
)
else:
_llm = ChatOpenAI(
@ -58,6 +60,7 @@ class LLMEndpoint:
if config.llm_api_key
else None,
base_url=config.llm_base_url,
max_tokens=config.max_tokens
)
return cls(llm=_llm, llm_config=config)