Add max_input and max_tokens parameters to KnowledgeBrainQA constructor (#2247)

This pull request adds the `max_input` and `max_tokens` parameters to
the `KnowledgeBrainQA` constructor. These parameters allow for setting
the maximum input length and maximum number of tokens for the
KnowledgeBrainQA model. This provides more flexibility and control over
the input size and tokenization process.
This commit is contained in:
Stan Girard 2024-02-22 15:38:25 -08:00 committed by GitHub
parent 81072b3841
commit e4f920120f
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -181,6 +181,8 @@ class KnowledgeBrainQA(BaseModel, QAInterface):
brain_id=brain_id,
chat_id=chat_id,
streaming=streaming,
max_input=self.max_input,
max_tokens=self.max_tokens,
**kwargs,
)