mirror of
https://github.com/zed-industries/zed.git
synced 2024-11-10 05:37:29 +03:00
f1778dd9de
### Pull Request Title Introduce `max_output_tokens` Field for OpenAI Models https://platform.deepseek.com/api-docs/news/news0725/#4-8k-max_tokens-betarelease-longer-possibilities ### Description This commit introduces a new field `max_output_tokens` to the OpenAI models, which allows specifying the maximum number of tokens that can be generated in the output. This field is now integrated into the request handling across multiple crates, ensuring that the output token limit is respected during language model completions. Changes include: - Adding `max_output_tokens` to the `Custom` variant of the `open_ai::Model` enum. - Updating the `into_open_ai` method in `LanguageModelRequest` to accept and use `max_output_tokens`. - Modifying the `OpenAiLanguageModel` and `CloudLanguageModel` implementations to pass `max_output_tokens` when converting requests. - Ensuring that the `max_output_tokens` field is correctly serialized and deserialized in relevant structures. This enhancement provides more control over the output length of OpenAI model responses, improving the flexibility and accuracy of language model interactions. ### Changes - Added `max_output_tokens` to the `Custom` variant of the `open_ai::Model` enum. - Updated the `into_open_ai` method in `LanguageModelRequest` to accept and use `max_output_tokens`. - Modified the `OpenAiLanguageModel` and `CloudLanguageModel` implementations to pass `max_output_tokens` when converting requests. - Ensured that the `max_output_tokens` field is correctly serialized and deserialized in relevant structures. ### Related Issue https://github.com/zed-industries/zed/pull/16358 ### Screenshots / Media N/A ### Checklist - [x] Code compiles correctly. - [x] All tests pass. - [ ] Documentation has been updated accordingly. - [ ] Additional tests have been added to cover new functionality. - [ ] Relevant documentation has been updated or added. ### Release Notes - Added `max_output_tokens` field to OpenAI models for controlling output token length. |
||
---|---|---|
.. | ||
src | ||
Cargo.toml | ||
LICENSE-GPL |