Description: I am receiving an error when trying to use the GPT-5-mini model. The system is attempting to use the legacy max_tokens parameter, but the GPT-5 API now requires max_completion_tokens instead.
Error Message: Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.
Technical Context: With the release of GPT-5, OpenAI has shifted to a reasoning-based architecture. This model requires max_completion_tokens to properly manage the combined budget of "reasoning tokens" (Chain of Thought) and "visible output tokens". Any API calls still utilizing the old max_tokens key are being rejected by the server.
Requested Fix: Please update the model configuration to use max_completion_tokens whenever any GPT-5 model is selected to ensure compatibility with the new API requirements.

Please authenticate to join the conversation.
In Review
π Bug Reports
15 days ago

Leandro Teixeira
Get notified by email when there are changes.
In Review
π Bug Reports
15 days ago

Leandro Teixeira
Get notified by email when there are changes.