Adjust baibot's openai-config.yml.j2 to avoid max_response_tokens if unspecified

Reasoning models like `o1` and `o3` and their `-mini` variants
report errors if we try to configure `max_response_tokens` (which
ultimately influences the `max_tokens` field in the API request):

> invalid_request_error: Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. (param: max_tokens) (code: unsupported_parameter)

`max_completion_tokens` is not yet supported by baibot, so the best we
can do is at least get rid of `max_response_tokens` (`max_tokens`).

Ref: db9422740c
This commit is contained in:
Slavi Pantaleev 2025-02-01 07:56:06 +02:00
parent efcac431bd
commit 8889b018f3

View File

@ -8,7 +8,9 @@ text_generation:
model_id: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_model_id | to_json }} model_id: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_model_id | to_json }}
prompt: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_prompt | to_json }} prompt: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_prompt | to_json }}
temperature: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_temperature | to_json }} temperature: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_temperature | to_json }}
{% if matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_max_response_tokens %}
max_response_tokens: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_max_response_tokens | int | to_json }} max_response_tokens: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_max_response_tokens | int | to_json }}
{% endif %}
max_context_tokens: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_max_context_tokens | int | to_json }} max_context_tokens: {{ matrix_bot_baibot_config_agents_static_definitions_openai_config_text_generation_max_context_tokens | int | to_json }}
{% endif %} {% endif %}