AgentScope-Java is an open-source project. To involve a broader community, we recommend asking your questions in English.
Describe the bug
由于 vllm-project/vllm#27755 的改动.
按照最新的openai的规范. vllm 把思考的内容的响应字段从"reasoning_content" 改成了 "reasoning". 导致OpenAIMessage类的字段解析出现了问题. 找不到思考的内容.
To Reproduce
Steps to reproduce the behavior:
使用 vllm最新版 部署 qwen3.5
Expected behavior
能清楚的返回思考的内容.
Error messages
旧版本vllm的返回为:
"choices":[{"index":0,"delta":{"content":"\n","reasoning_content":null},"logprobs":null,"finish_reason":null,"token_ids":null}]
新版本的vllm的返回为:
"choices":[{"index":0,"delta":{"reasoning":"\n"},"logprobs":null,"finish_reason":null,"token_ids":null}]
Environment (please complete the following information):
- AgentScope-Java Version: 1.0.9
- Java Version: 17
- OS: macos
AgentScope-Java is an open-source project. To involve a broader community, we recommend asking your questions in English.
Describe the bug
由于 vllm-project/vllm#27755 的改动.
按照最新的openai的规范. vllm 把思考的内容的响应字段从"reasoning_content" 改成了 "reasoning". 导致OpenAIMessage类的字段解析出现了问题. 找不到思考的内容.
To Reproduce
Steps to reproduce the behavior:
使用 vllm最新版 部署 qwen3.5
Expected behavior
能清楚的返回思考的内容.
Error messages
旧版本vllm的返回为:
"choices":[{"index":0,"delta":{"content":"\n","reasoning_content":null},"logprobs":null,"finish_reason":null,"token_ids":null}]新版本的vllm的返回为:
"choices":[{"index":0,"delta":{"reasoning":"\n"},"logprobs":null,"finish_reason":null,"token_ids":null}]Environment (please complete the following information):