Skip to content

[Bug]: When deploying qwen3.5 using the latest version of vllm, the thought content cannot be displayed. #848

@sunxiang0918

Description

@sunxiang0918

AgentScope-Java is an open-source project. To involve a broader community, we recommend asking your questions in English.

Describe the bug
由于 vllm-project/vllm#27755 的改动.
按照最新的openai的规范. vllm 把思考的内容的响应字段从"reasoning_content" 改成了 "reasoning". 导致OpenAIMessage类的字段解析出现了问题. 找不到思考的内容.

To Reproduce
Steps to reproduce the behavior:

使用 vllm最新版 部署 qwen3.5

Expected behavior
能清楚的返回思考的内容.

Error messages

旧版本vllm的返回为:
"choices":[{"index":0,"delta":{"content":"\n","reasoning_content":null},"logprobs":null,"finish_reason":null,"token_ids":null}]

新版本的vllm的返回为:
"choices":[{"index":0,"delta":{"reasoning":"\n"},"logprobs":null,"finish_reason":null,"token_ids":null}]

Environment (please complete the following information):

  • AgentScope-Java Version: 1.0.9
  • Java Version: 17
  • OS: macos

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    Status

    Done

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions