Skip to content

[Bug] 通过vLLM方式添加向量模型,出现报错,同样方式添加LLM正常 #4990

@Markfaye

Description

@Markfaye

Contact Information

No response

MaxKB Version

v2.10.7

Problem Description

vLLM端区分了端口,8000是LLM,8005是Embedding Model
Image
进入MaxKB容器测试,vLLM服务端没有问题:

Image 网络层面没有限制

Steps to Reproduce

Image Image

The expected correct result

No response

Related log output

devadm@VDPVNCAIAPPS02 ~]$ curl http://10.112.76.90:8005/v1/models
{"object":"list","data":[{"id":"BAAI/bge-m3","object":"model","created":1773817148,"owned_by":"vllm","root":"BAAI/bge-m3","parent":null,"max_model_len":8192,"permission":[{"id":"modelperm-9fb80455a9e99d23","object":"model_permission","created":1773817148,"allow_create_engine":false,"allow_sampling":true,"allow_logprobs":true,"allow_search_indices":false,"allow_view":true,"allow_fine_tuning":false,"organization":"*","group":null,"is_blocking":false}]}]}

Additional Information

No response

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions