-
Notifications
You must be signed in to change notification settings - Fork 2.7k
[Bug] 通过vLLM方式添加向量模型,出现报错,同样方式添加LLM正常 #4990
Copy link
Copy link
Closed
Description
Contact Information
No response
MaxKB Version
v2.10.7
Problem Description
vLLM端区分了端口,8000是LLM,8005是Embedding Model

进入MaxKB容器测试,vLLM服务端没有问题:
网络层面没有限制
Steps to Reproduce
The expected correct result
No response
Related log output
devadm@VDPVNCAIAPPS02 ~]$ curl http://10.112.76.90:8005/v1/models
{"object":"list","data":[{"id":"BAAI/bge-m3","object":"model","created":1773817148,"owned_by":"vllm","root":"BAAI/bge-m3","parent":null,"max_model_len":8192,"permission":[{"id":"modelperm-9fb80455a9e99d23","object":"model_permission","created":1773817148,"allow_create_engine":false,"allow_sampling":true,"allow_logprobs":true,"allow_search_indices":false,"allow_view":true,"allow_fine_tuning":false,"organization":"*","group":null,"is_blocking":false}]}]}Additional Information
No response
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels