Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ dependencies = [
# Used by authentication/k8s integration
"kubernetes>=30.1.0",
# Used to call Llama Stack APIs
"llama-stack==0.4.3",
"llama-stack-client==0.4.3",
"llama-stack-api==0.4.4",
"llama-stack==0.5.2",
"llama-stack-client==0.5.2",
"llama-stack-api==0.5.2",
Comment on lines +31 to +33
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

llama-stack-api 0.5.2 PyPI package availability

💡 Result:

As of March 6, 2026, llama-stack-api==0.5.2 is not available on PyPI. The PyPI release history for llama-stack-api shows the latest published version as 0.5.1 (Feb 19, 2026), with earlier releases like 0.5.0, 0.4.5, etc., and no 0.5.2 entry. [1]

What you can do

  • Install the latest available: pip install llama-stack-api==0.5.1 (or omit the version to get latest). [1]
  • If you specifically need 0.5.2, it likely hasn’t been published to PyPI yet (you’d need to install from source/tag instead). [1]

Sources
[1] PyPI project page / release history for llama-stack-api (shows latest as 0.5.1, no 0.5.2) (pypi.org)

Citations:


Fix version pinning: llama-stack-api==0.5.2 does not exist on PyPI.

llama-stack-api==0.5.2 is not available on PyPI. The latest published version is 0.5.1 (Feb 19, 2026). Update line 33 to use llama-stack-api==0.5.1 or check if the other llama-stack packages (llama-stack and llama-stack-client) should also be downgraded to 0.5.1 for compatibility.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@pyproject.toml` around lines 31 - 33, The pyproject dependency list pins
"llama-stack-api==0.5.2" which does not exist on PyPI; update the package spec
to a valid version (e.g., change "llama-stack-api==0.5.2" to
"llama-stack-api==0.5.1") and decide whether to align "llama-stack==0.5.2" and
"llama-stack-client==0.5.2" to 0.5.1 for compatibility—ensure all three package
entries ("llama-stack", "llama-stack-client", "llama-stack-api") reference
existing matching versions.

# Used by Logger
"rich>=14.0.0",
# Used by JWK token auth handler
Expand Down
2 changes: 1 addition & 1 deletion src/constants.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

# Minimal and maximal supported Llama Stack version
MINIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.2.17"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.4.3"
MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION = "0.5.2"

UNABLE_TO_PROCESS_RESPONSE = "Unable to process this request"

Expand Down
7 changes: 4 additions & 3 deletions src/utils/responses.py
Original file line number Diff line number Diff line change
Expand Up @@ -1079,15 +1079,16 @@ def _extract_text_from_content(

text_fragments: list[str] = []
for part in content:
if part.type == "input_text":
part_type = getattr(part, "type", None)
if part_type == "input_text":
input_text_part = cast(InputTextPart, part)
if input_text_part.text:
text_fragments.append(input_text_part.text.strip())
elif part.type == "output_text":
elif part_type == "output_text":
output_text_part = cast(OutputTextPart, part)
if output_text_part.text:
text_fragments.append(output_text_part.text.strip())
elif part.type == "refusal":
elif part_type == "refusal":
refusal_part = cast(ContentPartRefusal, part)
if refusal_part.refusal:
text_fragments.append(refusal_part.refusal.strip())
Expand Down
2 changes: 1 addition & 1 deletion tests/e2e/features/info.feature
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Feature: Info tests
When I access REST API endpoint "info" using HTTP GET method
Then The status code of the response is 200
And The body of the response has proper name Lightspeed Core Service (LCS) and version 0.4.1
And The body of the response has llama-stack version 0.4.3
And The body of the response has llama-stack version 0.5.2

@skip-in-library-mode
Scenario: Check if info endpoint reports error when llama-stack connection is not working
Expand Down
Loading
Loading