Skip to content

LCORE-1285: update llama stack to 0.5.0#1112

Open
jrobertboos wants to merge 8 commits intolightspeed-core:mainfrom
jrobertboos:lcore-1285
Open

LCORE-1285: update llama stack to 0.5.0#1112
jrobertboos wants to merge 8 commits intolightspeed-core:mainfrom
jrobertboos:lcore-1285

Conversation

@jrobertboos
Copy link
Contributor

@jrobertboos jrobertboos commented Feb 6, 2026

Description

Updated Llama Stack to 0.5.0 in order to enable the network configuration on providers so that TLS and Proxy support can be added.

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up service version
  • Bump-up dependent library
  • Bump-up library or tool used for development (does not change the final image)
  • CI configuration change
  • Konflux configuration change
  • Unit tests improvement
  • Integration tests improvement
  • End to end tests improvement
  • Benchmarks improvement

Tools used to create PR

Identify any AI code assistants used in this PR (for transparency and review context)

  • Assisted-by: N/A
  • Generated by: N/A

Related Tickets & Documents

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Summary by CodeRabbit

  • Chores
    • Updated Llama Stack framework dependencies to v0.5.2 and aligned compatibility checks.
  • Bug Fixes
    • Improved robustness of response handling to avoid errors when response parts lack type information.
  • Tests
    • Updated end-to-end test expectations to reflect the updated framework version.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 6, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 65adbce5-493c-4446-be9a-b53627711538

📥 Commits

Reviewing files that changed from the base of the PR and between 74eb52c and 403e6c3.

📒 Files selected for processing (1)
  • src/utils/responses.py

Walkthrough

Updated llama-stack dependency versions and corresponding version constant and test expectation; also made part type access in response formatting safer by using getattr.

Changes

Cohort / File(s) Summary
Dependency Updates
pyproject.toml
Bumped llama-stack, llama-stack-client, and llama-stack-api to 0.5.2 (was 0.4.3 / 0.4.4).
Version Constants
src/constants.py
Updated MAXIMAL_SUPPORTED_LLAMA_STACK_VERSION from "0.4.3" to "0.5.2".
Test Updates
tests/e2e/features/info.feature
Adjusted expected llama-stack version in info endpoint test to 0.5.2.
Response Handling
src/utils/responses.py
Made access to a part's type safer by using getattr(part, "type", None) and comparing against "input_text", "output_text", and "refusal".

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Title check ⚠️ Warning The title states 'update llama stack to 0.5.0' but the actual changes bump to version 0.5.2, not 0.5.0 as claimed. Update the PR title to accurately reflect the target version: 'LCORE-1285: update llama stack to 0.5.2'
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In `@pyproject.toml`:
- Around line 31-33: Update the mismatched dependency for llama-stack-api:
replace "llama-stack-api==0.5.0" with the latest published version
"llama-stack-api==0.4.3" (or align all three to a consistent, released version)
so installations won't fail; locate the dependency entry for llama-stack-api in
the pyproject.toml dependency list and change the version string accordingly.

@jrobertboos jrobertboos marked this pull request as draft February 9, 2026 20:13
@jrobertboos jrobertboos marked this pull request as ready for review March 3, 2026 18:09
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/utils/responses.py`:
- Around line 1082-1084: The code currently appends part.strip() to
text_fragments unconditionally inside the branch that checks isinstance(part,
str); change this to only append when the stripped string is non-empty to avoid
inserting empty fragments from whitespace-only parts—update the branch handling
the variables part and text_fragments (where the if isinstance(part, str): ...
continue logic appears) so that you compute stripped = part.strip() and append
stripped only if stripped is truthy.

ℹ️ Review info

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 2e001ab and 60efb35.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (4)
  • pyproject.toml
  • src/constants.py
  • src/utils/responses.py
  • tests/e2e/features/info.feature
🚧 Files skipped from review as they are similar to previous changes (2)
  • tests/e2e/features/info.feature
  • pyproject.toml

Comment on lines +1082 to +1084
if isinstance(part, str):
text_fragments.append(part.strip())
continue
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Avoid appending empty fragments from whitespace-only string parts.

Line 1083 appends part.strip() unconditionally. If part is whitespace-only, this adds "" and can create spacing artifacts in the final joined text.

Suggested fix
-        if isinstance(part, str):
-            text_fragments.append(part.strip())
+        if isinstance(part, str):
+            stripped_part = part.strip()
+            if stripped_part:
+                text_fragments.append(stripped_part)
             continue
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/utils/responses.py` around lines 1082 - 1084, The code currently appends
part.strip() to text_fragments unconditionally inside the branch that checks
isinstance(part, str); change this to only append when the stripped string is
non-empty to avoid inserting empty fragments from whitespace-only parts—update
the branch handling the variables part and text_fragments (where the if
isinstance(part, str): ... continue logic appears) so that you compute stripped
= part.strip() and append stripped only if stripped is truthy.

- updated from 0.5.0 -> 0.5.2
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@pyproject.toml`:
- Around line 31-33: The pyproject dependency list pins "llama-stack-api==0.5.2"
which does not exist on PyPI; update the package spec to a valid version (e.g.,
change "llama-stack-api==0.5.2" to "llama-stack-api==0.5.1") and decide whether
to align "llama-stack==0.5.2" and "llama-stack-client==0.5.2" to 0.5.1 for
compatibility—ensure all three package entries ("llama-stack",
"llama-stack-client", "llama-stack-api") reference existing matching versions.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 3c97ce73-2d75-4d10-94cb-06ad82568e92

📥 Commits

Reviewing files that changed from the base of the PR and between 60efb35 and 74eb52c.

⛔ Files ignored due to path filters (1)
  • uv.lock is excluded by !**/*.lock
📒 Files selected for processing (3)
  • pyproject.toml
  • src/constants.py
  • tests/e2e/features/info.feature
🚧 Files skipped from review as they are similar to previous changes (1)
  • src/constants.py

Comment on lines +31 to +33
"llama-stack==0.5.2",
"llama-stack-client==0.5.2",
"llama-stack-api==0.5.2",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🌐 Web query:

llama-stack-api 0.5.2 PyPI package availability

💡 Result:

As of March 6, 2026, llama-stack-api==0.5.2 is not available on PyPI. The PyPI release history for llama-stack-api shows the latest published version as 0.5.1 (Feb 19, 2026), with earlier releases like 0.5.0, 0.4.5, etc., and no 0.5.2 entry. [1]

What you can do

  • Install the latest available: pip install llama-stack-api==0.5.1 (or omit the version to get latest). [1]
  • If you specifically need 0.5.2, it likely hasn’t been published to PyPI yet (you’d need to install from source/tag instead). [1]

Sources
[1] PyPI project page / release history for llama-stack-api (shows latest as 0.5.1, no 0.5.2) (pypi.org)

Citations:


Fix version pinning: llama-stack-api==0.5.2 does not exist on PyPI.

llama-stack-api==0.5.2 is not available on PyPI. The latest published version is 0.5.1 (Feb 19, 2026). Update line 33 to use llama-stack-api==0.5.1 or check if the other llama-stack packages (llama-stack and llama-stack-client) should also be downgraded to 0.5.1 for compatibility.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@pyproject.toml` around lines 31 - 33, The pyproject dependency list pins
"llama-stack-api==0.5.2" which does not exist on PyPI; update the package spec
to a valid version (e.g., change "llama-stack-api==0.5.2" to
"llama-stack-api==0.5.1") and decide whether to align "llama-stack==0.5.2" and
"llama-stack-client==0.5.2" to 0.5.1 for compatibility—ensure all three package
entries ("llama-stack", "llama-stack-client", "llama-stack-api") reference
existing matching versions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants