LCORE-1285: update llama stack to 0.5.0#1112
LCORE-1285: update llama stack to 0.5.0#1112jrobertboos wants to merge 8 commits intolightspeed-core:mainfrom
Conversation
…api; adjust constants and tests accordingly
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
WalkthroughUpdated llama-stack dependency versions and corresponding version constant and test expectation; also made part type access in response formatting safer by using getattr. Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Fix all issues with AI agents
In `@pyproject.toml`:
- Around line 31-33: Update the mismatched dependency for llama-stack-api:
replace "llama-stack-api==0.5.0" with the latest published version
"llama-stack-api==0.4.3" (or align all three to a consistent, released version)
so installations won't fail; locate the dependency entry for llama-stack-api in
the pyproject.toml dependency list and change the version string accordingly.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/utils/responses.py`:
- Around line 1082-1084: The code currently appends part.strip() to
text_fragments unconditionally inside the branch that checks isinstance(part,
str); change this to only append when the stripped string is non-empty to avoid
inserting empty fragments from whitespace-only parts—update the branch handling
the variables part and text_fragments (where the if isinstance(part, str): ...
continue logic appears) so that you compute stripped = part.strip() and append
stripped only if stripped is truthy.
ℹ️ Review info
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (4)
pyproject.tomlsrc/constants.pysrc/utils/responses.pytests/e2e/features/info.feature
🚧 Files skipped from review as they are similar to previous changes (2)
- tests/e2e/features/info.feature
- pyproject.toml
src/utils/responses.py
Outdated
| if isinstance(part, str): | ||
| text_fragments.append(part.strip()) | ||
| continue |
There was a problem hiding this comment.
Avoid appending empty fragments from whitespace-only string parts.
Line 1083 appends part.strip() unconditionally. If part is whitespace-only, this adds "" and can create spacing artifacts in the final joined text.
Suggested fix
- if isinstance(part, str):
- text_fragments.append(part.strip())
+ if isinstance(part, str):
+ stripped_part = part.strip()
+ if stripped_part:
+ text_fragments.append(stripped_part)
continue🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/utils/responses.py` around lines 1082 - 1084, The code currently appends
part.strip() to text_fragments unconditionally inside the branch that checks
isinstance(part, str); change this to only append when the stripped string is
non-empty to avoid inserting empty fragments from whitespace-only parts—update
the branch handling the variables part and text_fragments (where the if
isinstance(part, str): ... continue logic appears) so that you compute stripped
= part.strip() and append stripped only if stripped is truthy.
- updated from 0.5.0 -> 0.5.2
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@pyproject.toml`:
- Around line 31-33: The pyproject dependency list pins "llama-stack-api==0.5.2"
which does not exist on PyPI; update the package spec to a valid version (e.g.,
change "llama-stack-api==0.5.2" to "llama-stack-api==0.5.1") and decide whether
to align "llama-stack==0.5.2" and "llama-stack-client==0.5.2" to 0.5.1 for
compatibility—ensure all three package entries ("llama-stack",
"llama-stack-client", "llama-stack-api") reference existing matching versions.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 3c97ce73-2d75-4d10-94cb-06ad82568e92
⛔ Files ignored due to path filters (1)
uv.lockis excluded by!**/*.lock
📒 Files selected for processing (3)
pyproject.tomlsrc/constants.pytests/e2e/features/info.feature
🚧 Files skipped from review as they are similar to previous changes (1)
- src/constants.py
| "llama-stack==0.5.2", | ||
| "llama-stack-client==0.5.2", | ||
| "llama-stack-api==0.5.2", |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
llama-stack-api 0.5.2 PyPI package availability
💡 Result:
As of March 6, 2026, llama-stack-api==0.5.2 is not available on PyPI. The PyPI release history for llama-stack-api shows the latest published version as 0.5.1 (Feb 19, 2026), with earlier releases like 0.5.0, 0.4.5, etc., and no 0.5.2 entry. [1]
What you can do
- Install the latest available:
pip install llama-stack-api==0.5.1(or omit the version to get latest). [1] - If you specifically need 0.5.2, it likely hasn’t been published to PyPI yet (you’d need to install from source/tag instead). [1]
Sources
[1] PyPI project page / release history for llama-stack-api (shows latest as 0.5.1, no 0.5.2) (pypi.org)
Citations:
Fix version pinning: llama-stack-api==0.5.2 does not exist on PyPI.
llama-stack-api==0.5.2 is not available on PyPI. The latest published version is 0.5.1 (Feb 19, 2026). Update line 33 to use llama-stack-api==0.5.1 or check if the other llama-stack packages (llama-stack and llama-stack-client) should also be downgraded to 0.5.1 for compatibility.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@pyproject.toml` around lines 31 - 33, The pyproject dependency list pins
"llama-stack-api==0.5.2" which does not exist on PyPI; update the package spec
to a valid version (e.g., change "llama-stack-api==0.5.2" to
"llama-stack-api==0.5.1") and decide whether to align "llama-stack==0.5.2" and
"llama-stack-client==0.5.2" to 0.5.1 for compatibility—ensure all three package
entries ("llama-stack", "llama-stack-client", "llama-stack-api") reference
existing matching versions.
Description
Updated Llama Stack to 0.5.0 in order to enable the network configuration on providers so that TLS and Proxy support can be added.
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Checklist before requesting a review
Testing
Summary by CodeRabbit