OLS-1958: Correct OpenAPI spec violations in docs/openapi.json, add OpenAPI linting to CI#1565
OLS-1958: Correct OpenAPI spec violations in docs/openapi.json, add OpenAPI linting to CI#1565syedriko wants to merge 1 commit intolightspeed-core:mainfrom
Conversation
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 51 minutes and 38 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: ASSERTIVE Plan: Pro Run ID: 📒 Files selected for processing (10)
WalkthroughAdds Spectral OpenAPI linting (CI workflow and Make target), configures Spectral rules, centralizes OpenAPI tags in the app and includes them in generated schema, splits the Changes
Sequence Diagram(s)sequenceDiagram
participant Client as Client
participant Route as FastAPI Route (GET/POST /a2a)
participant Auth as Auth Dependency
participant MCP as mcp_headers Dependency
participant Handler as _handle_a2a_jsonrpc
participant Response as Response / StreamingResponse
Client->>Route: HTTP GET/POST /a2a
Route->>Auth: resolve auth dependency
Auth-->>Route: AuthTuple
Route->>MCP: resolve mcp headers
MCP-->>Route: McpHeaders
Route->>Handler: call _handle_a2a_jsonrpc(request, AuthTuple, McpHeaders)
Handler->>Handler: process JSON‑RPC, detect streaming or non‑streaming
Handler-->>Response: return Response or StreamingResponse
Response-->>Client: HTTP response (JSON or text/event-stream)
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes 🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
✨ Simplify code
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/openapi_spectral.yaml:
- Around line 3-5: The workflow is triggering twice because the top-level "on"
includes both "push" and "pull_request" with no branch filters; update the "on"
configuration so "push" is scoped (e.g., set "push" to only run on protected
branches like main or tags) while keeping "pull_request" for PR checks to avoid
duplicate runs — modify the "on" block (the "push" and "pull_request" keys in
.github/workflows/openapi_spectral.yaml) to include appropriate branch filters
such as push: branches: [main] or pull_request: branches: [main], or use
branches-ignore on push to exclude feature branches.
In `@CONTRIBUTING.md`:
- Around line 221-223: Update the Prerequisites / Tooling section to list
Node.js (and npx) as a required prerequisite so contributors know it's needed
for make verify; mention that make verify runs the lint-openapi target which
invokes npx `@stoplight/spectral-cli`@6, and add a short note or link on how to
install Node/npx (or the minimum Node version) so contributors won't see "npx:
command not found" when running make verify.
In `@docs/contributing/openapi-tags-and-spectral.md`:
- Line 5: The sentence in docs/contributing/openapi-tags-and-spectral.md is too
dense; split it into two sentences for clarity: first explain that FastAPI
assigns route tags via APIRouter(tags=[...]) in src/app/endpoints/, then explain
the OpenAPI requirement and our solution—maintaining _OPENAPI_TAGS in
src/app/main.py and passing it as openapi_tags into the FastAPI app so
Spectral's operation-tag-defined rule passes. Keep references to
APIRouter(tags=[...]), src/app/endpoints/, _OPENAPI_TAGS, src/app/main.py, and
openapi_tags in the rewritten text.
In `@docs/openapi.json`:
- Around line 9938-9939: Update the public OpenAPI texts to use standard
JSON-RPC casing and remove implementation details: change the summary field
"Handle A2A Jsonrpc Get" to "Handle A2A JSON-RPC GET" and replace the
description that says "Handle GET on ``/a2a`` for A2A JSON-RPC (same handler as
POST)." with a concise client-focused sentence such as "Handle GET on /a2a to
perform a JSON-RPC request; returns JSON-RPC formatted responses."—edit the
summary and description values (the JSON keys shown in the diff) and apply the
same change at the other occurrence around lines 9956–9958.
In `@Makefile`:
- Around line 116-126: The verify target now depends on lint-openapi which
shells out to npx, adding an implicit Node/npx requirement; either document
Node/npx in CONTRIBUTING.md tooling or make lint-openapi tolerate missing npx so
local make verify won't fail—update the Makefile's lint-openapi target to check
for npx (using command -v npx or similar) and skip with an explanatory message
if not found, while leaving the CI workflow that runs the Spectral lint
unchanged so CI still enforces the check.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: e9710063-bf71-404d-a65e-296270bb9913
📒 Files selected for processing (10)
.github/workflows/openapi_spectral.yaml.spectral.yamlCONTRIBUTING.mdMakefiledocs/contributing/openapi-tags-and-spectral.mddocs/openapi.jsonscripts/generate_openapi_schema.pysrc/app/endpoints/a2a.pysrc/app/main.pysrc/models/responses.py
📜 Review details
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (14)
- GitHub Check: unit_tests (3.12)
- GitHub Check: mypy
- GitHub Check: radon
- GitHub Check: black
- GitHub Check: bandit
- GitHub Check: build-pr
- GitHub Check: E2E: server mode / ci / group 1
- GitHub Check: E2E: server mode / ci / group 3
- GitHub Check: E2E: server mode / ci / group 2
- GitHub Check: E2E: library mode / ci / group 1
- GitHub Check: E2E Tests for Lightspeed Evaluation job
- GitHub Check: E2E: library mode / ci / group 2
- GitHub Check: Pylinter
- GitHub Check: E2E: library mode / ci / group 3
🧰 Additional context used
📓 Path-based instructions (3)
**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
**/*.py: Use absolute imports for internal modules:from authentication import get_auth_dependency
Import FastAPI dependencies with:from fastapi import APIRouter, HTTPException, Request, status, Depends
Import Llama Stack client with:from llama_stack_client import AsyncLlamaStackClient
Checkconstants.pyfor shared constants before defining new ones
All modules start with descriptive docstrings explaining purpose
Uselogger = get_logger(__name__)fromlog.pyfor module logging
Type aliases defined at module level for clarity
Use Final[type] as type hint for all constants
All functions require docstrings with brief descriptions
Complete type annotations for parameters and return types in functions
Usetyping_extensions.Selffor model validators in Pydantic models
Use modern union type syntaxstr | intinstead ofUnion[str, int]
UseOptional[Type]for optional type hints
Use snake_case with descriptive, action-oriented function names (get_, validate_, check_)
Avoid in-place parameter modification anti-patterns; return new data structures instead
Useasync deffor I/O operations and external API calls
HandleAPIConnectionErrorfrom Llama Stack in error handling
Use standard log levels with clear purposes: debug, info, warning, error
All classes require descriptive docstrings explaining purpose
Use PascalCase for class names with standard suffixes: Configuration, Error/Exception, Resolver, Interface
Use ABC for abstract base classes with@abstractmethoddecorators
Use@model_validatorand@field_validatorfor Pydantic model validation
Complete type annotations for all class attributes; use specific types, notAny
Follow Google Python docstring conventions with Parameters, Returns, Raises, and Attributes sections
Files:
scripts/generate_openapi_schema.pysrc/models/responses.pysrc/app/main.pysrc/app/endpoints/a2a.py
src/**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
Pydantic models extend
ConfigurationBasefor config,BaseModelfor data models
Files:
src/models/responses.pysrc/app/main.pysrc/app/endpoints/a2a.py
src/app/endpoints/**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
Use FastAPI
HTTPExceptionwith appropriate status codes for API endpoints
Files:
src/app/endpoints/a2a.py
🧠 Learnings (7)
📚 Learning: 2026-04-15T18:53:54.785Z
Learnt from: Lifto
Repo: lightspeed-core/lightspeed-stack PR: 1510
File: src/models/requests.py:742-769
Timestamp: 2026-04-15T18:53:54.785Z
Learning: In `src/models/requests.py`, keep `ResponsesRequest.validate_body_size` using `json.dumps(values)` with Python defaults (including ASCII escaping and the default separators). This default formatting is intentional to make the 65,536-character limit conservatively strict (accounting for small encoding/format overhead). Do not recommend changing it to `ensure_ascii=False` or using compact separators for this validator, since an exact wire-format byte match with the client payload is not achievable via `json.dumps` anyway.
Applied to files:
.spectral.yaml
📚 Learning: 2026-04-20T15:09:45.119Z
Learnt from: major
Repo: lightspeed-core/lightspeed-stack PR: 1548
File: src/app/endpoints/rlsapi_v1.py:56-56
Timestamp: 2026-04-20T15:09:45.119Z
Learning: In `src/app/endpoints/rlsapi_v1.py`, treat the line `_get_rh_identity_context = get_rh_identity_context` as an intentional temporary backward-compatibility shim (introduced for PR `#1548`, Splunk HEC telemetry work). Do not flag it as dead/unnecessary/unused code during review until the planned part 3 is merged and the responses endpoint is fully wired up such that no tests or external consumers reference the underscore-prefixed name.
Applied to files:
.spectral.yaml
📚 Learning: 2026-03-02T15:57:15.929Z
Learnt from: asamal4
Repo: lightspeed-core/lightspeed-stack PR: 1250
File: AGENTS.md:11-12
Timestamp: 2026-03-02T15:57:15.929Z
Learning: In lightspeed-stack's AGENTS.md, the Linting Tools section lists actual tool names (e.g., mypy, pylint, pyright, ruff, black), not make target names. For example, `check-types` is a make target that runs mypy, but mypy is the tool name documented in the Linting Tools section.
Applied to files:
MakefileCONTRIBUTING.md
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv run make verify` to run all linters (black, pylint, pyright, ruff, docstyle, check-types)
Applied to files:
MakefileCONTRIBUTING.md
📚 Learning: 2026-01-12T10:58:40.230Z
Learnt from: blublinsky
Repo: lightspeed-core/lightspeed-stack PR: 972
File: src/models/config.py:459-513
Timestamp: 2026-01-12T10:58:40.230Z
Learning: In lightspeed-core/lightspeed-stack, for Python files under src/models, when a user claims a fix is done but the issue persists, verify the current code state before accepting the fix. Steps: review the diff, fetch the latest changes, run relevant tests, reproduce the issue, search the codebase for lingering references to the original problem, confirm the fix is applied and not undone by subsequent commits, and validate with local checks to ensure the issue is resolved.
Applied to files:
src/models/responses.py
📚 Learning: 2026-02-25T07:46:33.545Z
Learnt from: asimurka
Repo: lightspeed-core/lightspeed-stack PR: 1211
File: src/models/responses.py:8-16
Timestamp: 2026-02-25T07:46:33.545Z
Learning: In the Python codebase, requests.py should use OpenAIResponseInputTool as Tool while responses.py uses OpenAIResponseTool as Tool. This difference is intentional due to differing schemas for input vs output tools in llama-stack-api. Apply this distinction consistently to other models under src/models (e.g., ensure request-related tools use the InputTool variant and response-related tools use the ResponseTool variant). If adding new tools, choose the corresponding InputTool or Tool class based on whether the tool represents input or output, and document the rationale in code comments.
Applied to files:
src/models/responses.py
📚 Learning: 2026-04-06T20:18:07.852Z
Learnt from: major
Repo: lightspeed-core/lightspeed-stack PR: 1463
File: src/app/endpoints/rlsapi_v1.py:266-271
Timestamp: 2026-04-06T20:18:07.852Z
Learning: In the lightspeed-stack codebase, within `src/app/endpoints/` inference/MCP endpoints, treat `tools: Optional[list[Any]]` in MCP tool definitions as an intentional, consistent typing pattern (used across `query`, `responses`, `streaming_query`, `rlsapi_v1`). Do not raise or suggest this as a typing issue during code review; changing it in isolation could break endpoint typing consistency across the codebase.
Applied to files:
src/app/endpoints/a2a.py
🪛 checkmake (0.2.2)
Makefile
[warning] 119-119: Target body for "verify" exceeds allowed length of 5 (7).
(maxbodylength)
🪛 Checkov (3.2.519)
docs/openapi.json
[high] 1-19584: Ensure that the global security field has rules defined
(CKV_OPENAPI_4)
[high] 1-19584: Ensure that security operations is not empty.
(CKV_OPENAPI_5)
🔇 Additional comments (8)
src/app/main.py (1)
35-61: The tag list in_OPENAPI_TAGSis complete and synchronized.Verification confirms all 24 router tags used across the codebase are declared in
_OPENAPI_TAGS. However, the lack of automated checking remains a concern—consider adding a pre-commit hook or CI check that validates tag declarations against actual router usage to prevent drift as new routers are added.docs/contributing/openapi-tags-and-spectral.md (3)
9-13: Clear regeneration instructions.The schema generation command and workflow are well-documented with appropriate context about when regeneration is needed.
25-25: Excellent justification for the disabled rule.The explanation clearly documents why
oas3-valid-media-exampleis disabled and provides guidance for when it should be re-enabled. This will help future maintainers understand the trade-off.
23-23: The documentation statement is accurate. The Makefile confirms thatlint-openapiis called as part of theverifytarget (line 126 of the Makefile shows$(MAKE) lint-openapiwithin the verify recipe).docs/openapi.json (4)
20-20: LGTM — server base URL is cleaner without the trailing slash.This avoids accidental double slashes when tools concatenate path templates.
8583-8585: LGTM — SSE behavior is documented at the response level.This keeps the OpenAPI document valid while still describing the
text/event-streampayload semantics.
8638-8642: LGTM — SSE example is concrete and useful.The sample shows event framing, payload shape, completion, and the terminal
[DONE]marker.
19486-19584: LGTM — top-level tag metadata improves generated API docs.The tag list is valid OpenAPI metadata and should make grouped documentation clearer.
55a492e to
ffcecf4
Compare
There was a problem hiding this comment.
Actionable comments posted: 1
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/app/main.py (1)
3-7: 🛠️ Refactor suggestion | 🟠 MajorMark
_OPENAPI_TAGSasFinal.This is a new module-level constant, but it is not annotated with
Final[...].♻️ Proposed fix
import os from collections.abc import AsyncIterator from contextlib import asynccontextmanager +from typing import Final @@ -_OPENAPI_TAGS: list[dict[str, str]] = [ +_OPENAPI_TAGS: Final[list[dict[str, str]]] = [As per coding guidelines, Use Final[type] as type hint for all constants.
Also applies to: 35-61
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/app/main.py` around lines 3 - 7, Add a Final annotation to the module-level constant _OPENAPI_TAGS and any other constants defined in the same region (lines ~35-61): import Final from typing (or typing_extensions if needed) and change declarations like _OPENAPI_TAGS = ... to include an explicit Final[...] type hint (e.g., _OPENAPI_TAGS: Final[list[dict[str, str]]]) so the constants are properly typed as immutable; apply the same pattern to the other module-level constants in that block using accurate concrete types for each constant.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/openapi.json`:
- Around line 19486-19584: CI currently lints the committed docs/openapi.json
but doesn't regenerate it, so stale tags can slip through; update the
openapi_spectral workflow to run scripts/generate_openapi_schema.py (using the
same Python environment/requirements as local dev), write its output to
docs/openapi.json (or a temp file) and then fail the job if the generated file
differs from the committed docs/openapi.json (e.g., by running a git diff
--exit-code or comparing the two files); reference the module/entrypoint
scripts/generate_openapi_schema.py and the constant _OPENAPI_TAGS in
src/app/main.py when implementing this step.
---
Outside diff comments:
In `@src/app/main.py`:
- Around line 3-7: Add a Final annotation to the module-level constant
_OPENAPI_TAGS and any other constants defined in the same region (lines ~35-61):
import Final from typing (or typing_extensions if needed) and change
declarations like _OPENAPI_TAGS = ... to include an explicit Final[...] type
hint (e.g., _OPENAPI_TAGS: Final[list[dict[str, str]]]) so the constants are
properly typed as immutable; apply the same pattern to the other module-level
constants in that block using accurate concrete types for each constant.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 5e970a94-b60e-4c0f-9645-5b89db14baf9
📒 Files selected for processing (10)
.github/workflows/openapi_spectral.yaml.spectral.yamlCONTRIBUTING.mdMakefiledocs/contributing/openapi-tags-and-spectral.mddocs/openapi.jsonscripts/generate_openapi_schema.pysrc/app/endpoints/a2a.pysrc/app/main.pysrc/models/responses.py
📜 Review details
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (8)
- GitHub Check: build-pr
- GitHub Check: E2E: server mode / ci / group 3
- GitHub Check: E2E: server mode / ci / group 1
- GitHub Check: E2E: library mode / ci / group 2
- GitHub Check: E2E: server mode / ci / group 2
- GitHub Check: E2E: library mode / ci / group 1
- GitHub Check: E2E Tests for Lightspeed Evaluation job
- GitHub Check: E2E: library mode / ci / group 3
🧰 Additional context used
📓 Path-based instructions (3)
**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
**/*.py: Use absolute imports for internal modules:from authentication import get_auth_dependency
Import FastAPI dependencies with:from fastapi import APIRouter, HTTPException, Request, status, Depends
Import Llama Stack client with:from llama_stack_client import AsyncLlamaStackClient
Checkconstants.pyfor shared constants before defining new ones
All modules start with descriptive docstrings explaining purpose
Uselogger = get_logger(__name__)fromlog.pyfor module logging
Type aliases defined at module level for clarity
Use Final[type] as type hint for all constants
All functions require docstrings with brief descriptions
Complete type annotations for parameters and return types in functions
Usetyping_extensions.Selffor model validators in Pydantic models
Use modern union type syntaxstr | intinstead ofUnion[str, int]
UseOptional[Type]for optional type hints
Use snake_case with descriptive, action-oriented function names (get_, validate_, check_)
Avoid in-place parameter modification anti-patterns; return new data structures instead
Useasync deffor I/O operations and external API calls
HandleAPIConnectionErrorfrom Llama Stack in error handling
Use standard log levels with clear purposes: debug, info, warning, error
All classes require descriptive docstrings explaining purpose
Use PascalCase for class names with standard suffixes: Configuration, Error/Exception, Resolver, Interface
Use ABC for abstract base classes with@abstractmethoddecorators
Use@model_validatorand@field_validatorfor Pydantic model validation
Complete type annotations for all class attributes; use specific types, notAny
Follow Google Python docstring conventions with Parameters, Returns, Raises, and Attributes sections
Files:
scripts/generate_openapi_schema.pysrc/models/responses.pysrc/app/main.pysrc/app/endpoints/a2a.py
src/**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
Pydantic models extend
ConfigurationBasefor config,BaseModelfor data models
Files:
src/models/responses.pysrc/app/main.pysrc/app/endpoints/a2a.py
src/app/endpoints/**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
Use FastAPI
HTTPExceptionwith appropriate status codes for API endpoints
Files:
src/app/endpoints/a2a.py
🧠 Learnings (11)
📚 Learning: 2026-04-15T18:53:54.785Z
Learnt from: Lifto
Repo: lightspeed-core/lightspeed-stack PR: 1510
File: src/models/requests.py:742-769
Timestamp: 2026-04-15T18:53:54.785Z
Learning: In `src/models/requests.py`, keep `ResponsesRequest.validate_body_size` using `json.dumps(values)` with Python defaults (including ASCII escaping and the default separators). This default formatting is intentional to make the 65,536-character limit conservatively strict (accounting for small encoding/format overhead). Do not recommend changing it to `ensure_ascii=False` or using compact separators for this validator, since an exact wire-format byte match with the client payload is not achievable via `json.dumps` anyway.
Applied to files:
.spectral.yaml
📚 Learning: 2026-04-20T15:09:45.119Z
Learnt from: major
Repo: lightspeed-core/lightspeed-stack PR: 1548
File: src/app/endpoints/rlsapi_v1.py:56-56
Timestamp: 2026-04-20T15:09:45.119Z
Learning: In `src/app/endpoints/rlsapi_v1.py`, treat the line `_get_rh_identity_context = get_rh_identity_context` as an intentional temporary backward-compatibility shim (introduced for PR `#1548`, Splunk HEC telemetry work). Do not flag it as dead/unnecessary/unused code during review until the planned part 3 is merged and the responses endpoint is fully wired up such that no tests or external consumers reference the underscore-prefixed name.
Applied to files:
.spectral.yaml
📚 Learning: 2025-07-28T13:15:54.895Z
Learnt from: radofuchs
Repo: lightspeed-core/lightspeed-stack PR: 306
File: .github/workflows/e2e_tests.yaml:4-4
Timestamp: 2025-07-28T13:15:54.895Z
Learning: The user radofuchs prefers unrestricted pull_request_target workflows with secrets access for comprehensive PR testing, accepting the security trade-offs to ensure all problems are caught during the PR process.
Applied to files:
.github/workflows/openapi_spectral.yaml
📚 Learning: 2026-01-12T10:58:40.230Z
Learnt from: blublinsky
Repo: lightspeed-core/lightspeed-stack PR: 972
File: src/models/config.py:459-513
Timestamp: 2026-01-12T10:58:40.230Z
Learning: In lightspeed-core/lightspeed-stack, for Python files under src/models, when a user claims a fix is done but the issue persists, verify the current code state before accepting the fix. Steps: review the diff, fetch the latest changes, run relevant tests, reproduce the issue, search the codebase for lingering references to the original problem, confirm the fix is applied and not undone by subsequent commits, and validate with local checks to ensure the issue is resolved.
Applied to files:
src/models/responses.py
📚 Learning: 2026-02-25T07:46:33.545Z
Learnt from: asimurka
Repo: lightspeed-core/lightspeed-stack PR: 1211
File: src/models/responses.py:8-16
Timestamp: 2026-02-25T07:46:33.545Z
Learning: In the Python codebase, requests.py should use OpenAIResponseInputTool as Tool while responses.py uses OpenAIResponseTool as Tool. This difference is intentional due to differing schemas for input vs output tools in llama-stack-api. Apply this distinction consistently to other models under src/models (e.g., ensure request-related tools use the InputTool variant and response-related tools use the ResponseTool variant). If adding new tools, choose the corresponding InputTool or Tool class based on whether the tool represents input or output, and document the rationale in code comments.
Applied to files:
src/models/responses.py
📚 Learning: 2026-03-02T15:57:15.929Z
Learnt from: asamal4
Repo: lightspeed-core/lightspeed-stack PR: 1250
File: AGENTS.md:11-12
Timestamp: 2026-03-02T15:57:15.929Z
Learning: In lightspeed-stack's AGENTS.md, the Linting Tools section lists actual tool names (e.g., mypy, pylint, pyright, ruff, black), not make target names. For example, `check-types` is a make target that runs mypy, but mypy is the tool name documented in the Linting Tools section.
Applied to files:
MakefileCONTRIBUTING.md
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv run make verify` to run all linters (black, pylint, pyright, ruff, docstyle, check-types)
Applied to files:
MakefileCONTRIBUTING.md
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv run make format` to auto-format code with black + ruff
Applied to files:
Makefile
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv` package manager with `uv run` prefix for all commands
Applied to files:
CONTRIBUTING.md
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv sync --group dev --group llslibdev` for dependencies
Applied to files:
CONTRIBUTING.md
📚 Learning: 2026-04-06T20:18:07.852Z
Learnt from: major
Repo: lightspeed-core/lightspeed-stack PR: 1463
File: src/app/endpoints/rlsapi_v1.py:266-271
Timestamp: 2026-04-06T20:18:07.852Z
Learning: In the lightspeed-stack codebase, within `src/app/endpoints/` inference/MCP endpoints, treat `tools: Optional[list[Any]]` in MCP tool definitions as an intentional, consistent typing pattern (used across `query`, `responses`, `streaming_query`, `rlsapi_v1`). Do not raise or suggest this as a typing issue during code review; changing it in isolation could break endpoint typing consistency across the codebase.
Applied to files:
src/app/endpoints/a2a.py
🪛 checkmake (0.2.2)
Makefile
[warning] 123-123: Target body for "verify" exceeds allowed length of 5 (7).
(maxbodylength)
🪛 Checkov (3.2.519)
docs/openapi.json
[high] 1-19584: Ensure that the global security field has rules defined
(CKV_OPENAPI_4)
[high] 1-19584: Ensure that security operations is not empty.
(CKV_OPENAPI_5)
🪛 LanguageTool
docs/contributing/openapi-tags-and-spectral.md
[style] ~25-~25: Consider using the synonym “brief” (= concise, using a few words, not lasting long) to strengthen your wording.
Context: ...akefile skips Spectral and prints a short message so make verify can still ...
(QUICK_BRIEF)
[uncategorized] ~25-~25: The official name of this software platform is spelled with a capital “H”.
Context: ...de.js to run the check locally. CI (.github/workflows/openapi_spectral.yaml) alway...
(GITHUB)
CONTRIBUTING.md
[uncategorized] ~52-~52: The official name of this software platform is spelled with a capital “H”.
Context: ...check. CI always runs Spectral (see .github/workflows/openapi_spectral.yaml). The...
(GITHUB)
[uncategorized] ~62-~62: The official name of this software platform is spelled with a capital “H”.
Context: ...work. CI uses Node 22 for Spectral (see .github/workflows/openapi_spectral.yaml). #...
(GITHUB)
🔇 Additional comments (12)
.github/workflows/openapi_spectral.yaml (2)
3-5: Avoid duplicate Spectral runs for same-repo PR branches.With both unfiltered
pushandpull_request, same-repo PRs run this workflow twice. Scopepushto protected branches or tags if that duplication is not intentional.🧹 Proposed trigger scope
on: - - push - - pull_request + push: + branches: [main] + pull_request:
7-20: LGTM — the Spectral job is minimal and read-only.The workflow uses least-privilege read permissions and runs the same error-severity lint command as the local target.
src/models/responses.py (1)
1707-1714: LGTM — SSE documentation is now OpenAPI-compliant.The response-level description keeps the
text/event-streamclarification without using unsupported media-typedescriptionmetadata.src/app/main.py (1)
132-135: LGTM — OpenAPI metadata is wired into the app.The server URL normalization and
openapi_tagswiring align the generated schema with the new Spectral tag checks..spectral.yaml (1)
1-9: LGTM — Spectral configuration is clear and scoped.The file enables the OAS ruleset and documents why media example validation is temporarily disabled.
scripts/generate_openapi_schema.py (1)
78-89: LGTM — generated schemas now include global tags.Passing
app.openapi_tagskeepsdocs/openapi.jsonin sync with the FastAPI tag declarations.Makefile (1)
116-130: LGTM — local verification stays usable while CI enforces OpenAPI linting.The
npxguard avoids surprising local failures, and adding the target toverifymakes the check discoverable.CONTRIBUTING.md (1)
20-25: LGTM — contributor docs cover the new OpenAPI lint workflow.The prerequisites, install guidance, and Spectral section now explain Node/npx, local skip behavior, CI enforcement, and tag-list maintenance.
Also applies to: 47-62, 219-225
docs/contributing/openapi-tags-and-spectral.md (1)
1-27: LGTM — tag ownership and Spectral behavior are well documented.The page clearly states where tags come from, where to mirror them, how schema generation preserves them, and how local/CI linting behaves.
docs/openapi.json (1)
9938-9939: Past polish applied — LGTM.The summaries now use proper "JSON-RPC" casing and the descriptions no longer reference the other handler, addressing the earlier review feedback on both GET and POST operations.
src/app/endpoints/a2a.py (2)
695-731: LGTM — clean split into dedicated GET/POST handlers.The split preserves the original behavior (both paths delegate to
_handle_a2a_jsonrpc), gives each operation a distinctoperation_id,summary, anddescriptionfor OpenAPI uniqueness, and keeps@authorize(Action.A2A_JSONRPC)enforcement on both. This directly addresses the duplicateoperationIdviolation called out in the PR objective.
733-737: Helper extraction is appropriate.Moving the shared ASGI bridging/streaming logic into
_handle_a2a_jsonrpcavoids duplication between the GET and POST handlers. Signature and return type are consistent with the public handlers.
ffcecf4 to
27f69f5
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
scripts/generate_openapi_schema.py (1)
78-89:⚠️ Potential issue | 🟠 MajorRegenerate and commit the OpenAPI artifacts.
The new
tags=app.openapi_tagschanges generated schema output, and CI reportsdocs/openapi.jsondiffers fromuv run python scripts/generate_openapi_schema.py /tmp/openapi-generated.json. Regeneratedocs/openapi.jsonand the derivedopenapi2.md, then commit both.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@scripts/generate_openapi_schema.py` around lines 78 - 89, Run the OpenAPI generation script to update committed artifacts: execute the existing script (python scripts/generate_openapi_schema.py /tmp/openapi-generated.json) which uses get_openapi with the new tags=app.openapi_tags, then overwrite docs/openapi.json with the generated output and regenerate the derived openapi2.md (the project’s conversion step that consumes /tmp/openapi-generated.json), verify diffs, and commit both docs/openapi.json and openapi2.md to the repo.
♻️ Duplicate comments (1)
.github/workflows/openapi_spectral.yaml (1)
7-9:⚠️ Potential issue | 🟡 MinorScope the
pushtrigger to avoid duplicate PR runs.With unfiltered
pushandpull_request, same-repo PR branches run this workflow twice. Keep PR checks, but restrict push runs to protected branches such asmain.🧹 Proposed workflow trigger cleanup
on: - - push - - pull_request + push: + branches: [main] + pull_request:🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/openapi_spectral.yaml around lines 7 - 9, The workflow trigger currently fires on both unfiltered push and pull_request causing duplicate runs for PR branches; change the top-level trigger so push is restricted to protected branches (e.g., replace "- push" with "push: branches: - main" or similar protected branch names) while keeping "pull_request" intact so PR checks still run; update the YAML "on:" block to list "pull_request" and a scoped "push: branches: [main]" to avoid duplicate PR runs.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@docs/contributing/openapi-tags-and-spectral.md`:
- Line 25: The long sentence that begins "This is part of **`make verify`**. If
**`npx`** is not on your **`PATH`**, the Makefile **skips** Spectral and prints
a short message so **`make verify`** can still pass; install Node.js to run the
check locally. **CI** (`.github/workflows/openapi_spectral.yaml`) always runs
Spectral. Failures are driven by **error**-severity rules." should be split into
shorter sentences for clarity: one stating Spectral is part of make verify, one
explaining the local fallback when npx is missing and the suggested fix (install
Node.js), one noting CI always runs Spectral, and a final short sentence stating
failures are driven by error-severity rules; update the paragraph in
docs/contributing/openapi-tags-and-spectral.md accordingly.
In `@docs/openapi.json`:
- Around line 19486-19584: The committed docs/openapi.json is out of sync with
the generator; run the OpenAPI generator and recommit the regenerated file.
Execute the script scripts/generate_openapi_schema.py (as in uv run
scripts/generate_openapi_schema.py docs/openapi.json) to regenerate
docs/openapi.json so it matches get_openapi(...) output derived from
_OPENAPI_TAGS and the updated ResponsesResponse.openapi_response(); verify the
top-level tags ordering/content and any SSE/response metadata match, then commit
the regenerated file.
In `@src/app/main.py`:
- Around line 35-61: The module-level constant _OPENAPI_TAGS should be annotated
with typing.Final to follow the repo's typing convention: import Final from
typing and change the annotation to use Final[...] (e.g., Final[list[dict[str,
str]]]) for the _OPENAPI_TAGS declaration so the constant is statically
recognized as immutable; update the import if necessary and keep the same value
assignment.
---
Outside diff comments:
In `@scripts/generate_openapi_schema.py`:
- Around line 78-89: Run the OpenAPI generation script to update committed
artifacts: execute the existing script (python
scripts/generate_openapi_schema.py /tmp/openapi-generated.json) which uses
get_openapi with the new tags=app.openapi_tags, then overwrite docs/openapi.json
with the generated output and regenerate the derived openapi2.md (the project’s
conversion step that consumes /tmp/openapi-generated.json), verify diffs, and
commit both docs/openapi.json and openapi2.md to the repo.
---
Duplicate comments:
In @.github/workflows/openapi_spectral.yaml:
- Around line 7-9: The workflow trigger currently fires on both unfiltered push
and pull_request causing duplicate runs for PR branches; change the top-level
trigger so push is restricted to protected branches (e.g., replace "- push" with
"push: branches: - main" or similar protected branch names) while keeping
"pull_request" intact so PR checks still run; update the YAML "on:" block to
list "pull_request" and a scoped "push: branches: [main]" to avoid duplicate PR
runs.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 95259f47-886a-4b48-a46f-d15c3832365e
📒 Files selected for processing (10)
.github/workflows/openapi_spectral.yaml.spectral.yamlCONTRIBUTING.mdMakefiledocs/contributing/openapi-tags-and-spectral.mddocs/openapi.jsonscripts/generate_openapi_schema.pysrc/app/endpoints/a2a.pysrc/app/main.pysrc/models/responses.py
📜 Review details
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (12)
- GitHub Check: Pylinter
- GitHub Check: integration_tests (3.12)
- GitHub Check: unit_tests (3.12)
- GitHub Check: unit_tests (3.13)
- GitHub Check: build-pr
- GitHub Check: E2E: library mode / ci / group 1
- GitHub Check: E2E: server mode / ci / group 1
- GitHub Check: E2E: library mode / ci / group 3
- GitHub Check: E2E: server mode / ci / group 3
- GitHub Check: E2E: library mode / ci / group 2
- GitHub Check: E2E: server mode / ci / group 2
- GitHub Check: E2E Tests for Lightspeed Evaluation job
🧰 Additional context used
📓 Path-based instructions (3)
**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
**/*.py: Use absolute imports for internal modules:from authentication import get_auth_dependency
Import FastAPI dependencies with:from fastapi import APIRouter, HTTPException, Request, status, Depends
Import Llama Stack client with:from llama_stack_client import AsyncLlamaStackClient
Checkconstants.pyfor shared constants before defining new ones
All modules start with descriptive docstrings explaining purpose
Uselogger = get_logger(__name__)fromlog.pyfor module logging
Type aliases defined at module level for clarity
Use Final[type] as type hint for all constants
All functions require docstrings with brief descriptions
Complete type annotations for parameters and return types in functions
Usetyping_extensions.Selffor model validators in Pydantic models
Use modern union type syntaxstr | intinstead ofUnion[str, int]
UseOptional[Type]for optional type hints
Use snake_case with descriptive, action-oriented function names (get_, validate_, check_)
Avoid in-place parameter modification anti-patterns; return new data structures instead
Useasync deffor I/O operations and external API calls
HandleAPIConnectionErrorfrom Llama Stack in error handling
Use standard log levels with clear purposes: debug, info, warning, error
All classes require descriptive docstrings explaining purpose
Use PascalCase for class names with standard suffixes: Configuration, Error/Exception, Resolver, Interface
Use ABC for abstract base classes with@abstractmethoddecorators
Use@model_validatorand@field_validatorfor Pydantic model validation
Complete type annotations for all class attributes; use specific types, notAny
Follow Google Python docstring conventions with Parameters, Returns, Raises, and Attributes sections
Files:
scripts/generate_openapi_schema.pysrc/models/responses.pysrc/app/main.pysrc/app/endpoints/a2a.py
src/**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
Pydantic models extend
ConfigurationBasefor config,BaseModelfor data models
Files:
src/models/responses.pysrc/app/main.pysrc/app/endpoints/a2a.py
src/app/endpoints/**/*.py
📄 CodeRabbit inference engine (AGENTS.md)
Use FastAPI
HTTPExceptionwith appropriate status codes for API endpoints
Files:
src/app/endpoints/a2a.py
🧠 Learnings (11)
📚 Learning: 2026-04-15T18:53:54.785Z
Learnt from: Lifto
Repo: lightspeed-core/lightspeed-stack PR: 1510
File: src/models/requests.py:742-769
Timestamp: 2026-04-15T18:53:54.785Z
Learning: In `src/models/requests.py`, keep `ResponsesRequest.validate_body_size` using `json.dumps(values)` with Python defaults (including ASCII escaping and the default separators). This default formatting is intentional to make the 65,536-character limit conservatively strict (accounting for small encoding/format overhead). Do not recommend changing it to `ensure_ascii=False` or using compact separators for this validator, since an exact wire-format byte match with the client payload is not achievable via `json.dumps` anyway.
Applied to files:
.spectral.yaml
📚 Learning: 2026-04-20T15:09:45.119Z
Learnt from: major
Repo: lightspeed-core/lightspeed-stack PR: 1548
File: src/app/endpoints/rlsapi_v1.py:56-56
Timestamp: 2026-04-20T15:09:45.119Z
Learning: In `src/app/endpoints/rlsapi_v1.py`, treat the line `_get_rh_identity_context = get_rh_identity_context` as an intentional temporary backward-compatibility shim (introduced for PR `#1548`, Splunk HEC telemetry work). Do not flag it as dead/unnecessary/unused code during review until the planned part 3 is merged and the responses endpoint is fully wired up such that no tests or external consumers reference the underscore-prefixed name.
Applied to files:
.spectral.yaml
📚 Learning: 2026-01-12T10:58:40.230Z
Learnt from: blublinsky
Repo: lightspeed-core/lightspeed-stack PR: 972
File: src/models/config.py:459-513
Timestamp: 2026-01-12T10:58:40.230Z
Learning: In lightspeed-core/lightspeed-stack, for Python files under src/models, when a user claims a fix is done but the issue persists, verify the current code state before accepting the fix. Steps: review the diff, fetch the latest changes, run relevant tests, reproduce the issue, search the codebase for lingering references to the original problem, confirm the fix is applied and not undone by subsequent commits, and validate with local checks to ensure the issue is resolved.
Applied to files:
src/models/responses.py
📚 Learning: 2026-02-25T07:46:33.545Z
Learnt from: asimurka
Repo: lightspeed-core/lightspeed-stack PR: 1211
File: src/models/responses.py:8-16
Timestamp: 2026-02-25T07:46:33.545Z
Learning: In the Python codebase, requests.py should use OpenAIResponseInputTool as Tool while responses.py uses OpenAIResponseTool as Tool. This difference is intentional due to differing schemas for input vs output tools in llama-stack-api. Apply this distinction consistently to other models under src/models (e.g., ensure request-related tools use the InputTool variant and response-related tools use the ResponseTool variant). If adding new tools, choose the corresponding InputTool or Tool class based on whether the tool represents input or output, and document the rationale in code comments.
Applied to files:
src/models/responses.py
📚 Learning: 2026-03-02T15:57:15.929Z
Learnt from: asamal4
Repo: lightspeed-core/lightspeed-stack PR: 1250
File: AGENTS.md:11-12
Timestamp: 2026-03-02T15:57:15.929Z
Learning: In lightspeed-stack's AGENTS.md, the Linting Tools section lists actual tool names (e.g., mypy, pylint, pyright, ruff, black), not make target names. For example, `check-types` is a make target that runs mypy, but mypy is the tool name documented in the Linting Tools section.
Applied to files:
MakefileCONTRIBUTING.md
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv run make verify` to run all linters (black, pylint, pyright, ruff, docstyle, check-types)
Applied to files:
MakefileCONTRIBUTING.md
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv run make format` to auto-format code with black + ruff
Applied to files:
Makefile
📚 Learning: 2025-07-28T13:15:54.895Z
Learnt from: radofuchs
Repo: lightspeed-core/lightspeed-stack PR: 306
File: .github/workflows/e2e_tests.yaml:4-4
Timestamp: 2025-07-28T13:15:54.895Z
Learning: The user radofuchs prefers unrestricted pull_request_target workflows with secrets access for comprehensive PR testing, accepting the security trade-offs to ensure all problems are caught during the PR process.
Applied to files:
.github/workflows/openapi_spectral.yaml
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv` package manager with `uv run` prefix for all commands
Applied to files:
CONTRIBUTING.md
📚 Learning: 2026-04-19T15:40:25.624Z
Learnt from: CR
Repo: lightspeed-core/lightspeed-stack PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-04-19T15:40:25.624Z
Learning: Use `uv sync --group dev --group llslibdev` for dependencies
Applied to files:
CONTRIBUTING.md
📚 Learning: 2026-04-06T20:18:07.852Z
Learnt from: major
Repo: lightspeed-core/lightspeed-stack PR: 1463
File: src/app/endpoints/rlsapi_v1.py:266-271
Timestamp: 2026-04-06T20:18:07.852Z
Learning: In the lightspeed-stack codebase, within `src/app/endpoints/` inference/MCP endpoints, treat `tools: Optional[list[Any]]` in MCP tool definitions as an intentional, consistent typing pattern (used across `query`, `responses`, `streaming_query`, `rlsapi_v1`). Do not raise or suggest this as a typing issue during code review; changing it in isolation could break endpoint typing consistency across the codebase.
Applied to files:
src/app/endpoints/a2a.py
🪛 checkmake (0.2.2)
Makefile
[warning] 123-123: Target body for "verify" exceeds allowed length of 5 (7).
(maxbodylength)
🪛 Checkov (3.2.519)
docs/openapi.json
[high] 1-19584: Ensure that the global security field has rules defined
(CKV_OPENAPI_4)
[high] 1-19584: Ensure that security operations is not empty.
(CKV_OPENAPI_5)
🪛 GitHub Actions: OpenAPI (Spectral)
scripts/generate_openapi_schema.py
[error] 1-1: Generation command was run and produced a schema different from the committed docs. Command: 'uv run python scripts/generate_openapi_schema.py /tmp/openapi-generated.json'.
docs/openapi.json
[error] 1-1: OpenAPI schema is out of date. Diff failed between 'docs/openapi.json' and '/tmp/openapi-generated.json'. Regenerate with: 'uv run scripts/generate_openapi_schema.py docs/openapi.json'.
🪛 LanguageTool
docs/contributing/openapi-tags-and-spectral.md
[style] ~25-~25: Consider using the synonym “brief” (= concise, using a few words, not lasting long) to strengthen your wording.
Context: ...akefile skips Spectral and prints a short message so make verify can still ...
(QUICK_BRIEF)
[uncategorized] ~25-~25: The official name of this software platform is spelled with a capital “H”.
Context: ...de.js to run the check locally. CI (.github/workflows/openapi_spectral.yaml) alway...
(GITHUB)
CONTRIBUTING.md
[uncategorized] ~52-~52: The official name of this software platform is spelled with a capital “H”.
Context: ...check. CI always runs Spectral (see .github/workflows/openapi_spectral.yaml). The...
(GITHUB)
[uncategorized] ~62-~62: The official name of this software platform is spelled with a capital “H”.
Context: ...work. CI uses Node 22 for Spectral (see .github/workflows/openapi_spectral.yaml). #...
(GITHUB)
🔇 Additional comments (10)
src/models/responses.py (1)
1708-1711: LGTM.The SSE note is now placed on the OpenAPI response description instead of the media-type object, which keeps the generated spec valid while preserving the documentation.
src/app/endpoints/a2a.py (2)
695-730: LGTM.The split GET/POST handlers give
/a2aunique OpenAPI operation IDs while preserving the shared authorization and request handling path.
733-913: LGTM.The extracted helper keeps the JSON-RPC forwarding logic centralized, reducing drift between the GET and POST handlers.
.spectral.yaml (1)
1-9: LGTM.The config enables the OAS ruleset while explicitly documenting why media-example validation is temporarily disabled.
src/app/main.py (1)
132-135: LGTM.Passing
openapi_tags=_OPENAPI_TAGScorrectly exposes the declared operation tags to schema generation.Makefile (1)
116-130: LGTM.The new target makes OpenAPI linting available locally, and the
npxguard keepsmake verifyusable on Python-only developer setups while CI still enforces Spectral.CONTRIBUTING.md (1)
21-62: LGTM.The contributor docs now cover the Node/npx prerequisite, local skip behavior, CI enforcement, and the
_OPENAPI_TAGSupdate requirement.Also applies to: 223-225
.github/workflows/openapi_spectral.yaml (1)
23-52: LGTM.The workflow validates generated OpenAPI drift, derived markdown drift, and Spectral linting in one focused CI job.
docs/contributing/openapi-tags-and-spectral.md (1)
1-27: Well-structured documentation with clear instructions.The documentation effectively explains the OpenAPI tag synchronization requirement, provides actionable guidance for developers, and includes necessary context about CI behavior and disabled rules. The separation into logical sections (tag management and linting) makes it easy to navigate.
docs/openapi.json (1)
9938-9958: Past feedback addressed.A2A GET/POST summaries now use standard "JSON-RPC" casing and the descriptions no longer leak handler-implementation details.
| make lint-openapi | ||
| ``` | ||
|
|
||
| This is part of **`make verify`**. If **`npx`** is not on your **`PATH`**, the Makefile **skips** Spectral and prints a short message so **`make verify`** can still pass; install Node.js to run the check locally. **CI** (`.github/workflows/openapi_spectral.yaml`) always runs Spectral. Failures are driven by **error**-severity rules. |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Consider splitting the complex sentence for better readability.
Line 25 combines five distinct concepts (make verify integration, npx availability, Makefile fallback behavior, CI execution, and failure criteria) into a single sentence. Breaking this into separate sentences would improve clarity and reduce cognitive load.
📝 Suggested rewording
-This is part of **`make verify`**. If **`npx`** is not on your **`PATH`**, the Makefile **skips** Spectral and prints a short message so **`make verify`** can still pass; install Node.js to run the check locally. **CI** (`.github/workflows/openapi_spectral.yaml`) always runs Spectral. Failures are driven by **error**-severity rules.
+This is part of **`make verify`**. If **`npx`** is not on your **`PATH`**, the Makefile **skips** Spectral and prints a short message so **`make verify`** can still pass. Install Node.js to run the check locally. **CI** (`.github/workflows/openapi_spectral.yaml`) always runs Spectral. Failures are driven by **error**-severity rules.🧰 Tools
🪛 LanguageTool
[style] ~25-~25: Consider using the synonym “brief” (= concise, using a few words, not lasting long) to strengthen your wording.
Context: ...akefile skips Spectral and prints a short message so make verify can still ...
(QUICK_BRIEF)
[uncategorized] ~25-~25: The official name of this software platform is spelled with a capital “H”.
Context: ...de.js to run the check locally. CI (.github/workflows/openapi_spectral.yaml) alway...
(GITHUB)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@docs/contributing/openapi-tags-and-spectral.md` at line 25, The long sentence
that begins "This is part of **`make verify`**. If **`npx`** is not on your
**`PATH`**, the Makefile **skips** Spectral and prints a short message so
**`make verify`** can still pass; install Node.js to run the check locally.
**CI** (`.github/workflows/openapi_spectral.yaml`) always runs Spectral.
Failures are driven by **error**-severity rules." should be split into shorter
sentences for clarity: one stating Spectral is part of make verify, one
explaining the local fallback when npx is missing and the suggested fix (install
Node.js), one noting CI always runs Spectral, and a final short sentence stating
failures are driven by error-severity rules; update the paragraph in
docs/contributing/openapi-tags-and-spectral.md accordingly.
…penAPI linting to CI
27f69f5 to
785d35e
Compare
Description
Removed OpenAPI spec violations in docs/openapi.json - duplicate operation ids and an unexpected "description" attribute.
Added OpenAPI linting to CI as a GitHub action via the 'make lint-openapi' target.
Type of change
Tools used to create PR
Identify any AI code assistants used in this PR (for transparency and review context)
Related Tickets & Documents
Checklist before requesting a review
Testing
Please provide detailed steps to perform tests related to this code change.
Run 'make lint-openapi', which should produce no errors.
How were the fix/results from this change verified? Please provide relevant screenshots or results.
Summary by CodeRabbit
New Features
Refactor
Documentation