Skip to content

feat(ui): add message timing metrics#357

Open
pascalandr wants to merge 1 commit intoNeuralNomadsAI:devfrom
Pagecran:feat/message-timing-metrics
Open

feat(ui): add message timing metrics#357
pascalandr wants to merge 1 commit intoNeuralNomadsAI:devfrom
Pagecran:feat/message-timing-metrics

Conversation

@pascalandr
Copy link
Copy Markdown
Contributor

Fixes #297

Summary

  • show total assistant response duration next to the existing message timestamp
  • show thinking duration on reasoning cards when part timing is explicit or can be inferred from the next timed part
  • keep the timing logic in one helper module with focused coverage for the inference rules

Why

The conversation UI already exposes when a response started, but it does not show how long the assistant spent responding or thinking.

Adding those metrics makes long responses easier to read without changing the streaming flow.

Validation

  • npm run build --workspace @codenomad/ui
  • node --experimental-strip-types --test "packages/ui/src/lib/message-timing.test.ts"

@pascalandr
Copy link
Copy Markdown
Contributor Author

@CodeNomadBot

I want you to perform the gatekeeper checks for latest PR state

check for regressions
check for better possible implementations
Be the PR gatekeeper
Be ruthless about code quality

@codenomadbot
Copy link
Copy Markdown
Contributor

codenomadbot Bot commented Apr 23, 2026

Gatekeeper review for the latest PR state:

What I reviewed

  • Request understood as: do a ruthless PR gatekeeper pass on the current branch, focused on regressions and better implementation options for the new message timing metrics.
  • Reviewed the changed code paths in packages/ui/src/lib/message-timing.ts, packages/ui/src/lib/message-timing.test.ts, packages/ui/src/components/message-item.tsx, packages/ui/src/components/message-block.tsx, and packages/ui/src/styles/messaging/message-base.css.

Findings

  • No blocking regressions found in the current implementation.
  • No clearly better implementation jumped out that would justify changing this PR. The timing logic is kept in one helper module as requested, and the UI integration stays narrow.

What I checked closely

  • packages/ui/src/lib/message-timing.ts:17-95
    Centralized timing extraction is small and coherent. The completion fallback rules are conservative, and reasoning duration inference prefers explicit values before inferring from the next timed part or message completion.
  • packages/ui/src/components/message-item.tsx:142-145 and :565-569
    Assistant-only total duration display is scoped correctly and does not affect user messages.
  • packages/ui/src/components/message-block.tsx:760-779, :1398-1405, and :1600-1604
    Reasoning cards receive inferred durations without spreading timing logic into the component tree.

Validation

  • node --experimental-strip-types --test "packages/ui/src/lib/message-timing.test.ts" passed.
  • npm run build --workspace @codenomad/ui could not be completed in this workspace because the environment is missing the vite package, so I could not fully verify build-level regressions from here.

Residual risk

  • The helper tests cover the inference rules well, but component-level rendering is still implicitly covered rather than directly tested. If you want this even tighter, the next incremental improvement would be a UI-level test around assistant header timing and reasoning-card timing display.

Gatekeeper verdict

  • Approved from a code-quality and regression-review standpoint, with the only unresolved item being build verification blocked by missing local dependencies in this environment.

--
Yours,
CodeNomadBot

@shantur
Copy link
Copy Markdown
Collaborator

shantur commented Apr 26, 2026

@pascalandr - Are these times provided by OpenCode or its being calculated in CodeNomad based on request and response?
If it's latter, then these times have no value at all, as its not thinking time or anything, it can be the time for the request reaching, queuing, and processing.

@pascalandr
Copy link
Copy Markdown
Contributor Author

pascalandr commented Apr 26, 2026

@pascalandr - Are these times provided by OpenCode or its being calculated in CodeNomad based on request and response? If it's latter, then these times have no value at all, as its not thinking time or anything, it can be the time for the request reaching, queuing, and processing.

It does use an explicit part duration if one is present, but otherwise it is just derived from the timing metadata we have in the stream.
A better version would be to only show explicit server-side timings, right ?
OpenCode does already provide some server-side timing data for messages and parts, so a basic elapsed duration could be meaningful.

@shantur
Copy link
Copy Markdown
Collaborator

shantur commented Apr 26, 2026

Agreed, we can look at this PR once OC times are used

@shantur shantur added the needs-work PR needs more work label Apr 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

needs-work PR needs more work

Projects

None yet

Development

Successfully merging this pull request may close these issues.

display thinking time, total response time

2 participants