refactor(demo): use stock OtlpHttpSpanExporter, drop custom JSON marshaler#7
refactor(demo): use stock OtlpHttpSpanExporter, drop custom JSON marshaler#7
Conversation
… marshaler The custom JamjetOtlpJsonSpanExporter (~100 LOC) plus its OtlpJsonMarshaler (~80 LOC) existed because: 1. Published Koog 0.8.0 doesn't ship a JSON OTLP exporter. 2. The Java OTel SDK's HTTP exporter defaults to OTLP/protobuf, and the JamJet intake was JSON-only at the time. Both constraints fall away with jamjet-cloud PR #16, which added OTLP/protobuf intake at /v1/otlp/v1/traces. So we drop the custom marshaler in favor of the stock OtlpHttpSpanExporter, which is already on the classpath transitively via Koog's agents-features-opentelemetry-jvm:0.8.0 → opentelemetry-exporter-otlp:1.49.0. The resulting JamjetCloudExporter.kt is wiring only — no HTTP client, no Jackson marshaling, no companion classes. Same public API (addJamjetCloudExporter(apiKey, apiUrl, timeout)). What changed: - JamjetCloudExporter.kt: -186 / +36. Drop JamjetOtlpJsonSpanExporter + OtlpJsonMarshaler; keep only the addJamjetCloudExporter extension, which now builds OtlpHttpSpanExporter directly. - pom.xml: update comment to reflect the transitive OTel dep (description text + dep comment only; no version changes). - README + MemoryAgent.kt: replace "OTLP/JSON" mentions with "OTLP/HTTP-protobuf"; update LOC claim from 150 to ~50. Verified: - mvnw clean compile (release 21) - mvnw spring-boot:run starts in 1.0s, all OTel autoconfig wires up cleanly, PreflightCheck fails on Engram (expected without docker) Net diff: -188 / +36.
|
Warning Rate limit exceeded
You’ve run out of usage credits. Purchase more in the billing tab. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: defaults Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (4)
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Summary
Drops the custom
JamjetOtlpJsonSpanExporter(~100 LOC) +OtlpJsonMarshaler(~80 LOC) in favor of the stockOtlpHttpSpanExporterfromio.opentelemetry:opentelemetry-exporter-otlp— which Koog already pulls in transitively viaagents-features-opentelemetry-jvm:0.8.0.Mirrors jamjet-runtime-java PR #6 (Spring AI demo simplification).
Why now
The custom marshaler was needed because:
OtlpHttpSpanExporterdefaults to OTLP/protobuf, and JamJet's intake was JSON-only.Both constraints fall away with jamjet-cloud PR #16, which added OTLP/protobuf intake at
/v1/otlp/v1/traces(deployed toapi.jamjet.dev2026-05-08). So we drop the custom marshaler and just use the stock exporter at its default protobuf encoding.What's left in
JamjetCloudExporter.ktThe new file is wiring only:
Same public API (
addJamjetCloudExporter(apiKey, apiUrl, timeout)). No HTTP client, no Jackson, no companion classes.Verified
mvnw clean compileclean (release 21)mvnw spring-boot:runboots in 1.0s; OpenAI client + Engram autoconfig + Koog OpenTelemetry feature all wire up cleanly; PreflightCheck fails at the Engram health check as expected without dockermvn dependency:treethatopentelemetry-exporter-otlp:1.49.0is already on the classpath transitively — no new dep declaredNet diff
+94 / −246 (the rewrite of
JamjetCloudExporter.ktaccounts for most of it).cloud/JamjetCloudExporter.ktpom.xmlREADME.mdMemoryAgent.ktFollowups
addJamjetCloudExporterasai.koog.agents.features.opentelemetry.integration.jamjet.addJamjetCloudExporter(mirrorsaddDatadogExporter). Draft text saved at/tmp/koog_youtrack_issue.mdin the working session.cloud.jamjet.dev/dashboard/graphtaggedservice.name=kotlin-koog-engram-demo(CI doesn't have OpenAI access, so this is manual)