An Android 16 fork where AI is a platform primitive, not an app feature.
Twelve AI capabilities — text completion, translation, embeddings, classification, rerank, transcription, synthesis, VAD, image embeddings, description, detection, OCR — exposed to every app on the device through a single binder AIDL. One system service. One native daemon. Four pluggable backends. Pooled residency. Per-UID rate limits. Priority-aware scheduling. OEM-configurable per capability.
Models load once at the platform tier and are shared across every app that asks. Runtime infrastructure, not a chatbot. The closest mental model is "a kernel for on-device inference."
Named after Puerto Rico's jíbaros — rural folk, known for resilience and self-sufficiency. Models and runtime live on the device, work offline, no cloud account required.
⭐ Star the main repo — it's the cheapest signal that on-device AI belongs at the platform tier.
👉 github.com/Jibar-OS/JibarOS — the main repo. README, architecture diagram, default.xml manifest, and the docs/ directory with capability/knob/SDK/model/build guides.
OIR (Open Intelligence Runtime) is the inference layer. Twelve capabilities, permissions-gated:
| Capability | Shape | Reference backend |
|---|---|---|
text.complete / text.translate |
TokenStream | Qwen 2.5 (llama.cpp) |
text.embed |
Vector | MiniLM-L6-v2 (llama.cpp) |
text.classify / text.rerank |
Vector | OEM-supplied ONNX |
audio.transcribe |
TokenStream | whisper.cpp |
audio.synthesize |
AudioStream | Piper (ONNX Runtime) |
audio.vad |
RealtimeBoolean | Silero (ONNX Runtime) |
vision.embed |
Vector | SigLIP (ONNX Runtime) |
vision.describe |
TokenStream | libmtmd (LLaVA / SmolVLM / …) |
vision.detect |
BoundingBoxes | RT-DETR (ONNX Runtime) |
vision.ocr |
BoundingBoxes | OEM-supplied det+rec pair |
Full details in JibarOS/docs/CAPABILITIES.md.
Core
JibarOS— main: manifest + docsoird— native inference daemon (C++)oir-framework-addons— platform service + AIDL (Java)oir-patches— 5 small patches to upstream AOSP (69 lines)oir-sdk— Kotlin SDK for appsoir-demo— OirDemo Mission Control reference appoir-vendor-models— reference model bundle + fetch scriptdevice_google_cuttlefish— reference device tree
External backend forks
Builds on AAOSP, the earlier Android 15 fork that first put an LLM inside system_server as a platform service and introduced MCP tool-calling at the manifest layer. JibarOS extends that "AI at the platform tier" pattern from one-LLM-for-tool-calling to a general multi-backend inference layer on Android 16.
Apache 2.0. Pre-1.0. Validated on Android 16 Cuttlefish with SELinux Enforcing.
