Skip to content
@Jibar-OS

JibarOS

Android 16 fork. AI as a platform primitive. Twelve capabilities, one shared runtime, every app. OEM-pluggable. Apache 2.0.

JibarOS

Stars Android 16 Apache 2.0 pre-1.0

v0.6.9 Fire All demo — Loom

JibarOS

An Android 16 fork where AI is a platform primitive, not an app feature.

Twelve AI capabilities — text completion, translation, embeddings, classification, rerank, transcription, synthesis, VAD, image embeddings, description, detection, OCR — exposed to every app on the device through a single binder AIDL. One system service. One native daemon. Four pluggable backends. Pooled residency. Per-UID rate limits. Priority-aware scheduling. OEM-configurable per capability.

Models load once at the platform tier and are shared across every app that asks. Runtime infrastructure, not a chatbot. The closest mental model is "a kernel for on-device inference."

Named after Puerto Rico's jíbaros — rural folk, known for resilience and self-sufficiency. Models and runtime live on the device, work offline, no cloud account required.

Star the main repo — it's the cheapest signal that on-device AI belongs at the platform tier.

Start here

👉 github.com/Jibar-OS/JibarOS — the main repo. README, architecture diagram, default.xml manifest, and the docs/ directory with capability/knob/SDK/model/build guides.

The runtime surface

OIR (Open Intelligence Runtime) is the inference layer. Twelve capabilities, permissions-gated:

Capability Shape Reference backend
text.complete / text.translate TokenStream Qwen 2.5 (llama.cpp)
text.embed Vector MiniLM-L6-v2 (llama.cpp)
text.classify / text.rerank Vector OEM-supplied ONNX
audio.transcribe TokenStream whisper.cpp
audio.synthesize AudioStream Piper (ONNX Runtime)
audio.vad RealtimeBoolean Silero (ONNX Runtime)
vision.embed Vector SigLIP (ONNX Runtime)
vision.describe TokenStream libmtmd (LLaVA / SmolVLM / …)
vision.detect BoundingBoxes RT-DETR (ONNX Runtime)
vision.ocr BoundingBoxes OEM-supplied det+rec pair

Full details in JibarOS/docs/CAPABILITIES.md.

What's in this org

Core

External backend forks

AAOSP → JibarOS

Builds on AAOSP, the earlier Android 15 fork that first put an LLM inside system_server as a platform service and introduced MCP tool-calling at the manifest layer. JibarOS extends that "AI at the platform tier" pattern from one-LLM-for-tool-calling to a general multi-backend inference layer on Android 16.

Apache 2.0. Pre-1.0. Validated on Android 16 Cuttlefish with SELinux Enforcing.

Popular repositories Loading

  1. JibarOS JibarOS Public

    Android 16 fork. AI as a platform primitive. Twelve capabilities, one shared runtime, every app. OEM-pluggable. Apache 2.0.

    Shell 6

  2. platform_external_llamacpp platform_external_llamacpp Public

    JibarOS fork of ggml-org/llama.cpp with Android.bp for AOSP integration.

    C++ 1

  3. oir-vendor-models oir-vendor-models Public

    Reference model bundle (Apache 2.0 + MIT) — populated by tools/fetch-models.sh.

    Shell 1

  4. platform_external_piper platform_external_piper Public

    Reserved placeholder. Piper TTS runs via platform_external_onnxruntime today; this repo is for future native Piper integration.

    1

  5. oir-patches oir-patches Public

    Five small patches to upstream AOSP that wire OIR into the platform (~69 lines total).

    1

  6. oir-framework-addons oir-framework-addons Public

    OIRService + AIDL surface — the platform-service half of Open Intelligence Runtime. Lives under frameworks/base/.

    Java 1

Repositories

Showing 10 of 13 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…