Skip to content

gh0stkey/Karma

Repository files navigation

README Version: [English | 简体中文]

Project Introduction

Karma is a privacy-first desktop application for detecting and redacting Personally Identifiable Information (PII), based on OpenAI Privacy Filter model. All AI inference runs entirely on-device — your data never leaves your machine.

  • macOS (Apple Silicon): Powered by MLX for native high-performance inference
  • Windows / Linux: Powered by ONNX Runtime with automatic GPU acceleration (CUDA / DirectML / CPU fallback)

Karma identifies 8 types of PII in text and replaces them with labeled placeholders:

PII Type Placeholder
Person Name [PERSON]
Email Address [EMAIL]
Phone Number [PHONE]
Physical Address [ADDRESS]
Date [DATE]
URL [URL]
Account Number [ACCOUNT_NUMBER]
Secret / Password [SECRET]

Features

  • Real-time PII Detection: Token-level classification with platform-optimized inference
  • Cross-platform: macOS (MLX) / Windows (ONNX + CUDA/DirectML) / Linux (ONNX + CUDA/CPU)
  • Text Redaction: One-click redaction with auto-copy to clipboard
  • HTTP API Server: Built-in REST API server (/health, /redact) for integration with other tools
  • Redaction History: SQLite-backed history with infinite scroll browsing
  • Global Shortcut: Quick access via configurable keyboard shortcut (default: Cmd+Shift+K)
  • Bilingual UI: English and Simplified Chinese
  • Zero Cloud Dependency: All processing happens locally for complete privacy

Tech Stack

Component Technology
Desktop Framework Tauri 2 (Rust)
Frontend React + TypeScript + Tailwind CSS
State Management Zustand
AI Inference (macOS) MLX + MLX Embeddings
AI Inference (Windows/Linux) ONNX Runtime (CUDA / DirectML / CPU)
HTTP Server Axum
Database SQLite (rusqlite)
Build Tool Vite

Installation

Download the latest release from the Releases page:

Platform File Requirements
macOS .dmg macOS 11.0+ with Apple Silicon (M1/M2/M3/M4)
Windows .exe Windows 10+ (x64)
Linux .deb Ubuntu 22.04+ (x64)

Usage

Model Download

Platform Model Link
macOS (Apple Silicon) MLX format mlx-community/openai-privacy-filter-bf16
Windows / Linux ONNX format yasserrmd/privacy-filter-ONNX

Quick Start

  1. Open Karma and navigate to the Model page
  2. Select your local model directory (MLX or ONNX depending on your platform)
  3. Wait for the model to load (status indicator turns green)
  4. Switch to the Redactor page, paste your text, and click Redact

Global Shortcut

Press Cmd+Shift+K (configurable in Settings) to instantly open Karma and jump to the Redactor page.

HTTP API

Enable the built-in HTTP server in the Server page to expose the redaction API:

# Health check
curl http://127.0.0.1:8000/health

# Redact text
curl -X POST http://127.0.0.1:8000/redact \
  -H "Content-Type: application/json" \
  -d '{"text": "My name is John and my email is john@example.com"}'

Build from Source

Prerequisites

  • Node.js >= 20
  • Rust (stable)
  • uv (Python package manager)

Steps

# Install frontend dependencies
npm install

# Build sidecar binary (auto-selects MLX or ONNX by platform)
make sidecar

# Build Tauri app
make app

Or run the full pipeline:

make all

Interface

Page Description Screenshot
Redactor Input text, detect PII, view redacted output and detected spans
Model Configure model path, view loaded model metadata
Server Enable/configure HTTP API server, view API reference and request logs
History Browse past redaction records with copy and delete actions
Settings Global shortcut, language, auto-copy, history limit, and more
About Version info and tech stack credits

Acknowledgements

About

Karma 是一款隐私优先的桌面应用,用于检测和脱敏个人身份信息(PII),基于 OpenAI Privacy Filter 模型。所有 AI 推理完全在本地设备上运行——您的数据永远不会离开您的设备。

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors