Next-generation open-source AI agent development framework and runtime platform
下一代开源 AI 智能体开发框架与运行时平台
Event-driven Runtime · Multi-provider LLM · RoleX Integration · TypeScript First
Build an AI agent in a few lines of TypeScript:
import { createAgentX } from "agentxjs";
import { nodePlatform } from "@agentxjs/node-platform";
import { createMonoDriver } from "@agentxjs/mono-driver";
const createDriver = (config) => createMonoDriver({
...config,
apiKey: process.env.ANTHROPIC_API_KEY,
provider: "anthropic",
});
const platform = await nodePlatform({ createDriver }).resolve();
const ax = createAgentX({ platform, createDriver });
// Create container → image → agent → chat
await ax.container.create("my-app");
const { record: image } = await ax.image.create({
containerId: "my-app",
systemPrompt: "You are a helpful assistant.",
});
const { agentId } = await ax.agent.create({ imageId: image.imageId });
ax.on("text_delta", (e) => process.stdout.write(e.data.text));
await ax.session.send(agentId, "Hello!");Expose your agent as a WebSocket server:
import { createServer } from "agentxjs";
const server = await createServer({
platform,
createDriver,
port: 5200,
});
await server.listen();Connect to a running AgentX server:
import { createAgentX } from "agentxjs";
const ax = createAgentX();
const client = await ax.connect("ws://localhost:5200");
// Same API as local mode
await client.agent.create({ imageId: "..." });Interactive terminal chat:
cd apps/cli
cp .env.example .env.local # Set DEEPRACTICE_API_KEY
bun run dev| Package | Description |
|---|---|
| agentxjs | Client SDK — local, remote, and server modes |
| @agentxjs/core | Core abstractions — Container, Image, Session, Driver |
| @agentxjs/node-platform | Node.js platform — SQLite persistence, WebSocket |
| @agentxjs/mono-driver | Multi-provider LLM driver (Anthropic, OpenAI, Google, etc.) |
| @agentxjs/claude-driver | Claude-specific driver with extended features |
| @agentxjs/devtools | BDD testing tools — MockDriver, RecordingDriver, Fixtures |
MonoDriver supports multiple LLM providers via Vercel AI SDK:
- Anthropic (Claude) —
provider: "anthropic" - OpenAI (GPT) —
provider: "openai" - Google (Gemini) —
provider: "google" - DeepSeek —
provider: "deepseek" - Mistral —
provider: "mistral" - xAI (Grok) —
provider: "xai" - OpenAI-compatible —
provider: "openai-compatible"
MonoDriver integrates with RoleX for AI role management — identity, goals, knowledge, and cognitive growth cycles:
import { localPlatform } from "@rolexjs/local-platform";
const driver = createMonoDriver({
...config,
rolex: {
platform: localPlatform(),
roleId: "my-role",
},
});Event-driven architecture with layered design:
SERVER SIDE SYSTEMBUS CLIENT SIDE
═══════════════════════════════════════════════════════════════════════════
║
┌─────────────────┐ ║
│ Environment │ ║
│ • LLMProvider │ emit ║
│ • Sandbox │─────────────────>║
└─────────────────┘ ║
║
║
┌─────────────────┐ subscribe ║
│ Agent Layer │<─────────────────║
│ • AgentEngine │ ║
│ • Agent │ emit ║
│ │─────────────────>║ ┌─────────────────┐
│ 4-Layer Events │ ║ │ │
│ • Stream │ ║ broadcast │ WebSocket │
│ • State │ ║════════>│ (Event Stream) │
│ • Message │ ║<════════│ │
│ • Turn │ ║ input │ AgentX API │
└─────────────────┘ ║ └─────────────────┘
║
║
┌─────────────────┐ ║
│ Runtime Layer │ ║
│ │ emit ║
│ • Persistence │─────────────────>║
│ • Container │ ║
│ • WebSocket │<─────────────────╫
│ │─────────────────>║
└─────────────────┘ ║
║
[ Event Bus ]
[ RxJS Pub/Sub ]
Event Flow:
→ Input: Client → WebSocket → BUS → LLM Driver
← Output: Driver → BUS → AgentEngine → BUS → Client
AgentX is in active development. We welcome your ideas, feedback, and contributions!
Part of the Deepractice AI infrastructure:
- RoleX — AI role management system (identity, cognition, growth)
- ResourceX — Unified resource manager
- IssueX — Structured issue tracking for AI collaboration
Built with ❤️ by Deepractice
