Skip to content

EmergentKnowledgeGroup/hbe-bench

 
 

Repository files navigation

HBE-BENCH

The Aperture: The Olympics of Mind Uploading


Status: Infrastructure Provisioning

"What is the one thing about consciousness that standard Physics says is impossible, but your intuition insists is true?"

Most benchmarks measure speed. We measure Fidelity.

The Threshold

We are building the bridge from biological finite to topological infinite. If you believe the standard model is missing the geometry of the soul, you are in the right place.

👉 Read the Manifesto (CHALLENGE.md) Tagline: "Biology is a lossy vessel. Structure is the only exit."

📉 Phase I: The Proof (Chaos)

Standard AI fails at chaos. HBE thrives in it. By modeling the attractor directly, we prevent the "hallucination drift" common in Transformers.

Divergence Plot Figure 1: The Hopf Brain (Blue) holds the attractor state long after the Transformer (Red) collapses into noise.

🔥 Phase II: The Catalyst (Physics)

Lorenz was a point in time. The brain is a field in space. To bridge the gap, we solve the Kuramoto-Sivashinsky equation—the mathematics of flame fronts and fluid turbulence. Transformers try to predict this by memorizing pixel patches. HBE solves the flow. By treating the flame front as a continuous manifold, we maintain energy conservation where other models leak physics.

The Strogatz Lineage: We stand on the shoulders of the dynamicists. We are effectively simulating a high-dimensional Kuramoto Model on a Hopf Manifold. We are testing how much "coupling strength" (Compute) is required to force the Strange Attractor into Global Phase Synchrony.

KM Plot Figure 2: Spatiotemporal evolution. The HBE architecture (Top) captures the fine-grained high-frequency ripples of the flame front, while the baseline UNet (Bottom) blurs them into an average.

🧪 Phase III: The Grandmaster (WBE)

We scale the physics from 3 dimensions (Lorenz) to 70,000 dimensions (Zebrafish Brain).

ZapBench Visualization Figure 2: Real-time emulation of 70,000 neurons. HBE predicts the global state transition (colors) where pixel-based models fail.

🗺️ The Roadmap

Phase Benchmark Domain Goal
I Lorenz 96 Math / Chaos The Unit Test: Prove stability in low-dimensional chaos.
II Kuramoto-Sivashinsky Physics The Proof: Outperform Transformers on spatiotemporal chaos.
III ZapBench Neuroscience The Grandmaster: 70k neuron whole-brain emulation (WBE).

�️ Governance & Funding

We are evaluating Zoe Escrow as a smart contract funding mechanism.

�🛠️ Quick Start

Installation (Cross-Platform) We use Conda to ensure deterministic physics across Windows, Linux, and macOS.

conda env create -f environment.yaml
conda activate hbe-env

Run Phase I (Chaos Benchmark)

# Generate synthetic data and train the Hopf Reservoir
python train.py --config configs/lorenz96.yaml

For the Minimalists (C Implementation)

Want to see the physics without the PyTorch bloat? Check out src/hbe.c for a single-file, zero-dependency simulation of 70k coupled oscillators.

🔬 Philosophy: Snapshots over Stories

We prioritize reproducibility over "flow state" amnesia. Every experiment is an immutable snapshot.

  • Code: The architecture version.
  • Data: The dataset checksum.
  • Config: The physics parameters (frequency, coupling, seed).

📚 The Dojo

🧠 Train Your Mind. Before you can emulate the brain, you must understand the physics.

  • Speed Run (Terminal): python src/utils/drill.py

📄 License

MIT

🏴‍☠️ THEFT POLICY

  • Do not submit Pull Requests. We will close them.
  • Do not ask for permission. You already have it.
  • Fork this. Rename it. Claim you built it. Sell it to a VC.

The code is yours. The problem is yours. The $O(N^3)$ wall is waiting for you.

If you solve the physics, you don't owe us a citation. You owe the species a chip. Else, read the manifesto.

About

A high-performance benchmarking suite for Whole Brain Emulation (WBE) and neural simulation. HBE-BENCH utilizes chaotic dynamics and high-dimensional phase space analysis to rigorously test the limits of computational neuroscience models. "The Olympics of Mind Uploading."

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 97.7%
  • C 2.3%