A frontier AI research laboratory

Neuromorphic AGI. A new substrate
for intelligence.

01 — The thesis

One motif,
compute that
behaves like
a brain.

The neocortex is not a monolith of 175 billion parameters. It is roughly 150,000 small, repeating cortical columns, each learning locally, each growing and pruning its own connections — a universe of connections that shape dynamically, moment to moment. There is no global error signal. There is no frozen checkpoint. Memory is updated through the act of being used.

FractalBrain is built on this principle. Its compute fabric behaves closer to an FPGA, or to a cortex, than to a fixed ANN: it grows relevant connections when confronted with novel tasks, and recycles unused ones. Learning is local, temporal, and does not require differentiable domains. The model of the world is causal rather than statistical. The agent chooses what to sense.

We are not scaling a known thing. We are proposing a different one.

02 — Measured outcomes

Disruptive,
measurably so.

20×
Better data efficiency on RL tasks

Active sensing lets the agent solve tasks with a fraction of the interactions required by conventional deep RL.

10×
Better accuracy on sequence modelling

Unbounded temporal credit assignment over genomics and long-range structured data, where transformer context windows clip.

500×
Better power efficiency

Fractal networks run on a single CPU core. No GPUs, no data centres, no backpropagation. A significantly lower energy footprint at comparable performance.

Unlimited context, parameters, lifespan

No attention window. No fixed parameter count. No boundary between training and inference. The model grows as it is used.

03 — The architecture in three views

Built for causality,
for language, for
life-long learning.

— causal
A B C D

Causal. Not probabilistic.

Knowledge is stored as explainable causal chains, not as an opaque parameter vector. No hallucinations by design.

— flame · hebbian
FIRE TOGETHER WIRE TOGETHER

Flame. A Hebbian language system.

Our internal language model runs on a fractal substrate. Local updates, no backpropagation, a fraction of the energy of GPU-based LLMs.

— continual

Learns. Permanently.

Knowledge acquired during deployment is kept, forever. No retraining, no forgetting. The model grows as it is used.

04 — Two paradigms, side by side

Fractal networks vs.
neural networks.

Four head-to-head differences. The fractal column is what we believe; the neural column is the dominant paradigm we are departing from.

Topic
Fractal networks
Neural networks
Knowledge
Continually learning.
Frozen after training.
Context
Unlimited.
Bounded attention window.
Halluci-
nations
Eliminated by design.
Inherent to probabilistic generation.
Hardware
Single CPU core + RAM.
Heavy parallel GPU computation.
05 — The laboratory

An independent
research
laboratory.

FractalBrain LTD is a team of industry-hardened PhDs and engineers from DeepMind, Google, IBM, CERN, DESY. The result of over a decade of our own R&D across AI/ML, fractal theory, theoretical physics, and quantum computing.

London
Theory · World models
Vienna
Research · Abstraction
Bielsko-Biała
Engineering · Data center
06 — Research roadmap

The road to
fractal networks.

2006
Hierarchical Temporal Memory

The first cortically-inspired model. Sequence learning over a hierarchy of repeating regions.

2009
Cortical Learning Algorithm

Local, sparse, online learning at the column level. No backprop, no batches.

2011
Pattern Recognition Theory of Mind

A unifying theory: the neocortex as a hierarchy of pattern recognisers operating in parallel.

2015
Augmented Hierarchical Temporal Memory

HTM extended with attention, active sensing and hierarchical credit assignment.

2017
Distributed Sequence Controllers

Options-based planning across many cortical columns: concurrent, hierarchical, asynchronous.

2022
Fractal Networks

One self-similar substrate. Growing topology, local Hebbian updates, causal world model — unified.

If a continually-learning substrate sounds like the problem you want to spend the next decade on, get in touch.

Join the team