LGR
FOUNDING_STAGE // BERLIN

THE BRAIN RUNSON 20 WATTS.WHY CAN'T AI?

We're building a company to solve the most important problem in AI infrastructure: energy efficiency. Our approach—analog in-memory computing—could deliver 100x better efficiency than today's GPUs. We're assembling the team to make it real.

ORGANOID_SUBSTRATE_V1
NEURONS: 34 | ACTIVE: 0
BIOLOGY-SCALE DYNAMICS
ANALOG_COMPUTE
LIVE
THE_PROBLEM

AI is on track to consume 10% of global electricity by 2030. Current chips waste 90% of energy just moving data around. We're running 21st century AI on 1940s computer architecture. This doesn't scale.

THE_BENCHMARK
20W

WHAT THE HUMAN BRAIN USES

THE_INSIGHT

Biology doesn't separate memory from computation—it does both in the same place. Analog in-memory computing uses the same principle. Compute happens where data lives. No shuttling bits. No wasted energy.

THE_OPPORTUNITY
100x

POTENTIAL EFFICIENCY GAIN

WHAT_WE'RE_BUILDING
APPROACHANALOG COMPUTE
ARCHITECTUREIN-MEMORY
PROCESSMATURE NODES
STATUSFOUNDING
THE_OPPORTUNITY

AI IS HITTING AN ENERGY WALL.

Training GPT-4 consumed the equivalent of 50 million smartphone charges. Data centers are projected to consume 10% of global electricity by 2030. The industry is scrambling for solutions—but they're optimizing the wrong architecture.

Digital chips waste 90% of their energy moving data between memory and processors. Analog in-memory computing eliminates this bottleneck entirely. The physics has been proven in labs. The question is: who will build the company to commercialize it?

WHERE_WE_ARE
STAGEPRE-SEED / FOUNDING
THESISANALOG IN-MEMORY COMPUTE
NETWORKWORLD-CLASS RESEARCHERS
LOCATIONEUROPE (BERLIN)
STATUSASSEMBLING TEAM
WHY_NOW

The science is ready. Analog in-memory computing has been demonstrated in research labs at IBM, Stanford, and dozens of universities. The materials exist. The architectures are understood. What's missing is a focused team with the conviction to build a company around it—and the runway to get to first silicon.

BERLIN // EUROPE