THE BRAIN RUNSON 20 WATTS.WHY CAN'T AI?
We're building a company to solve the most important problem in AI infrastructure: energy efficiency. Our approach—analog in-memory computing—could deliver 100x better efficiency than today's GPUs. We're assembling the team to make it real.
AI is on track to consume 10% of global electricity by 2030. Current chips waste 90% of energy just moving data around. We're running 21st century AI on 1940s computer architecture. This doesn't scale.
WHAT THE HUMAN BRAIN USES
Biology doesn't separate memory from computation—it does both in the same place. Analog in-memory computing uses the same principle. Compute happens where data lives. No shuttling bits. No wasted energy.
POTENTIAL EFFICIENCY GAIN
AI IS HITTING AN ENERGY WALL.
Training GPT-4 consumed the equivalent of 50 million smartphone charges. Data centers are projected to consume 10% of global electricity by 2030. The industry is scrambling for solutions—but they're optimizing the wrong architecture.
Digital chips waste 90% of their energy moving data between memory and processors. Analog in-memory computing eliminates this bottleneck entirely. The physics has been proven in labs. The question is: who will build the company to commercialize it?
The science is ready. Analog in-memory computing has been demonstrated in research labs at IBM, Stanford, and dozens of universities. The materials exist. The architectures are understood. What's missing is a focused team with the conviction to build a company around it—and the runway to get to first silicon.