LGR
KNOWLEDGE_BASE

RESEARCH

Analog in-memory computing isn't new—it's been developed in research labs for decades. Here's the science we're building on and the people doing the work.

WHY_THIS_PAGE

We believe in transparency about where we're starting from. This page documents the foundational research, key papers, and leading groups in analog computing. These are the shoulders we're standing on—and in many cases, the people we're talking to about joining or advising.

FOUNDATIONAL PAPERS

"Memory devices and applications for in-memory computing"

Sebastian, A., Le Gallo, M., Khaddam-Aljameh, R. & Eleftheriou, E.
Nature Nanotechnology, 2020

The definitive review from IBM Research Zurich. Covers memristive devices, phase-change memory, and the computational primitives they enable.

"Equivalent-accuracy accelerated neural-network training using analogue memory"

Gokmen, T. & Vlasov, Y.
Nature, 2016

IBM's proof that analog crossbar arrays can train neural networks without accuracy loss. Key milestone for the field.

"Fully hardware-implemented memristor convolutional neural network"

Yao, P., Wu, H., Gao, B., et al.
Nature, 2020

Tsinghua/Stanford collaboration demonstrating a full CNN on memristor hardware. Achieved image recognition with real analog devices.

"A 64-core mixed-signal in-memory compute chip based on phase-change memory"

Le Gallo, M., et al.
Nature Electronics, 2023

IBM's most recent chip: 64 analog compute cores achieving record efficiency. The closest thing to a commercial-ready analog AI processor.

EVENTS & CHALLENGES

IROS 2025 WORKSHOP ON EVENT-BASED VISION

Latest advances in event-based vision for robotics applications.

DETAILS →

EVENT-BASED STEREO SLAM CHALLENGE

DEADLINE: SEP 30, 2025

IROS 2025 Competition

DETAILS →

ESA ELOPE: LUNAR OPTICAL FLOW CHALLENGE

DEADLINE: AUG 31, 2025

Space applications of event-based vision.

DETAILS →

HARDWARE & DEVICES

EVENT_CAMERAS

INIVATION (DVS, DAVIS)

Pioneer sensors from Institute of Neuroinformatics. DVS128, DAVIS240/346 series.

PROPHESEE METAVISION

High-resolution event cameras with advanced software ecosystem. Up to 1280x720 resolution.

CELEPIXEL CELEX

High-resolution sensors including CeleX-V 1 Megapixel event camera.

WEBSITE
NEUROMORPHIC_PROCESSORS
INTEL LOIHISPIKING NEURAL NETWORKS
DYNAP (aiCTX AG)256 NEURONS / 128K SYNAPSES
IBM TRUENORTHLARGE-SCALE NEUROMORPHIC
SPINNAKEREVENT-DRIVEN PROCESSING

RESEARCH APPLICATIONS

APPLICATION_01

SLAM & VISUAL ODOMETRY

Event-based simultaneous localization and mapping using temporal contrast changes for robust navigation.

METHODS: direct methods, feature tracking, visual-inertial fusion

APPLICATION_02

OPTICAL FLOW & MOTION

High-speed motion field estimation using event streams for real-time perception.

USES: drone navigation, collision avoidance, motion segmentation

APPLICATION_03

3D RECONSTRUCTION

Monocular and stereo depth estimation using event cameras with structured light fusion.

TECHNIQUES: contrast maximization, deep learning, semi-dense

APPLICATION_04

OBJECT RECOGNITION

Pattern recognition and object tracking using sparse event representations.

METHODS: HOTS, graph neural networks, attention mechanisms

SPECIALIZED_DOMAINS
SPACE

Satellite tracking, debris monitoring, lunar nav

AUTOMOTIVE

Driver monitoring, lane detection, collision

BIOMEDICAL

Retinal implants, eye tracking, neural interfaces

INDUSTRIAL

Quality control, vibration analysis, monitoring

KEY DATASETS

DRIVING_&_AUTOMOTIVE

DDD20

End-to-End Event Camera Driving Dataset

LINK →

DSEC

Stereo Event Camera Dataset

LINK →

PROPHESEE AUTOMOTIVE

Large scale detection dataset

LINK →
RECOGNITION_&_CLASSIFICATION

N-MNIST & N-CALTECH101

Neuromorphic versions of classic datasets

LINK →

N-CARS

Real-world event-based car classification

LINK →

DVS128 GESTURE

IBM neuromorphic gesture recognition

LINK →

SOFTWARE & TOOLS

FRAMEWORKS

jAER

Real-time sensory-motor processing

PROJECT →

PROPHESEE OPENEB

Open source event-based vision library

GITHUB →

TONIC

Event datasets and transforms (Torchvision-style)

GITHUB →
ALGORITHMS

EV-FLOWNET

Self-supervised optical flow estimation

GITHUB →

E2VID

Video reconstruction from event cameras

GITHUB →

RPG DVS ROS

ROS driver and tools for DVS

GITHUB →

LEADING RESEARCH GROUPS

EUROPE
INI

UZH & ETH Zurich

RPG

University of Zurich

EDPR

IIT Italy

NORTH AMERICA
GRASP LAB

UPenn

INTEL LABS

Loihi Development

UMD

Perception & Robotics

ASIA-PACIFIC
ICNS

Western Sydney

PEKING UNIVERSITY

Camera Intelligence Lab

KAIST

Visual Intelligence Lab

EDUCATIONAL RESOURCES

COURSES & TUTORIALS

EVENT-BASED ROBOT VISION (TU BERLIN)

Comprehensive course with videos and slides

YOUTUBE →

DVS TUTORIAL (TOBI DELBRUCK)

Introductory video series about DVS technology

YOUTUBE →

TELLURIDE NEUROMORPHIC WORKSHOPS

Annual workshops on neuromorphic engineering

WEBSITE →

RESOURCES

This directory draws from the community-driven Event-based Vision Resources repository.