Join Pascal as he explores the groundbreaking world of generic neuromotor interfaces with Jesse, Lauren, and Sean. Discover how these technologies enable control of devices with just a flick of the wrist or even a simple intention to move. We'll discuss the role of AI in eliminating the need for personalised training, the differences between non-invasive interfaces and their predecessors, and the exciting implications for accessibility. Don't miss this deep dive into the future of human-computer interaction.
Got feedback? Send it to us on Threads (https://threads.net/@metatechpod), Instagram (https://instagram.com/metatechpod) and don’t forget to follow our host Pascal (https://mastodon.social/@passy, https://threads.net/@passy_). Fancy working with us? Check out https://www.metacareers.com/.
Timestamps
Intro 0:06
Jesse introduction 1:29
Lauren introduction 2:42
Sean introduction 3:29
Team's mission statement 3:49
What's a neuromotor interface? 4:24
Paper overview 5:29
Non-invasive interfaces 7:50
How to make it generic 9:42
Design tradeoffs 11:29
Real-world model performance 14:21
Feedback cycle 16:22
LLMs and EMG 17:22
Handwriting vision 18:39
Working with product 20:55
EMG for accessibility 22:25
How Meta helps 25:53
Open-source repos 28:02
What's next? 28:45
Outro 30:51
Links
A generic non-invasive neuromotor interface for human-computer interaction - Nature - https://www.nature.com/articles/s41586-025-09255-w
How the low-vision community embraced AI smart glasses - The Verge - https://www.theverge.com/the-vergecast/701018/ray-ban-meta-smart-glasses-be-my-eyes-ceo-accessibility-tech
MKBHD on Orion - https://www.youtube.com/watch?v=G0eKzU_fV00