33.2 A Sub-1μJ/class Headset-Integrated Mind Imagery and Control SoC for VR/MR Applications with Teacher-Student CNN and General-Purpose Instruction Set Architecture

Zhiwei Zhong, Yijie Wei, Lance Christopher Go, Jie Gu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Scopus citations

Abstract

Virtual Reality (VR) and Mixed Reality (MR) systems, e.g., Meta Quest and Apple Vision Pro, have recently gained significant interest in consumer electronics, creating a new wave of developments in metaverse for gaming, social networking, workforce assistance, online shopping, etc. Strong technological innovations in AI computing and multi-modular human activity tracking and control have produced immersive virtual realistic user experiences. However, most existing VR headsets only rely on traditional joysticks or camera-based user gestures for input control and human tracking, missing an important source of information, namely, brain activity. Hence there is a growing interest in incorporating brain-machine interfaces (BMIs) into VR/MR systems for consumer and clinical applications [1]. As illustrated in Fig. 33.2.1, an existing VR/MR system integrated with EEG channels typically consists of a VR headset, a 16/32-channel EEG cap, a neural recording analog frontend, and a PC for signal classification. Major drawbacks of such systems include: (1) cumbersome wear and poor user appearance, (2) lack of in situ computing support for low-latency operation, (3) inability for real-time mind imagery control and feedback based on brain activity, (4) high power consumption due to AI classification. To overcome these challenges, this work introduces a mind imagery device integrated into existing VR headsets without extra wearing burden for mind-controlled BMI for a VR/MR system. The contributions of this work include: (1) an SoC supporting in situ mind imagery control for VR/MR systems, (2) seamless integration with existing VR headset and optimized selection of EEG channels to enhance user acceptance and experience, (3) a general-purpose instruction set architecture (ISA) with flexible dataflow, supporting a broad range of mind imagery operations, (4) a confusion-matrix-guided teacher-student CNN scheme to save power during AI operations, (5) sparsity enhancement on EEG signals to reduce energy. A 65nm SoC test chip is fabricated with in situ demonstrations on various mind imagery-based VR controls. While prior works address EEG-based seizure detection or similar biomedical applications [2] -[6], this work focuses on emerging BMI in a VR/MR environment. The digital core of the SoC achieves an energy consumption <1μJ/class for compute-intensive CNN operations thanks to the low-power features and system-level optimizations of the design.

Original languageEnglish (US)
Title of host publication2024 IEEE International Solid-State Circuits Conference, ISSCC 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages544-546
Number of pages3
ISBN (Electronic)9798350306200
DOIs
StatePublished - 2024
Event2024 IEEE International Solid-State Circuits Conference, ISSCC 2024 - San Francisco, United States
Duration: Feb 18 2024Feb 22 2024

Publication series

NameDigest of Technical Papers - IEEE International Solid-State Circuits Conference
ISSN (Print)0193-6530

Conference

Conference2024 IEEE International Solid-State Circuits Conference, ISSCC 2024
Country/TerritoryUnited States
CitySan Francisco
Period2/18/242/22/24

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of '33.2 A Sub-1μJ/class Headset-Integrated Mind Imagery and Control SoC for VR/MR Applications with Teacher-Student CNN and General-Purpose Instruction Set Architecture'. Together they form a unique fingerprint.

Cite this