The SPQR Group at the University of Michigan works broadly on research problems pertaining to embedded security. We explore the research frontiers of computer science, electrical and computer engineering, and healthcare. Our latest projects examine how to protect analog sensors from intentional electromagnetic, acoustic interference, and light injection.
Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light. In our paper we demonstrate this effect, successfully using light to inject malicious commands into several voice controlled devices such as smart speakers, tablets, and phones across large distances and through glass windows.
As cameras become increasingly pervasive, the security and privacy of computer vision systems is a focus of SPQR's research. Our work explores how different physical processes such as acoustics and motion are captured and interpreted by cameras, and how malicious parties may exploit these effects to intentionally adulterate vision systems’ output or exfiltrate sensitive information from it. Our recent research shows acoustic injections into cameras may cause accidents to autonomous vehicles.
Over the last six years, several papers demonstrated how intentional analog interference based on acoustics, RF, lasers, and other physical modalities could induce faults, influence, or even control the output of sensors. Damage to the availability and integrity of sensor output carries significant risks to safety-critical systems that make automated decisions based on trusted sensor measurement. Established signal processing models use transfer functions to express...