What is it about?
Modern smart homes can automatically detect things like doors opening, lights turning on, or water running—but today this usually requires electronics such as cameras, microphones, or smart devices that must be installed, powered, and connected to the internet. These approaches can be costly, complex for users, and raise privacy concerns. In this work, we present a new type of tiny metal “tag” that sticks onto everyday objects—such as doors, drawers, windows, and faucets—and makes them “smart” without using any electronics at all. When a tagged object moves, the tag naturally makes a quiet ultrasonic sound that people cannot hear. A wearable device, like a watch with a ultrasound microphone, can detect these sounds and recognize which object was used. This lets a home understand basic activities—opening a window, turning on the faucet, or using a cabinet—without recording speech, video, or any personal information. The tags cost only a few cents to make, require no power or batteries, and work with existing furniture. We also show how computer simulations can automatically design thousands of unique tags so each object can be recognized individually. This makes it possible to turn many ordinary objects into useful sensors, enabling more accessible, affordable, and privacy-friendly smart homes.
Featured Image
Photo by Jakub Żerdzicki on Unsplash
Why is it important?
Current smart-home sensing solutions are reaching a limit: they rely on electronics, wireless connectivity, and cloud-based systems that are increasingly difficult to trust, maintain, and scale. At the same time, society is demanding more ambient intelligence in homes—especially for aging-in-place, health monitoring, energy use, and everyday convenience—yet without sacrificing privacy or adding more devices that require maintenance. This work matters now because it points to a different direction: instead of adding more electronics, we show that physical object geometry itself can be the sensing interface. This transforms smart-home sensing from a hardware+ML problem into a materials and design problem, enabling deployment at a scale that traditional IoT approaches cannot match. Our technique also challenges the assumption that sophisticated machine learning is required for activity recognition. By embedding distinguishable signals directly into the tags physical form, we demonstrate that tens or even hundreds of different tags can be detected using lightweight computation, opening new opportunities for edge devices and low-power wearables. In a moment when society is questioning surveillance and data collection, our results highlight an alternative path where intelligence emerges from passive physical design rather than invasive sensing.
Perspectives
For me, this project started from a simple frustration with how complicated “smart” technology has become—too many devices, too many subscriptions, too much setup, and too many ways for personal data to be collected without notice. I wanted to explore whether a home could be made more helpful and intelligent without adding more electronics and without asking the user to trust another camera, microphone, or company. What surprised me during this research was how far we could push a very old idea—mechanical vibration—into something that feels futuristic. My hope is that readers take away a broader question: What else in our physical environment already contains, or can simultaneously generate, useful information, waiting to be revealed and utilized? If this work can spark new directions toward more humane, quiet, and respectful technologies, then I will feel we contributed something meaningful beyond the technical results.
Yibo Fu
Georgia Institute of Technology
Read the Original
This page is a summary of: SoundOff: Low-cost Passive Ultrasound Tags for Non-invasive and Non-Intrusive Smart Home Sensing, Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies, December 2025, ACM (Association for Computing Machinery),
DOI: 10.1145/3770666.
You can read the full text:
Resources
Contributors
The following have contributed to this page







