We put smart devices in our homes to make things easier. I have an Echo Dot hooked up to an Amazon Smart Plug in my kitchen to voice-control a lamp that is inconveniently far from the front door. Life, hacked. But that Echo Dot doesn’t ease my mind. I consider it more of a concession to the Technostate and its growing body of information about me. This is just the way of it in 2019: We’re thankful for our smart gadgets, with their calming voices and cute quirks, while at the same time we give them a wary side-eye, making strained jokes about FBI wiretaps and the Terminator. When my Echo Dot randomly starts playing music, or utters a ghostly “Yes” into a silent kitchen, I tell it to shut up and think, not today, robots.
We suspect they’re always listening in and sometimes recording, and we have proof to confirm our suspicions. A Google search digs up endless articles on how to stop Alexa and Google from spying. You can hit mute buttons, update your wake word, and prevent your contacts list from syncing, even though these actions make it more annoying to use the device as intended. Or, you can trust that Alexa and Google are only recording and uploading information to the cloud after you utter their wake words—”Alexa,” “Hey Siri,” or “O.K., Google”—like they are supposed to. No matter what you think, Amazon and Google are coy as hell about their privacy practices.
Ironically, it’s a robot that might make me feel calmer.
This week, Fast Company reported on Project Alias, a physical device that prevents Alexa and Google from eavesdropping on you. Created by designers Bjørn Karmann and Tore Knudsen, Alias sits on top of your Amazon or Google speaker, covering its microphones completely like a parasitic fungus, from which it is modeled. It plays a constant stream of white noise to prevent Alexa or Google from hearing anything; the white noise sounds kind of like all those garbled, static-y newscasts from spooky sci-fi movies. (It won’t be audible to you.) You train Alias via an app to recognize a wake word of your choice, which can be anything from “Dude” to “Hey, hamberders.” Then, when you use your wake word, Alias stops playing white noise and quietly whispers “Alexa” or “Hey, Google” to your device, prompting it to tune in from below. Once the device completes your task, the white noise starts back up.
With its white noise, Alias blocks Alexa and Google from listening and potentially recording any conversations that aren’t directed at them. And it isn’t connected to the cloud, either. It’s like a fortress around your private info, a middle man standing between you and Big Tech. Or, just think of it like cool, high-tech fungus.
“[Fungus] is a vital part of the rain forest, since whenever a species gets too dominant or powerful it has higher chances of getting infected, thus keeping the diversity in balance,” Knudsen told Fast Company. “We wanted to take that as an analogy and show how DIY and open source can be used to create ‘viruses’ for big tech companies.”
That’s the downside: Currently, Alias only exists as open source hardware and software, with no product you can actually buy. (You can download plans to build your own on the website.) But its designers are open to investment opportunities, and because one good idea spreads like, yeah, fungus, I’m guessing more parasitic products like it will start popping up. There’s definitely a market for them: in one survey conducted last year, 18 percent of households said they didn’t own devices with digital assistance, and half of those households said they didn’t trust these devices to protect privacy.
News about smart devices is rarely heartening. One family in Oregon had an Amazon device that recorded conversations and sent them to someone on their contacts list without prompting, becoming a viral story of 2018. In 2017, a batch of Google Home Minis started recording everything without prompting. Our trust in giant technological corporations like Amazon, Google, and Facebook is getting daily kicks to the gut. (Did you hear? That cute Facebook challenge to post a photo of yourself in 2009 compared to a photo of yourself in 2019, because you glammed the fuck up, could hypothetically be a training exercise for artificial intelligence facial recognition software, Wired reports.)
Reluctant acceptance that the robots are going to be in charge soon isn’t a solution. Any way to claw back some sense of security is a relief…even if it means one more flipping device I need to buy for my home.