It’s been a rough year for the Internet of Things. Security researchers uncovered terrifying vulnerabilities in products ranging from cars to garage doors to skateboards. Outages at smart home services Wink and Google’s Nest rendered customers’ gadgets temporarily useless. And the Volkswagen emissions scandal, though not precisely an Internet of Things issue, has exposed yet another issue with “smart” physical goods: the possibility of manufacturers embedding software in their products designed to skirt regulations.
And those are only the most immediate concerns. The Internet of Things brings with it privacy concerns and compatibility headaches. There’s also the potential for the companies that make this stuff to go belly-up at any moment—as Wink’s parent company Quirky just did. In the worst case scenario, customers could be left with a house full of expensive, not-so-smart gadgets.
To protect consumers and realize its true promise, the Internet of Things must open up.
It’s enough to make you wonder whether it’s time to scrap the whole idea of smart things and get back to basics. After all, having to get out of bed to turn the heat down or switch off the lights is the ultimate First World problem. That could be part of why consumer interest in smart home products has been sinking, at least according to one report.
But the Internet of Things also holds tremendous potential to improve our health; make our cars safer and more efficient; and conserve both water and energy. The Internet of Things doesn’t have to be a nightmare of deceit, outages, and self-interested black boxes. To protect consumers and realize its true promise, the Internet of Things must go the direction of the software and hardware that supports the Internet itself: it must open up.
The Safety of Objects
Today, the vast majority of smart home gadgets, connected cars, wearable devices, and other Internet of Things inhabitants are profoundly closed. Independent researchers can’t inspect the code that makes them run. You can’t wipe the factory-loaded software and load alternative software instead. In many cases you can’t even connect them to other devices unless the manufacturers of each product have worked out a deal with each other.
Ostensibly, this is for your own protection. If you can’t load your own software, you’re less likely to infect your car, burglar alarm, or heart monitor with a virus. But this opacity is also what helped Volkswagen get away with hiding the software it used to subvert emissions tests. It makes it harder to trust that your thermostat isn’t sending selling your personal info to door-to-door salesmen or handing it out to the National Security Agency.
As the Volkswagen case shows, openness and vigilance go hand-in-hand.
One of the biggest ironies of the Volkswagen case is that the Environmental Protection Agency actually fought rules that could have made it easier for independent researchers to catch the company’s cheating. The EPA reasoned that making it easier for the public to experiment with the software that runs emissions systems would make it easier for consumers to circumvent pollution controls. Clearly that approach backfired. We can’t know for sure that researchers would have found the Volkswagen defeat device earlier if the software had been more open, but it surely wouldn’t have hurt.
Critics could point to long-standing bugs in open source software as evidence that open source software can be less secure than the proprietary kind. After all, the Shellshock bug in Bash, a standard part of Linux and other open source operating systems, went undiscovered for 22 years. The problem, however, isn’t inherent in the openness; it’s that in many cases open source software has received less scrutiny from researchers because the financial incentive to do so just wasn’t there. No one was posting big bug bounties for Bash. By that same logic, merely putting code out in the open doesn’t make it more secure. It needs to be examined by people who know what they’re doing. As the Volkswagen case shows, openness and vigilance go hand-in-hand.
Companies argue that publishing the code that powers their products will hurt their competitiveness in the marketplace. Competitors, they say, will be able to copy what they do. But there are differing degrees of openness. Writing about Volkswagen fiasco, sociologist and New York Times contributor Zeynep Tufekci recently pointed out that while the code that powers casino slot machines isn’t open to the public, it is audited by regulators. The real trick, as Tukekci points out, will be in getting regulators more involved in auditing devices under real-world conditions, as opposed to labs. While open source zealots would argue that providing the software that cars run on only to regulators and the researchers they hire isn’t enough, it would certainly be more open than what’s going on today.
“It’s a pity that casinos have better scrutiny of their software than the code running our voting machines, cars, and many other vital objects, including medical devices and even our infrastructure,” Tufekci wrote.
Making Internet of Things products more open would also make them better products. As the Wink and Nest outages have shown us, if the manufacturer’s cloud service goes down, often these products simply stop working. That’s particularly scary in light of Wink parent company Quirky’s bankruptcy. The Wink line will likely be sold quickly—supply chain company Flextronics has already made a bid—so customers will hopefully never even notice. But the tumult shows just how precarious the Internet of Things really is right now.
That fragility could disappear, however, if customers had more control over their gadgets. Even if a manufacturer disappears, consumers could re-program their devices to work with open source software that enables them to keep functioning. We’ve already seen great success for this model in the home networking world. OpenWrt is an open source operating system for network gear, such as WiFi routers, that consumers can install at home. Since manufacturers aren’t always good about providing security updates for old products, OpenWrt not only provides more features, but also improves security.
When code let loose in the physical world inflicts real harm, the creators of that code must be held accountable.
Most vendors probably aren’t keen on allowing customers to replace the operating systems on their products, but again, openness varies by degree. Simply providing an application programming interface—API for short—that developers and hobbyists can use to build links between one product and another would go along way towards making the Internet of Things more robust.
Many smart home gadgets, such as the Amazon Echo, already provide some sort of API. But as software developer Matthew Garrett found out, they’re often limited. As he explained in a recent blog post, Garrett wanted to write some software to make his Echo work with LIFX smart bulbs. But he found that because the Echo API would only work on the Internet, they couldn’t talk to the bulbs on his home network. He eventually found a workaround—essentially tricking the Echo into thinking his smart bulbs were a different brand that were actually supported by Amazon—but his experience shows how making products even slightly more open could increase their usefulness.
Until there’s consumer demand for more openness in the Internet of Things, companies will keep trying to exert control over their products. In some cases, they’re hoping to keep customers dependent, forcing them to pay monthly fees to keep their services working. In other cases, manufacturers are making misguided attempts to keep customers safe from themselves.
But the more trouble that emanates from the closed-off code embedded in an ever-increasing number of physical objects, the more the makers of those objects will struggle to shield themselves from calls for transparency. When code let loose in the physical world inflicts real harm, such as Volkswagen’s polluting cars, the creators of that code must be held accountable. It’s time to start demanding that smart things finally open up.