The sorry state of IoT security needs regulating, not only because it's disruptive to the Internet, but because it poses a threat to public safety.
Photograph by Rama, Wikimedia Commons, licensed under Cc-by-sa-2.0-fr
With the Internet of Things already flexing its muscle and showing its potential to be a security nightmare, has the time come for governments to step into the fray and begin regulating the Internet? Security guru Bruce Schneier thinks that may be an inevitability, and says the development community might want to go ahead and start leading the way to assure that regulations aren't put in place by people who don't understand tech.
"As everything turns into a computer, computer security becomes 'everything security,'" he explained, "and there are two very important ramifications of that. The first is that everything we know about computer security becomes applicable to everything. The second is the restrictions and regulations that the real world puts on itself are going to come into our world, and I think that has profound implications for us in software and especially in open source."
He was speaking on February 14, Valentine's Day, at the Open Source Leadership Summit in Lake Tahoe, giving his presentation by way of a Skype hookup. He couldn't be there in person, as he was in San Francisco to give a presentation at the RSA security conference.
These days, he noted, computers are about much more than desktops, laptops and servers. Everything from microwave ovens to thermostats to ATM machines to automobiles are computers, with most of them being connected to the Internet. And although much of the media attention on security issues associated with IoT is centered around massive DDOS attacks, script kiddies and email spam, these were barely mentioned in his talk. He sees much larger issues looming.
"Because the Internet is naturally empowering, it allows all good things to scale," he explained. "It [also] allows attacks to scale. The notion of a class break, that you could actually simultaneously hack a million devices and that things can work until they fail all at once, doesn't happen with noncomputerized systems in the same way -- and a lot of the intuition regular people have is based on that world. Of course, this is more dangerous as these systems get more critical. So we actually have to worry about crashing all the cars [or] disabling all the power plants."
The talk was peppered with this distinction between "us" -- meaning IT workers -- and "regular people," who are going to have to learn to adjust to a new paradigm of security brought about by technology's intrusion into "the real world."
"And unlike the real world," he continued, "we're not concerned about the average attacker. We're concerned about the Phi Sigma guy, the one smart person who will figure it out and write the software and distribute it to everybody else. That doesn't happen with lock picking in the same way."
He pointed out that many of IoT's security problems are well known, like the fact that IoT devices are designed with security as an afterthought. "A team will grab some library, possibly open source, probably some binary blob that no one knows what it does, get it working and then disband. A lot of these IoT devices can't be patched, even if there were engineers who could work on patches, which they're not."
Other issues might not be so obvious, such as the intended lifetime of IoT devices, which is much longer than phones and computers which tend to be replaced every two to five years. "You replace your DVR every five to 10 years and your refrigerator every 25 years. You replace your thermostat approximately never. So we have big market failures here. The buyers and sellers of that DVR don't care about security. They don't care that it's in a botnet. If it's cheap and it works, it's fine."
Add to this the fact that an estimated 5.5 million new devices go online each day -- that comes to about two billion per year -- and it's clear that, to paraphrase Apollo 13 astronaut Jack Swigert, "we have a problem here."
"What we are going to see is increased government involvement," Schneier said. "We are going to see, in our world. people passing laws and putting restrictions on us, because that's what happens in the world of dangerous things."
Although Schneier calls the avenues that are open to governments a "limited toolbox," he points to a broad range of actions that could be taken. This includes the regulation of products as well as product categories, the institution of licensing requirements, both for individual developers and for products, and introduction of testing requirements. After products make it to market, they can be covered under liability laws, and fines can be levied against companies that try to skirt around regulations.
The European Union has already begun dealing with the IoT issue. Schneier pointed to the adopted "CE" standard for vulnerability disclosure and noted that they're working on another for patch management. However, in the U.S., where regulation is regarded less favorably, Washington has been hesitant to deal with the issue.
Schneier proposes "a new government regulatory agency. That we in the United States need a new agency to figure this all out." He's calling for developers, specifically developers of open source software, "to start getting involved in policy," both to insure that regulations aren't passed that effectively put an end to open source, which he sees as a possibility, and to insure that unworkable regulations aren't put in place by politicians who don't understand technology.
"I know I'm speaking to programmers, but for the past bunch of decades we've all had this special right to code the world as we saw fit," he said. "My guess is that we're going to lose that right, because it is too dangerous to give to a bunch of techies. That means we need to get involved in policy. As Internet security becomes 'everything security,' Internet security technology becomes more important and security policy becomes more important. The policy issues, I think, are more important than the tech, and we will never get the policy right if policy makers get the technology wrong."