Bioometric identification Thinkstock

IoT privacy and the tricky question of data ownership

IoT privacy advocates ask: If data is the new currency, why are so many people giving it away for free?

“The Party seeks power entirely for its own sake. We are not interested in the good of others; we are interested solely in power, pure power.” — George Orwell in 1984

Collect data first, ask questions later. That seems to be the unofficial motto for many organizations when it comes to data gleaned from IoT devices and other connected devices like smartphones.

But the fact is, we are now living in a big data world, and organizations across the world are seeing that they can realize unexpected benefits to data gleaned from IoT devices. “My view generally is all of this data is being used against us,” said Ari Scharg, a Chicago-based partner at the law firm Edelson PC. “Companies might say they are using it to benefit their customers, but that’s just a tagline. What they really mean is that they want to know everything they can about their customers’ personal lives so that they can predict and capitalize on their needs and behaviors.” The market for such insights is considerable. The U.S. data brokerage industry generated some $202 billion in revenue in 2014, according to the Direct Marketing Association, and it’s poised to expand in tandem with the Internet of Things. “Just think about how valuable the data collected by your smart refrigerator about what you eat and drink every day would be to a life insurance company that is pricing your policy,” Scharg explained. “In many ways, personal data is being weaponized and used to help corporations manage their own risk.” 

IoT data, crime and punishment 

One example of the potential of IoT data to be used in unexpected ways comes courtesy of Ross Compton, an Ohio man whom police have charged with aggravated arson and insurance fraud. Compton claimed a fire broke out in his house, and that he quickly grabbed a few things before escaping. Police were suspicious early on, according to a Washington Post article. He didn’t just grab a few things — he grabbed 15, which seemed like a high number for someone fleeing a fire. And investigators smelled gasoline at the site where Compton’s $400,000 house had burned, and also identified gasoline on his shoes, hands and shirt. Investigators also noted that the fire appears to have broken out in multiple locations across the house, which could be further evidence for arson.

But the most damning evidence against Compton came courtesy of biometric data gathered from his pacemaker, a medical device that is an early example of the Internet of Things in medicine. Investigators got a search warrant to download data from the connected device. According to court documents cited by the Middletown Journal-News, a cardiologist, after examining the pacemaker data, testified that “it is highly improbable Mr. Compton would have been able to collect, pack and remove the number of items from the house, exit his bedroom window and carry numerous large and heavy items to the front of his residence during the short period of time he has indicated due to his medical conditions.”

[IoT Security Summit, co-located with Blockchain360 and Cloud Security Summit, explores how industry-wide security, privacy and trust can be established to unlock the full potential of IoT. Get your ticket now.]

Compton has pleaded not guilty in the case, and the judge in the case has allowed the data from the pacemaker to be used in the trial. The defendant’s attorney had argued that, in obtaining the pacemaker data, police intruded on Compton’s Fourth Amendment Right against unreasonable search and seizures. 

No matter what the outcome, the case could be a harbinger of things to come in the legal field, as IoT data becomes an important class of evidence, explained Peter Tran, RSA's Advanced Cyber Defense general manager and senior director. “There's a forensics principle known as Locard's exchange principle that applies across the board to cyber including for IoT,” said Tran. “The principle basically states that whenever criminals walk into or leave a room, they leave something behind, whether they see it or not,” Tran explained. “In essence, the cyber version of the principle states that whenever data or a connection is made or transmitted, there is trace evidence that's left behind, whether it is at rest or in transit, stored or in other memory.” 

The threat of capturing extremely personal information

Another notable case study of what can go wrong in terms of IoT privacy comes courtesy of the We-Vibe, an internet-connected vibrator. Its manufacturer, Standard Innovations, recently agreed to settle a $3.75 million class action settlement related to the data it had collected from tens of thousands of customers. The lawsuit alleged that the product logged the date, temperature of the device, time and duration of each session on its cloud server, correlating the data with the email address of registered users. The manufacturer denies any wrongdoing.    

Scharg helped represent the Chicago-area woman who brought that lawsuit.

The We-Vibe device was also featured at the 2016 Defcon security conference in a session titled “Breaking the Internet of Vibrating Things.” A pair of security researchers who went by the name of g0ldfisk and follower explained that they could have succeeded in reverse-engineering the wireless functionality of the We-Vibe device. The duo also was able to launch a man-in-the-middle attack to control the device and intercept data from it. The researchers pointed out that the devices’ geo-tracking capability posed a risk to users where such products are illegal, which is the case in a handful of countries and, in the U.S., in Mississippi and in one town in Georgia. The two pointed out that the privacy policy for the device states that the manufacturer “reserve[s] the right to disclose your personally identifiable information if required to by law."

Facial recognition in the public and private sectors 

A further hot-button IoT privacy issue is facial recognition, which essentially works by drawing patterns from IP cameras found throughout public spaces including everything from airports and train stations to retail and corporate locations. 

The potential privacy implications of the technology, whose roots stretch back decades, came to the fore recently in a pilot project at a train station in Berlin, Germany. While officials stated that the program was intended to fight terrorism and for cracking down on graffiti, the technology “holds an enormous potential for misuse,” according to Maja Smoltczyk, an official responsible for protecting data protection rights in Berlin. 

Former Rep. Jason Chaffetz (R-Utah) reached a similar conclusion in a meeting of the Full House Committee on Oversight and Government Reform in March. Acknowledging that the technology could be a powerful law enforcement tool, he stated that the technology could be used by bad actors to “harass or stalk” individuals and that it could “chill free speech” and could potentially “track people’s location throughout the day.” 

According to the Guardian, roughly half of U.S. adults are in FBI's facial recognition database "without their knowledge or consent." “Imagine how valuable that database would be if [the FBI] had the facial recognition of every single American in their system,” Chaffetz said at the March hearing. “What does it mean if that information was to get into the wrong hands?” 

In the rough and tumble world of the internet, the answer to such questions may depend on who you ask.

Data brokerage and biometrics set to grow

Given the proliferation of data generated by IoT devices, smartphones and elsewhere, the role of data brokerage seems destined to play an even more important role in the economy. A recent report summary from Transparent Market Research acknowledges that “customers are unaware of data being collected, thus privacy issues occur consistently. This is expected to provide hurdles in the growth of the data brokers market.”

According to Gartner, three-fourths of analytics technologies “will incorporate 10 or more data sources from second-party partners or third-party providers.”

On a global basis for IoT, the major challenge for biometric data in general is the massive volume of collection involved, which can complicate the prospect of maintaining appropriate access, authorization and authentication, Tran said. “If the collection sample isn’t sufficient and accurate at the point of collection, the biometric ‘compass’ becomes fuzzy and gets far off course very quickly as biometric data repositories build over time with IoT being a force multiplier,” he added. “Error rates, misidentification, and exposure to nefarious manipulation of data become serious problems.” 

Given the vast amounts of data available, the lines between government and private sector databases are already blurring in some cases, with law enforcement and government agencies feeding, for instance, data from social media into their databases, which include biometric data.

There is, of course, the argument that people shouldn’t worry about being tracked if they are not doing anything wrong. “This is something that often comes up in the context of geolocation information,” Scharg said. “Some people might say: why would I care if a company is tracking where I am going? I’m not going anywhere sketchy. Putting aside the major invasion of privacy, the bad guys that hack your data don’t care about where you go. They care about when you leave your house in the morning and when you leave work to come home,” he added. “The idea that we shouldn’t care about being tracked if we’re not doing anything wrong is crazy, and frankly, it’s un-American.”

TAGS: Personal
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish