In an age of IoT, your objects may be smart, but how smart are you when it comes to legal implications?

According to the Altimeter Group, 87% of people have not heard of the term ‘Internet of Things’ (‘IoT’). However, it is safe to say they are already immersed in it, and that this immersion will only become deeper as the years progress. After all, ATMs were amongst the first IoT objects, making their connection debut in 1974. Today, smart objects range from watches, thermostats and medicines to even liquor, with Johnnie Walker’s launch of interactive bottles just last year.

The IoT is comprised of all the objects connected to the Internet and/or to each other. Used conceptually, the IoT represents a vision in which the physical becomes one with the digital world; in which objects communicate with their users, other devices and databases to possess an ambient intelligence.[1] According to Gartner, last year, we had 4.9 billion things connected to the Internet. By 2020, no less than 50 billion things will be connected.

We are virtually swimming in a veritable sea of unseen signals, but how well-versed are we in this ocean’s risks and safety procedures? Our life jackets and buoys can be found in an awareness of the IoT’s legal implications.

The International Data Corporation, Intel and United Nations estimate that by 2020, there will be 200 billion objects in the IoT ecosystem, representing about 26 smart objects for every person in the world, as well as an increasingly interconnected processing of information. What this means? More cyber risk and wider-reaching consequences should systems be infiltrated, and more competition – as companies fixate themselves upon market share, security of IoT technologies may only be remembered too late.[2]

Nissan was publically embarrassed when it was forced to shut down a mobile app for its electric cars – turns out the app’s interface could be hacked to remotely control a vehicle’s functions. Although these were non-critical functions, such as climate control and battery charge, there was potential for historical driving data to be accessed.

Lesson? Secure your devices before they’re connected. As noted by the American Federal Trade Commission, this is especially important in a climate where the same chips and drivers can be found in many different products made by many different manufacturers – a weak spot on one product can be targeted across a larger class. And where security is installed, make sure security certificates are not allowed to lapse, as in the case of home automation company Wink’s device hub.

A standard trip to buy shoes could become an exercise in (unwitting) data generation. Without needing to connect to the store’s Wi-Fi network, your shopping time, number of visits, the items you were near, could passively be measured ‘without interrupting [your] shopping experience’, as one analytic service provider (Euclid Analytics) cheerily reported. Customers can, of course, actively participate in data collection by downloading a store’s app, receiving a tailored shopping experience, but allowing their location, online search activity and potentially social network information (if such an account is used to sign-in to the store’s Wi-Fi) to be shared with the store and its third-parties.[3] Even so, one 2015 study by the Altimeter Group reported 48% of consumers to be ‘highly uncomfortable’ with companies using their data in physical spaces ranging from their bodies to public institutions.

Companies offering connected products need to ensure they have the trust of consumers before sharing their data and accessing their networks. This is a crucial step before the benefits of being connected can be reaped by all involved.

In the IoT age, digital mishaps may have very physical ramifications. Smart pills might fail to feed information to a monitoring doctor, self-driving cars might malfunction (or be hacked), water-pipe monitors might glitch and fail to warn of a burst. In these situations, is it the supplier, manufacturer, developer or individual who is at fault? And to what extent? How far down the supply chain should one look to apportion blame? Liability is at large in the IoT.

According to IHS, wearable technology in the healthcare sector is estimated to rise to 210 million shipments by 2018. Providers need not only be aware of the $30 billion in revenue it will generate, but the risks that will accumulate. Indeed, apps have already been recalled for incorrect calibrations and monitoring of medical conditions.[4]

Surrounded by smart objects, how smart do users and companies need to be? And what should we do when the intelligence of data collection is pitted against the intelligence of physical and informational safety?