The open-data economy is growing globally at a rapid rate with various governments publishing large, anonymised data sets around the world.
While the opportunities this presents to not only marketers and businesses, but also to all industries are immense, there are some serious challenges that need to be taken into account when sharing data.
A recent issue around re-identification of data through fitness app Strava created headlines around the world, bringing the challenges of an open-data economy to the forefront.
The Strava issue
The app’s visualisation of data – in the form of heatmaps – is public and draws data from users across the globe. This heat map led Nathan Ruser, a 20-year-old university student, to identify secret US military bases.
“The Strava map has a beautiful way of showing all these incredibly large number of data points. I thought, maybe those data points showed something interesting,” Nathan Ruser told the ABC.
Through looking at routes used by military personnel while exercising and overlaying these onto isolated parts of the Middle East and known military US bases, Nathan discovered he could also identify secret locations—a discovery he then shared on Twitter.
Others went a step further and realised they could identify individuals at those secret locations.
While this was not a conventional data breach (tracking and heatmapping are part of Strava’s terms of service), it certainly raises concerns about the safety and potential consequences of open data and data sharing.
The open-data economy promises a world of opportunities especially for business, including improved customer experience, building stronger relationships and increased profitability. However, it can also have profound consequences.
“Something done with the best intentions, and that is quite a clever use of data, will have unintended consequences,” DGA CEO Jodie Sangster told the ACS.
“As much as we think through how data is going to be used, I am not sure we can foresee every use that’s going to come out of it.”
As companies increase their collection of data sets and advance their data practices, it is also becoming increasingly difficult to guarantee that data can’t be re-identified, according the Sangster.
Most Australian companies refer to the Privacy Act for guidance around data collection, which don’t necessarily address this issue. To combat this, and prepare for the open-data economy, Data Governance Australia (DGA) has released a new Code of Practice, incorporating provisions around transparency and ethical use of data. The Code of Practice is intended to encourage organisations to think about the consequences of their data practices and the risks posed by re-identification.
The Code of Practice requires companies to take reasonable steps to ensure that its data practices cannot reasonably be considered unethical – that is, the use of personal information:
(1) that causes harm to an individual,
(2) for purposes that are not consistent with the context in which the personal information or would be considered as inappropriate by reasonable community expectations, or
(3) which does not otherwise accord with reasonable community expectations.
While ‘ethical use of data’ and ‘reasonable community expectations’ continue to be a difficult concept to define with precision, DGA and its members are leading the charge with a special Advisory Committee developing a framework for assessing unethical use of data and community expectations.
The Strava incident demonstrates the importance of companies keeping ahead of rapidly changing technologies and data practices to ensure the protection of their data and to minimise unintended consequences that can cause harm to businesses, consumers and governments alike.
Read the DGA Code of Practice here.