Blog icon

By Alison Donnellan 27 January 2020 7 min read

If data is the new oil, then consumer privacy is the refinery that transforms the product into the safe and useable end result. As we move further into the Digital Revolution, new and innovative solutions are needed to ensure we’re protecting one of our most valuable resources.

 

In 2017, it was revealed that Cambridge Analytica had collected and analysed millions of individuals’ data from Facebook without their explicit consent, using this information to allegedly manipulate the outcome of 2016 United States Presidential election. 

While this breach of trust succeeded in bringing the issue of data privacy and misuse to the forefront of company policies, there is no clear and uniform understanding of how to effectively manage and protect user data.  

We spoke to three data privacy experts at CSIRO’s Data61 about why data privacy should be a significant and constant issue for organisations, and how they, along with individuals, can conserve the privacy of users. 

 

What is consumer data privacy and data security? 

According to Hugo O’Connor, Senior Engineer at Data61, the terms ‘data privacy’ and ‘data security’ are often conflated. “Data security relates to protecting data, almost as a defensive mechanism, whereas privacy is the selective revelation of personal information and the concept of choice to anonymity. It’s always advisable for organisations to implement both security and privacy measures, rather than just one,” explains O’Connor. 

Multiple organisations have recently experienced a data breach resulting in users’ personal information leaking publicly or being stolen for malicious activity, and while no security system is completely invulnerable, these breaches reveal a lack of cyber security, such as data encryption, and a failure to adequately protect sensitive information. 

However, there are a plethora of companies who inappropriately access user data that bears little to no relativity to the product or service they offer, a knowing act of data privacy negligence.  It was revealed in 2013 that a host of well-known and regularly used mobile applications were collecting private user data, with torch app Brightest Flashlight requiring device ID and geolocation information to operate.  

According to Dr Adnene Guabtni, Senior Research Scientist at Data61, organisations do not have a business incentive to keep users’ data private. “Organisations could be more private than they are now, but we’re at a point where business models rely on customer data being sold to advertisers, and services are being built on analytics from that data.”

“There needs to be a full-scale revamp of how data is required and managed on a broad scale, so organisations are transparent on how, what and why they collect, giving consumers more oversight,” notes Dr Guabtni. 

 

Why consumer data privacy is so important for a business and how to protect it

Correctly managing consumer data privacy should be in the business interest of organisations globally. According to Professor Dali KaafarInformation Security and Privacy Group Leader at Data61, research has shown that two out of three people in Australia would switch to a competitor if they found out that a business had been unsafe with their personal data.  

“This goes to show just how much change there would be if individuals really knew how their data was being stored and used, and why it’s in their interest to nail their privacy processes.”  

Organisations should be completely responsible for protecting consumer data, argues O’Connor, a condition that requires not only a change to the framework of a business, but also the culture and day-to-day processes of the workplace.  

“It’s not unheard of for data to be stored in an unencrypted Excel spreadsheet and sent to multiple staff members via email. This is blatantly unsafe for a multitude of reasons, so organisations have a high need to educate their teams on protecting data – including safe methods of sharing it, and only sharing it with approved viewers.” 

Organisations need to improve the methods used to obtain consent from customers in order to collect their data. The current ‘End User Licensing Agreements’ fall short of accurately explaining to subscribers as to how their personal information will be used.  

“End User Licensing Agreements are often lengthy and long-winded, with users more often than not refusing to read them,” explains O’Connor. “People would be shocked by what they have actually agreed to if they did.” 

“Organisations need to be short, sharp and snappy on what exactly users are consenting to, must provide a receipt reaffirming to users as to what they’ve agreed to, and afford them an avenue to rescind consent.” 

 

Considerations when building a consumer privacy preserving system 

Collecting user data involves a high degree of risk for all companies, in particular small businesses. “Small companies may not have the ability to build infrastructure that will keep it secure and protected,” says Guabtni, who advocates for a simple solution.  

“The less data you collect, the less vulnerable you’ll be, and less targeted by malicious actors. While the large tech giants would love to collect more data, they have more resources to store it safely and protect it as much as possible. So, if you insist on collecting data, the next step is to consider how granular you need it to be – for instance, do you really need exact GPS coordinates, or can you achieve the same results only collecting postcode, region or city data? 

“If you do need granular data for analytics, my recommendation would be to collect the data locally on the device, rather than on a central server, and run analytics at the user level, such as for personalising a website or app. In cases where this isn’t possible and big data analysis is required across users, the data does need to be centralised, so the recommendation there would be to ensure it’s all anonymised, collected with explicit consent, and stored securely.” 

Dr Guabtni believes artificial intelligence (AI) and machine learning will play an essential role in the future of data privacy. “As artificial intelligence becomes more widely utilised, I can see it being implemented in relation to data collection, anonymisation and analysis. However, AI in relation to data privacy is a double-edged sword, as it can only be useful if you feed it with lots of training data, which goes against the whole idea of data privacy.  

“At the same time, if you have an AI system that can make accurate predictions about customers, you don’t actually need to collect any additional data about them because you have those predictions. The balance is in collecting the minimum amount of data that will allow AI to predict enough, to not require any further data.” 

Professor Kaafar argues that businesses must also be aware that reidentification using their stored datasets is a possibility. “There are absolutely ways for adversaries to infer things about the original owners of the data – research has shown that even if a machine learning model analysing the data is hidden behind a black box to isolate it, you can still identify common data across different models and insinuate why the data might be used there, which breaks consumers’ expectations around how their privacy will be maintained.” 

“For instance, if a person’s data is being used as part of a cancer trial, and a malicious actor can tell it’s being used there, this would violate the individual’s right to privacy in regard to not having their health information shared.” 

 

The opportunities for Australian businesses

Despite data privacy being inherently risky, there is economic potential in getting it right when it comes to enabling and maximising data use while guaranteeing its protection, explains Kaafar 

“This opportunity is huge, but it is also a global multidisciplinary challenge between technology, economic value, regulatory implications, and the fundamental human right to privacy. There is great opportunity for Australia to lead the way in data privacy regulation, and industry and government both need to work together to take the lead on this.” 

O’Connor also sees value for organisations that develop tools to enable users to be in control of their own data. “Businesses that allow consumers to control their own data will be raising the level of competition by offering better solutions. These businesses, whether a large organisation or small start-up, will see economic value at their doorstep by challenging the status-quo, and the same goes for organisations which reduce their data intake while providing the same services.” 

“At Data61, we are working to create systems which guarantee some element of privacy, while enabling some utility from the data to be gained. If we could create an algorithm that protects data but allows insights to be extracted, companies would be able to collect data with informed consent, while providing guarantees that they know how to manage the risks of having such data.” 

Find out more about our privacy preserving research and technologies here and why the future of science is data here

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.


First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.