Digital deceit and data privacy scandals plagued 2018, with everyone trying to keep up with expeditiously evolving platforms and technologies. What does data mistrust look like now and how can we overcome it? Is regulation the answer and who should take responsibility for it? Can technology be both the problem and the solution? Four experts - Prof Dali Kaafar, CSIRO’s Data61 and Optus Macquarie Cyber Security Hub, Ben Cade, Trustonic, Yaël Eisenstadt, Center for Humane Technology, and Catherine Maxwell from Governance Institute of Australia tackled these questions at D61+ LIVE 2019.
According to journalist and moderator Paul McIntyre, a consumer study conducted by WE Communications asked for the public’s sentiment towards innovation and technology across right countries, with alarming results. 84% said they are fearful that their personal data is not secure, while 77% believed that hackers could shut down the power grid, and 72%were worried that their customer service reps will be BOTS. The survey also reported that 65% of participants were worried about being hit by a self-driving car, 63% think their phone is listening to them, and 54% are worried that AI will take their jobs.
Although these statistics are startling, Dali Kafaar believes that there is a disconnect between people’s attitude towards privacy and technology and their behaviour. Although 59% of people have said that they will switch service providers if they think the provider is breaching their trust, he noted that 59% of Facebook users have not left the platform.
“I think there is a disconnect because notions of trust and privacy are very complicated to capture,” he said. “Each of us has a different definition of what is really privacy. We get so used to and reliant on technology, that it becomes an essential part of our daily lives. For example, even though I considering myself a privacy cautious and tech savvy guy, when I need to go somewhere, I just open the Google maps app and use it, without really being concerned that Google knows my every movement and protecting my location privacy.”
Yaël Eisenstadt agrees with the general notion of disconnect, explaining that while people are losing trust in companies, they're not stopping to use them, arguing it’s because they don't have an alternative for some of these companies.
“ At the same time, I think people want to trust that their government - although this is eroding - is actually protecting them. People believe that if companies were absolutely abusing them, then they would be illegal, and they wouldn’t be able to sell their products and services.”
Dali Kafaar explains that there is indeed some level of implicit trust. “I think for ages we've been really operating under either a no trust type of situation, or an implicit trust type of situation. We implicitly trust the government to process our data in the right way. And yet, in the wake of the recent privacy scandals, we realise that notion of implicit trust is probably not the right approach to how companies and government should handle our data. And so, I think we should really move and try to shift the whole eco-system into notions of explicit trust.”
What does that really mean? Dali answers: “Essentially, instead of implicitly assuming that our data is well processed, companies should provably show to us that our data is in safe hands, or our data is provably processed in a good, privacy-preserving way.”
But how do we get to that point? Legislation and regulation need to catch up with technological advanced and industry has a huge role to play in that.
According to Catherine Maxwell, private companies have to make privacy and trust a key differentiator in their business. “They really have to invest in that, they really have to put the time and the money to make sure that our privacy is protected.”
However, for that to happen, she points out a significant hurdle to overcome – digital literacy at the board level. According to a survey conducted by The Australian Institute of Company Directors on innovation, only 3% of Directors surveyed said they hold science and technology expertise, and only 3% indicated that they had international experience. 10% said they brought innovation experience into the boardroom.
There is a need for global standards, and industry, government and technology experts must work together in order to enforce some progress.
Yaël suggested that enforcing global standards is possible if driven by large international companies who operate across multiple countries. “If you do have Australia, Europe and the US all having different sorts of regulations around technology and privacy, it's going to actually force the hand of some of these big companies to step up and say, ‘we need a global standard’.”
The reality is that this approach may take years, and, in the meantime, technology continues to evolve. Ben Cades says we need to have other answers. “There are things we can do now, for example assess how secure our data is, so we can take the necessary steps to protect it,” he says. “There also needs to be more awareness. People need to be given a clear choice, e.g. ‘you can have this service for free, and this is what you give up in return; or you can pay for it, but you don’t give up any data’.”
And Catherine Maxwell agrees that companies can’t afford to wait for the government to catch up. “For companies that actually want to do well in this space, they've got to get some good solid governance processes around their technology.”
And one such small step companies can take today is provide their customers with transparency around how secure and private their data it, how is it used and how is it shared with third parties. Which will be the first step of building trust between consumers and providers.
Watch the full discussion below: