Successes of machine learning have catalysed a revolution across the spectrum of human endeavours. Enterprises offer machine learning services to improve their businesses and provide ready-made intelligence for end-user applications. However, there is an increasingly striking discordance between its rapid growth and renewed awareness of privacy protection in this era of intelligence. Individual data often carry private information and models are deemed as intellectual properties.
The security of private and sensitive information carried by the data and machine learning models that could be potentially exposed must be guaranteed. This program aims to nurture Australian next-generation workforce in the field of Privacy-preserving Machine Learning (PPML). PPML aims to protect the privacy of data or models used in machine learning, at training or inference time and during system deployment. It is the key enabler of trustworthy ML systems and benefits both model users and model owners.
We plan to explore emerging privacy-enhancing techniques for realising PPML systems, address practical challenges during deployment, and adopt PPML systems to Australian prioritised industry sectors. The outcome expects to pave a long-sought lane towards broad market adoptions of PPML and be an integral part of the next generation Australian digital economy and society.