Why 40% of privacy compliance tech will rely on AI by 2023
The rise of regulations like GDPR have launched personal data security into the spotlight, and artificial intelligence is here to help.
Complying with GDPR might be challenging, says Box’s Crispen Maung, but new data privacy regulations are good for consumers and good for business.
In the next three years, more than 40% of privacy technology will rely on artificial intelligence (AI), up from 5% currently, Gartner found. With privacy laws and data breaches coming into focus in 2019, security leaders are looking for new ways to keep personal information safe.
Data breaches increased by 17% in 2019, with nearly 60% of businesses suffering a data breach in the past three years. The risks became so severe that governments took action, enforcing regulations including GDPR and California Consumer Privacy Act to keep users safe.
The heightened conversation around data security has resulted in mounting pressure on privacy professionals, who are ultimately responsible for keeping an organization’s data secure.
AI-powered applications, however, can help, Gartner found.
“There are various reasons [it can help], but speed and repeatability of data-governing actions and the ability to manage large volumes in similar ways are a few drivers behind AI-powered privacy aid tooling,” said Bart Willemsen, research vice president at Gartner.
“This will especially continue to increase with the coming of 5G networks and the unprecedented volume of data exchanged, including, for example, IoT.
“Another reason would be the data in scope, personal data. The identifiability of a record depends on the context and meaning. Personal data is much more than just names, addresses and SSNs. AI (pattern and principle-based) technology is capable of recognizing patterns and contextualized or identifiable data, discovering that data faster than conventional, (rule based) systems,” Willemsen added.
Successful privacy user experience relies on a company’s effectiveness of handling subject rights requests (SRRs). The SRRs enable individuals to make specific requests regarding their data, leaving organizations to respond accordingly.
However, Gartner’s 2019 Security and Risk Survey found that many organizations aren’t able to provide precise and efficient answers to these SRRs. Two-thirds of respondents said it takes them at least two weeks to respond to a single SRR. Since these tasks are often done manually, they end up costing an average of $1,400, the report found.
This is where AI comes in. However, “It is not the AI that solves a problem in itself. It is the reliance on AI functions that helps us solve problems faster, and on a larger scale,” Willemsen said.
“You can imagine a wide variety of functions that benefit from speed and the ability to handle large volumes. These include data discovery with ML, attribution to individuals with natural language processing (NLP), or even the usage of chatbot interaction to answer large volumes of SRRs with underlying ML to automate the actual answer to the request.
“It improves both the internal levels of control, operationalizing the personal data lifecycle in an automated way, as it can improve the customer’s Privacy User Experience (UX),” Willemsen said.
Rise in compliance tools
These AI compliance tools aren’t just talk: Privacy-driven spending on compliance tooling is expected to rise to $8 billion worldwide by 2022, Gartner found.
Privacy spending will impact connected stakeholders’ purchasing strategies, including that of CIOs, CDOs and CMOs, according to Gartner. This investment is necessary for future and current privacy readiness, however, so the cost will pay off.
As for when to begin investing, “Today is as beautiful a day as any other day,” Willemsen said.
“Especially when organizations operate a complex architecture where personal data is hard to manage across systems, if they are insufficiently able to control the personal data lifecycle, or if they have a high privacy risk exposure–e.g., when expecting to receive large SRR volumes unable to be handled today–organizations may want to investigate AI-based solutions,” he added.