Governance, Risk, and Compliance (GRC) - OpenPages

 View Only

Remember The Movie, 'A Perfect Storm' When Three Storms Came Together? A Perfect GRC Storm Is On The Horizon

By Tony Perri posted Fri November 15, 2024 03:12 PM

  

Remember the Clooney/Wahlberg movie, "A Perfect Storm" when 3 massive storms came together? I've been working on a research paper for a client lately from a SHARE presentation I did in Orlando. There are parallels from the paper – titled "Managing Governance, Risk, and Compliance in the Wild West of Converging Data Security and Consumer Privacy Policy," – and the movie where 3 massive storms converged and wreaked havoc on the fishing village and ships that were out int he ocean. (If you haven't seen the movie, I'll reveal no spoilers here.) What's emerging in data governance are a few initiatives related to AI and LLMs that, combined with data security and consumer privacy mandates, are going to keep a lot of CxOs up at night because they are going to struggle to get their arms around all the governance mandates. 

Why? Firstly because the privacy and AI mandates are new. A new AI governance imitative has already launched in Europe, The EU AI Act. With consumer protection already being a thing in Europe (the GDPR), it's only natural that the EU come up with the first attempt at governing LLMs. In a nutshell the AI Act is designed to protect citizens "against the harmful effects of AI systems in the Union." There are rules for putting AI systems in place and and calls for transparence throughout.

Many of you are familiar with the GDPR and some of you may already be versed in the AI Act, but what what might be interesting to you is the origins of these laws designed to protect EU citizens. When I researched and wrote about the GDPR a few years ago, I discovered that it was an update to 1995 legislation in Europe called The Data Protection Directive. What is significant about 1995 was that it was the dawn of the internet and some visionaries saw the need for this legislation. But what's most interesting to me about the 1995 Directive is a university paper I came across that suggested the Directive's roots date back to Nazi Germany when there was no protection for citizens when the Nazis were hunting down citizens to kill them. 

Secondly, the next big problem is going to be how to manage to compliance when different people are tasked with the complying with the different mandates. In InfoSec, you typically have the IT side of orgs deploying, monitoring and working with ops and business leaders to set policy for data security. For consumer privacy laws and regs, we are seeing more and more legal teams taking on the policy making and policing of consumer privacy. With impending AI/LLM regs -- NIST has already issued an AI Risk Management Framework and there is currently ISO 42001 (Alignment Systems) -- its unclear who will create policy and manage the use of AI but my guess is that it will also fall somewhere between ops leaders and legal.

So, what's the problem you may be saying? Silos of people and process. IT/IS doesn't communicate much with legal and the biggest consumers of data for growing large business are marketing and sales. My experience with sales and marketing is that they don't communicate much with one another, much less with IT. 

Thirdly, the big problem is that every year in whatever breach report you follow (Splunk, Verizon and IBM to name a few) the origins of nearly every cyber breach involve a human factor (Verizon DBIR reported 68%), be it human error or malicious actor. Over the years, in every enterprise breach I've ever researched and written about, the breached org had millions invested in cyber security software, training and process for remediation. And in every breach, there were weeks and months or yet longer for remediation to begin. 

Here are a few takeaways from my research that may be helpful for you: 

  1. Humans make errors and errors where data management is concerned, are now costly with the GDPR. Soon they will also be costly for U.S. consumer privacy violations. With the GDPR, you can be personally liable, and it can be career threatening. U.S. state privacy legislation is less costly, but you still can be held liable which can threaten your reputation and create risk for your career.
  2. Consumer privacy regulations and laws in the U.S. are just coming into play. These will be rolled out state by state and the seem to be following the lead of California with their CCPA and CPRA directives . You can find a US privacy legislation tracker here - https://iapp.org/resources/article/us-state-privacy-legislation-tracker/. 
  3. Just as consumer privacy law in the US followed the GDPR, so too will AI regs. The EU has issued it's AI Act and NIST now has a framework. It would not surprise me if US state legislators started crafting laws around AI for citizen protection and they will likely vary state by state.
  4. Fines/punishments are likely years away. The GDPR launched in May of 2016 but the first major fine was not until January 2019. The CCPA went into effect in June of 2018 but the first fine was not until November of 2022. It takes time, like any legal process, but the fines are there and they are coming.
  5. Working in silos is inevitable and a huge problem for data security: Mainframers tend to be lone wolfs but these regulations should concern you as they affect your livelihood. Aside from large fines, brand reputation is at risk. With most products and services being commoditized these days, the differentiators for being more competitive in crowded marketplaces is data and how your sales/ops/marketing teams are using that data. Product Marketing people (and, I are one) are awful with data. Like IT people these people in your org are overtasked and understaffed. Like IT people they are given unrealistic deadlines and are mostly busting ass to hurry to complete a job that was due yesterday. I firmly believe this is why there are so many human errors linked to cyber breach. Work silos can be both horizontal and vertical. Chances are you talk to the marketing team about as much as you talk to your boss's boss or boss's boss's boss. Are they adding marketing, ops and HR teams to security vulnerability assessments? Or are these assessments just done with the IT security teams and it's understood that the rest of the org will get the memo? 
  6. Data breach vulnerability is highest at the weakest link. There's no weaker link that a new hire who doesn't know data policy and security best practices and is also rushing to learn and become productive. Therefore, data security and awareness must start at onboarding and HR must be heavily involved. This is why we IT people must talk to marketing people and HR people and ops people and legal people to help them understand that there is a lot to wrap your collective arms around for data security, consumer privacy and AI models.

Half the battle with these laws and regulations will be understanding the playing field. Then you can build a playbook. For now it is worth noting that where enterprise data is concerned there are 2 more things to worry about than that data security/compliance thing you been worrying about for the past 10 years. There are a lot of unknowns so make yourself aware of the CCPA/CPRA and the NIST AI governance framework. It's a good idea to understand the EU AI Act as well. There will likely be something similar coming to the US state by state. And, try not to rush too much; it is contagious. If you feel too much pressure because your boss has a hard time keeping up with your workload and continues to pile on the work, you must push back. Use your PM system to increase or improve your periodic reporting to them so they have the visibility to understand your work queue and its deadlines. And of course, feel free to share this blog.


#IBMChampion
0 comments
8 views

Permalink