Episode 22: Data Dave Dives Deeper with Kimberly Zink

Data Dave Dives Deeper with Kimberly Zink

Dive into the evolving world of data management, privacy, and compliance with a unique twist in this episode of “Data Dave Dives Deeper.” Join us as we welcome Kimberly Zink, a seasoned lawyer and executive in the data management sphere, for an enlightening conversation that bridges the gap between law, technology, and the pressing challenges of our digital age. From her early days navigating the complexities of privacy and cybersecurity to her pivotal role in shaping information security compliance and navigating the murky waters of AI and data risk, Kimberly’s journey is a testament to the critical interplay between legal expertise and technological advancement. Explore the intricacies of data governance, the impact of emerging legislation, and the proactive steps companies must take to align with legal standards while fostering innovation. Whether you’re a tech enthusiast, legal professional, or simply curious about the future of digital privacy, this episode offers invaluable insights into the challenges and opportunities that lie at the intersection of law and technology. Tune in to “Data Dave Dives Deeper” for a compelling discussion that demystifies the complexities of our digital world and unveils the critical role of legal frameworks in the era of AI.

HAVE A QUESTION?
Ask Data Dave about all things data, cloud, or technology.
We'll be happy to answer your question on the podcast.

or send us an email to: techtalk@d3clarity.com

Published:

April 16, 2024

Duration:

00:25:51

Transcript

Alexis
Hi everyone. Welcome to another episode of Data Dave Dives Deeper. We’re here today with Kimberly Zink and of course, Data Dave to talk to you about all things data, all things cloud, all things technology, and all things D3Clarity. We’re super excited to have Kimberly join us for this Dives Deeper, but of course, I’m here with Dave. So, hey, Dave, how are you today? 

Data Dave
Hey Alexis. I’m very well. How are you? Good morning and good morning to all our listen. Fabulous to be here, and I’m absolutely delighted to introduce Kimberly, who’s with us today. I think this is going to be a fabulous conversational topic and very poignant for a lot of people.  

We’re gonna be talking a lot about risk, privacy, and compliance with Kimberly, who is a lawyer and executive in the data management space. I think this will be really exciting, and We’ve had a lot of questions in this area, so Kimberly, welcome.  

I’d like you to say a little bit about yourself. How did you get here? What is your kind of background? How did you come into this space with data management and with your law aspect to data management? And we’ll go from there? 

Kimberly Zink
Sounds great. Thank you and thanks to all your listeners for having me here as well. I hope you all enjoy it.  

I like to say that I was doing privacy and cybersecurity before it was cool to do them. That that goes back to the early 2000s, where I was an attorney at a private law firm and while I was doing other areas of the law, some of the questions that I was starting to get. I was located in the seat of a state government, and so we had a lot of government and private clients. We were starting to get questions from the employers of how do we communicate any privacy expectations or lack thereof, that we’re going to have. And that would revolve around, for example, the use of CCTV in employment areas and a lot of the differences between privacy rights that government employees had versus privacy rights that employees of private companies had. And there are distinctions there that were very interesting and also challenging sometimes for clients to navigate.  

So, along with the cyber security and the privacy came questions around policies. “What do we do?” You’re all familiar with pop up banners that you see when you turn on your laptop. “There’s no expectation of privacy with an employer. We can look at anything that you do.” Of course, now under certain laws or some exceptions there. But that’s really how I started to get into it. And over time that evolved, and then I moved into working in-house in government relations, also working in this space, evaluating legislation that was being considered in the cyber security, privacy and R&D space.  

So, I started to see very early versions of bills that were coming out around it and some of the benefits that that might come in more often than not, a lot of the challenges that legislation like that would pose, particularly when it comes to implementation and violation of certain existing rights.  

Then just because I was working in that space, moved ultimately into an Information Security Compliance Officer role where I was an attorney, but sitting in the IT department reporting in to the CISO.  And I built an information security compliance program for them. In addition to that it was mid 2000s at that point, so you have GDPR starting to be talked about going into effect in 2018. The China Cyber Security law was coming out around that time as well, along with several other measures and standards from China.  

And so that morphed into developing the early-stage privacy program also with CCPA and then a big piece of the data came in because you can’t manage risk around compliance and new laws if you don’t know what data you have and how you’re using it and where it’s going. At that point, the company I was working for had several, several thousands of companies, if not more had those challenges around, “How do we govern our data and how should we be governing it?” 

Data Dave
Excellent. That’s a pretty long and detailed road certainly from a legal and legislation point of view.  

I personally think that’s absolutely fascinating. I did some work in the mid-2010s with the Center of Identity Research within the University of Texas and the government for NIST, trying to define what privacy was and trying to define what it meant as it pertains to the US government and pertains to actually all things the Department of Security at that time. So it was fascinating for me. 

Alexis
I just love the idea from that story of a lawyer working in an IT office reporting to a CISO. So, I think that is just a fascinating approach. I came up in higher education, and I’m now thinking like “Ohh what if my institution had done that?” Like, it would have made a huge difference for us having that kind of use case.  

So that’s an awesome kind of story. That’s an awesome life story. I love it! 

Kimberly Zink
Thank you. I think one important point there, it is a very unique setup. Often if you are a lawyer but not sitting in the legal department, you can’t make legal decisions that bind. So, I think that’s always a very important consideration for companies is if you’re going to place a lawyer somewhere other than in the law department, what impact does that have on their ability to make legal decisions? And so companies just need to keep that in mind. But I’ll tell you it was an absolute game changer for that company because what it did was instantly breakdown a barrier between the business and IT and legal. Because all of a sudden there’s a lawyer in the midst of the IT folks, and I was one of them. It just immediately erased that “us versus them,” and that tension was gone. 

Data Dave
I can relate a little bit. I worked for a while in an IT company that was started by a group of lawyers. And it started in the healthcare privacy world for patient data. But it started by a group of lawyers. So, I was running the engineering development organization for them. But the general counsel became one of my close friends. But because the whole place was built by lawyers, it was very, very compliant. And very, shall we say, precise. And I credit that lawyer… she was a fabulous lawyer. She worked for Bill Clinton for a while. But she taught me how to read is what I credit her with because it just changed the way we looked at the world. And changed the way as an IT person – as an IT organization – that we interacted.  

As an engineer, you think you talk with precision. It changes when you’re speaking to lawyers and bringing that law and IT together is absolutely fascinating in looking at that.  

Alexis
That’s your life, right Kimberly? You are here to bring law and IT together. One of the things I know that you specialize in, like risk, compliance, privacy, and I’d love to hear more about your thoughts on the risk of data right now, especially in the AI world that we live in right now? 

Kimberly Zink
Yeah, sure. I think a lot of the risks that we see with AI, particularly with generative AI and the risks that we see with data use in general are similar and initially can be addressed in a similar way. What I would encourage people to think about, when you’re coming at it from a compliance perspective, my job is to say, “Okay, what does the law say? What is the business impact of the law and actions that we may or may not take? What effect is that going to have on the company? And then how do I effectively communicate and translate into meaningful action the words of the law?” 

So, I think of myself as a liaison or a bridge between that legal language where every or and and, semi-colon and, you know, section 3ACi. That that very precise interpretation of that then is interpreted for the business and for IT.  

I don’t use the same language necessarily when I’m speaking with the business and when I’m speaking with IT. There are definite commonalities, but I think there are some different languages used there. The business doesn’t necessarily want to hear about the technical controls that we need to put in place, and that’s what IT wants to know. Then IT, I think more at the executive level, wants to hear about business side of it and the impact that it can have. But the people who are implementing those controls, what they want to hear is how do I take this stuff that you’re saying to me and translate it into a tangible action that I can take the company and comply with the law. 

Data Dave
So, you’re talking about the risk is all on the business? And yet the rules and the compliance framework is often in IT for how to do that. So, the risk is carried by the line of business. But the rules to protect the data are in IT. And getting those to work together and work symbiotically, has got to be absolutely crucial. 

Kimberly Zink
Absolutely. And I think part of that is initially meeting the company where it is when it comes to, “What’s the risk appetite of the company?”  Because every compliance decision, if you’re doing it right, it has to be risk based when you are working in a company. It is external council’s job to say,  “Regardless of your risk appetite, you can and can’t do this.” But when you’re in the house, you do have to say, “Okay, what’s our risk appetite?” And that’s where the law department comes in in conjunction with working with IT to say, “Okay, the law says we can and can’t do this. Here’s what we are or aren’t doing. What’s the impact of that? And then what do we need to change? Is it that we need to change compliance, or do we need to change something that we’re doing on the business side? What’s reasonable for the company? And what’s reasonable from a compliance perspective?” And then make sure you document that decision. I think that’s missed a lot. We just don’t document how we came up with how we’re going to do something, and that’s what a regulator is going to want to see, “Walk me through your decision-making process.” 

Data Dave
How did you get here? What’s your decision-making process? How did you interpret this? How did you come to the decision that you’re in?” 

The law only comes so far, though, because the law is always catching up with reality or with the world. And the law only comes so far. One of the areas that I’ve got involved in a couple of times is with the legal department, for want of a better phrase, giving council to the business organization where the law only came so far. So yes, they’ve got an absolute right to say, “You cannot under the law do this,” but there’s also the construct with, “The trends in the law are such that you should not do this.” And this comes back to your risk awareness and your propensity for risk as a business leader.  

Can you talk a little bit about that in the sort of gray area of giving guidance there as it pertains to AI and as it pertains to data management. 

Kimberly Zink
Yeah, absolutely. And that’s a great point. And you’re right, the law is never going to be what leads these decisions, generally speaking. Privacy is slightly different because it’s so rapid fire right now, and the law of coming out. But AI is a great example. If we use generative AI, as the example ChatGPT, explodes recently. Everyone wants to use it. The business is like, “Let me do this right now. It’s really important to us!” And thinking and perhaps rightly so, “Here’s all of the great potential that we can gain from this,” without thinking about the downside. My job is to think about what that downside is. 

And so, I can use a real life example of how I ultimately started the creation of the Generative AI Governance Framework for a company that was working for where I was, our Chief Privacy Officer and Global Strategy Advisor.  

We had the EU AI act that people were able to read and similar to the GDPR, it’s supposed to be kind of like the gold standard, but all we know is this, at a point in time might be what this is going to look like.  

When we’re evaluating that risk, we have to turn to other resources. That could be speaking with your network within within your industry or within your particular field. Also, I am a huge fan of using existing frameworks. So you’d use, for example, there’s the NIST AI framework that lays controls. I think that’s very helpful. You could overlay that with ISO standards. You could also overlay that with the OECD principles, for example. 

Alexis
Oh, hold on, Kimberly. I gotta stop you, right there. The OECD? 

Kimberly Zink
Yes, it is the Organization for Economic Development and Cooperation… Cooperation and Development. 

Alexis
Are they just the big governing body who puts standards together? 

Kimberly Zink
I wouldn’t consider them a governing body, and that, and certainly people might disagree with that, but I would consider them more of an advisory body. 

Alexis
I’ve heard of the other things before and I haven’t heard of that so I wanted to ask. 

Kimberly Zink
Yeah, I like that because it’s principles-based.  

So, what we’re seeing as a trend and a foundation in AI laws is similar to what we see in the privacy laws. The EU decided to create the AI Act to reflect a lot of the foundational principles that are in their privacy law, the GDPR.  

So if we look then at those principles, within the law itself, and this is the second piece that I would look at, “What other laws, even if it’s not in that specific area, can we look at that might help determine how certain regions or countries might be steering their laws.” 

Alexis
And so you were mentioning, and I stopped to you, but you were mentioning the idea of layering all those different kind of standard on top of one another could create a big picture. Is that where you were headed? 

Kimberly Zink
Yes, that’s exactly the way that I think of it is… You’ll hear this probably more on it than you ever would in the legal world, but essentially you crosswalk laws, standards, frameworks to identify where the commonalities are so that you can determine, “Where should I start?” Because there’s going to be so many risks, some of them are going to be high, some of them are not, and it’s also going to also depend on the company. So, if you layer them together, you see those commonalities. “Do we have gaps here?” because those could be low-hanging fruit that you could check off pretty quickly even though they might be medium to a low risk. And then simultaneously, you’ll want to look at your really, really high risk ,your high red category.  

Alexis
I love to end these podcasts with a good success story. So, Kimberly, do you have a good success story or even maybe a challenge that you see that could turn into a good success story that you’d want to share with us to bring things to a close? 

Kimberly Zink
Yeah, sure. I think going back to AI and kind of sticking with that theme. And so, I was asked to set up the Generative AI Governance Framework for a multinational organization. 

Alexis
I’m so happy you came back to this because I wanted to ask about it earlier so. 

Kimberly Zink
No problem! 

Setting up a Generative AI Governance Framework initially. And you say well, “Why was a lawyer put in charge of something like that?” 

And so I want to be clear, I don’t think you can effectively set up a governance framework of any kind with one person in charge. I think it is much, much more effective having the Co chair, for example, and so the Head of Governance in IT and myself in legal, we co-chaired a task force that was truly, truly cross functional. Included decision makers across the entire business globally. With that, we would have meetings every other week to discuss, “What use cases are we looking at?” and, “How are we going to start governing this?” 

And instead of jumping into – and this that I think is part of the success story of – people wanna jump into a policy. “We have to have a policy around this to try to control it.” Policies, more often than not, are only as good as the paper they’re written on. Unless you have awareness around it… 

Data Dave
Unless you follow them. 

Kimberly Zink
And unless it has some teeth, right.  

But because Generative AI, there was no law around it yet that we could point to. There were frameworks that we could look at, but it was new for everybody. So, talking to your peers with brainstorming, it wasn’t like, “Oh, how did you do this way back when, when you did it?” And I think the success story there was that we took the time to take a step back and say, ”What is the true problem that we’re trying to solve for right now?” And that’s that we do not want people using something like ChatGPT and on purpose or and inadvertanly inputting confidential and sensitive company information.  

The plabicity around Samsung, with all of their incidents around that, I think is a great example of things that can go wrong when people are just trying to be innovative and not thinking about those risks. And so instead of jumping to a policy, what we did was say, “We’re just going to put guidelines in place right now.” And we communicated to them to the entire company. We created awareness around it. We required top-down messaging on it, not only from our most senior leaders, but also from everyone on the task force. Part of their responsibility was to communicate decisions that we made in those meetings down to their teams and their functions.  

And so I think the two successes there are, we successfully were able to pause this huge rush towards using tools like ChatGPT, and we were able to do that because we clearly communicated the risks associated with it, incidents that we had seen occurring, and then also said, “Yeah, we’re going to put some guardrails in place. We’re gonna put guidelines in place, but we’re not going to have a policy yet.” Because of policy in most companies, is a formal, actionable document with discipline associated with it. And we were all in the learning phase at that point, and I don’t think it’s fair to put disciplinary action on someone when we’re all learning at the same time together, when we have other protections for purposeful wrongdoing. 

Data Dave
So, I’m going to just ask a couple more questions. There’s one that that brings to mind, which is – for individuals that are in the data governance space or deal with data frequently and sort of commonly and in a responsible position. What real world advice would you give them as they’re being pressured to use more and more AI? And they’ve got concerns and issues? You’re obviously very steeped in the background here that the law, but there’s a lot of people in this space that are not that are coming into this and they’re operating on, “I’m nervous. Don’t know why. I’ve got concerns.” What should they do? What real world sort of thoughts would you give them? 

Kimberly Zink
I think first and foremost is: trust your gut on that. And I think with anything. If your gut says you shouldn’t go closer to the bear, you probably shouldn’t go closer to the bear.  

So essentially, that’s the same – if AI is the bear and your gut says, “I’m questioning this.” I would follow that. I would have the courage to take that pause and say, “Why am I feeling this way, and let me talk to some other people about this.” 

And then I would say, do your research and just move prudently. And I think people tend to think of lawyers – and maybe some lawyers are like this- that lawyers are blockers. And no, everything’s gonna be no. But my job is to enable the business. That fundamentally is my job. I just want to make sure we’re going to do it in a way that’s not going to harm the company.  

Talk to your lawyers that are friends and make sure that any issues that they’ve identified, make sure that there’s that open line of communication, and don’t be afraid to ask the hard questions of people. Reach out to your advisors that you have. If you’re using a company that consults with you on different topics, reach out to them as well. And just look for answers. Make sure you’re identifying those risks. And just don’t fall victim to that pressure because that’s when we really get in trouble. 

Data Dave
I’m round this out with a comment from some of our previous ones, which is we’ve had a lot of conversations just recently about data governance and data management, building a movement within your organization. Recruiting people in to be part of it. I’m gonna give some advice from myself, which is – don’t forget your legal organization when you’re recruiting those participants. It is far easier to bring your lawyers on board with a, “I would like to do this. How do I do it within the construct of the law?” than it is to run a foul of them saying, “No, you shouldn’t have done that.” It changes the conversation dramatically. So, my advice to anybody doing this is to continue to think about that collaboration across the organization, and your lawyers are some very knowledgeable advisors who can give you some deep insight and some deep structures on this. We’ve used them tremendously in a number of different organizations and continue to do so. To bring your lawyers in, understand what is happening, and get them to explain the constructs of the risk rather than the stepping over the line and being slapped with a incompliant or a compliance issue. 

Alexis
That is some beautiful advice from Kimberly and Data Dave. Anybody listening to this podcast, you just earned at least an hour’s worth of legal advice from the two of them. So, you’re welcome.  

Thank you both so much for chatting with me this morning. This has been super insightful, and I’m so happy we went down this path. We’re in the midst of trying to figure out how to use AI right now at D3Clarity, and you just put a whole bunch of things into my brain that I had never thought about before. And so thank you. Thank you, thank you. I’ll take the free legal advice. 

It’s been such a good conversation. Anything else, before we round out? Anything else to add to the conversation, Kimberly or Dave? 

Kimberly Zink
I would just add: we’re on the same team. Everyone at that company is on the same team. I think if we can all remind ourselves of that in those more heated moments when we’re trying to decide what to do, I think that that will serve everyone very, very well. 

Data Dave 

And there usually is an acceptable outcome to almost any of those heated discussions. People are logical; people are reasonable.  

The other thing I would part with is – thank you so much Kimberly, this has been a fabulous conversation. I would remind our users that ChatGPT and other generative AI that isn’t owned by you, will never forget anything that you put into it. So, by all means, ask questions, but don’t give it any information because it will not forget, and it will willingly use it to answer other questions from anywhere. 

Alexis
Kimberly, thank you for being on Data Dave Dives Deeper. This has been fabulous. I hope you both have a great day. For the world out there who’s listening to the podcast: if you are interested in being a guest on a Data Dave Dives Deeper, we would love to hear from you and have you on the show. Simply reach out to us at talktech@d3cclarity.com, and we would be happy to have a wonderful conversation with you.  

Thank you both again so much. It’s been wonderful. And I hope you both have a fabulous day.  

Data Dave
You too.  

Kimberly Zink
Thank you both. 

Ask Data Dave!

Listener questions are the best.
Ask Data Dave any question you have about all things data, all things cloud, or all things technology.
We'll be happy to answer your question on the podcast.

We will never sell, share or misuse your personal information.

Let's Talk.

An expert, not a sales person, will contact you quickly.
Usually in less than 20 minutes during business hours.

We will never sell, share or misuse your personal information.

Schedule a free meeting with an Expert.