Afleveringen
-
Chance Conversations Interviews (Odia Kagan) – 2024 08 28 15 55 IST – Recording
[00:00:00] Conor Hogan: hi, and welcome to another episode of Chance Conversations. I'm Connor,
[00:00:15] Carey Lening: and I'm Kerry,
[00:00:16] Conor Hogan: and today we're so excited to have with us a true leader in the world of privacy and data protection, Odia Kagan. Welcome, Odia.
[00:00:23] Odia Kagan: Thank you. Nice to be here.
[00:00:25] Conor Hogan: Odia is a partner and Chair of Data Privacy Compliance and International Privacy at Fox Rothschild.
She specializes in guiding companies through the complexities of privacy laws, including the GDPR, and offers practical advice on tech transactions and third party engagements. With certifications including all of the letters that you can possibly imagine, and recognition as a Fellow of Information Privacy from the IAPP, Odia brings deep expertise to her role.
She's advised
[00:00:51] Odia Kagan: Sure now I have to top that intro, right?
[00:00:55] Carey Lening: I will briefly introduce the rules just for the folks who haven't heard this [00:01:00] before for each category you'll get to pick a number one through three where a question in that category will be chosen at random There are two questions in each category and three total categories.
The first is industry trends and insights We'll ask questions about where you think the industry will go in the future, where it might be stuck, some common misconceptions, hype cycles, and a few other surprises, because I keep adding questions. Next, we will talk about career observations.
And in this category we'll talk about your highlights, what you've learned along the way and what advice you might have for future generations in this space. And then finally, personal questions. This is one of my favorite sections. These are questions about you, who you are, your hopes, your dreams, your aspirations, life lessons and challenges you've overcome.
Nothing too crazy personal, but just a little bit to get to know you better. You could skip one question and a new question will be asked in its place. You can also throw one question back on Connor and I and we'll answer it first But then you need to answer in the end [00:02:00] The goal here is as always to broaden the conversation and for serendipity to take over a little bit like the conversations you might have with friends So sound good.
Cool. Let's do it. No nonsense All right, i'm gonna go first connor
All right, we're gonna start with career pick a number one, two, or three
[00:02:21] Odia Kagan: Okay. I want to start i'm going to try two.
[00:02:23] Carey Lening: Okay, two I should let everyone know I reconfigured my little random number generator thing And so now there are pictures of cats for each and it looks like there's like a playing card.
It's pretty awesome. This is only for my personal edification, but it's our podcast, so I can do it. Okay, the question is, what does success look like to you?
[00:02:44] Odia Kagan: Oh wow I read about this in one of my books that I'm reading, and it had a really good definition of success.
Let me see if I can remember it. It basically said something like, Success is the continued. striving [00:03:00] towards a worthy goal or something like that. So. I've thought about this a lot recently because, I have spent Pretty much almost all of my career, and law firms and big law and kind of the concept of success in that construct is very clear.
I think my definition as I'm thinking about it is basically to continuously do work that is interesting, that is intellectually challenging. And that makes a positive impact on, on people and on clients. So I really like what I do. I like what I do because I find it. Interesting because it changes all the time because there's challenges because there's puzzles to solve, because there are new facts to apply to old rules and there's [00:04:00] surprises that are happening all the time.
There's technology involved. And also because I feel most of the time that I am, making a positive. Impact on the people that I work with, right? Be it, clients helping them solve problems, helping them, avoid problems, mitigate risk, avoid being sued, avoid, having a negative consequence.
And so when I think the combination of those together is a definition of success for me.
[00:04:34] Carey Lening: I like it. It's a well rounded answer. I love that by the way, you were reading a book on the definition of success. What book is that, by the way?
[00:04:42] Odia Kagan: So I have I have a reading corner, where in the morning in the crack of dawn, I read 15 books simultaneously.
And I read like a few pages in each as I drink my coffee. That's like my morning thing.
[00:04:53] Carey Lening: 15 books simultaneously
[00:04:55] Odia Kagan: yeah, I read a
few pages in each, they're all different things. They're like [00:05:00] philosophy, history, I don't know, statistics, like different things.
Yeah.
[00:05:05] Carey Lening: How do you do that without like your brain exploding?
[00:05:08] Odia Kagan: Seriously, that is like my time of happiness. I really like Kindle and I don't do paper at all and everything is digital, but this is like a literal stack of books.
That's brilliant. Like, I'm very similar to that.
[00:05:20] Conor Hogan: Not that I have 15 books on the go at any one time, but usually if I'm reading a book, which is pretty regularly, I usually have more than one and probably less than five on the go, but like 15 just blows my mind.
[00:05:31] Carey Lening: I can't keep track of reading a menu without, having to focus.
So the idea of fifteen in like small bits and chunks I don't know how you do it, girl. But that does explain a few things about how prolific you are in your responding to and summarizing all these pieces. You must have like speed reading done or something.
[00:05:49] Odia Kagan: So I think to answer your question, it was probably, it was either I have two quote books.
One is Robert Greene. It's the Daily Laws. It's like a quote a day, right? And one is the Daily [00:06:00] Stoic, the Ryan Holiday, who I love his books. I might, it's either one of those, or it might be I'm reading Tal Ben Shachar, who's this Israeli researcher. He works, I think, at, he's got a famous course. Now I don't remember if it's Harvard or Yale, about happiness.
So the book is called Being Happy. And I think that might be where I read it.
[00:06:21] Conor Hogan: Oh, wow. Do you know what, we were talking on an earlier episode about, I think it might have been the same question, and the answer that the guest in question gave was that there's not something necessarily that they're striving towards, In terms of, retirement or, the typical measures of success, but rather it was really similar to this in that it's actually constantly striving to make an impact and to be better and to enjoy.
And I just think it really is quite revealing when. Somebody like you, Odia is of exactly that same mindset, it's actually very encouraging, I would argue, to anybody who's, listening and embarking on a career [00:07:00] in this space or, wants to pursue big law and all that sort of stuff you can still do that, be happy and enjoy and make an impact, that's what I'm hearing.
[00:07:07] Odia Kagan: Yeah, I really like what I do. I really enjoy it. , I think that this type of career is really difficult to do if you don't really love what you're doing. It's hard because it's very, time consuming and, a lot of investment and. Yeah, and it's also very easy to get embroiled in all your day to day stuff and kind of forget that, you're supposed to enjoy it, too.
[00:07:28] Conor Hogan: And you're, so you're, your coffee in the morning at dawn with 15 books, is that time to yourself where you're like, you know what, it's not just about a rat race or anything like that.
[00:07:38] Odia Kagan: , like I'm reading one of my books is Winnie the Pooh.
Which is amazing.
It's so cool. It's Rabbit is and Eeyore and it's very, like, when you read it as an adult, it's wow, this is really interesting.
[00:07:50] Carey Lening: So you're reading like fiction, nonfiction, studies, law, whatever.
[00:07:54] Odia Kagan: Yeah, there's a privacy thing in there. There's a lot of philosophy. There's the quote books. There's [00:08:00] Winnie the Pooh. I'm reading a book about like math and statistics, which, I'm a lawyer. I can't do anything related to graph paper. That's, that stopped, a while ago.
So I'm trying. So yeah, there's Yeah. It's, I'm doing a thing in like Chinese medicine.
[00:08:14] Carey Lening: Oh, geez. . I thought I was impressed, impressive because I was like trying to pick up Feynman's lectures on physics again.
And I read it and I'm a couple pages like you. I can't do too much because my brain is literally going to explode. But you're all over the map. You got everything going.
[00:08:27] Odia Kagan: That was one of the books that's on my list is he's got a book about like six simple things, right? Or something like that.
, but physics was what I avoided in high school. I did not, I was in the physics class, like all my classmates were doing physics and I was the person who didn't do physics, but I sat with them and watched them struggle. So my physics knowledge is like negligible.
[00:08:47] Carey Lening: He's a good one to read for that. Yeah. Okay. We're going to talk about books the whole time. Connor, you should ask another question.
[00:08:54] Conor Hogan: Alright, okay. I will move to industry. We'll try and put it back in there. Odia, pick a [00:09:00] number, one, two, or three again, please.
[00:09:02] Odia Kagan: Let's try three.
[00:09:04] Conor Hogan: Okay, perfect. Ooh, I like this one, actually. What is something that you've rethought In the last year it doesn't even have to be the last year but what is something that you've thought about again all over again and Rethink it rethought it
[00:09:18] Odia Kagan: In the privacy industry
[00:09:20] Conor Hogan: in the industry Yeah,
[00:09:22] Carey Lening: maybe a conception you had or an idea that you had about the industry that has changed or evolved
[00:09:28] Odia Kagan: Interesting. Oh, why don't I flip that one on you first so I can think about it.
[00:09:33] Conor Hogan: I knew that was coming
[00:09:35] Carey Lening: I could take this. I came in guns a blazing with AI. I really did think that it was going to do something phenomenal and that we were going to be able to Square the circle as far as the law and yeah, I saw your posts
[00:09:49] Odia Kagan: and you were like building some stuff yourself and things, right?
Yeah.
[00:09:52] Carey Lening: Yeah. And I still build, don't get me wrong. Like the little weird generator thing. It is really good for coding. But my research and what I've been digging [00:10:00] into has led me to be very despondent. Not because I don't think that there's utility in any of these tools, but I think there's a lot, but I do think that people.
aren't appreciating what limitations exist, but the systems themselves are really not compatible with the law that we have. And so my whole thing has become even more jaded and frustrated because we have this just glaring problem.
And I'm really. I'm genuinely not optimistic that there's going to be much in the way of large language models as we know them, in a couple of years when everyone finally wraps her head around the stuff that, oh, hey, data protection, we can't really do it, or the idea of consent and the entire fundamental model around notice and choice and transparency and all these things.
I'm really skeptical that's going to last much longer as well, because I just don't think it can. There's going to be a big conflict that we're going to have to sort out and the law is going to have to change. The technology is going to have to change.
We're going to have to bury our head in the sand or something else. I [00:11:00] don't know. So that's me.
[00:11:01] Conor Hogan: Yeah, that's really interesting. Actually my perspective touches on maybe elements of that, but actually. I wrote, I was, I had a piece in draft for such a long time and put, I put it up on LinkedIn yesterday and it was like how the privacy and data protection is always going to be around, but actually to what end, because people in general.
Will quite willingly and it is maybe it's a generational thing and maybe this is a sign of me. I don't know, but people will trade their personal data for access to Wi Fi, on an airplane or in a public space to
[00:11:33] Carey Lening: Oasis concerts
[00:11:36] Conor Hogan: as I have. fallen foul of this morning. I've decided I won't lie, Kerry.
I was very conflicted, my, literally the first record I ever bought was Oasis back in the nineties and it was on cassette tape and they reform and announced, you can get in to an early phase. Yeah. Q. If you give us all your details and then they don't let you proceed unless you [00:12:00] agree that they can contact you on your mobile phone, even if you're on a do not contact list.
And I, oh God, I shuddered so badly. And I thought, no, I really want the chance to get the ticket. So what have I rethought? i've rethought privacy over and over again the whole thing about privacy being a thing because We you're still going to come up against moments like that where you really just want to go to a gig and it happened to me this morning.
[00:12:21] Odia Kagan: You know what that actually was what I was going to say in the first place What you guys said you know, resonated so I think the issue that I always come back to In connection with privacy or data protection is transparency and the impact of the, cause we were talking before about like making an impact, right?
Some of the stuff that I see around without naming names or whatever, but some enforcement, some lawsuits, some things are, you're looking at it and you have a conversation with people who do not deal in privacy and they're like, are you people serious?
This is what you guys are dealing with. This is [00:13:00] helping the world. So on the one hand, there's that sort of perception which, we knowing more don't necessarily share to the same extent, but on the flip side what you guys just said and my thought of transparency. Transparency is the thing that I keep coming back to. But from a u. s kind of influence perspective the crux of the matter is If there is a situation where I, as a person, have a good understanding of what is actually happening with my data, and what is the potential impact, And I make a decision.
Now the discussion though is And that goes back to what you were saying on ai and what connor was saying about the concert is in some situations You aren't able to make you know a free choice maybe because of the decision issues, right?
Like the thinking fast and slow stuff and all of that. Like we're not, we, the dark patterns were influenced, [00:14:00] right? The key is how do we Get to a situation where people can make Educated decisions now. In the context of the concert right on the one hand.
You're like, okay I didn't get a choice because on the other hand you're looking at it and you're like, okay what's the impact here? I get a spam that I can unsubscribe from I get a text that I can unsubscribe from. Maybe there are people that are thinking You know, that's okay.
I read this morning when I was reading the, my privacy update things, I was reading that there's, there are companies that are now advertising, active listening basing retargeting and data sharing for retargeting and marketing based on active listening, which is basically like eavesdropping on conversations and using.
Both content and intent and emotion analysis. Now that part on like sort of these inferences and [00:15:00] opinions and things that are made based on your stuff that you one don't know what they are. Two, they are very Likely to not be reliable right now, given the state of the technology and three go off into space to have different impacts that you don't know what they are.
I'm going to age myself, too. But if you guys remember, there was a commercial. Was it like an HIV commercial? It's like your girlfriend and your girlfriend's girlfriend. And like you, there was this chain of people, right? There's chain reaction thing on TV.
So like this type of stuff, right? The concert thing, yes or no. Is it ethical for a company to put you in this position? Okay. There's a European discussion on freely given consent thing. But if you're looking at impact and like real life impact, stuff that is done that I don't know about, that I don't understand, that may not be accurate, that I have no idea to know if it's accurate, it's now dispersing into the world, having, multiple compound impacts that I also don't know about that could affect me [00:16:00] in a way that I don't know about and can't reverse.
That concerns me. And so that's the thing that I'm trying to figure out. You know, when I'm advising companies to how do I get to a situation where I am, basically giving the consumer the ability to make the informed decision. And in some situation, right?
The Connors point. Yes, the generation, my kids generation, the younger generation. They have maybe a different concept of what is a reasonable trade off, right?
Yeah. But
[00:16:32] Odia Kagan: if they're in a position to understand the trade off and say, you know what? I'll press unsubscribe 50 times because I got 5 percent off. We can make that decision.
[00:16:41] Carey Lening: But it is going to become increasingly harder. Especially when there's the downstream effects that you're not visible to you. They can't be transparent because there's just so many different extra chains, extra links of the chain.
I have a book for your reading list, by the way, if you haven't read it already. What is it? Jam [00:17:00] burgers optimal. It's a dystopian sci fi novel, and it's great. And I highly recommend it. Optimal. Okay.
[00:17:09] Conor Hogan: This is brilliant. I'm actually just writing down all of the book recommendations.
[00:17:13] Carey Lening: Yeah, it's it'll, it it's why I've been rethinking consent, but it's also a really important thing to juxtapose the not too distant future, I think, given where everything is going with where we are now. And he didn't write it from a privacy perspective at all. He wrote it from a an extremism and academic perspective, but it's just a brilliant book and I highly recommend it.
[00:17:34] Conor Hogan: Do you know what, you just mentioned something really quickly, Kerry, there at the end when you're talking about like how deep the chain is and, second and third and fourth and fifth and sixth and whatever party organizations, buying and selling data in the back end. And like that to me is the fundamental reason why transparency is in trouble today because of the Just how complex the landscape is.
And so [00:18:00] yes, our kids and their kids, the world that they're going to inhabit when it comes to consent and transparency. And if the notion of informed consent is, In, using mechanisms and mediums that aren't even probably imagined yet, but the infrastructure that sits behind it and how their data will be consumed and potentially even though you're just, you saying active listening that's what people freak out about today when they get retargeting ads.
God, my phone is listening to me. No, it isn't Yes, correctly well will be. So like, how can that be transparently communicated to somebody and how does informed consent work in a transparent manner? My brain hurts.
[00:18:42] Carey Lening: Okay. Let's take a pause, take a breath and I will ask a personal question next.
So Odia pick a number one through three.
[00:18:52] Odia Kagan: All right, let's try one.
[00:18:54] Carey Lening: If you could go back in time. What's one thing that you would change and why?
[00:18:58] Odia Kagan: Oh, wow. I [00:19:00] think I mean I'm still working on it. So i'm trying to change it actively, but if I could have this insight earlier, I would, it would be more efficient
it goes back to be able to focus on the present while trusting the future, basically have your plan, but have your plan, look at your plan, pursue your plan. Trust your ability to pursue your plan while focusing on what's going on, in the present.
And my kids are so much better at that than I am, and that I am very. happy about I don't know if I'm proud of myself for, I don't know if it's because of me or not. It has a lot to do with my husband probably too, but they're happy kids and they're enjoying, they're the present and everything. I was a five year old who was like worried about, the future.
And, that was not. helpful.
[00:19:54] Carey Lening: It's hard. It's hard not to freak out. And it's the unknown, right? And it's trusting in the future? I can [00:20:00] barely trust in the present for Christ's sake. Yeah,
[00:20:03] Conor Hogan: I really like that, though, because I think being present is something that a lot of people in general struggle with.
And I think maybe As I've aged anyway, and I look at my kids, they couldn't give a shite about what's going to happen tomorrow. They wake up on a daily basis and they're just like, what are we doing today, dad? Where are we going? And they don't seem to have that worry. And maybe as they grow up and they get older, that might change.
But at the end of the day. I think that's a remarkable view and perspective to be able to empower kids to have is actually, you know what, be present, enjoy what's around you and try not to be worrying too much. Have that perspective, that actually you can really enjoy whether it's being outside or the fact that it is raining, just, look at the rain as it bounces off The ground or look at the sun as it rises and just enjoy the now like I think if more people could learn To [00:21:00] enjoy the now I think the world would be a much better place
[00:21:03] Odia Kagan: My daughter did that she went to sleep away camp and she sent us letters like actual letters because they couldn't use any devices at camp Which is genius. So she sent us letters And, she is so good at this to be like, Hey, this was great and today was interesting and I miss you, but I had fun and I did this and they're like, looking at, being able to enjoy things.
It's really that I would have gone back in time and been less stressed out and pre K, that, that would have been better.
[00:21:30] Carey Lening: That's, interesting too, because I feel, like kids today probably should be more stressed out and yet they're not. We, I don't know.
There's a, yeah,
[00:21:38] Odia Kagan: there, the slider went all the way to the other side where it's who cares if I'm going to have a career? I can just live here with my parents forever. That's also not good. So there should be a balance somewhere in
[00:21:48] Carey Lening: Or maybe they just said, they're just like, F it because the whole.
Global, climate change is going to wreck everything and there's not going to be much of a world left I don't know. Maybe it's so fatalistic. I don't know I don't have kids. [00:22:00] This is why i'm wildly speculating on all these things
[00:22:03] Conor Hogan: All right, cool.
I am i'm going to move on to the career category again audio so if you can give me a number one, two or three again, please.
[00:22:13] Odia Kagan: All right, let's try one again.
[00:22:16] Conor Hogan: Here we go Okay, where do you see, and this is a career question in fairness, right? But where do you see the industry going in the next five years?
[00:22:24] Odia Kagan: Interesting. I don't know. I'm hoping that it will pursue a path whereby, companies are using and people are able to make decisions about use of data in a way that gives people agency on the one hand to avoids the Real harms, right?
You want to be on Vogue and use the AI act, right? Like high risk, or we have Colorado calls them consequential decisions, but avoid the things that can really cause harm. And [00:23:00] maybe avoid the things that currently seem to expend a lot of Time, energy, resources, good faith, and, and patience with things that don't matter, right?
Right now, there seems to be a lot of discussions about this. In Europe, the discussion is, Oh, regulators don't have a choice about what they enforce because they got to investigate all the complaints in the U. S. The blame is put on.
We can control what's a class action lawsuit, right? But I don't know. So I don't have a solution right now. But I think that de facto there is a lot of the chatter about privacy. focuses on things that the people whose support we need to advance get jaded with, should the cookie banner look like this or should it look like that? And what are you people doing? And so then, We don't get to the crux. In meantime, the [00:24:00] cookie banner has been perspected, but emotional analysis, active listening is running rampant.
So hopefully that is the direction because I think, again, back to the discussion of, how are we as practitioners and how are we as the privacy industry, making a positive impact on the word, but what's our contribution, right?
I read, Arnold Schwarzenegger's book be useful, right? Yeah, how are we useful right and I think we are useful when we help mitigate The risk and also help, the whole unfair and deceptive concept of the FTC act, right? Things that people don't know about, they don't know if they care about, maybe they don't, but they just don't know.
So I think if the industry could somehow be able to go in that direction, that is my hope that is where it would go. Because as you guys said, if we are not able to figure out that way, that direction the technology. Is running really fast and if you stop by to change the cookie banner [00:25:00] tire for two weeks You know the ai, vehicle has left the state line right across the state
[00:25:06] Conor Hogan: And even just to add on one little bit a part two of that question like what immediately struck me was Sort of what Kerry was talking about in what she's rethought about AI and your perspective on all of the really invasive practices that, that you're reading about and learning about at the moment in our industry, whether it's, big law, in, in the legal space, in the legal tech space, or in consulting and advisory do you think we're at risk of being replaced, like with firms embracing AI in the way they are and people's expectations around how quickly information can be, Obtained and searched, triaged, furnished, provided back. Do you see AI To some extent, replacing, the professionals in legal and consulting in this industry.
[00:25:58] Odia Kagan: We were chatting [00:26:00] before this briefly, right? I think, and I gave an example of why I don't think that's the case. I think that AI Especially let's say the applications that we're seeing, without finding other dystopias on, the robots that will overtake the world situation.
But the tech that we're looking at I think that What it can't, it, if, without, running off and being doing completely, unregulated things, but if it advances in a normal way, I think that it can make efficient, a lot of the busy work that some of parts of the industry.
Do I think that there are, I'm going to say, I'm going to say this in a way that's pithy, but hopefully, politically correct enough. I think that AI can do away with the mundane tasks that, a lot of people do not like to do anyway, or the ones that are very commoditized, right?
The ones that are very standard. And I think that it's either going to be able to replace [00:27:00] or make much more efficient. I think that the things that are more complex and require the value add that I Try to focus on bringing to clients.
You I think those will not. So for example, I actually did a webinar once about like the U. S. Privacy laws. And I said, My goal for this webinar is for people to not come away with it thinking this webinar should have been a chat GPT prompt, right?
Because if the takeaway you're bringing is there are 19 privacy laws and the threshold is, 100, 000 people. Okay. People can either use AI or Google that, right? So the things that are mundane, right? Like those things, yes, I think AI can maybe do away with, like Excel did away with, the abacus or whatever.
I did an experiment on this, right? I read the Uber decision, the 50 page, Uber decision from the Netherlands.
And I asked, [00:28:00] some chat, UPD or whatever to summarize it, give takeaways or whatever. The summary piece, and I've seen this and colleagues use it like to take a court case And do the summary of it. That's hard. For example, right? The thing that we you know, spat blood and first year of law school and divided up the list and each person summarized a different one, right? Like that, but
[00:28:20] Conor Hogan: then
[00:28:22] Odia Kagan: the takeaways of What does this mean for my clients?
What does this mean for this industry? What does this mean for a company that does X with their loyalty program or a company that has a website that does? Why? There is no way I can do that. There's also no way I like I'll give you a an example. I gave my son about Google Maps, right? Google Maps in my area, I think everywhere.
But in my area, when you go home, There is when I drive back home, there is a point on the road where in some cases you can either choose 13 minutes red [00:29:00] or 14 minutes green. And Google Maps always picks this 13 minute red, and because of some weird configuration, the 14 minute green is not a route option that is in its bank, but I know it, and I can turn right and right again, and boom.
Now, I think most human beings prefer 14 green to 13 red. Maybe not. But in order to make that determination of here are the options, here's the significance, here's the path that is more appropriate for you to take that type of high level analysis, AI is not going to replace.
And so the summary of that is, I think that commoditized. Simple and routine aspects of our work may be able to be replaced. Some of them we may be able to rely on AI as like a first draft instead of doing the work ourselves. But the high level stuff, right? I don't think so.
[00:29:59] Carey Lening: Yeah, that's, [00:30:00] I call it the slightly dumb, 1L intern. That's how I look at it, is it's, as a summarizer, you learn CRAAC and you're like, Grand, okay, this is good, here you go and, it can do, or IRAC, which law school you went to, but IRAC, it can do that, issue, rule analysis, conclusion.
It's not going to pull out, like you said.
[00:30:21] Odia Kagan: But All of the, what do you call those unseen things and bigs in the test, right? Like the hypo, right? It can't take the 20, it gets summarized the 20 cases in your file, but it can't then take the takeaways from the 20 and apply them to a different set of facts and draw a specific conclusion.
[00:30:38] Carey Lening: But I think it's, it is useful to understand those, what those limits are to take advantage and leverage the utility, but not to worry too much about AI replacing us. I'm with, I think I'm with you Odi. I don't really foresee any huge replacement.
I would be more than happy though if AI got better at drafting, say data protection agreements [00:31:00] and. Processing agreements and reviewing limitation on liability clauses. That's what we were talking about earlier. Connor. Oh my God. The death of me, the death of me. Okay. I'm going to jump to industry now, pick a number one through three.
[00:31:16] Odia Kagan: Let's do two.
[00:31:17] Carey Lening: What is one place that you've always wanted to visit but haven't yet?
[00:31:21] Odia Kagan: Wait, you said industry. Wait,
[00:31:22] Carey Lening: that's not right. My
code is super broken.
[00:31:25] Odia Kagan: I want to visit, I want to visit the land of limitation of life. No. Yeah, sorry.
[00:31:32] Conor Hogan: I'm so glad Odia, you were on that. I was thinking to myself, that's a strange question to ask.
[00:31:36] Carey Lening: That is a super strange question. I'll back up and explain. I had Claude revise some code that I had before and it the algorithm now Does a shuffle of all the questions in the question bank.
And what I thought it was doing was a shuffle within each category. It is not, it is doing a shuffle of all the questions in all of the categories together. So this
[00:31:58] Odia Kagan: is like the, this document [00:32:00] I'm drafting law. It's like this may have hallucinations and inaccuracies. Please pay, please make sure that this function's working and this function's working and this function's working.
[00:32:08] Carey Lening: So this is fun. Yep, I told you, highly random. All right, so I generated a number, another number two. What are some common misconceptions about data protection privacy.
[00:32:19] Odia Kagan: Common misconceptions. Oh, so so from people that are outside the industry.
One is. You do data breaches, right?
So
[00:32:30] Odia Kagan: privacy compliance is not data breaches. That's one thing is that especially in the U. S. In Europe, this is less and less of a common misconception, obviously, because data protection is, has been around.
But in the U. S. Oh, data breach. The other is when I say I do data privacy. People think information security. When I say data privacy, people think HIPAA. This is all like very U. S. centric approach. Once I said I do privacy and security, they thought it was securities, like securities [00:33:00] law.
And then the other one, the big one, I think, and that we find it's still in educational process, and I don't know, you guys can tell me if this is maybe also U. S. centric, but in the U. S., there is still a gap. On understanding the scope of personal information or personal data, because there is a very entrenched opinion or view or understanding, which I guess, kudos to the data breach laws, which, they're not that old.
They're from 2003, but they're very prevalent. And so in the U. S. When you say personal information, everybody thinks data breach laws. They think PII. They think driver's license. They think medical. They think health insurance. They think social security number, definitely social security number. And so when I explain that personal information is also email, what do you mean email?
When I say cookie information or identifiers, What do you mean identifiers? There are still around these privacy notice that is that say we collect personal and [00:34:00] non personal information, parentheses, IP addresses you can date that the rings in a tree, like you can date a privacy disclosure by that statement, right?
And then of course, hashing. The FTC just did a guidance on hashing, that hashing is not anonymization. People, what do you mean? So the, I think that's one big misconception is what's personal information, what's anonymized information so what's the scope and what does it actually do?
So I think we still, in the U. S., even though, 19 laws and counting and all that stuff, but what is it and what does it do? I think we still have a ways for that to penetrate. Whereas I think in Europe, that's no longer the case.
Because you've got Oh, no,
[00:34:41] Conor Hogan: that's, no, I, that's still,
[00:34:43] Carey Lening: it's still the thing like the, because a lot of people have the US model in their head. They don't necessarily attribute it to data breach laws, but everyone is especially in the tech sector, everyone is, mostly hearing from the privacy.
Teams or from the, their respective teams from their [00:35:00] us centered companies, your Googles and your Facebooks, and they still call it PII. And they still make it narrower because I think in in, in some respects it's willful. If they can constrain what is personal data to a subset of things then engineers feel happy because they don't have to think about all these other possible things that could identify someone, because if you try to tell someone, hey, inferential data can be personal data about you, it makes engineers heads explode.
And so they do the narrowing because it's concrete. But it's not accurate.
[00:35:34] Odia Kagan: Yeah. Yeah, but it's not accurate.
You guys are both in Ireland, right?
[00:35:38] Conor Hogan: Yeah.
[00:35:39] Odia Kagan: Yeah. So I'll give you a point of national pride. On this, I was traveling back from the IABP event in Brussels.
I don't remember, maybe two years ago or something. And I was flying Aer Lingus and I had to do a really tight connection. And so I asked one of the flight [00:36:00] attendants, there was an empty seat. In the like the front row or something and I said can I go there for the descent so I can just hop out And make my connection So I was sitting there and very close to the staff right So they were like, oh, what were you doing here?
I said I was here for a conference. What do you do? I'm a data privacy attorney. He says oh We have gdpr now So there you go
[00:36:22] Conor Hogan: Yeah, I think though the gdpr is doing one thing and then it's become known You maybe it's a bit infamous, I think, in Europe, at least in Ireland, because a lot of public authorities, and I'm generalizing here, but, will say, Oh, we can't do this because of GDPR, and it's become a sort of bad sort of reason to blame things.
But, I do think that the fuzziness of personal data is something that can, confounds and confuddles a lot because it's not as narrow as maybe other jurisdictions. So when somebody thinks about personal data here in Europe, it is so broad, a brushstroke, [00:37:00] and people generally have the difficulty of actually coming to terms with what that is.
So I think it's very interesting, just the points that you raise. But yeah, it's the same
[00:37:09] Odia Kagan: discussion we had, right? I think there is a gap on also what is this data, right? These IP addresses and the, identifiers and like all that stuff.
What is that and what's the actual impact? Because once that understanding hits home on. Once you do that, it goes around with you and it may affect this. And now you're not getting a loan. Once people have that, I am not sure, that, people that say, Oh, I'm going to give my, I'm going to take a couple of spam texts and unsubscribe for a concert. I don't think the same people are going to be like, Oh, I'm going to give that information. And then maybe I won't get a loan. I don't think it's the same decision. So back to your point the fact that people don't relate, non PII to part of the part of the program is because they don't really understand how these identifiers work and [00:38:00] what their impact could be.
[00:38:01] Carey Lening: Part of what I end up doing as a consultant is that education piece. Let's not talk about categories of things. Let's talk about what this means in practice. This is how, Information can be used to target you, to affect you, to change decisions made about you, to infer, details about your person.
And this is the kinds of impacts that can have. And I feel like when you personalize it it washes away any of the technicality. Because most people do not care about the technicals. My, my sweet patient husband who listens to me every day, talk about this stuff and rant constantly. He's I don't care about the technicals.
This is like boring just, but making it real for them, making it real for him. Is something that actually resonates and I think that's a really important part
[00:38:47] Odia Kagan: And I think that is the part that's where I try the value add that I try to bring. And I actually think this is what the regulators are looking for I think if you take boxes of here's a category here's data.
Here's purpose. Here's legal [00:39:00] basis and you're reading it And, to me, it's it's like an Ikea instruction manual because I can't do 3d stuff. So I look at it. I'm like, what's happening here. But if you're still looking at it and you have all of the ingredients that are listed in the law, but at the end of the day, the normal human being reads it and has no idea what is happening with their data, then I actually think that is not compliant with transparency because the whole point is to make the person understand. But I think that's the vague things that people like not lawyers have a bad gut reaction is you have, you lost me at notwithstanding or here and after. What is this?
I don't care. But once you understand what this means for you at that point, then you're having the right conversation. Some people are like, no, I still don't care. I'm fine with this.
[00:39:46] Conor Hogan: But
[00:39:46] Odia Kagan: at least you're speaking from a place of common ground.
[00:39:49] Conor Hogan: Yeah, that's really interesting.
Unconscious. It's just gone the top of the hour. Oh, yeah we have one question left. Are you okay to hang on or do we need to be judy? Oh, excellent [00:40:00] Kerry, I think i'm right in saying it's a personal question to end. It's a personal
[00:40:03] Carey Lening: question and Hopefully that the random generator actually gets a personal question here.
But if not, i'll let you know
[00:40:09] Conor Hogan: Exactly. Is it near you? Okay
It's you.
One, two, or three, then, for the last time, on the second one. Let's do
[00:40:16] Odia Kagan: two.
[00:40:17] Conor Hogan: Okay. Oh, fantastic. Here we go.
[00:40:22] Odia Kagan: That sounds ominous.
[00:40:24] Carey Lening: Cotter is so excited. Cotter loves these questions, by the way. Every single time, he's Oh, this is good.
[00:40:29] Conor Hogan: I know. I actually genuinely haven't found one, Herm. Oh, for God's sake. Here we go. This is, no, this is genuinely good, Odia, right? But, Predetermination, chaos, or universal alignment, what do you think guides us on the path that we're on? Is it destiny, random chance, pseudo anonymization, the stars aligning, or a mix of each, or something else entirely?
What a question to end.
[00:40:53] Odia Kagan: I think that it is a mix. I think that people [00:41:00] are born with some sort of predisposition, my Chinese medicine doctor calls it composition, right? There is something there are things that you're born with. And then there are things that you add to your, disposition or personality or, approach to life, based on nature, nurture, environment, things like that.
So part of it, Yes, you are actively choosing things. Are the things that you are choosing completely volitional, or are they impacted by, the, I don't know, childhood trauma you had, or the, completely worry free childhood that you had? Does anyone have a
[00:41:43] Carey Lening: worry free childhood?
I've never met those people. I don't know, this reminds me, I was
[00:41:47] Odia Kagan: reading, so this reminds me, is one of the books that I'm reading is a JFK biography and they asked JFK what his feeling was about the Great Depression, and he said, I had no concept of it. Like we, I have not, I just learned about it in [00:42:00] college later.
So it's not the same, but I think, some people have, idyllic kind of childhood, but I think basically I think nobody makes completely unaffected decisions, because, they're all impacted by the things that you do, that you have and your setup and everything.
But I also think, one, if you are consciously looking at that, looking at yourself, looking at other people, trying to learn about it you are able to not avoid the tendency, But choose a different action, right? It's like putting space between you know the feeling and the action or whatever, right?
I think that's one part. And then the other part, that's the part that I am working through, is more difficult for me, is that I, there, there probably is some sort of, I don't know if predetermination, but I'm more universal, something that you can accept, right?
And [00:43:00] not trying to, force things or act, as opposed to accept and wait, I think that there's also an element to that and philosophers in different kind of schools think that. For example, we talked about ryan holiday and the stoics, right?
Especially now right you read the news, you get chest pain you move on right but One sentence that he has that keeps repeating is you don't have to have an opinion about everything And so you can you know Not accept things as this is the right thing but recognize that this is what's happening. I think it is or it should be a mix of things where we take actions that are volitional But we need to also understand that one They're not really 100 percent volitional because they're affected by who we are and we see things how we are and they are also affected by the world and the, I don't want to have it sound vague or weird, but what you're doing, you don't control.
[00:44:00] the world, right? And whatever you do, it interacts with other systems, right? In the privacy space, right? You can control your own thing. And then you look right and left, and there's other countries doing other things. And so the ecosystem looks different. So I think it's a combination.
If you read the Stoics about Marcus Aurelius who like lived, I don't know a really long time ago and all the things that he writes in his book. Are the same things that we deal with every day.
Ignore the disinformation. It's Oh, really? You guys had disinformation in Rome. Okay, cool. If we are, able to look at it, understand what's going on and put your volitional actions on top of it, they will be more free and they will be more impactful. And that actually does tie back to my point about transparency being core. So there we go. You like made it full circle.
[00:44:46] Carey Lening: Full circle. Also, I have another book for you now, because you mentioned the whole determined predetermination thing. We were chatting with Jason Kronk a couple of weeks ago, and he finished reading a book called Determined [00:45:00] by Robert Sapolsky.
Sapolsky? Sapolsky, yeah.
[00:45:04] Conor Hogan: Sapolsky, yeah.
[00:45:05] Carey Lening: It's the science of life without free will. So that might be an interesting one for your reading corner, Odia. Is it
[00:45:14] Odia Kagan: for or against?
[00:45:16] Carey Lening: I haven't read it yet. It's on my ever growing pile, because I can't read 15 books at the same time of things.
But I, I think it's the way Jason explained it is it was very interesting. It gave him an interesting perspective.
[00:45:28] Conor Hogan: Definitely. I feel like I'm the odd one out now because I haven't I haven't recommended any books yet. And I just picked up the one that I have been reading on and off for the last, probably two to three weeks.
And it's completely not related to anything we've been talking about. It's the room where it happened by John Bolton, the White House National Security Advisor of the U. S. All about his view of his first term. And know totally different. So totally not related and completely from that field.
There you go, . Very good, but very good.
[00:45:56] Carey Lening: Well done. Cotter . Okay. Thank you. [00:46:00] This was, excellent. I'm glad you were able to entertain us and and spend a little bit more time. I know we, we had a, we whi over a little bit. There was so much engaging conversation. But thank you so much, OIA.
I hope you enjoyed it.
[00:46:12] Odia Kagan: Yeah, thank you very much.
[00:46:15] Carey Lening: Awesome.
[00:46:15] Odia Kagan: All right.
[00:46:16] Conor Hogan: No really appreciate your time odia as good as ever and looking forward to some more deep dives on linkedin in in the very unique but always fantastic style that you bring so Thank you for your time and wishing you every success no matter what way you define it
for the rest of your day your week and everything else to follow as well
[00:46:35] Odia Kagan: Thank you very much.
[00:46:36] Carey Lening: And more privacy FOMO.
Thanks for listening to another episode of Chance Conversations. Come back next time where we'll be interviewing the data diva herself, Debbie Reynolds.
Thank you for subscribing. Leave a comment or share this episode. -
Welcome to another insightful episode of Chance Conversations, where Conor Hogan and I unravel the intricacies of privacy and technology with renowned expert Dr. Gabriela Zanfir-Fortuna.
Gabriela, Vice President of Global Privacy at the Future of Privacy Forum, joined us for an engaging dialogue that darted between AI's future impact to the profound importance of history in the law and why if we don’t start paying attention, we might be doomed to repeat it.
Think someone might be interested in hearing this episode? Why not share it with a friend (or two!)
Conor and I properly acknowledge that Gabriela is a force in the data protection and privacy world. Her experience straddles both sides of the Atlantic, and with a plethora of influential roles under her belt, Gabriela commands attention, while being gracious, humble and encouraging to others in the community. Her impressive introduction left her a touch embarrassed, but excited to dive into our "chance conversation."
In Privacy, It’s About the Past as Much as the Future
Gabriela graced us with her nuanced perspective on the origins and evolution of data protection laws like the GDPR. Drawing from deep knowledge in this space, including her past stint working for the EDPS, where she advised EU legislators on the GDPR during the drafting process, she reminded us of the why behind the laws, and the necessity for laws and regulations to adapt as technology advances.
We both appreciated Gabriela’s insights, especially on the question if tech always outpaces law. As technology accelerates, will our comprehension of its effects ever keep pace? Gabriela had some thoughts, and they’re definitely worth listening to.
But just as a teaser, her spicy take was that while the legal landscape obviously needs to evolve with technology, many of the challenges we face today in our endless war between new tech and law are actually old battles, that have been fought, won, and lost in the past, so maybe we should all be reading a bit more history before trying to reinvent things (again).
Personal Reflections: On Journalism, Impact, and A Secret Dream
Interestingly, Gabriela's journey in data protection isn't her first foray into leading an impactful life. Previously, Gabriela was a journalist, where she wrote about local and national issues in Romania. One poignant memory she shared involved her investigation into early climate-induced desertification in Romania. We bonded over our common goal of seeking truth and finding out the why.
But the more fun part of the conversation was around what she would do if money was no object — let’s just say, it includes many of my favorite things, but you’ll have to listen to find out.
On Cats, Clippy, and Philosophy
But don’t worry — not all of our conversation was heavy with such lofty goals and societal implications. A delightful interlude about Clippy and later one about my insane cat obsession also made it in. She even asked me an important question: will AI replace cats? Looking over the timestamps, honestly we might have spent too much time on cats. Sorry.
As our conversation wrapped up, Gabriela even threw a few questions back on me (Conor had to leave a bit early, sadly), and for once, I felt like I was in the hot seat. We talked about our passions, and my latest writing obsessions. It was probably more of me than anyone wanted to listen to, but hey. That’s what makes it a chance conversation!
Listen and Engage!
We invite you to join this stimulating exploration by tuning into the full episode. Whether you're a privacy enthusiast or just curious about technology's imprint on our lives, Gabriela offered an engaging compass to navigate the privacy world. Stay tuned for our next episode featuring the remarkable insights of Odia Kagan, another luminary in privacy law. Until then, keep pondering where humanity and technology converge—and diverge.
Want to get these podcasts delivered in your inbox? Why not subscribe?
Timestamps:
* 00:00 Introduction and Guest Welcome
* 00:31 Gabriela's Background and Achievements
* 01:44 Explaining the Rules of the Conversation
* 03:06 Career Highlights and Personal Aspirations
* 05:56 Industry Trends and Future of Technology
* 16:27 AI and LLMs: Will They Replace Us?
* 22:09 Inference, Intuition, and Philosophy
* 22:29 Personal Questions: Do You Like Cats?
* 23:13 Growing Up in Romania
* 24:03 The Irish and Their Cats
* 25:13 AI and the Future of Cats
* 26:34 Final Thoughts and Farewells
* 26:57 Career Motivation and Journalism
* 27:59 Climate Change in Romania
* 32:00 Challenges of AI Compliance
* 35:37 Closing Remarks and Upcoming Guests
Thank you for subscribing. Leave a comment or share this episode. -
Zijn er afleveringen die ontbreken?
-
In this episode of Chance Conversations, Conor and I interview Robert Bateman, a renowned expert in data protection and privacy law. Robert shares his journey from self-professed ‘lazy man’ into data protection powerhouse, after he became unexpectedly obsessed with data protection in 2017.
Robert is the owner of KnowData Ltd, based in Brighton, England, where he provides consultancy, training, writing, and events related to data protection, privacy, and AI regulation. He also works as a data protection consultant for a number of firms, and as a trainer for Act Now Training. He also writes and hosts the bi-weekly Privado Privacy Corner, where he turns the crazy churn of privacy and data protection news into something manageable for all of us.
A Unique Approach to Training & Putting Privacy on the Map: When the topic of building a privacy culture came up as the first question, Robert immediately flipped it back on us, so if you haven’t heard Conor or my takes on the subject, prepare yourselves.
But for Robert, the heart of effective data protection lies in making the process approachable and meaningful. His philosophy is can be boiled down into a few key points:
* be helpful & proactive
* be clear, and
* be approachable—cultivate a culture where data protection is second nature rather than a dictated chore.
Part of this mentality comes from Robert's deep commitment to empathy and logical consistency. Robert shared how his search for fairness and his focus on empathy have shaped him both professionally and personally. It’s nice to see that while the world of data protection can sometimes feel a bit like a battleground, Robert’s emphasis on understanding and fairness reminds us that at the core of it, data protection is fundamentally about people and doing right by them.
Compliance Grievances: What would a Chance Conversations episode be without at least a little airing of grievances when it comes to privacy theatre. I’ll admit, this is a bit in the weeds and mostly EU-focused, but for some of you, I suspect there will be some nodding in empathy and acknowledgement of our shared pain.
Finding a Passion for Data Protection and Understanding What Drives Him: Robert's journey into data protection wasn't an entirely straightforward one (though he did do the oft-customary legal song-and-dance first).
Candidly, he admitted that he used to be “quite lazy”, and that he’s the kind of person who is less driven towards a specific goal or goals (professional or otherwise), and is instead mostly carried along by what interests him. But after an illness, he did find his passion in data protection. Fortunately for us all, even after so many years, Robert seems pretty committed to making data protection, privacy, and AI understandable and clear. And notwithstanding his own opinion, I personally think Robert is one of the most hardworking folks I know, so I think he successfully dislodged his inner lazy.
For Robert, it’s about living in the moment, prioritizing the people he cares about and the life he wants to lead, rather than being consumed by lofty, long-term (and often unachievable) goals. This mindset not only keeps him present but also adaptable to the twists and turns life throws his way.
Inevitably, AI and The Fault in Our Laws: On industry Robert shared his thoughts about the current buzzword dominating our industry: AI. While many see AI as overhyped, Robert offered a nuanced view that it might actually be underappreciated by some in the data protection community. But he also shares the view that the state of expectations (including legal expectations) may have to change. We even get into what might happen if the law gets ahead of technology… oh my.
Still, the constant evolution and the legal intersections with AI make it an endlessly fascinating topic, one that Robert believes demands more thoughtful consideration as we hurtle toward a more automated future.
Timeline:
* 00:00 Welcome and Introduction
* 00:15 Meet Robert Bateman
* 01:45 Conversation Rules and Format
* 03:18 Industry Trends and Insights
* 03:33 Creating a Culture of Data Protection
* 13:22 Career Observations
* 19:54 Personal Questions
* 29:08 AI and Data Protection
* 36:19 Final Thoughts and Farewell
Stay tuned for the next episode of Chance Conversations, where we interview Gabriela Zanfir-Fortuna of the Future of Privacy Forum.
If you liked this episode, consider sharing it with a friend, leaving a comment, or subscribing to the podcast!
Thank you for subscribing. Leave a comment or share this episode. -
In the latest episode of Chance Conversations, Conor and I dive into the intricate world of data privacy with Alexander Hanff, a towering figure (both figuratively, and literally) in privacy advocacy and an unwavering champion for data protection rights. With decades of experience shaping the laws and ethics surrounding privacy, Alex's insights are not just invaluable—they're transformative.
Meet Alexander Hanff
We begin the episode by talking about Alex's remarkable journey. For those entrenched in the data protection sphere, Alex barely needs an introduction. His contributions span from the drafting of the GDPR to advising the EU Parliament, all underscored by a genuine passion for advancing fundamental rights. Through his consultancy, Hanff and Co., Alex has been a beacon for privacy advocacy worldwide for decades.
Embracing Change in Data Protection
One standout moment in our conversation emerged when Alex shared his views on fostering a culture of data protection within businesses. According to Alex, the crux lies in a willingness to change—something that, despite its challenges, is pivotal. As he empathetically explained, many organizations are resistant not out of malevolence, but habit. Bringing change requires empathy and effective communication, an ethos Alex carries into his consulting and advisory roles.
The Human Touch vs. Technological Advances
In an era where technology dominates, Alex urges us all to strike a balance. While automation and technology can support privacy efforts, Alex warned against over-reliance on these tools at the risk of losing the critical human element. "We’re talking about human rights," he asserts, emphasizing that technology, as sophisticated as it might be, can never fully grasp the nuances of human dignity and rights. Here’s Alex’s take on privacy notices and how we might be doomed if AI start generating them:
Personal Insights and Industry Reflections
Turning the conversation personal, Alex shares his unwavering belief in learning from life's challenges. He posits that every experience—every hardship—has shaped him into the person he is today. This perspective really touched us, and highlighted the resilience and dedication that define his approach to data protection and privacy advocacy.
A particularly moving part of the episode was Alex's recollection of an encounter with a WWII veteran during a speech at the London School of Economics—a poignant reminder of the human cost behind the rights many of us take for granted. This moment, among others, fueled Alex's commitment to preserving and advancing these hard-won rights.
Looking Ahead
As discussions about future technological impacts and privacy unfold, Alex remains a steadfast advocate for keeping privacy human-centric. His call to action is both a challenge and a reminder: to protect and prioritize human dignity in the face of ever-evolving technology.
We also reflected on the collegiality of privacy professionals — it was a bit of a love-fest at times.
Conor and I think you’ll really enjoy listening — the episode was truly a masterclass in understanding not just the mechanisms of privacy and data protection, but the heart behind the movement. The conversation leaves listeners inspired, introspective, and eager for our next episode, which features Robert Bateman.
Stay connected with us on our LinkedIn page, where we continue to explore and discuss the dynamic world of data protection!
Timestamps:
* 00:00 Welcome and Introduction
* 01:50 Industry Trends and Insights
* 08:18 Career Observations
* 13:58 Personal Questions
* 18:41 Future of Technology in Privacy
* 23:49 The Role of Technology in Privacy
* 24:31 Challenges with Current LLMs
* 25:39 The Importance of Context in Privacy
* 29:14 The Future of AI in Privacy
* 30:26 Personal Reflections on Privacy Careers
* 33:22 The Impact of Personal Experiences
* 36:55 Final Thoughts and Reflections
* 42:36 Closing Remarks
Thank you for subscribing. Leave a comment or share this episode. -
Heads up: Conor had a reprieve and I ran this one solo. I had the pleasure of speaking with Liz Steininger, CEO of Least Authority — a privacy and security auditing firm based in Berlin, Germany. Least Authority’s focus is primarily on Web3, and I’ve had the good fortune to work with Liz and her brilliant team.
We started out by diving deep into reflections and insights that only come from lived experience. Liz's thoughtful perspective provided a wealth of wisdom, offering up advice that is not only personal but universally resonant.
The Question That Opens Time: I started off by posing an introspective question: "If you could go back in time, what's one piece of advice that you would give a younger version of yourself?" It’s the kind of question we’ve all pondered at least once in our lives, and Liz’s response was both relatable and inspiring.
On Misplaced Energy: Liz didn't hesitate to reflect on her past. She shared that one of the most poignant pieces of advice she’d offer her younger self revolves around the use of energy. Specifically, she would tell herself not to waste so much energy on certain people or situations.
Focusing on What Matters: Instead, she advised that while it’s easy to get caught up in the minutiae of daily life, to let certain situations drain our energy and consume our thoughts. Liz's advice serves as a potent reminder: life's challenges are often temporary, and holding on too tightly can prevent us from progressing toward better opportunities that lie ahead.
Remembering to direct energy toward things that truly matter is a lesson many of us could revisit, and Liz's candidness underscores just how transformative this shift in mindset can be.
A Guidepost for Life's Journey: While much of our talk revolved around time, energy and focus, and was indeed a bit of a therapy session, Liz’s reflections are not just words of wisdom but can be seen as a guidepost for all of us navigating life’s tumultuous journey.
She reminds us that while the challenges of the moment can seem overwhelming, tomorrow often brings new opportunities that are worth our attention and care.
Thank you, Liz, for sharing your insights and reminding us of the power of strategic self-focus.
Timeline:
* 00:00 Introduction and Welcoming Liz
* 03:02 Career Observations: Defining Success
* 08:21 Industry Trends: Learning from Failures
* 15:19 Personal Questions: Dream Destinations
* 21:59 Creating a Culture of Security and Privacy
* 25:57 Advice to Younger Self
* 29:48 Life Lessons and Guilty Pleasures
* 35:37 Conclusion and Next Episode Teaser
Stay tuned for the next episode, where we welcome Alexander Hanff to Chance Conversations and put him in the hot seat.
Thanks for listening — if you liked this episode, consider sharing it with others!
If you liked this, you might also like my blog, Privacat Insights
Thank you for subscribing. Leave a comment or share this episode. -
In this episode of Chance Conversations, we welcome R. Jason Cronk, a principal privacy consultant at Enter Privacy Consulting Group. Jason shares insights from his diverse career and deep expertise in privacy engineering, privacy by design practices, and regulatory standards.
True Randomness & Philosophy: We start off with some unprompted philosophy about … randomness— Jason shares his latest read Determined: A Science of Life Without Free Will by Robert M. Sapolsky and explains why he created his own pseudo-random number generator to choose a random number. Epic nerdery indeed.
We get into some deep philosophical discussions, including the implications of predetermination on life, our brains, and decisions we make around consent & autonomy, as well as fuzzy / murky consent. I also take the opportunity to shamelessly mention JM Berger’s Optimal and my Gikii presentation (which I dive into more detail on in this post).
Jason’s Million Dollar Idea: You’ll have to listen in, but I promise, this is so good.
Hey Policymakers: Theory is Nice, But We Need Some Practical Guidance: Jason shares how one of the biggest glaring problems in privacy and data protection is the lack of specificity in the law, especially when it comes to implementing and building in privacy by design. High-level principles based theory is good, but as he notes “when the rubber hits the road, what does all of this mean and how do you get people to implement it?” Much ranting occurs concerning Article 25 GDPR.
Changing Personal History & Defining Success: Jason reflects on how a different opportunity might have radically reshaped his life. We then go totally off the rails and we end on a discussion of meteors and Armageddon, before coming back to Earth and hearing his views on what success means for him.
Cognitive Biases & Cultural Influences: We all wax on about the different global views on privacy, and Jason leads us back to how cultural influences and biases often shape (or determine) how we think about privacy, autonomy, obligations, and even what approaches to take. This, he argues should be considered and inform practitioners when it comes to training, raising awareness, and informing others about data protection. Jason also shared a new-to-us cognitive bias—the Einstellung Effect, which refers to a person’s predisposition to solve a given problem in a specific manner even though better or more appropriate methods exist. When all you have is a hammer…
Wrapping Up: We close discussing the future of technology and Jason’s dystopian and hopeful futures. Naturally, AI & LLMs came up, as well as overreliance on tools that ‘do it for us’ versus being additive. If you listen in, you might get the inadvertent pun we were dancing around the entire time…
Timeline:
* 00:00 Introduction to Chance Conversations
* 00:17 Meet R. Jason Cronk: Privacy Expert
* 01:08 Explaining the Rules of the Game
* 02:17 Predetermination and Free Will
* 05:13 The Concept of Consent in Privacy
* 09:43 Career Highlights and Aspirations
* 12:38 Challenges in Privacy Training
* 14:41 Policy Makers and Privacy Engineering
* 20:09 Personal Reflections: Decisions That Shaped My Life
* 22:06 Hypothetical Time Travel: Changing History
* 23:46 Career Insights: Defining Success
* 25:02 Global Perspectives on Privacy
* 29:15 Cognitive Biases and Privacy Training
* 31:30 Future of Technology: Dystopian vs. Hopeful
* 37:40 Closing Thoughts and Farewell
Our next guest will be Liz Steininger of Least Authority!
Like what you’re hearing? You also might like Privacat Insights!
Consider sharing this episode with a friend or colleague — or leaving a comment below.
Thank you for subscribing. Leave a comment or share this episode. -
In this episode of Chance Conversations, Conor and I interview Shoshana Rosenberg, co-founder of Logical AI Governance and co-founder of Women in AI Governance.
Good Bosses Can Guide Us: We kick off by touching on significant career influences in Shoshana’s life, including what she has learned from supportive, inspiring leaders who let her take chances outside of what was listed on the job spec. We then discuss one of Shoshana’s many passions: operationalizing AI governance and her PRISM framework, as well as her passion for fostering an inclusive AI community.
DEI, Digital Agency & Explainability: Shoshana discusses her journey, including the founding of SafePorter, a DEI tool suite with a privacy-by-design approach, and the importance in recognizing a right of ‘digital agency’ when it comes to personal information and data.
For Shoshana, digital agency is interlinked with explainability, particularly with regards to AI and algorithmic decision making. Here we discuss the distinctions between technical ‘explainability’ or understandability (like the model weights, data provenance & architecture), versus explainability when it comes to context, controls, and the decision-making processes that models undertake.
On Writing a Book: Did you know Shoshana wants to write a book? According to Conor’s mom, “we all have at least one book in us”, but in the case of Shoshana, I suspect she’s actually going to write that book (or books). We even mused about a collaborative fiction book with other privacy pros, which sounds like fun. Maybe I can write a cat-privacy themed haiku?
On Fear, Bravery, Trust and Taking Risks: Shoshana offers some sage advice to a younger version of herself about not being afraid to go after things, even if it’s not the ideal (or initial) path you expected. Then she turns the question back on us! We also talk about trust, and what bravery means in many different contexts. It gets very philosophical, y’all.
AI Governance & Recycled Air: She offers insights into the future of AI governance, what it means to her, and the pernicious problem of ‘recycled air’ — where people fall back to saying and relying on the comfortable and familiar, rather than taking a bold stance or saying something genuinely unique or different. As Shoshana reminds us ‘We’re in the Mining Era’ — and no, we’re not talking about gold or crypto.
Finally on to Low Tech Problems, Solutions, and Back to Trust: We end with a brief discussion of the importance of handling low-tech problems (aka, we should be worried less about Terminator AI, and more about social engineering), and the value of low-tech solutions. There may, or may not be a product plug here, but we do go back to the all-important question of trust.
It was a great conversation, and we had a wonderful time chatting with Shoshana. Also, I’d be remiss if I didn’t mention that I recently had the opportunity to take Shoshana’s excellent August AI Leadership bootcamp. Shoshana ran us through the paces, where we learned the value of logical frameworks, including her company’s signature LEARN and PRISM methodologies. Shoshana’s courses are vibrant, interactive, and delightfully intimate course, and one I highly recommend to other AI leaders (or those looking to become AI leaders!). You’ll learn a lot, though I suspect it’s a small tip of the huge iceberg of knowledge that Shoshana has.
Timeline:
* 00:00 Introduction and Guest Welcome
* 00:22 Shoshana's Background and Achievements
* 00:41 Women in AI Governance
* 01:01 SafePorter and Privacy by Design
* 02:00 Ground Rules for the Conversation
* 03:49 Career Highlights and Mentorship
* 10:04 Personal Goals and Writing a Book
* 12:23 Future of Technology and AI Governance
* 17:18 Explainability vs. Understandability
* 20:31 Advice to our Younger Selves
* 21:01 Navigating Career Paths: Trusting the Journey
* 22:21 Embracing Risks and Learning from Mistakes
* 23:18 The Importance of Trust and Vulnerability
* 24:16 The Role of Cynicism and Trust in Professional Life
* 25:48 Personal Growth and Self-Trust
* 28:27 The Value of Diverse Perspectives
* 31:13 Underhyped Aspects of Technology and Security
* 35:45 Final Thoughts and Reflections
Thank you for subscribing. Leave a comment or share this episode. -
Welcome to another thrilling episode of Chance Conversations!
In this episode, Conor and I sit down with the dynamic Lisa Forte, a partner at Red Goat Cybersecurity and an awe-inspiring high-altitude climber and caver.
Lisa shares her mountain (get it?) of insights on cybersecurity, personal adventures, and life lessons that will leave you inspired, contemplative, and a little jealous (I know I was).
A little about Lisa: Lisa is not your average cybersecurity expert. Beyond her role at Red Goat Cybersecurity, where she runs cyber crisis simulations and awareness courses, Lisa is an accomplished mountain climber and caver. Her exploits have taken her to some of the world's highest peaks and most exotic locations, and she has starred in several documentaries while regularly contributing to BBC news and national papers.
Navigating High-Stakes Environments: In a light-hearted yet profound moment, Lisa captures the essence of balancing perfectionism and humility. She shared how her type-A personality has evolved to make her a better consultant and person, by leveraging the positives of getting things right, while simultaneously putting her ego to the side and really listening to client and business needs. Even if their right and her right don’t match.
As the sole professed Type B in the conversation, Conor really appreciated the humility angle.
Adventures in the Central Asian 'Stans: Lisa also drew perhaps, one of the best questions in the bunch for a jet-setter such as herself — What’s one part of the world she would love to visit but hasn’t yet. If you want to find out where she’s been (and where she hopes to go soon), you’ll have to tune in.
Hint: This country’s landmass is known for its mountainous terrain and breathtaking landscapes. Also, the Pallas’ cat, a famous wildcat in the area, has been celebrated on a postage stamp.
But she also reminds us that her travels are not just leisure but also about practicing resilience in daily life, broadening her perspectives, and connecting to others who have diverse cultural and societal viewpoints.
Lisa’s Most Impactful Person: Lisa shared her thoughts on how an impactful science teacher early on shaped who she was and helped her find a place. We all then spent a good bit of time reminiscing over why we each found value through special science teachers in our lives.
A Brush with Mortality: One of the most impactful moments in the conversation was Lisa's recount of a life-altering accident she had a decade ago. The severe brain injury she sustained caused her to reassess her approach to life, imbuing her with a 'seize-the-day' mentality. This experience has propelled her to live fully and fearlessly, urging others to embrace life's opportunities without delay.
One of those take-aways? Focus more on the actual experience, and less on taking a selfie of you in the experience.
Final Thoughts: As the episode wrapped up, Lisa threw a thought-provoking question back to us:
"If you could see one thing happen in our industry in the next year, what would it be?"
Let’s just say, we had some thoughts. Though none of them were quite as profound as what Lisa had to share.
In short, whether you're interested in cybersecurity, personal development, or thrilling travel tales, this interview has something for everyone.
Timeline:
* 00:00 Welcome and Guest Introduction
* 01:36 Explaining the Rules of the Game
* 03:20 Career Insights: Lisa's Comparative Advantage
* 09:10 Personal Questions: Dream Destinations and Travel Stories
* 15:55 Industry Trends: Overhyped Technologies
* 24:00 Career Reflections: Influential Figures
* 24:59 A Teacher's Influence
* 28:36 A Life-Changing Accident
* 30:53 Living in the Moment
* 36:23 The Importance of Redundancy
* 40:35 Final Thoughts and Reflections
PS: our next episode, where we'll chat with the remarkable Shoshana Rosenberg, co-founder of Logical AI Governance. Catch you next time on Chance Conversations!
Thank you for subscribing. Leave a comment or share this episode. -
Ever wondered how an idea can snowball into something incredible?
I honestly think that’s exactly what happened with our new podcast, Chance Conversations.
In our inaugural episode, we had the honor of speaking with the legendary Ralph O'Brien, principal of Renbo Consulting Limited and a visiting fellow at Maastricht University. With decades of experience in global privacy and security compliance, he’s seen it all, and lived to tell the tale.
We kicked off with Ralph playfully questioning the "chance" aspect of our conversation, and once we put his mind at ease, Ralph shared his insightful views on how policymakers often ignore the voice of data protection pros before diving into how special the data protection and privacy community really is. Right back atcha, Ralph.
Perhaps one of the most inspiring moments came when discussing career advice. Ralph’s dedication to mentoring the new generation of data protection professionals clearly shines through and really made Conor and I smile. As Ralph wisely said:
If I can get paid for doing something that I find morally and ethically good, my motto, if you like, what I'd written on my gravestone is hopefully I’m somebody who made a difference.
And that's really all I want.
For those who love a bit of existential pondering, Ralph even shared his thoughts on predetermination vs random chance. I told you this podcast would be like having a chat with an old friend!
We even dove into some playful banter about his recent London meetup (which was looking to draw quite a motley crew of folks in the English data protection and privacy scene). Let us know in the comments if you were able to attend.
Curious to hear more about Ralph's take on technology's role in data protection, his candid thoughts on the profession, or what fuels his passion? Tune in to our very first episode and join us in celebrating those unforeseen, yet deeply meaningful conversations. Conor and I have both learned so much from Ralph (and our other amazing guests) and we hope you will also benefit.
Transcript Timeline:
* 00:18 Meet Ralph O'Brien: Privacy and Security Expert
* 01:05 Explaining the Rules of the Game
* 01:28 Industry Trends and Insights: Data Protection Challenges
* 09:27 Career Observations: Ralph's Journey and Advice
* 18:29 Personal Questions: Existential Views and Beliefs
* 22:56 Artificial Intelligence: Hype, Reality, and Future
* 26:29 The Role of AI in Our Lives
* 26:43 Tech Bros and Black Mirror
* 27:26 AI's Present and Future Impact
* 28:49 Creativity and AI: A Complex Relationship
* 30:25 The Human Spark of Creativity
* 31:52 The Hero's Journey and Patterns in Storytelling
* 32:55 Mentorship and Community in Data Protection
* 39:08 Personal Reflections and Advice
* 43:39 Conclusion and Farewell
By the way, we’ve got more coming up. Stay tuned for our next episode featuring the incredible Lisa Forte of Red Goat Cybersecurity. Until then, happy listening!
Thank you for subscribing. Leave a comment or share this episode. -
A few months ago, a dear friend and colleague (Conor Hogan) and I were musing about creating a podcast. We waffled on concepts for months, but nothing really stuck.
That is, until I came across the amazing Wild Card podcast hosted by Rachel Martin. The premise is novel: Rachel interviews some of the world’s greatest artists, thinkers, actors, and musicians, and asks guests to pick a card across three categories. Neither she, nor the guest know the question on that card, but it’s always interesting. What makes this fun is that it’s so unscripted — so raw and real.
When I heard this, I realized this would be a brilliant vehicle for interviewing people within the data protection, information security, AI, tech and adjacent spaces. Conor loved it, and so far, all six of the guests we’ve interviewed have also really enjoyed this strange experience into the slightly unknown.
Our podcast is a bit less existential, but it sill maintains the serendipity and opportunity to learn about people we might know in the industry or online, in a format that’s quite different than most podcasts.
Conor and I have been learning a lot from our guests, and over the next few months, we’ll be sharing these ‘Chance Conversations’ with you.
Like us, I hope you enjoy listening. Here’s a teaser episode.
Thank you for subscribing. Leave a comment or share this episode.