Skip to content

Podcast: Privacy Experts Discuss the One-Year Anniversary of the GDPR

May 25th, 2019, marks the one-year anniversary of the General Data Protection Regulation or the GDPR, which are regulations enacted by the European Union on data protection and privacy. The GDPR aims to give control to individuals over their personal data and to simplify the regulatory environment for international businesses.

The privacy regulations divided companies into data processors and data controllers. Any company who has customers in the EU must comply regardless of where the company is located. The GDPR sent a shock wave through tech companies who had previously gone unrestrained in data collection. In episode 6 of Tech Lightning Rounds, Beth Kindig of Intertrust speaks to three companies who are going to great lengths to protect privacy.

The press tends to focus on companies who do not follow privacy standards. Instead, the focus of this podcast will be on companies that go above and beyond what is required. The first interview is with Robin Andruss, the Director of Privacy at Twilio, a leader in global communications that is uniquely positioned to handle data from text messaging sent inside its applications. We also interview Tomas Sander of Intertrust, the company that invented digital rights management and has been advocating for privacy for nearly 30 years. The third lightning round is with Katryna Dow, CEO of Meeco, a visionary startup that is introducing the concept of data control for your digital life.

Topics discussed:

  • Raising the bar in privacy standards with Binding Corporate Rules (Twilio)
  • Whether data privacy should extend beyond the European Union (Twilio)
  • A no-shenanigans approach to privacy (Twilio)
  • Whether the GDPR has been effective or not (Intertrust)
  • How privacy relates to freedom (Intertrust)
  • The seriousness of profiling and look alike modeling (Intertrust)
  • How mobile changed privacy standards (Meeco)
  • How to redesign society for digital rights (Meeco)
  • The consequences of digital footprints (Meeco)

Please Subscribe and Leave a Review for the Podcast Here

TRANSCRIPT:

03:00 Beth: Twilio provides the messaging, voice, and video inside mobile and web applications for nearly 40,000 companies, such as Uber, Lyft, Yelp, Airbnb, Salesforce and many more. You can think of Twilio as a telecom in the cloud, which places Twilio in a unique position when it comes to privacy. I had the opportunity to speak to Robin Andruss, Director of Privacy, on how they handle sensitive customer data, including for people who live outside of the European Union.

03:28 Beth: Can you give us a little bit of background as to what your company does?

03:32 Robin: Sure. So, Twilio is one of the leaders in the communications platform as a service space, where we power APIs to help telecommunication services like SMS and texting, for example. A good example is when you order a Lyft or an Uber and you’ll text with a Uber driver and you’ll notice that’s not really their phone number. So that’s an example of one of our services.

03:57 Beth: As a global communications platform, you’re in an important role for privacy because your APIs handle calls and text inside of applications. What is Twilio’s stance on privacy?

04:09 Robin: So Twilio actually has what are called “binding corporate rules”, which is the global framework around privacy. Which is for anyone who’s been in the privacy space for a long time, they know that it’s actually very challenging to reach this standard. You need to work with a law firm or consultancy to make sure you’re meeting a bar of privacy and actually have your privacy regulations and obligations agreed to and approved by your lead DPA, Data Protection Authority in the EU, of which ours are the Irish DPC. So when my manager came here several years ago, it was during the Safe… When Safe Harbor was being brought down in 2015. And Twilio’s really only been around for 10 years. And she said, “I wanna do this right, I wanna build privacy right from the beginning.” And that’s why she put binding corporate rules into place. So, we treat everyone who uses Twilio services across the board the same, our corporate rules. One rule, we don’t have a different one for the US or the EU. So I’d say that they are getting GDPR level of privacy standards when you use Twilio.

05:15 Beth: What are the privacy laws that were passed in California for 2020? Can you give us some background there?

05:20 Robin: Yes. So there’s still some items they’re discussing around the law, but it’s the CCPA, California Consumer Privacy Act. And it’s basically a GDPR-ish style law that was passed, where either there’s certain criteria for whether or not you fall under it, it’s mostly more or less targeted towards advertising companies and companies that might sell data about individuals and make money off of it, like Intelius or Spokeo or those sort of services. But there’s a variety of obligations in the actual law. One of them, for example, is that you put a button on your website that says, “Do not sell my data,” and you allow people to opt-out of that. Other options are more GDPR-specific rights where a user could request access to all their PII that the company has processed about them. Which if you went through GDPR and you are giving those rights to individuals globally like we are a Twilio, then it’s not a huge lift. But if for example, you haven’t, then it’s… A lot of companies are scrambling to comply with that for California residents.

06:30 Beth: Because Robin works in data privacy every day, I ask her how concerned the rest of us should be about data and what companies can do internally to improve privacy measures.

06:40 Beth: You’re entrenched in privacy, it’s something that you work with every day. Should… How concerned should people be with their privacy? Like people who aren’t in tech, people who aren’t dealing with data as their occupation?

06:53 Robin: With the recent Facebook-Cambridge Analytica issues, I don’t think people fathomed that… What would happen around doing a survey, a personality survey, and then them data mining all this information to third parties, so that’s… I just don’t know what’s out there, I’m afraid I don’t really use many social media sites, I don’t post much publicly about myself, because I just… I guess, I inherently just don’t trust many companies with my data. So just think about, really, what you’re putting out there, and why, and this third party you’re giving your information to when you are giving it away.

07:38 Beth: What is a no-shenanigans approach to privacy, or do you wanna perhaps mention anything else about binding corporate rules?

07:45 Robin: So Twilio has these values which I personally really love. A lot of companies have values. I don’t know if they talk about them in their day to day activities. One of them is, “No shenanigans,” and what that really means is, inherently, “Let’s do the right thing, the right thing for our end-users and our customers.” You might be in a meeting, and you can say, “Is that the right thing? Do we really wanna do that? Is that the right thing to do for our customers or is that shenanigany does it not feel right?” Another one we have is “Wear your customers’ shoes.” So when we’re building a product or thinking about something, we think about how to do the right thing for our customers. So having a good privacy program in place is good for our customers because when they use our tools and services, they realize like, “Okay, they’re… They really care about privacy, they want to do the right thing.”

08:37 Beth: My next guest is Tomas Sander, the Data Protection Officer at Intertrust, a company that introduced digital rights management to protect music and video nearly 30 years ago, and is now creating data processing platforms with privacy features. We do a deep dive into the GDPR and discuss where there has been progress in privacy over the last year and where there is still cause for concern.

09:01 Beth: For any listeners who may not be aware, can you explain what the GDPR stands for and, perhaps, provide a few key points about the GDPR?

09:10 Tomas: Sure. So the GDPR stands for the General Data Protection Regulation. This is a sweeping, new European privacy legislation that came into effect, and it gives users broad rights such as the right of access to their data or the right to be forgotten. It also puts a number of obligations on companies and corporations to really demonstrate they… It’s not longer enough that they just say they’re doing the right thing, but they have accountability requirements to fulfill. Though they also need to demonstrate and put processes in place to ensure that they’re actually doing what they’re saying in terms of privacy, in terms of protecting privacy.

09:50 Beth: Yeah. And we’re coming up on the one year anniversary for the GDPR. How has privacy changed in the last year? Can you describe privacy pre-GDPR and post-GDPR?

10:00 Tomas: Sure. So privacy… One of the main things that the GDPR has done is that it has an extraterritorial reach. So GDPR not only applies to European companies, but to companies worldwide if they provide goods and services to European citizens. Also, GDPR has huge fines for non-compliance and that has contributed for it taken seriously by companies globally. And for example, many US companies had to comply and change their practices because of GDPR. So it’s really made privacy a much more important issue for many organizations, and if previously… ’cause of the data breaches, security has become a boardroom issue for many companies. Now, also privacy has become a boardroom issue.

10:53 Beth: And do you think that GDPR has been effective, now that it’s becoming more of a boardroom issue, or what were your thoughts on that?

11:02 Tomas: Certainly, GDPR has been extremely effective in setting the privacy debate worldwide. So determining what is being discussed and setting a very high bar for privacy. So GDPR has… Although it’s a regulation in Europe, it’s been extremely effective through its global impact on organisations and on thinking of policy makers, what they wanna do about privacy in their countries. So in that sense, yes, they have been effective. Another concrete area, GDPR has been concerned about data breaches, and there have been… Thousands of data breaches have been reported to the data protection authorities in Europe, and there it has also been effective. But the question is if it has a positive impact where it really matters, namely on the privacy of end-users, on improving that privacy, and… For example, for that, the jury is still out because data behemoths such as Google and Facebook, who have been particularly… Who the European regulators had in mind when they were making these regulations, they’re right now collecting data from many, many different sources, aggregating it about users, and create detailed profiles for the purpose of selling advertising, usually, so for profit.

12:17 Tomas: And this practice of taking all this different data, from location data, to smart home data, to their social media data and so on and using them for sophisticated user profiling, that practice hasn’t recognizably changed yet. And I have recently heard data protection commissioners speak at a privacy conference in Washington, and they believe that we’re going to see some of these investigations conclude this summer. And hopefully then there’ll be some enforcement, and some of the commissioners certainly believe that there will be fines.

12:50 Beth: Many of my listeners are aware of Cambridge Analytica, which involved an agency using data from Facebook to target people politically off of Facebook. I take the opportunity to ask Tomas about some of the negative effects data and look-alike modeling has had in other global regions as well.

13:07 Beth: You’re entrenched with privacy, you deal with privacy every day, data and privacy every day. People who aren’t in tech, who aren’t working at data companies, who don’t know much about privacy, should they be concerned?

13:22 Tomas: Absolutely. I think people should be deeply concerned about privacy. And this relates to what I said earlier, that we’re understanding that protecting privacy is about many things that we value in society. So let me make this a little clearer. So first, today more and more data about people are being collected. It’s big data about your web browsing, your searches, location data, the data you share on social media, there might be facial recognition from images, and also these days IoT and smart home data that give people intimate insights into what’s happening in your home. So more and more data are being collected, but secondly, also given machine and AI algorithms, it is possible to infer, now, many sensitive attributes about users from these data sets, even when the data sets are seemingly innocent. For example, you can predict sexual orientation from facial images, you can predict the upcoming onset of depression from social media data, before the user him or herself may even know about it. More generally, you can predict personality characteristics, such as whether a person’s introvert or extrovert, from their data on social media. So there is simply no way to hide from that.

14:51 Tomas: So even sharing relatively innocent data allows to make a lot of sensitive inferences. And yes, we should be very concerned about this because this could really change the way society works. And we could lose a lot of the freedoms that we have enjoyed over centuries really.

15:12 Beth: Can you expand on that a little bit, maybe give me a “worst case scenario” of what big tech companies could do with this much information?

15:20 Tomas: Yes, so in this example that I currently gave they’re mostly about tech companies observing what you do and perhaps creating a user profile. A next step they could take is that they don’t only observe what you do and predict what the next step is you’re going to do, but they may also may try to manipulate and to influence what you do. And they would usually do that for profit motives, and that is certainly a major concern. So people may not even know, may not even realize, that they’re being influenced. There have been studies done at Facebook that, for example, showed that just simply changing the web site a little bit may impact the percentage of users who turns up for an election. So we can make a lot of changes that the tech companies are now just not observing what we do, but that they’re actually influencing and shaping what we do. And that’s of course a major concern, and it should be, because then it really becomes our individual freedom about… It really becomes about democracy.

16:35 Beth: Yeah, we have heard about people’s personal data being used to target people politically in the United States. However, this also happened in Germany, I believe. Can you tell us about that?

16:46 Tomas: Yes. So in Germany there is a far-right party which is called “Alternative For Germany”, “Alternative für Deutschland”. They were able to use a Facebook feature that has been created for advertisers to help it achieve the best result in the federal election for any far right-wing party in Germany after World War 2. And that was certainly fairly shocking to many Germans. And the feature that was being used here was a feature of “look-alike audiences”. So what happened was that Facebook helped this party to analyze the characteristics of the 300,000 users who had liked the “Alternative For Germany”, who had liked this party. And from these users then it created a “look-alike” audience of another 300,000 users that were similar in characteristics to those who had already liked this party, and then they were specifically targeting ads to this group.

17:51 Beth: In the third “lightning round” I speak to Katrina Dow, the CEO of Meeco, a company building tools that powers consent for personal data. Katrina is very knowledgeable around the evolution of data, and she puts into perspective how the mobile device changed the parameters around privacy.

18:09 Beth: What kind of data abuse is most concerning from a consumer’s standpoint, also from your standpoint as a technologist?

18:15 Katrina: Well look, I think the biggest challenge right now is that people just don’t understand what goes on under the surface. You take a picture of your child in the park on Sunday and you upload it to a social media site, and you’re not thinking, “Okay, this picture of my five year old daughter, and what she’s doing and how she’s moving, may actually impact her credit rating in the future.” Her ability to access a service. You do a DNA test, and you wanna explore the genealogy of your family, and you haven’t read the fine print that says you’re actually signing away the rights of that, maybe for two or three generations in the future. And this isn’t your child, this isn’t your grandchild. This is maybe your great-grandchild, turns up at a medical facility one day, and says, “Sorry, you can’t be treated for this because your great-great-grandparent signed away the rights around your DNA.” And these aren’t kind of “future things”. This is not some sci-fi thing that’s happening out there. These are the kinds of things that are happening right now. And people don’t read the terms and conditions. People don’t understand the consequences of something that I do right now, that’s digital, and what it might impact some time in the future.

19:38 Katrina: So I think the biggest challenge is, how do we get people digitally aware? How do we start to manage some of those potential downstream consequences? And how do we help people make more informed choice around the services they wanna use, or argue for better rights in terms of those services, so those consequences don’t happen?

20:05 Beth: We know some of the players who may not be handling data well, they’re in the news frequently. Is there anybody that is handling data well? Like any industries that are at the forefront of trust and data privacy that you can think of industry wise?

20:25 Katrina: That’s a really good question and you’re right. We know all of the names that are being called out every day. Look, I wanna say there are, and probably in pockets. But I would say those pockets are in this whole emerging trust sector that ironically says, “We can’t trust. We’ve had so many privacy breaches. The only way that we can do this is to take a completely different design philosophy.”

21:00 Beth: Can you give us maybe some background information? How do privacy and digital trust change when we migrated from desktop onto mobile?

21:08 Katrina: So I think one of the big things was stuff went under the bonnet. So we wanted this great UX experience, and so that meant a lot of stuff had to happen in the background. And I think this is probably one of the big challenges for a company like Facebook. Is that when you looked in 2012, when they were doing their IPO, and in fact it’s one of the reasons that I founded our company… I was looking at the risks they were signalling to the market and there were four things that jumped out at me. A concern around privacy, a concern around regulation, a concern around what would happen if customers became aware of the value of the data. But the other thing was that they’re in that process of moving from desktop to mobile. And therefore, a lot of what goes on had to become more opaque. It goes under the bonnet. And that’s a great UX experience, is all this stuff going on in the background.

22:03 Beth: Katrina and I continue the conversation about the UX or the User Experience of the mobile device, and how this complicates privacy for end users. She also discusses one of the principles of the GDPR, which is designing privacy into the applications or websites as the foundation of the design, rather than adding privacy as an after thought.

22:24 Beth: There are already a lot of privacy concerns with how companies collect and share data. This will only increase as we connect more machines, and we use AI for applications. What is the future of data privacy, like fast forward five to 10 years from now? Maybe talk about the worst case scenario and the best case scenario.

22:40 Katrina: I love this question, because I really do think there are very simple answers. So the worst case scenario is we just keep going. If we just keep going as we are right now, and without sounding Orwellian, we will actually see a generation of children born into digital slavery. If we don’t actually change this and don’t course correct, that is absolutely where we’re going. However, if we have the ability to hit pause and go, “Okay, this is not working at an architectural level and every one of us has a digital twin.” Everything we do right now is creating some kind of data. This podcast, this conversation, how this would be distributed, the fact that we’re standing here together right now, the location, everything there’s like this… There’s this digital other around everything we’re doing in the physical world. So we’ve spent the last, I don’t know, hundred years campaigning for physical rights. The right to vote, the right to take a bus, the right to be paid equally. The idea that privacy is a human right. The right to have access to food and shelter. So we’ve done all of these amazing things for our physical selves, we now need to start thinking about how we do that for our digital selves. And that is a significant change in the way we design. So, we’ve re-designed society for physical rights. What we have to do right now is redesign society for digital rights.

24:16 Beth: Do you think that GDPR, which introduces some level of control, do you think that it’s effective and why or why not?

24:25 Katrina: It’s early days.

24:26 Beth: Yeah.

24:27 Katrina: It’s not working as intended right now. And in fact…

24:33 Beth: What’s not working about it?

24:35 Katrina: So, I think the biggest problem right now is the UX level is just not working. And organizations that have been smart in terms of creating enormous amounts of friction are using that to their advantage. So it’s that paradox. They’re legally compliant, but they have created that compliance burden to be so overwhelming, that I agree or just anything to get this screen out of the way is driving the behavior. So I think one of the problems, and I was actually speaking with one of the regulators about this yesterday, is how do we think about GDPR at the design level? So, part of GDPR is privacy by design, but what we haven’t seen is we haven’t seen that surface to the UX level. And I think right now, it’s just so overwhelming for people to even work out, “What’s the choice?” What are they saying yes to? What are they saying no to? So I think, the underlying components are there and from a legal framework. Now, how do we move that to what we know is the everyday use case, which is how you interact with those frameworks.

26:00 Beth: Thank you for listening to Tech Lightning Rounds. Please support the production of this podcast by subscribing on iTunes and leaving a review. 26:07 S1: This podcast was brought to you by Intertrust, the inventor of digital rights management. Intertrust’s DRM solution ExpressPlay has strong privacy protections built in from the ground up. Go to intertrust.com for more information.


I write fundamental technology analysis for FATRADER, where top analysts and serious traders share market insights. Learn more about this premium community here.


Sign Up to Receive Bi-monthly Insider Analysis:

I’m an industry insider who writes free in-depth analysis on public tech companies. This year, I predicted Facebook’s Q2 crash, Roku’s meteoric rise, Oracle’s slow decline, and more. Be industry-specific. Know more than the broader markets. Sign Up Now. I look forward to staying connected.

Published inTech Podcast

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *