The Economist’s Dexter Thillien: Privacy in peril amid digital data explosion
IN BRIEF
- "When we’re talking about biometric data and biometric computing it raises a question of what type of data we might be sharing. When it is possible to change an email address or even your financial or other details, it is impossible to change your fingerprint or your DNA."
- "I still don’t think that self-regulation is the answer. While I mentioned the GDPR might not be as well enforced as it should be it still offers a EU citizen much more protection than in many other jurisdictions across the world."
- "What I find interesting is […] younger people telling their parents not to upload pictures of them online. It made me think about the concept of what I might call “online privacy native.” Where maybe the younger generation is less keen to share publicly compared to the previous generation."
In this VISION by Protiviti interview, Joe Kornik, Editor-in-Chief of VISION by Protiviti, sits down with The Economist’s Dexter Thillien. Dexter is the lead analysts for technology and data at The Economist Intelligence Unit, the research arm of The Economist. Dexter is the lead author of numerous reports on AI, cybersecurity, data privacy, technology and regulation as well a frequent speaker on the intersection of the digital economy and global business. Here, he discusses how privacy is in peril in the digital economy, the impact of emerging technologies on data protection, regulation vs. innovation, and how the private sector will play a significant role in data privacy in the future.
In this interview:
1:11 – Biggest privacy issues for consumers and companies
3:18 – Emerging tech’s effects on privacy
5:42 – What type of regulation is needed?
7:49 – Who’s taking this seriously?
11:01 – Privacy in 2030
The Economist’s Dexter Thillien: Privacy in peril AMID digital data explosion
Joe Kornik: Welcome to the VISION by Protiviti Interview. I’m Joe Kornik, Editor-in-Chief of VISION by Protiviti, our global content resource examining big themes that will impact the C-suite and executive boardrooms worldwide. Today, we’re exploring the future of privacy, and I’m happy to be joined by The Economist’s Dexter Thillien. Dexter is the lead analyst for technology and data at The Economist Intelligence Unit, the research arm of The Economist. Dexter is the lead author of numerous reports on AI, cyber security, data privacy, technology, and regulation, as well as a frequent speaker on the intersection of the digital economy and global business. We spoke to Dexter last year about privacy in the metaverse and he has been kind enough to come back. Dexter, thank you so much for joining me today.
Dexter Thillien: Great to be here.
Kornik: Dexter, in a digital economy I don’t think there’s any question that data privacy is now and probably will continue to be one of the biggest issues facing consumers and companies, the rest of this decade and probably much further into the future, actually. What do you see as the biggest threats in terms of data privacy for both customers and companies?
Thillien: Yes, thank you for the question. First of all, you’re right to differentiate between companies and consumers because I think they will have different issues to deal with. For companies, they’re owning more and more personal data as part of the business processes, the key is to make sure that the data is secured. That means putting the right government system in place internally. So, instance, that only the right people can access sensitive data. Also, making sure that the company’s own data, any data from suppliers or consumer is being dealt with properly. That’s going to become more and more of an issue to deal with because there’s going to be more and more data to deal with. For some companies, it might even become a competitive advantage. We’ve seen Apple trying to do that in terms of its privacy policy compared to its competitors.
For consumers, it’s a different question, different range of issues. For consumers, it’s important to keep the data safe and secure, but it is also becoming increasingly difficult because we’re giving away much more personal data at any one time. Giving away personal data is no longer about just filling a form, but any time we see something online or do something online, and also anytime we’re going to be on the move because most of us are going to have a smartphone. Many of the apps we have on our smartphone will also collect a huge amount of data. We may become a bit blasé about all those data we’re giving away, but it’s also very difficult to operate and to use the internet and go online if we’re trying to minimize the amount of data we give away. Meta, as an example, tried to build a more private platform which has been charging users and making privacy as a premium feature, but is so far this has been refused by the European Commission in the European Union. The issue is that advertising remains the cornerstone of Meta which means that it is free as long as we give away much of our personal data. With the addition of pictures and videos and on top of text and also facial recognition entering the fray we’re starting to giveaway data which is even more unique and much more difficult to replace if it ever becomes hacked.
Kornik: I’ve been reading a lot of The EIU’s position papers on AI and really all emerging technologies, which includes quantum and spatial and biometric computing, and how those will ultimately impact data and privacy. How do you see AI and those other emerging technologies I mentioned impacting privacy going forward?
Thillien: I think it will all have an impact. So, starting with artificial intelligence, artificial intelligence is all about data. The fact that some are arguing that we may run out web data as early as 2026 as well as so much of a personal data we have given away so far. One of the major issues with artificial intelligence is the output as a model as it may give away as part of answer some personal data because that personal data is part of the input. Sometimes it’s very, very difficult to understand why it does that. We have seen some cases in Europe where this has happened and privacy regulators are keeping track. There is also consent issue which is why Meta say they will not release its most advanced Llama model in Europe because the company is not entirely sure if it can comply with the GDPR in terms of using pictures and videos and things other than text—content other than text.
In terms of quantum computing we’ve seen in August the National Institute of Science of Technology in the U.S. releasing its first post-quantum encryption standards and this is over the fears that a quantum computer might break the current encryption standards sooner rather than later. It’s still not very clear when that will happen and we do not think at The EIU that it’s going to happen any time soon, at least in the short to medium term because many, many technical hurdles remain. But it’s better to be safe especially as some of the encrypted data which can be, will still be very, very valuable when and if quantum computer becomes operational.
When we’re talking about biometric data and biometric computing it raises a question of what type of data we might be sharing. When it is possible to change an email address or even your financial or other details, it is impossible to change your fingerprint or your DNA. We may not be—we may not be able to identify in terms of what we share, but it is something we consider if we don’t want to—that data and want to make sure that the data do not fall into the wrong hands.
Kornik: Right. Thanks, Dexter. You mentioned GDPR. So, let me just follow up on that. Globally, Europe has invested significantly in data protection rules with GDPR. Japan has had very strict privacy laws in place as well. Meanwhile, India and China not so much. U.S. is somewhere in the middle, but has no federal regulation. A lot of the states have sort of taken the lead on that front. Where is the sweet spot? Who is getting this right? Does too much regulation stifle innovation or does not enough create chaos? Where do you stand on regulation?
Thillien: I think finding the sweet spot between regulation and innovation is what every policymaker, every regulator, is trying to do. I think it’s a problem or an issue for all tech regulation and not just data privacy. This could happen when sometimes regulation becomes more of a box-ticking exercise and we have seen that with cookies in Europe. It has no real impact on privacy because—for instance, active consent will now be fully given. I do think we do need some level of regulation because without it any protection will be lacking and there needs to be independent rules put in place.
I think for me there are two main things to consider when we’re talking about regulation. The first is fragmentation. Many, many businesses will be global in nature, whereas, regulation is very often not. This means that making a decision as to what to do, whether to follow offshore jurisdiction is required, whether there would be some overlap or to go with the strictest rule and have only one set of rules to comply globally for the company. Now, some companies have already done that in terms of the GDPR.
The second and probably the most important one is enforcement. I think rules are very nice, but without the right enforcement it can be a bit pointless. We’ve seen, again, with the GDPR where it’s taking quite a long time before any rulings or any judgement because it can be very, very tough for regulators to make the case. It’s very important to note what level you can enforce before you start thinking of regulation.
Kornik: Barely a week goes by without hearing about another significant breach, right? I just wonder if consumers specifically are becoming desensitized to these breaches, do we suffer low expectations when it comes to our own privacy?
Thillien: I don’t know if we desensitize, but I think the issue that is not always visible or very visible to the main user, we often hear about an attack where millions if not even billions of entries have been hacked, but the impact of that attack is very difficult to gauge because in most cases that means we’re going to receive a bit more spam emails. It becomes much more personal when they are a financial repercussions is in attack meaning we can be scammed or buying details are now available with people buying things online with that money. I also want to make the point that companies can try to do as much as they can and many, many do but the attacker, in this case the hacker, is much more favored than a defender because the attacker, in terms of an attack needs to get it right once when a defender has to get it right all the time. Now, as we’re spending more and more time online it means that the attacked surface is only increasing, which means that those breaches will keep happening.
Kornik: Right. We’ve seen big companies—I mean, very big tech companies even playing sort of fast and loose with data and privacy. Even children’s privacy, I think, is—we’ve heard that has been in the news recently. Can we trust the private sector? I mean, we were talking about regulations so I’m just curious. Can we trust the private sector to do what’s right in terms of privacy? Are boards and the C-suite taking this issue seriously enough, do you think?
Thillien: Some are, but I still don’t think that self-regulation is the answer. While I mentioned the GDPR might not be as well enforced as it should be it still offers a EU citizen much more protection than in many other jurisdictions across the world. You mentioned that the U.S. still doesn’t have federal rules, is trying to remedy that in terms of children. It needs to get passed through congress which is very difficult as well. The U.S. also has a much bigger, what I would call, a third-party market. With the data you might have given happily to a provider or a retailer is then sold on to a third party without you knowing about it, and perhaps you wouldn’t want that particular company, the third party company, to have access to your data. Companies do have to take it seriously because it can impact their reputation if it is proven they haven’t done as much as they could have should it be hacked. With a greater penetration of technology in the world place and the move towards digital information it can also become a phenomenal advantage to business continuity. I think the example of the CrowdStrike incident in July 2024 has shown how reliant we’ve become on digital technology and how important it is to protect those. Could it become a competitive advantage is very, very difficult to say because privacy is one of those areas where doing things right to make sure nothing happens has a limited impact, but not doing things right when something is happening could have a major negative impact. So, the positive and the upside is not very apparent, but you do need to do as much as possible to make sure that nothing actually happens.
Kornik: Dexter, I appreciate your time today. I really enjoyed our conversation. I just have one more question and it’s forward looking. I’m wondering if you could take me out to the end of the decade, let’s say 2030, and tell me what you see around data and privacy. I’m wondering how we’ll view privacy in 2030.
Thillien: I think for me it’s evolving concept because the online world has become so prevalent, but the right to privacy is also a fundamental human right whether it is online or offline. This is part of Article 12 for Universal Declaration of Human Rights which is, and I’m going to quote that, “No one should be subjected to arbitrary interference with their privacy, family, home or correspondence, nor to attacks upon their honor or reputation. Everybody has the right to the protection of the law against such interference or attacks.” I think this is the case both in the online world and both in the offline world. I’m going to give you a personal example maybe. I graduated from my school in very late 20th century. I don’t think I used the internet at all for any of my course work during high school. If you can see, for instance, the iPhone launched in 2007, so 17 years ago, and Facebook in 2004, so 20 years ago, it shows that many, many younger people are now what we call fully digital native and are going to have maybe a different perspective. What I find interesting is some mystery that I saw over the last few months and years where kids, where younger people, were telling their parents not to upload pictures of them online. It made me think about the concept of what I might call online privacy native. Where maybe the younger generation is less keen to share publicly compared to the previous generation. I think we’ll have to wait and see to see what will actually happen going forward.
Kornik: Yes, that’s interesting. I hadn’t really thought about that, but you’re right. That generation did seem more conscious—conscientious about sharing photos.
Thillien: I think they might have a different perspective when it comes to their online persona and their offline self and what they share online. So, they might not have all vision for all generation, vision of privacy more broadly, but in terms of what they’re doing online because they’re fully digital native and they are online a lot of time. Everybody is going to have a smartphone. That’s not going to change. We’re still going to be using the internet. We’re still going to share some data. There is still going to be probably from the younger generation which have only known that kind of a different perspective in terms of what they’re willing to share and especially what they’re not willing to share and to what they might get in return for what they’re sharing. I think it’s very early days and we’ll have to wait and see.
Kornik: Right. Very interesting. Thanks, Dexter, for that perspective and your insights. I really enjoyed our conversation today.
Thillien: Thank you very much for having me.
Kornik: Thank you for joining the VISION by Protiviti interview. On behalf of Dexter Thillien, I’m Joe Kornik. We’ll see you next time.
Did you enjoy this content? For more like this, subscribe to the VISION by Protiviti newsletter.