Is Privacy Possible in a High-Tech Data-Driven World?

Privacy is Dead

The idea of privacy has evolved. Even from the early days of the internet to today’s complex global data ecosystems, privacy concerns have changed. The rise of AI and biometric data increase those concerns exponentially. Is it even possible to have privacy anymore? Let’s take a look at the current state of digital privacy and where it might go in the future.

See Privacy is Dead with Pam Dixon for a complete transcript of the Easy Prey podcast episode.

Pam Dixon is the founder and executive director of World Privacy Forum, a nonprofit public interest research group. She has been working on privacy since the early 1990s, making her one of the earliest people in the field. As soon as she got on the internet, she realized it would have unique privacy implications. The idea of what the world would look like once everything was fully digital became a passion. In 1993, she co-authored Be Your Own Headhunter Online, one of the first books to talk about internet privacy.

While researching workplace privacy, Pam found significant privacy and security issues in popular online resume databases. She wrote a 90-page report about it. Richard M. Smith, lead director at the Dever School of Law’s Privacy Foundation, was impressed by the report and hired Pam as a principal investigator. There, she learned the technical aspects of privacy. After 9/11, she decided the best thing to do would be to get into broad research. So she founded the World Privacy Forum to do research on privacy and data governance. They research the privacy implications of complex data ecosystems, data brokers, identity, healthcare systems, financial systems, global privacy, AI, genetic databases, machine learning, and more.

The Evolution of Privacy Concerns

The first modern age of privacy was in the 1970s. Hesse, Germany, passed the very first privacy law in the world, but it was less than a page long. The United States passed the Fair Credit Reporting Act, the first major privacy legislation in the world. The Organisation for Economic Co-operation and Development (OECD) in Paris took notice and used it as a foundation to develop the OECD Privacy Principles. These principles then percolated into laws across the world. The first multinational law, the European Data Privacy Directive, passed in the 1990s, making it the first multinational law.

Around 2012, Europe realized that the privacy laws created before the majority of people were online didn’t match the privacy concerns of where the internet has gone. They started the conversation around what is now known as GDPR. GDPR has an extraterritorial provision, which means that even though it’s a European law, it applies to any country that wants to do business in Europe. This incentivized a lot of other countries to pass similar legislation. Now, the only countries that haven’t are small island nations, countries without the economic infrastructure to do so, and the United States.

Laws have been passed in many countries to try to manage digital privacy concerns.

The idea of privacy has to change along with technology. Just look at how much things have changed since the beginning of digitization in the late 1980s. The internet era has completed its purpose, and now we’re moving into machine learning and AI. Everyone’s talking about large language models and confused about how AI impacts everything. We’re seeing extraordinary systemic impacts and changes, even in places we wouldn’t expect. This is a unique transformation we’re lucky to be seeing firsthand.

We are right now in a transition that happens once every thousand years. We’re very lucky people. – Pam Dixon

Consumer Perspectives on Digital Privacy

Initially, consumers were unequivocally excited about internet technology. When Google started Gmail, though, she realized they were allowing their ad engine to scrape people’s emails for keywords, and that wasn’t okay. She co-wrote a passionate letter about how bad it was, and got death threats for years over it. But she was right, and eventually Google stopped doing that because it violated European privacy laws.

This shows how consumers have become more aware of privacy concerns and more ambivalent about the technology. We saw the same thing with social media. People posted everything enthusiastically on Facebook, then migrated to other social media. Now we’re seeing kids suing their parents over sharenting and more concern about what tech is doing to us. Public perception has gone from “This is a great innovation!” to “It has some harms, and we need ways to be safe from those.”

It goes from, yeah, this is a great innovation, to … we need some guardrails. – Pam Dixon

Other parts of the world have different mindsets, too. Asia has less rigid ideas about privacy – they put up more guardrails but those guardrails are more flexible. India has the Aadhaar system, which does fully-digital biometric real-time identification. Even the smallest, poorest villages are digitized. India also has a lot of digital, technical, and legal protections for privacy. But because of how technology is changing, there are still gaps.

Adapting to AI

At this year’s OCED meeting, the University of Melbourne presented a study they did that was a trust index for AI systems. The developing world had the highest measure of trust in these systems. This has led to an interesting situation where developing nations are the earliest adopters of new AI tech, while developed nations are more hesitant. There’s real potential for these nations to “leapfrog” over others in the next decade.

These systems are very complex. In the US, the themes in privacy around these systems are control and consent. Let’s control the flow of data and give people an opportunity to consent. But the quantity of data is so great that it’s hard to give people genuine insight into every piece of data. Data free flow with trust (DFFT) might be a solution. It allows for easier flowing of data between trusted and verified entities. Privacy advocates tend to want more guardrails than DFFT has, while the innovation side appreciates that it provides fewer roadblocks. But everything is so complex that it’s impossible to come up with a solution that fits everything.

We are in such a complex situation that it’s almost impossible to characterize everything under one rubric anymore. – Pam Dixon

The Future of Privacy

The privacy principles we know today have flowered, and now they’re reaching the end of their usefulness. New principles are waiting, but we don’t yet know what they are. There are people who will tell you they know. Pam thinks she sees hints of them. But we need to watch the ecosystems as they evolve before we make assumptions. Cars have been around for a while, and we can make them better because it’s tweaking a known system. Managing privacy concerns is a challenge because we don’t know where we’re going or what systems will look like in the future. Some aspects will get better and some will get worse – it’s not an either/or situation and it’s not linear.

We do have hints of where some problems are, especially with machine learning, neural networks, and AI. Geospacial Information Systems (GIS) are used a lot around the world, primarily for good purposes like providing healthcare to nomadic tribes. But identifying and tracking different ethnicities is a system that can also be used for evil. Algorithms have been criticized for bias – racial, age, gender, and more. If everything is tracked, even if the purpose is to help people, it could both help and hurt.

Digital profiling is only one of many privacy concerns and risks - it can be used both to help and to harm.

Privacy Concerns in Tracking and Machine Learning

A policy study found that if you’re a member of an indigenous tribe or other identifiable group, you could be genetically identifiable. If you’ve provided genetic information to a biobank, if that data is analyzed with AI or machine learning, there’s a good chance you could be identified even if the data is anonymized. If you’re a member of a Native American tribe, if you have a rare disease, or anything else where there’s a data set without a lot of people in it, you could still be identifiable. Current privacy mechanisms don’t cover this concern.

Every individual has their own level of concern about this. Some people with a rare disease are concerned about the privacy of their health data and don’t want it out there. Health privacy is an important area of research. Other people are willing to risk potentially being exposed to help other people who also have that disease. There’s vulnerability there, and we have to understand the complexity of privacy. It gets even more complex when AI is involved.

Privacy, when it comes down to it, is highly contextual, and this is what makes it so difficult to actually implement and legislate. – Pam Dixon

Right now, we’re in a transitional era. It’s essential to understand what we’re doing. We need to move slowly and carefully, but we do need to move. AI requires guardrails, even if we’re not sure what those look like right now. We need to develop principles and apply them. Right now, a lot of this technology has no governance systems in place. But those tools are important if we want to address privacy concerns.

The Importance of Testing Solutions

Any kind of tracking needs to have guardrails. Pam isn’t sure what those should look like, and we need a lot of factual work to figure out what they should be. Any proposed solution should be tested. You’d be surprised how many legislative solutions to privacy concerns have never been tested. We’re past the era where that could work. Proposed solutions must be tested to see if they work.

There are all these proposals for fixing privacy that literally have never undergone a single test.Pam Dixon

Lots of legislators are worried about hurting innovation. But anyone who’s looked at how it actually applies in the world know it’s almost inapplicable to any meaningful AI. People want to take advantage of this technology, but they have a vague sense that something could go wrong. It’s hard to see the privacy concerns. Guardrails will actually help people feel more comfortable engaging with these technologies.

Privacy Across the World

In Rwanda in the 1990s, there was a terrible genocide based on their ID cards, which listed ethnicity. Pam had the privilege to teach at Carnegie Mellon University Africa in Rwanda in 2023. She taught about identity ecosystems in the developing world, and invited Josephine Acacia, the Rwandan National Identity Authority, to teach part of a class. She spent an hour and a half talking about what she specifically had done to make sure their ID system was never abused like that again.

The Aadhaar system in India is the largest ID system in the world, with over a billion people. When it was first built, it was a privacy nightmare. The issues were adjudicated by the Supreme Court of India, and their decision struck down the law that allowed some of the problems, mandated more privacy legislation, federated a centralized database, and added significant privacy controls. Today, the system is quite resilient.

No system is perfect. It requires good governance and good guardrails. But we can build great systems that do a lot and address privacy concerns. There’s a lot of countries that are already doing it. Asia is doing really well, and Europe is also doing pretty good. We just have to figure it out.

Privacy in the United States

The United States has the most computing power in the world. We have some of the largest models and good power centers for AI and machine learning. We also use a lot of quantum capacity compared to the rest of the world. We’re very dominant in that way. Other regions in the world are looking at policy, how technology and privacy concerns are impacting people, and new permutations of technology, people, regions, and countries. Civil society, academic, and governments are all involved in this work.

The US is ahead in computing, but behind in understanding and implementing guardrails. There’s a lot of resistance to anything that looks like it might slow innovation. But it’s a balancing act. Studies looking at trust in digital ecosystems show that lack of trust means tech isn’t adopted as quickly. If legislation is too tight, it becomes an innovation problem. But if it’s too loose, it becomes a trust problem. We need to find the sweet spot of fostering innovation while protecting privacy.

Learn more about Pam Dixon and the World Privacy Forum at worldprivacyforum.org. There you can find op-eds, reports, and thousands of articles on various privacy topics.

Picture of <span>About The Author</span>Chris Parker

About The AuthorChris Parker

Chris Parker is the founder of WhatIsMyIPAddress.com, one of the world’s most popular websites for online privacy and security with over 13 million monthly visitors. He is also the host of the Easy Prey podcast, where he interviews experts and survivors to uncover the tactics behind scams, fraud, and digital manipulation. Chris is the author of Privacy Crisis: How to Maintain Your Privacy Without Becoming a Hermit, a practical guide to protecting personal information in today’s surveillance-driven world. His work has been featured on ABC News and numerous podcasts, making him a trusted voice on how to stay safe, secure, and private online.
Share Post:

INSIGHTS YOU

MAY ALSO LIKE

You play a role in preventing social engineering. So, just what is social engineering? It’s a vague term that’s used for online shenanigans and…

If you think that only your Internet Service Provider (ISP), such as Cox Cable or Verizon, knows your IP address, think again. (Your IP…

This post was moved over from an old blog of mine that I’ve since taken down. Several years ago I purchased an APC AP9630…

WHAT PEOPLE SAY

ABOUT CHRIS

DOWNLOAD CHRIS’ BOOK

PRIVACY CRISIS

DOWNLOAD YOUR FREE PDF, MP3, and workbook by entering your details below.

We respect your privacy. 

Don’t want to share your email?

Privacy Policy

This following document sets forth the Privacy Policy for this website.

Collection of your personal information

We collect Non-Personally Identifiable Information from visitors to this Website. Non-Personally Identifiable Information is information that cannot by itself be used to identify a particular person or entity, and may include your IP host address, pages viewed, browser type, Internet browsing and usage habits, advertisements that you click on, Internet Service Provider, domain name, the time/date of your visit to this Website, the referring URL and your computer’s operating system.

Free offers & opt-ins

Participation in providing your email address in return for an offer from this site is completely voluntary and the user therefore has a choice whether or not to disclose your information. You may unsubscribe at any time so that you will not receive future emails.

Sharing of your personal information

Your personal information that we collect as a result of you purchasing our products & services, will NOT be shared with any third party, nor will it be used for unsolicited email marketing or spam. We may send you occasional marketing material in relation to our design services.

What Information Do We Collect?

If you choose to correspond with us through email, we may retain the content of your email messages together with your email address and our responses.

Cookie Based Marketing

Some of our advertising campaigns may track users across different websites for the purpose of displaying advertising. We do not know which specific website are used in these campaigns, but you should assume tracking occurs, and if this is an issue you should turn-off third party cookies in your web browser.

How Do We Use Information We Collect from Cookies?

As you visit and browse Our Website, the Our Website uses cookies to differentiate you from other users. In some cases, we also use cookies to prevent you from having to log in more than is necessary for security. Cookies, in conjunction with our web server log files or pixels, allow us to calculate the aggregate number of people visiting Our Website and which parts of the site are most popular.

This helps us gather feedback to constantly improve Our Website and better serve our clients. Cookies and pixels do not allow us to gather any personal information about you and we do not intentionally store any personal information that your browser provided to us in your cookies.

IP Addresses

P addresses are used by your computer every time you are connected to the Internet. Your IP address is a number that is used by computers on the network to identify your computer. IP addresses are automatically collected by our web server as part of demographic and profile data known as traffic data so that data (such as the Web pages you request) can be sent to you.

Sharing and Selling Information

We do not share, sell, lend or lease any of the information that uniquely identify a subscriber (such as email addresses or personal details) with anyone except to the extent it is necessary to process transactions or provide Services that you have requested.

How Can You Access and Correct Your Information?

You may request access to all your personally identifiable information that we collect online and maintain in our database by using our contact page form.

Changes to this Privacy Policy

We reserve the right to make amendments to this Privacy Policy at any time. If you have objections to the Privacy Policy, you should not access or use this website. You may contact us at any time with regards to this privacy policy.