← Back to Library

AI literacy - lecture 6.2: Data security in the era of AI

Kenny Easwaran cuts through the hype of generative AI to expose a quieter, more pervasive reality: your daily convenience is built on a data economy where your interests and corporate profits are often misaligned. This isn't just about privacy; it's about the subtle ways algorithms shape your behavior and the geopolitical minefields hidden in your cloud storage. Easwaran argues that for the ordinary person, the stakes of data security are far higher than mere embarrassment—they involve financial exposure, legal coercion, and the erosion of personal autonomy.

The Illusion of "Nothing to Hide"

Easwaran immediately dismantles the common defense that privacy doesn't matter if you have "nothing to hide." He writes, "I'm not sure I fully believe you if you claim that you have no illegal activity," challenging the reader to consider the fragility of their digital footprint. The argument here is that data leakage is rarely about criminal acts; it's about the context collapse that happens when disparate data points merge. He illustrates this with a visceral example: "Your Maps app might highlight businesses that it thinks are of interest to you on the basis of data gathered from your other apps," potentially revealing sensitive information about your health, relationships, or habits to unintended audiences.

AI literacy - lecture 6.2: Data security in the era of AI

This framing is effective because it moves the conversation from abstract surveillance to concrete social consequences. It forces the reader to ask not just "what are they stealing?" but "who might see this?" Easwaran suggests that even if you don't mind a company using your data, you should consider the leverage you might lose. "If you intentionally control the flow of this information it's possible that you may be able to extract payment or better service from some companies rather than just giving away all the data for free," he notes. This shifts the dynamic from victimhood to negotiation, a perspective often missing from standard privacy lectures.

The Shadow Network of Your Contacts

Perhaps the most distinctive part of Easwaran's analysis is the focus on "shadow profiles"—data gathered about people who never signed up for a service. He explains that even if a friend has turned off location tracking, "a service might have reasonable guesses about their location on the basis of your location." He points to the 2018 revelations regarding Facebook, noting that the platform was "using it to create Shadow profiles of people who had never signed up for Facebook" by analyzing contact lists and communication patterns.

This is a critical, often overlooked vulnerability. We tend to think of data security as a personal responsibility, but Easwaran argues it is a collective one. "If you do interact often with anyone who might have elevated concerns about their privacy you should think about this," he advises. This creates a new layer of ethical obligation for the user: your digital footprint now extends to your entire social circle, including those who are offline or opt-out. Critics might argue that this places an unfair burden on individuals to police their friends' digital safety, but Easwaran's point stands: the architecture of modern data collection makes isolation impossible.

The Geopolitics of Your Data

Easwaran then pivots to the macro level, arguing that where your data is stored matters as much as who holds it. He contrasts the robust protections in the European Union with the broad surveillance capabilities in China. "In general if your data is stored in Europe there's going to be a high level of protection of your privacy," he writes, while noting that in China, "even minor crimes can be used to warrant the release of large amounts of data." He describes the systems in India and the United States as "somewhat more chaotic and complex," varying by location and data type.

This section is vital for understanding the current regulatory landscape, particularly regarding the recent legislation targeting apps like TikTok. Easwaran highlights that "a bill that has passed the US Congress and been signed into law by the president that gives Tik Tock and similar apps 12 months to either sell themselves to entities outside of China and keep their data outside of China or else cease operating in the United States." He frames this not as a partisan issue but as a fundamental clash of data sovereignty. "China bans the use of many US-based internet services because they don't want data of their citizens being stored outside the country," he observes, drawing a parallel to South Korea's ban on foreign mapping services.

The core of the issue isn't just that companies want your data; it's that governments are increasingly willing to weaponize that data across borders.

The author's analysis of intergovernmental conflict is particularly sharp. He notes that even if you aren't a criminal, "you might not be committing any crimes you might not think that your data is of interest to the government there might still be reason to worry about potential intergovernmental conflict between countries." This reframes data security as a matter of international relations, where your personal data becomes a pawn in a larger game.

The Algorithmic Feedback Loop

Finally, Easwaran addresses the ultimate purpose of this data collection: the optimization of AI systems. He acknowledges the benefits—"they can show you the music or movies that you like"—but warns of the darker side of engagement optimization. The danger, he argues, is that these systems are not just reflecting your tastes but actively shaping them. "We might wonder whether our music and video feeds have not learned how to cater to our unique tastes but instead have learned how to shape our tastes into the ones that it's easy to cater to," he writes.

This is the most unsettling part of the lecture. It suggests that the "personalization" we crave is actually a feedback loop designed to maximize profit, often by inflaming outrage or negative emotions. "Sometimes these services are optimized for engagement not in ways that satisfy our authentic desires but in ways that inflame our outrage or otherwise enhance our negative emotions and keep us coming back," Easwaran explains. This challenges the reader to consider whether their digital experience is truly theirs or a curated illusion designed to keep them scrolling.

Bottom Line

Easwaran's strongest contribution is reframing data security from a technical checklist to a strategic life skill, emphasizing that your data is a currency you are currently giving away for free. The argument's biggest vulnerability lies in the practical difficulty of implementing his advice; for most users, opting out of the data economy means opting out of modern convenience entirely. However, the piece succeeds in making the invisible visible: your data is not just a record of who you are, but a tool that shapes who you become and how you interact with the world. Watch for how the upcoming legislative battles over data sovereignty will reshape the apps you use every day.

Sources

AI literacy - lecture 6.2: Data security in the era of AI

by Kenny Easwaran · Kenny Easwaran · Watch video

AI is everywhere in modern life not just the showy and spectacular generative AI that can write essays and create fake photographs but the kind of AI That's embedded in social media algorithms streaming service recommendations smart devices like step counters and smart watches and home assistants like Siri Alexa and Google Assistant even things like smart thermostats home security systems and car navigation systems all of the systems need data about you in order to function effectively for you but because of this they're also able to use the data that they gather from you to profit in other ways now the mere fact that a corporation is able to extract profits from its interactions with you in ways that you don't control isn't always a problem but the point of this lecture is to help you think through the ways that the corporation's interests and your interests may be aligned and may be at odds or may be completely orthogonal to each other so that you can be a bit more conscious of when you want them to have your data when you don't want them to have your data and how much effort it's worth putting into controlling this data flow I won't talk too much about the details of very specific Technologies here if I did this video would likely be obsolete in just a few weeks instead I will focus on the ways that you can better think about whatever the situation is that you face okay so why control your data well if you are a rich or powerful or important person or if you're engaged in activity that is legally banned by the country you live in some cases will be obvious thieves or prosecutors or your political opponents might want your data to expose you or to extort you or to rob you but why should an ordinary person with no illegal activity care well first I'm not sure I fully believe you if you claim that you have no illegal activity but let's assume that you're already being careful with the illegal activity that you do engage in why be careful with the rest of your data well you may have some legal or moral obligation to keep some data secret this probably doesn't apply too much to most people who don't have a security clearance or signed a non-disclosure agreement ...