By Paulina Jaar ’27
There’s something unsettling about being watched, but something even stranger about not noticing it. In the past, surveillance belonged in movies, in scenes with dark rooms and flickering screens. It didn’t belong in everyday life, in something as normal as sending a text or checking directions. Now it does, just not in a way you can see.
A recent report from NBC News explains how artificial intelligence (AI) is making it easier for the government to sort through massive amounts of data. Not just data from people they are targeting, but data that happens to include ordinary Americans. Messages, locations, habits. Things that feel personal, because they are.
The law at the center of all of this is called Section 702 of the Foreign Intelligence Surveillance Act. It was originally created to monitor foreign threats, which sounds reasonable on its own. The problem is that it doesn’t stop there. When Americans communicate with people overseas, their information can be collected too. Once it’s collected, it can be searched without a warrant.
That might have seemed manageable years ago, when searching through data meant something slower, more limited. Now, AI can move through it in seconds. It can find patterns, connect details and build a picture of someone’s life without ever meeting them.
One lawmaker described it in a way that sticks. Instead of searching for one person, you could “turn AI loose” on entire databases. There would be very little it couldn’t find.
That’s where the conflict begins. On one side, there are people who argue that this kind of access is necessary for safety. Intelligence agencies have pointed to cases where surveillance helped prevent real threats. On the other side, there are people who worry about how much power that gives the government, especially when the rules haven’t caught up with the technology.
It’s not just government databases either. Companies collect data constantly. Location history, browsing patterns and who you spend time with. That information can be sold, and in some cases, government agencies can buy it. AI makes that data even more useful, and at the same time, more invasive.
What makes this conversation different now is how fast everything is changing. Technology is moving forward before there is a clear agreement on where the limits should be. Some lawmakers are pushing for reforms that would require warrants before searching Americans’ data. Others are hesitant, arguing that restrictions could slow down investigations.
What makes it feel more real is how normal it all sounds. Excuses seem to look like: “We need your information to be able to personalize your experience.”
Most people don’t think about it when they send a message or open an app. There’s no reason to. Life goes on the same way it always has. That’s what makes it easy to ignore.
But there’s still a question worth answering: How much of your life are you okay with someone else being able to see? Not in theory, but in reality. In the ordinary parts of your day that feel like they belong only to you.
The answer isn’t simple, and maybe it isn’t supposed to be. Security matters. Privacy does too. The difficulty comes from trying to hold both at the same time, especially when the tools being used are more powerful than anything before.
In the end, this isn’t just about laws or technology. It’s about trust. Trust that the systems in place will be used carefully. Trust that the boundaries won’t keep shifting without anyone noticing.
Right now, this might not seem urgent to you. Right now, you might be thinking: Well, AI is useful and helps me work more efficiently. Right now, it seems like you’re not being affected so much. Right now, it might not matter to you. But one day… it will.
