THE SET-UP: Ubiquitous. There’s your one-word takeaway from today’s trio of tales about the profit-generating Panopticon that is the 21st Century.
Surveillance seems both inescapable and, unlike the bedeviling persistence of religious and political ideas far better-suited to the end of the 19th Century, it feels strangely appropriate to this century. It feels like we’re heading into “the future.”
When thinkers and writers imagined “the future,” they often imagined some form of Panopticon. It was usually dystopian, pernicious and imposed.
But the truly dystopian reality of Surveillance Capitalism’s Panopticon is that we walked into it willingly and enthusiastically as we adopted one tech “innovation” after another.
What started with books on Amazon and music on iTunes has turned into thousands of “free” data-gathering apps we load onto a portable two-way mirror that’s become so integral to our lives it almost seems like it’s an appendage. Now we’re surrounded by smartphones and Ring cameras and the “Internet of Things”—a.k.a. the growing roster of everyday “things” like toasters and clothes dryers that perpetually share our information through connections to the internet.
Now those “things” are potential pathways for police state-style surveillance and, as you’ll see below, perhaps even economic segregation. At the same time we’re using smartphones as disembodied eyes through which we process and filter reality. That, in turn, feeds our feeds on social media, where we curate virtual museums dedicated to our image of ourselves. The “free” space they give us to store our personal exhibits tuned us into exhibitionists and emotional streakers. When we exhibit our behaviors online we’re building a dossier, a dataset and psychological profile Surveillance Capitalists use to manipulate us while simultaneously selling the keys to our malleability to more manipulators.
The most dystopian part of the Panopticon is how they are slowly robbing us of our freewill by enticing us deeper into surveillance by dangling the illusion of freedom on the end of a selfie stick. - jp
TITLE: ICE Taps into Nationwide AI-Enabled Camera Network, Data Shows
https://www.404media.co/ice-taps-into-nationwide-ai-enabled-camera-network-data-shows/
EXCERPTS: Data from a license plate-scanning tool that is primarily marketed as a surveillance solution for small towns to combat crimes like car jackings or finding missing people is being used by ICE, according to data reviewed by 404 Media. Local police around the country are performing lookups in Flock’s AI-powered automatic license plate reader (ALPR) system for “immigration” related searches and as part of other ICE investigations, giving federal law enforcement side-door access to a tool that it currently does not have a formal contract for.
The massive trove of lookup data was obtained by researchers who asked to remain anonymous to avoid potential retaliation and shared with 404 Media. It shows more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an “informal” favor to federal law enforcement, or with a potential immigration focus, according to statements from police departments and sheriff offices collected by 404 Media. It shows that, while Flock does not have a contract with ICE, the agency sources data from Flock’s cameras by making requests to local law enforcement. The data reviewed by 404 Media was obtained using a public records request from the Danville, Illinois Police Department, and shows the Flock search logs from police departments around the country.
As part of a Flock search, police have to provide a “reason” they are performing the lookup. In the “reason” field for searches of Danville’s cameras, officers from across the U.S. wrote “immigration,” “ICE,” “ICE+ERO,” which is ICE’s Enforcement and Removal Operations, the section that focuses on deportations; “illegal immigration,” “ICE WARRANT,” and other immigration-related reasons. Although lookups mentioning ICE occurred across both the Biden and Trump administrations, all of the lookups that explicitly list “immigration” as their reason were made after Trump was inaugurated, according to the data.
Flock says its ALPR cameras are “trusted by more than 5,000 communities across the country.” These cameras continuously record the plates, color, and brand of vehicles passing in front of them. Law enforcement can then perform searches to see where exactly a vehicle, and by extension person, was at a certain time or map out their movements across a wide date range. Flock is also developing a new product called Nova which will supplement that ALPR data with people lookup tools, data brokers, and data breaches to “jump from LPR [license plate reader] to person,” 404 Media previously revealed. Law enforcement typically do these lookups without a warrant or court order, something which an ongoing lawsuit argues is unconstitutional.
TITLE: Black Students Are Being Watched Under AI — and They Know 
It
https://theatlantavoice.com/ai-surveillance-schools-students-civil-rights/
EXCERPTS: In the classic sci-fi movie “Minority Report,” Tom Cruise plays a cop whose “Precrime” unit uses surveillance and behavior patterns to arrest murderers before they kill. Set in the future, the movie raised tough questions about privacy, due process, and how predicting criminal behavior can destroy innocent lives.
But what once seemed like an action fantasy is now creeping into American classrooms.
Today, across the country, public schools are adopting artificial intelligence tools — including facial recognition cameras, vape detectors, and predictive analytics software — designed to flag students considered “high risk” — all in the name of safety. But civil rights advocates warn that these technologies are being disproportionately deployed in Black and low-income schools, without public oversight or legal accountability.
A recent report from the Center for Law and Social Policy (CLASP) argues that AI programs and mass surveillance aren’t making schools any safer, but rather quietly expanding the school-to-prison pipeline. And according to author Clarence Okoh, the tools don’t just monitor students — they criminalize them.
“The most insidious aspect of youth surveillance in schools is how it deepens and expands the presence of law enforcement in ways that were previously impossible,” says Okoh, a senior associate at the Georgetown Law Center on Privacy and Technology. “Black students are being watched before they even act.”
The rise of school surveillance didn’t begin with AI, but the advancing technology has taken it to a new scale. According to the National Center of Education Statistics, 91% of public schools use security cameras, while more than 80% monitor students’ online activity. Yet there is little evidence that these tools improve safety — and even less to show they’ve been tested for bias.
In fact, a 2023 Journal of Criminal Justice study found that students in “high-surveillance” schools had lower math scores, fewer college admissions, and higher suspension rates — with Black students bearing the greatest impact. These systems include facial recognition, social media monitoring, location tracking, as well as vape and gun detection sensors.
“The line between school and jail is being erased — not metaphorically, but digitally,” Okoh says.
In Pasco County, Florida, for example, an AI program secretly used school records to flag children for future criminal behavior based on grades, attendance, and discipline — leading to home visits, interrogations, and harassment.
“It wasn’t hypothetical,” Okoh said. “Kids were being watched, tracked, and punished — and families were being pushed out.”
Okoh also added that the incident in Pasco wasn’t isolated: “These tools are being marketed across the country, and the schools most likely to say yes are the ones serving Black and low-income students.”
TITLE: We’re not watching prices – prices are watching us
https://sfstandard.com/opinion/2025/05/27/were-not-watching-prices-prices-are-watching-us/
EXCERPT: You’re at home, scrolling online to find a nearby store to get a TV. You find somewhere, drive to the store and park, then pull up the same TV on your phone.
The price jumps from $499 to $599.
Why? The retailer is betting that the closer people are to their doors, the more money they are willing to pay. This actually happened with Target, and the company agreed in 2022 to pay $5 million in civil penalties.
Meet surveillance pricing, a dystopian business model straight out of the show “Severance.” This isn’t the old-school, supply-and-demand model of price fluctuation. This is hyper-personalized, data-driven price manipulation, with corporations using the vast amounts of data they collect on you — your finances, web browsing history, past purchases, even your location — to determine just how much they can squeeze from your wallet. It makes Ticketmaster’s dynamic pricing seem quaint.
A new California state Assembly bill, AB 446, introduced by Assemblymember Chris Ward, would make it illegal for businesses to charge different prices based on the personal data they collect from consumers. It’s a crucial move, pushing back against the unchecked power of data-driven price manipulation. The bill recently passed an Assembly floor vote and is now before the Senate.
Businesses have long offered discounts based on demographics — think student or veteran discounts. But such models are transparent. Surveillance pricing is different. It’s secretive, and it’s about you.
Bought a last-minute ticket before? Expect the next one to cost more. Traveling for something you can’t miss? That flight to attend a funeral could cost you. Ordering diapers online with expedited shipping? You’re probably a stressed parent willing to pay more.
Surveillance pricing is not just a “Black Mirror”–style hypothetical: It’s already happening. The travel website Orbitz charged people using Apple products as much as 30% more for hotel rooms. The price for a backpack on Amazon was $7 more on a shopper’s phone than on their laptop. The Princeton Review charged customers in ZIP codes with high Asian populations more money for tutoring packages. And consumers browsing from the Bay Area were charged higher rates for hotel rooms compared with users browsing from places like Kansas City and Phoenix.
For low-income shoppers, surveillance pricing can mean paying more for necessities. When algorithms favor people with better credit scores, more expensive devices, or certain browsing habits, those who are already struggling economically find themselves locked out of the best deals.
Since these pricing structures are invisible, there’s often no way for consumers to challenge or even recognize when they’re being gouged. It takes work to uncover, including switching devices or even ZIP codes. Getting a fair, transparent price shouldn’t be such an ordeal.


