TITLE: DEA surveillance program shows why Congress may ‘go slow’ on FISA renewal
https://www.washingtonexaminer.com/news/2918964/dea-surveillance-program-congress-slow-fisa-renewal-overnight/
EXCERPT: The Drug Enforcement Administration collected massive amounts of telephone records for 20 years before the program was shuttered amid the Edward Snowden revelations in 2013. A heavily redacted inspector general’s report into the program was released six years later, a more forthcoming version of which has been obtained by the Washington Examiner.
While the Office of the Inspector General exists to provide oversight of government agencies, among the new details of the DEA program is that it refused to comply with parts of the IG investigation for seven months, apparently without facing consequences.
“It raises fundamental questions about whether you can really trust publicly released Inspector General reports,” Cato Institute senior fellow Patrick Eddington said. “When federal agencies and the law enforcement and intelligence community have the ability to subvert oversight processes, it makes a mockery of what ultimately comes out.”
Eddington filed a Freedom of Information Act request for the full version of the report in 2019 and did not receive a reply until nearly five years later. The newer version, which was shared first with the Washington Examiner, is more forthcoming but still heavily redacted and in two cases includes new redactions that were not withheld from the original version.
The release of the report coincides with Sunshine Week, a nonpartisan collaboration among groups in the journalism, civic, education, government, and private sectors that shines a light on the importance of public records and open government.
The report also comes as Congress debates reauthorizing Section 702 of the Foreign Intelligence Surveillance Act, or FISA, which will expire on April 19 unless renewed. The Biden administration opposes a warrant requirement to access FISA data on U.S. citizens, without which critics say the program is ripe for abuse.
The DEA surveillance report sheds light on the types of abuses that worry government watchdogs.
In the early 1990s, the agency set up a program to collect or exploit bulk data from telephone companies, ostensibly to go after illicit drug operations relying on telecommunications such as “burner” cellphones. But the scope of the program exceeded its mandate and many details of how it operated are still not known.
Among details on the newly released report is that after 2006 DEA began sharing the information it collected with other government agencies for non-drug-related investigations.
“[Former DEA acting Administrator Robert Patterson] said he did not recall any specifics about this decision other than someone had generally decided that once the government is in proper possession of information it can be used for other federal criminal investigations,” reads an excerpt from page 22.
DEA subpoenaed telephone companies in order to obtain their compliance, but the new report reveals that it obtained their voluntary cooperation first and paid for their costs of compliance, saying it was required to do so by law.
Eddington said this detail raises questions about the nature of the relationship between the government and the phone companies who were handing over information about their customers.
“It tells you how nefarious DEA was in terms of how it operates,” he said. “They only went to companies that would voluntarily cooperate, but in order to make them feel better, they gave them subpoenas as legal cover.”
TITLE: Auto Makers Are Selling Data On Your Driving Habits To Your Insurer Without Properly Informing You
https://www.techdirt.com/2024/03/13/auto-makers-are-selling-data-on-your-driving-habits-to-your-insurer-without-properly-informing-you/
EXCERPT: Last September, Mozilla came out with a privacy study indicating that the auto industry was the worst tech industry the organization tracked. Mozilla found that not only does the industry hoover up a ton of data from your use of vehicles, it collects and monetizes most of the data on your phone. Often without transparency or adequate safeguards:
“All 25 car brands we researched earned our *Privacy Not Included warning label — making cars the official worst category of products for privacy that we have ever reviewed.“
Fast forward to this week, and the New York Times’ Kashmir Hill has a new story exploring how the auto industry is also collecting reams of personal driving data, then sharing it with your insurance company. More specifically, automakers are selling access to the data to Lexis Nexis, which is then crafting “risk scores” insurance companies then use to adjust rates. Usually upward.
If consumer approval is even obtained, it’s obtained via fine print buried deep in user agreements either in automakers’ car apps or road-side assistant apps:
“Automakers and data brokers that have partnered to collect detailed driving data from millions of Americans say they have drivers’ permission to do so. But the existence of these partnerships is nearly invisible to drivers, whose consent is obtained in fine print and murky privacy policies that few read.”
Even law professors, who are surely used to reckless treatment of consumer data at this point, were somehow surprised at the cavalier behavior by the auto industry:
“I am surprised,” said Frank Pasquale, a law professor at Cornell University. “Because it’s not within the reasonable expectation of the average consumer, it should certainly be an industry practice to prominently disclose that is happening.”
This is the same auto industry that has been fighting tooth and nail against “right to repair” reforms — in a bid to protect their lucrative repair monopolies — often under the pretense that they’re just that concerned about the consumer privacy ramifications.
Please notice that the absolute bare minimum that the auto industry could be doing here is making this tracking and monetization of your data transparent, and they’re not even doing that. Because they know Congress and U.S. federal regulators, lobbied into apathy over decades, are too corrupt to take meaningful action. At least not at any real, consistent scale.
Again, like countless past scandals, this is the direct byproduct of a country that has proven too corrupt to pass even a baseline privacy law for the internet era. Too corrupt to regulate data brokers. And obsessed with steadily defanging, defunding, under-staffing and curtailing the authority of regulators tasked with overseeing corporations with a broad and reckless disdain for U.S. consumer privacy and safety.
TITLE: Is ‘Uberveillance’ Coming for Us All?
https://www.zocalopublicsquare.org/2024/03/13/uberveillance-surveillance-technology-privacy-humanity/ideas/essay/
EXCERPT: Your cell service provider or smartwatch manufacturer might assure you they’ll only use your data for research. But they may also inform you they have no control over how their partners might use the biometric and other data downstream. Your wearable data could end up in an AI model one day, or used by a prospective employer during a hiring process, or be presented as evidence in a court of law. The wrong data might render you unemployable, uninsurable, and ineligible for government benefits. In an instant you could become persona non grata.
Uberveillance advances the idea of “us” versus a series of “thems”—data brokers, Big Tech companies, government agencies, hackers, secret intelligence, first responders, caregivers, and others. In doing so, “they” have power over how others perceive us and use our data, potentially building multiple black boxes containing intricate profiles—limited accounts of what makes us “us.” This technology is not free, and will not set us free.
Today, as in 2006, this strikes us as technology’s natural trajectory. From the moment the first programmable general purpose digital computer, the ENIAC, was dubbed “an electronic brain,” it was always going to fuse with the body at its ultimate technological potential.
The paradox of all this pervasive vigilance is that the more security we hope for, the less we get.
“Nothing was your own except the few cubic centimeters inside your skull,” Orwell ominously wrote in 1984. And yet uberveillance threatens that too: An embedded “smart” black box in the human body would encroach on a last fragment of private space. An internal closed-circuit television feed could bring about the most dehumanizing of prospects—a total loss of control and dignity, if used to surveil thoughts, rituals, habits, activities, appetites, urges, and movements. Such dystopian scenarios are no longer sci-fi imaginings alone.
This has ontological implications, directly to do with the nature of being. It could represent the consequential deconstruction of what it means to be human, to have agency, and to make choices for oneself. If uberveillance is to expand and forge ahead on its current path, the scenarios are countless and potential consequences staggering. At that point, we will have surrendered more than just our privacy.


