A Vast Camera System Now Feeds Information to Police on Drivers Across the US
The United States public has been treated to round after round of revelations disclosing that our actions — public and private, digital and physical, criminalized and legal — are tracked and surveilled in a bevy of ways. New realms of egregious privacy violations and excesses continue to be regularly broached by corporations and police. One of the most concerning contemporary cases is that of the AI-enabled security camera start-up Flock Safety, which, by surveilling enormous expanses of public terrain and facilitating the tracking of innocent individuals, is both testing the limits of warrantless dragnet surveillance and indulging in extremes of vacuous start-up hype and negligent absurdity.
Flock Safety, founded in 2017, is part of a wave of AI policing and military ventures; the company provides large-scale security camera networks to its clients, drawing on new technologies like cloud computing and AI video analysis. Flock and other companies like it have provided law enforcement and other entities with newly capable AI-enabled automatic license plate readers (ALPRs), in an upgrade to the existing ALPR technology that has long been deployed widely, almost ubiquitously in the U.S. Rather than directly selling surveillance hardware, Flock leases out parts of its massive network of ALPR cameras with AI tracking to clients, earning the company $300 million in sales in 2024. The company won’t disclose the total, and only a fraction of sites have been documented, but the full network is estimated to consist of around 80,000 cameras. Flock systems appear in over 4,000 cities and in 42 states — with its key clients being law enforcement agencies, ranging from major metropolitan forces to small local police units, as well as private entities like retail corporations and homeowner associations.
Flock systems appear in over 4,000 cities and in 42 states.
The Flock system has also, despite reassurances from the company and many apologia from police, has been steadily creeping outwards in both scope and boldness. As the network has proliferated, Flock’s cameras have not only played a role in facilitating local police abuse and overreach, they have also simplified the unauthorized (and in some cases, illegal) provision of police data for use in federal immigration enforcement. The company is flourishing, expanding into new predictive-AI and drone products and securing a partnership with Amazon Ring. And in addition, thanks to cartoonishly shoddy security, Flock is also leaving disastrously compromised police systems open to exploits by all manner of white hat, black hat, state, and non-state actors. Nevertheless, even if Flock’s systems were entirely secure, their proper functioning as promised already raises grievous concerns around privacy, ethics, civil rights, and the fusion of corporations and the police state.
Converts to the Flock
Flock’s $7.5 billion valuation is propped up by massive capital outlay, much of it coming from investment firm Andreessen Horowitz (known for, among other things, funding companies that commit invidious and incessant ethical and data privacy violations), with additional funding from noted Antichrist enthusiast Peter Thiel.
Its centralized camera feed platform, FlockOS, (a “real-time crime center”), places a powerful surveillance infrastructure in the hands of its clients, who can use the company’s ALPR algorithms to image and track every passing car’s license plate on local roads and compare them to police “hotlists” of would-be suspects. That capability, in essence, allows any individual to be followed all around a city.
Flock is far from alone in this pursuit — it has close competitors in the Axon-acquired Fusus ALPR network, Motorola’s Vigilant, and the local Palantir-esque Peregrine, as well as less direct analogues like ShotSpotter. But Flock has rapidly risen to become a major player in the market for private spy systems, in a permutation of capitalist disciplinary measures that George Orwell failed to foresee.
Critics have maintained that these kinds of tracking features represent glaring contradictions of Fourth Amendment constitutional protections against warrantless search and seizure. Indeed, the Supreme Court has already reviewed two cases that touch on other comparable forms of domestic surveillance tech: one ruling on police use of a GPS device to track an individual’s car, and the other on police requests for a time range of a person’s cell tower location data). The court noted, in reference to both cases, “that individuals have a reasonable expectation of privacy in the whole of their physical movements.”
The American Civil Liberties Union (ACLU)), commenting on these cases in a 2022 report on Flock, explained that the rulings “expressly rejected the argument that the public nature of the targets’ movements meant they had no legally significant expectation of privacy.” Because modern location-tracking techniques enable such invasive, constant and far-reaching spying, the ACLU contests that the use of such data should at the very least require a warrant — in fact, obtaining cell tower tracking data (which collects far less fine-grained location data than Flock’s ALPRs) already does.
But Flock, in the true spirit of disruptive tech — i.e., moving fast, breaking things, not asking permission (and never really asking forgiveness, either) — has plowed ahead anyways, making profitable use of a familiar problem: a gray area in which the law has lagged behind technological advances. If you’re wondering how Flock’s network appeared on our streets seemingly uncontested in the first place, the answer is more than a little ironic: In numerous localities across the country, as Forbes reported last year, Flock simply went ahead without permits and illegally strapped their cameras onto Department of Transportation infrastructure, earning them a ban in two states.
Flock simply went ahead without permits and illegally strapped their cameras onto Department of Transportation infrastructure, earning them a ban in two states.
This aggressive and familiar strategy (recalling the unpermitted public testing of driverless cars, or the time that a scooter company illegally rolled out its products on San Francisco sidewalks) has worked as intended; in 2023, Flock boasted that its revenue grew by 2,660 percent over three years. That revenue has merely come at the expense of the public’s democratic input, state regulation or oversight, and any form of accountability. And when the inevitable lawsuits against cases of false accusations and police overreach come through the police use of this technology, taxpayers will still foot the bill. Privatize the profits; socialize the costs. Unfortunately, the potential for legal concern around Flock congeals in layers upon layers: Beyond its shady proliferation tactics and its inherent intrusiveness, Flock facilitates particular abuses as well — stalking, racial profiling, and a host of other problems besides.
Policing the Police, Watching the Watchers
In two separate incidents, both in Kansas, two police officers in different departments, (one of them a chief), were caught using Flock to spy on ex-partners. (If National Security Agency staff are tempted to do this, we should expect the same and worse from local cops.)
In another case in Texas earlier this year, 404 Media exposed facts indicating that what both Flock and Texas deputies had claimed was a welfare-concerned search for a “missing person” was in fact a nationwide camera-feed inquiry made in an attempt to collect evidence on, and potentially charge, a woman for having an abortion — which the woman’s partner, who was later charged with domestic violence, had vindictively reported to police.
These and other unflattering facts about Flock have been publicized in part by dogged independent researchers. One noteworthy effort is DeFlock.me, a project by software engineer and concerned citizen Will Freeman, which provides a mapping platform for the public reporting of Flock and other ALPR camera locations. (Flock has responded with a cease-and-desist against Freeman — for using their trademark.)
In another case, as Truthout has previously reported, Flock data compiled in Illinois was made accessible to immigration agencies — a flagrant violation of the state’s TRUST Act, passed in 2021, which prohibits state and local police from assisting with federal immigration enforcement.
In truth, across the country, Immigration and Customs Enforcement (ICE), Customs and Border Protection, additional Homeland Security agencies, the Drug Enforcement Administration, and other feds can access local police’s Flock data by numerous means, both legal and deceptive. Many cases of such backdoor access provision to federal immigration agencies have been documented, and analyses of Flock lookup histories have found regular immigration-related searches — signs that police departments are allowing federal agents access to Flock data without oversight, by falsifying logs or other means. Research from the University of Washington Center for Human Rights found widespread collaboration, with cities in Washington granting Customs and Border Protection direct access to their data — even though Washington is a sanctuary state.
Instead of closing the loopholes used by their lucrative clients, Flock is eager to instead take on plenty of policing on the police’s behalf. In fact, its CEO Garrett Langley has gone so far as to proclaim to Forbes that Flock cameras can end crime entirely. Langley predicted that “in less than 10 years, Flock’s cameras, airborne and fixed, will eradicate almost all crime in the U.S.,” as the Forbes writer paraphrased. Langley continued: “I think we can have a crime-free city and civil liberties … We can have it all.”
Instead of closing the loopholes used by their lucrative clients, Flock is eager to instead take on plenty of policing on the police’s behalf.
It’s clear that Langley does indeed dream of having it all. Flock, with its burgeoning value, is surging towards a potential IPO, which would mean enormous self-enrichment for Langley and shareholders. Now, Flock is rapidly expanding both its hardware and software services, branching out by selling surveillance drones — which it labels as “drone first responders” — and Flock “Nova,” an AI tool for “predictive” policing. The latter purports to use historical, open-source, and other data to train neural networks to predict crime hotspots. (In another irony of Flock’s behavior, these endeavors have made use of data sets that dark web brokers obtained from illegal breaches, as 404 Media reported. In a muddled and evasive statement, Flock seemed to be attempting to simultaneously deny that it had ever used such data while also promising that it would stop using it going forward.)
As has been noted time and time again by experts, “predictive” policing AI relies on racist historical data collection practices. It is far less a miraculously clairvoyant technology than it is a way of police supplying themselves with justificatory data to give the veneer of legitimacy to what they already wanted to do: often involving racial profiling, brutalizing marginalized populations, and other repugnant patterns of misconduct.
In October, Flock announced a new partnership with one of the other largest private surveillance networks: Amazon Ring, which has its own checkered history of privacy malfeasance and of aiding law enforcement under constitutionally dubious premises. The partnership is poised to allow Flock clients to also request video footage from Ring users. Not only does this newfound corporate synergy confer more legitimacy on Flock as it looks towards an IPO, it also could represent a return to greater law enforcement collaboration for Amazon’s Ring, which had implemented safeguards and then backtracked on its own police video-sharing program, becoming more wary of privacy criticism after widespread public outcry. In any case, guardrails or not, it’s not like police have ever had a difficult time finding loopholes for accessing Ring footage before the Flock integration, like simply asking (or pressuring) doorbell-camera owners, from whom cops can obtain video without submitting an official platform request.
Further possible uses and abuses of the now-conjoined Flock-Ring systems are not yet fully clear. (Which is not to say that the elite aren’t already salivating over the prospects. Speaking fondly of the disciplinary future portended by these types of systems, Oracle founder and billionaire Larry Ellison was eager to declare in all sincerity that AI surveillance tools will ensure that “citizens will be on their best behavior.”) But it’s worth noting that Flock, which also counts major corporations and big-box retailers like Home Depot and Lowe’s among its clients, has helped facilitate not just security but customer data collection as well. Stores can track who drives through their parking lots and nearby streets, share camera feeds with police departments, and receive hotlist updates. For example, if a car with a license plate tied to an accused shoplifter arrives, ALPR will read their plates so that FlockOS can search for them on police hotlists, triggering an alert. Any ways that retailers are presently making use of this data is unknown, but it’s not hard to imagine the marketing advantages of identifying and logging every person that pulls up at your store.
Secure Only in Ineptitude
Technologist, commentator, AI expert, and musician Benn Jordan, who runs a popular eponymous YouTube channel, has also been cataloging Flock’s faults. Jordan’s video “Breaking The Creepy AI in Police Cameras,” in which Jordan informs viewers of Flock’s quiet proliferation and underscores the privacy concerns that the system raises, is nearing 3 million views; that single video has gone a long way toward drawing attention to the glaring ethical and technological problems around Flock.
Jordan released a follow-up video on November 16, in which he employs his own technical skills and the expertise of security researchers like Jon Gaines of GainSec, a “white hat” cybersecurity firm, to punch holes in Flock’s camera security measures — which have been revealed to be so easily circumventable that they seem like they were engineered by an untalented amateur. GainSec has published a formal white paper examining the technical details of dozens of security vulnerabilities; Jordan’s video concentrates on six of the most egregious.
In advance of his latest release, Jordan spoke with Truthout to share more about his independent research and to explain and expand on the GainSec findings. “You don’t really take data privacy seriously until you do,” he said. “It could be as ambiguous as your car insurance rates going up — because some car companies track your data about your driving habits, and sell it to car insurance companies, or health insurance companies. And those things can actually make your life more difficult — you have less control over your own income and your own behavior.”
Speaking of losing control, here’s a sampling of some of Flock’s most preposterous hardware and software issues, as detected by Jordan, Gaines, and others:
Pressing an easily accessible button on the back of Flock cameras (which, you may recall, are mounted in public across the country) a handful of times in an extremely simple sequence will open a wireless access point, which is easily hijacked to grant root access to the camera’s systems; once you have “root,” you can connect to the device, access its video data, and install whatever you’d like. Flock cameras’ exposed USB ports offer another avenue to gain control of the device to scrape data, insert fake camera feeds or anything else, obtain police information, and generally perform an endless variety of manipulations. Even more startlingly, Flock cameras still run on Android Things 8.1 — an outdated mobile system that, crucially, has been discontinued and is no longer supported by Google with security patches. Unsupported operating systems are essentially undefended, riddled with known exploits.
And there are plenty of other technical routes into Flock cameras, from “evil twin” hijacks, web-accessible API keys and signal interceptions, as Gaines, Jordan, and others have demonstrated with alacrity. The porous security of these camera systems approaches the comical. Flock even left its internal testing data accessible online: a trove that included police names and phone numbers, patrol areas, suspect hotlists, full license plates and even geographic information systems (GIS) data showing the live location of patrol cars.
But one of the most obvious security flaws is that Flock has never made it mandatory for its clients to use basic two-factor authentication — an everyday, familiar technology — to offer even a modicum of protection against the malicious accessing of camera feeds and private information. If the public can figure out how to use two-factor methods to log into their social media accounts, surely we can expect state security forces to embrace this bare-minimum security protocol. The predictable result of failing to mandate two-factor has been the leak of numerous police logins to Flock systems. In his video, Jordan even describes finding Flock logins for sale by Russian hackers in a dark web forum. The basic vulnerabilities here, of course, would be all too easy for even less experienced hackers to exploit. Any competent state or non-state actors, infiltrators of criminal or foreign intelligence origin, could have a field day.
In an email to Truthout, Flock disputed that it had been hacked and boasted of its own encryption standards, adding that the device used in Jordan’s video was on factory settings and didn’t have Flock’s security software enabled. The company also said that it “would be alerted if a camera is physically accessed, and would respond appropriately with the authorities,” and that the documented vulnerabilities don’t allow access to the cloud platform, where Flock stores “the vast majority of all evidence and metadata.” In response to the allegation that it had used data acquired from illegal breaches, Flock also said it had already admitted to these security issues and was in the process of mitigating them.
Flock also released a statement responding to the GainSec white paper, reassuring clients that, “Overall, none of the vulnerabilities detailed in the report have an impact on our customers’ ability to carry out their public safety objectives. Exploitation of these vulnerabilities would not only require physical access to a device, but also require intimate knowledge of internal device hardware. No customer action is required in response to this disclosure.”
The failings are farcical for a purported “security” company. Why would Flock be so heedless? Jordan postulates that it could have something to do with cost-cutting amid a broader mandate to charge forth and expand at all costs, reaching for the lodestar of its IPO. Perhaps that blinkered intensity is also the motivation for some of Garrett Langley’s most preposterous statements. One of Flock’s consistent marketing beats is the claim that its network has played a role in solving “10 percent” of all reported U.S. crime. As Jordan rightly notes, that staggering number has not been confirmed by any relevant outside organization. In his videos, Jordan also plays multiple clips from Langley; in one, the CEO touts that the Nova predictive AI that Flock is developing will make police work so simple that it will amount to “One click, one crime solved!” (Langley is also shown speaking to a Forbes journalist and referring to DeFlock.me as a “terroristic organization” whose “primary motivation is chaos. They are closer to antifa than they are anything else.”)
Flock’s whole raison d’être is to circumvent existing avenues of oversight and accountability.
Hype-bloated PR statements like Langley’s seem to betray a serious lack of reflection and concern on Flock leadership’s part for the consequences of the tech they have built, which adds another tool to the arsenal that police have at their disposal to violate rights. And Flock provides police with a workaround to some of the legal protections that individuals have in the face of such potential abuses. As Jordan put it to Truthout, “You have to understand the full scope of the situation and the context in which you’re tracking somebody before you can decide that it’s in the public interest to do so. That’s what a warrant is. We already have that system.”
“There’s just no point in using Flock Safety if you have to talk to a judge to use it,” Jordan said. In other words, Flock’s whole raison d’être is to circumvent existing avenues of oversight and accountability. And if that is the case, then there’s no reason to have such a system at all.
Flock Grounded
When all the manifold concerns around Flock are laid out, they run the full gamut from the absurd to the deadly serious. Fortunately, there are heartening signs of public attention and resistance. Momentum may well be amassing against Flock, thanks in no small part to independent researchers like Jordan and Gaines. The concerns have escalated to Congress: Sen. Ron Wyden (D-Oregon) and Rep. Raja Krishnamoorthi (D-Illinois) have called for a Federal Trade Commission investigation into Flock’s abuses and security negligence; Jordan has helped brief them on technical details.
But perhaps most crucial is the grassroots response. To take just one example: A concerned citizen in Skagit County, Washington, made public records requests for Flock data; at first, the county’s municipalities resisted his request. But ultimately, a state court ruled that ALPR data was indeed a matter of public record and should therefore be publicly accessible.
In recognition of the potential for ICE to access data, the City of Woodburn, Oregon, which has been subjected to intense immigration raids, shut off its Flock cameras for six months; activists hope this is made permanent. And in Eugene, Oregon, protesters flocked to the city council to protest and give testimony against the camera systems and police-ICE collaboration.
In fact, resistance has been mounting in quite a few of Flock’s client cities. Denver’s city council shot down efforts by Mayor Mike Johnston to sign for the continuation of a Flock contract without council approval and sans input from his own surveillance policy task force; the council cited “serious concerns” over Flock’s “ethics, transparency and credibility” and spoke of the company’s “disregard for honesty and accountability.” Hundreds of Denverites packed a town hall meeting to protest and to give testimony pushing back against the continued presence of Flock in their city. Denver’s mayor nevertheless acted unilaterally to extend the contract — with his deceptive statements on the matter (he had said unequivocally that the system had never been used for immigration enforcement, an assertion that researchers found was impossible to substantiate) raising credible questions of ulterior motives, especially in light of Flock’s significant backroom influence. (The company has spent $690,000 on political lobbying in 2025 to date.)
Some laws regulating ALPRs are already on the books, and further legislation and regulatory measures are pending in multiple states, with two bills in Missouri proposing to ban ALPRs outright. But one of the best retorts to Flock’s superlative claims might be the case of a Colorado woman named Chrisanna Elser, accused of package theft on the basis of what police insisted were incontrovertible Flock and Ring camera images.
The Colorado Sun reported that, when delivering Elser her court summons, a Columbine Valley officer had boasted that the Flock-captured video evidence against her was “a lock. One hundred percent. No doubt.” But, remarkably, Elser proceeded to clear her name, by bringing forward evidence from her own cameras and location data from her car and phone. It seems that Flock’s claims to omniscience are far more fraught, subjective, and contestable than the defenders of mass surveillance would care to admit. Perhaps we can take a lesson from the belied confidence of the officer who had told Elser that Flock empowered his small police department to such a degree that there was no room for contention: “You know we have cameras in that town. You can’t get a breath of fresh air in or out of that place without us knowing.”