Flock Group Inc. Exploits Public Need

Flock Cameras are exploiting the slow movement of government with the fast pace development of LLM-based technology

Fri Mar 20 2026
The Name is Michael | Photos Galleries Art | Posts

I attended the Troy City Council meeting on March 19, 2026 to speak on the dangers of Flock Group Inc. cameras in our community, as it relates to autonomous profit-driven LLM bots like Moltbot, and how insecure cameras will be exploited by these bots for profit by data brokers to the government. I listened to the conversations involving the council, the administration, and the affairs representative from Flock. I don't have every question asked and the answer written down: I trust that the council to deliberate based on those exchanges, and the more than 40 public testimonies against this technology. What I want to write about is the context in which this happened.

In 2021 the City of Troy entered a contract with Flock Group Inc. for these cameras without review from the council. Yet now we are upset with this choice. Why the change of heart for a service they hope to develop a relationship with a tool to aid safety? The answer is that the advancement of software from 2021 to 2026 has changed the core business operations of Flock Group Inc.'s software. When Flock Group Inc. first offered their services to the Troy Police Department, we existed in a pre-LLM age where the software technology for image recognition was less mature, and now we live in an age where LLM technology has leveled-up the capabilities of everything.

Another important concern is how we got here: the contract auto-renewed before the council could hear on the situation. The combination of 1) a contract that renews without review, and 2) a software service that changes its definition of data storage/utilization of public data in in relation to LLM tools as core business features is extremely concerning because it allows of the exploitation of the slow movement of the government for the benefit of private corporations.

When I listened to Mayor Mantello's Administration speak jointly with the Troy Police Department, I heard officers who were using a tool for their needs of research license plates for ongoing investigation of crime. They picked this system because there are other municipalities that also subscribe, allowing for tracking of stolen vehicles across county and state lines. They picked this system because it helped save people who are mentally distressed from themselves. The list goes on, but to say the least the tool is used throughout their day-to-day operations so that they can do their jobs faster and safer.

I understand that need, especially as one officer said that they have used LPR (License Plate Reader) technology for 17 years for assist them in their jobs. However, Flock Group Inc. has become more than a LPR tool in the span from 2021 to 2026. Flock Group Inc.'s representative was a nontechnical “affairs representative” who punted any difficult technical question about how Flock Group Inc moves data, retains data, and removes data, or which 3rd parties are used to achieve their feature set. Any corporation worth their salt would have either their sales rep or technical rep available to answer these questions. It's common practice in the business world to have a technical person on hand to explain the hard questions when the customer starts asking. Instead we had navel gazing from a man who said “I'll get back to you later on that.” This is purposeful ignorance to avoid culpability for what they are doing now: implementing LLM technology to replace their image-recognition technology as both a cost-savings and feature benefit technology.

What the representative and police officers did say is that the cameras send the data to a Flock Group Inc server to be stored for 30 days of use, which is then deleted from the city's record. However the contract with Flock Group Inc says that they own the data forever for their own internal uses. So that footage is there to be cited and used for any feature the system as a whole can deem. It may be completely true that the Troy Police Department cannot see the data anymore, and at the very same time that Flock Group Inc. retains the data for training and referential purposes for their larger system. This is the problem with companies like Flock Group Inc.: they seek ownership of the data for purposes outside of the contract's obligations and misinform their clients.

2021 image-recognition technology was not LLM-based, it could be be used for a myriad of purposes. LLM technology is now the “it” technology for how to process large amounts of data. LLM technology is so much more capable than the 2021 image recognition technology. The LLM technology is a blackbox, especially for how the corporate partnership could be between OpenAI/Anthropic and Flock Group Inc. Their cameras send the full frame of video to the Flock Group Inc. servers, and is stored with some sort of metadata tagging for objects like vehicles or other objects of interest. The simplest solution for Flock Group Inc. to leverage the new LLM technology would be to send the entire stream of frames to that agent for analysis. This means that the video would be used to train the LLM bots, without the consent of the public, on public images.

This approaches my chief concern, which is insecure cameras. If these cameras are insecure, then that means anyone on the internet can access those frames and develop their own database. That means anyone who has a freely available LLM-bot like Moltbot can be setup to commercialize the data as a data broker, to learn about the habits of the public for advertising, or to be sold to any government. These bots can operate at speeds of hundreds of nanoseconds to make decisions about these images for their pre-defined goal by their master.

At the end of the day, Troy needs a tool that stays a LPR. A tool that stays the form that we agreed to years ago. Officers trust their vest to stay a vest, their gun to stay a gun, their radio to be a radio, and so on. This LPR is not a LPR anymore, but a cog in an exploitation machine due to the lack of care for software quality and sticking to your mission.