


Understanding Flock’s Testing and Development Program
This article explains how Flock tests its technology in real-world environments, strengthens search safeguards, and addresses recent privacy questions about its development practices.
Testing Flock’s Machine Learning and Search Moderation
We are constantly refining our machine learning to ensure we can accurately identify license plates and objects. Here’s a concrete example: when the Tesla Cybertruck came out, we had to build a whole new ML algorithm to identify it. Nothing had been seen like that before. This requires testing and training the models in real-world conditions.
Similarly, we have instituted a content moderation policy across one of our search features, FreeForm Search. FreeForm allows law enforcement to search for the objective evidence they need to solve a case in plain terms. LPR cameras can use Vehicle FreeForm, allowing police to search for things like “blue car spray-painted with yellow graffiti”. Video cameras have additional FreeForm capabilities that allow officers to search for characteristics on people, like “man wearing a cowboy hat”. *Note: LPR cameras never allow People FreeForm.
We have also designed and implemented a moderation policy to ensure First Amendment protections are baked in: freedom of speech, religion, and assembly.
Some specific guardrails instituted in the moderation policy are:
- Inappropriate or offensive words or phrases, such as specific racial or anti-semitic slurs or sexual innuendo, will not yield results.
- Searches using an individual’s name or other personal information will not yield results.
- Searches for specific content that could potentially implicate First Amendment rights will trigger a pop-up warning the user and advising that this search will be sent to a superior for auditing if the user chooses to proceed.
The latter requires a complicated training protocol to ensure the moderation policy works as designed. For example, some vehicle bumper stickers contain words and symbols indicating religious affiliations or nationalities, and the moderation policy needs to recognize, flag, and block these searches.
Just like vehicle identification, these algorithms need to be trained in the real world using real-life examples. Those might be training and demo searches for things like “Cross bumper sticker”, “Star of David,” or “Japanese flag”. These searches, if performed, should not return results.
This is the same principle behind testing in cybersecurity: you simulate the attack to prove the defense holds. If we don't test whether an antisemitic slur gets blocked, we can't guarantee it will be blocked when it matters.
In Dunwoody, a Flock employee performed a demo of this content moderation policy by searching for both “Star of David”, which our search moderation tool blocked, and “Cowboy hat,” which the search moderation tool allowed. The test logs from Dunwoody reflect exactly this. Authorized content moderation testing, conducted to verify that the system catches what it's supposed to catch.

Public-Private Camera Sharing
Similarly, one of the benefits communities most value about Flock technology is the ability for law enforcement to directly access privately owned cameras, if and only if the organization allows them to, for crime-solving and security purposes. This is also a feature that must be tested and demoed, both to ensure we get everything right on the technical side and so other agencies and businesses understand how the sharing works.
Our testing and demo partners provide us permission to demo these sharing connections.
Modifications We Are Making to Our Testing Partnerships
Fair questions have been asked about conducting demos on cameras in sensitive locations when doing this very critical testing in the real-world. Last week, in the City of Dunwoody, questions were raised about a demo conducted as part of authorized activity approved under the city's demo partner agreement, on cameras at a local Jewish Community Center.
Although the camera was only viewed once during a routine demo, we understand that this is a sensitive location for many. We have therefore determined that employees will be trained to only conduct demos in more public locations, like retail parking lots.
What We're Doing
We take every concern about how our technology is used seriously, even when the underlying claims are wrong. That's why every search in the Flock system is permanently logged and auditable, data is deleted after 30 days by default, and local agencies - not Flock - control who can access their data.
A full overview of our compliance tools is available at flock.com/trust.
A Final Word
Accusing someone of spying on children is not a policy disagreement; it is a life-altering allegation. The employees being named online are well-intentioned employees who accessed a camera network with the city's explicit permission, as part of their job. They are now being called predators for it.
When false accusations this serious are used to win a political argument, real people pay the price, and transparent tools that actually solve crime get replaced with either no tools or less transparent technology.
We will continue to be transparent about how our technology works, how it's tested, and how it's governed, rather than letting false narratives go unchallenged. Safety is a fundamental right for every community.
Protect What Matters Most.
Discover how communities across the country are using Flock to reduce crime and build safer neighborhoods.
.webp)







