“From George Floyd to Digital Redlining: Why Predictive Policing Fails in Los Angeles (Clinkunbroomer) —and Why GoVia Highlight A Hero Matters”

Authorities say the ambush unfolded in seconds. A Los Angeles County sheriff’s deputy, 30-year-old field training officer Ryan Clinkunbroomer, was stopped at a red light outside the Palmdale station on a Saturday evening when a gray Toyota Corolla pulled alongside his marked patrol car and someone opened fire. A passerby saw the deputy slumped over, raced into the station for help, and triggered a manhunt that would end two days later with a barricaded suspect surrendering to a tactical team.

The shooting and its aftermath

Clinkunbroomer, an eight-year LASD veteran, was found in “medical distress” around 6 p.m. and rushed to Antelope Valley Medical Center, where he died of a gunshot wound. Investigators described the attack as a “cowardly” ambush targeting a uniformed deputy in a clearly marked vehicle, just yards from the station’s front doors. Security cameras at the station captured the shooting and the suspect vehicle, and county leaders quickly posted a reward of about 250,000 dollars for information leading to an arrest and prosecution.

Within 36 hours, attention focused on a 29-year-old Palmdale man, Kevin Salazar, who barricaded himself in a home for several hours as sheriff’s special enforcement units, negotiators, and armored vehicles surrounded the property. After prolonged negotiations and the use of de‑escalation tactics, he surrendered and was taken into custody with weapons recovered at the scene, according to sheriff’s officials. The case crystallized two profound tensions in American criminal justice: the acute vulnerability of officers to targeted violence, and the deep, long-running distrust between heavily policed communities and the institutions sworn to protect them.

AI justice: promise and peril

Across the United States, law enforcement agencies are turning to artificial intelligence tools that promise to predict where crime will occur, flag “high‑risk” individuals, and sift millions of data points—from license plate readers to social media posts—at machine speed. Predictive policing systems ingest historical crime data, map patterns, and generate “hot spots” or risk scores that command staff then use to deploy patrols or focus investigations. Proponents argue that these tools offer unprecedented efficiency, letting stretched departments concentrate resources where they are most needed while claiming an aura of algorithmic objectivity.

But the same historical data that feeds these systems often encodes decades of over‑policing in Black, Latino, and poor neighborhoods, creating what scholars describe as a self‑perpetuating feedback loop: more patrols lead to more recorded incidents and arrests, which lead algorithms to send even more officers back to the same streets. Legal analysts warn that when AI‑driven tools disproportionately target certain communities, they raise serious questions under the Fourteenth Amendment’s Equal Protection Clause, which bars discriminatory government practices even when bias is hidden inside “neutral” code. Defense attorneys are already challenging the use of AI‑generated evidence and risk scores in court, arguing that opaque systems can quietly tilt outcomes—from bail to sentencing—without meaningful oversight or transparency.

Community eyes, algorithmic lenses

The Palmdale ambush investigation shows both the power and the limits of technology in real time. Station cameras, likely combined with community surveillance systems, captured the suspect vehicle and helped rapidly focus the investigation. Officials pleaded with residents to review home and business cameras, underscoring how ordinary people and their devices now sit at the center of critical evidence chains—from identifying a car’s trajectory to corroborating a timeline.

In other cases, bystanders’ phones have done what official cameras did not or would not. The murder of George Floyd in Minneapolis was exposed primarily because a teenager’s smartphone video captured the deadly encounter in full, a recording later described in court as the “star witness” against Derek Chauvin. That case became a global shorthand for the idea that community‑generated video can both secure convictions against abusive officers and erode faith in police when accountability fails 

Tech companies are trying to formalize this ecosystem. Platforms have built tools that let residents voluntarily share doorbell or store‑camera footage with investigators, with uploads preserved as evidence and chain‑of‑custody logged automatically. In pilot programs, agencies can push verified updates to nearby residents—warning them to avoid an area, asking for specific video, or correcting rumors as an incident unfolds. Yet critics fear that, without guardrails, these same systems can become an informal, privatized surveillance web that extends policing reach deeper into already monitored communities. 

Where GoVia Highlight A Hero fits

A platform like GoVia Highlight A Hero could sit at the intersection of these tensions, turning community participation from a one‑way extraction of data into a two‑way exchange of recognition, transparency, and control.

In a case like Clinkunbroomer’s killing, GoVia could:

  • Enable controlled evidence sharing
    Residents near the Palmdale station could receive geofenced prompts asking whether they wish to review and share relevant video or photos from the time of the ambush, with clear options to anonymize, time‑limit, or revoke consent. This mirrors emerging community‑evidence tools but makes community control explicit rather than implied.
  • Protect chain of custody and privacy
    Media shared through the app could be encrypted end‑to‑end, time‑stamped, and logged, creating a secure audit trail similar to professional evidence platforms while masking exact addresses or user identities from broad internal access. That helps investigators act quickly while reducing the risk that residents’ data becomes a permanent trove for unrelated investigations or future AI training. 
  • Humanize both sides in real time
    In the aftermath of the deputy’s death, GoVia could host moderated tribute spaces where community members highlight acts of service by officers—mentorship, crisis de‑escalation, lifesaving interventions—alongside stories of residents who have stepped in during emergencies. This “hero” framing offers a counter‑narrative to the binary of “cop vs community,” emphasizing shared stakes in safety.
  • Add community oversight to AI use
    If agencies plan to use AI tools—license plate readers, facial recognition, predictive patrols—GoVia could embed public-facing “AI disclosures” explaining what tools are in use, what data they ingest, and how long information is stored, in language accessible to non‑experts. In‑app surveys could let residents vote on acceptable uses, such as violent‑crime investigations, and flag red lines, such as immigration enforcement or protest monitoring.
  • Build a transparent feedback loop
    When residents submit tips or video, they rarely hear what happened next. GoVia can provide status updates—“evidence received,” “reviewed by detectives,” “used to identify vehicle,” “helped clear an innocent person”—and aggregate those outcomes into public stats showing how community contributions actually shape cases. That visibility is key to sustaining engagement, especially in neighborhoods skeptical of how information is used. 

Asking Los Angeles: would you want this?

For GoVia Highlight A Hero to have legitimacy in a city like Los Angeles County, the question cannot be whether law enforcement wants another tool. The question must be whether residents believe such a platform would serve them as well as it serves the state.

A rigorous community conversation might ask LA County residents:

  • Would you use an app that lets you decide when and how to share video or tips with investigators in cases like the Palmdale ambush—if you could see exactly how your information was used and had the option to withdraw it later?
  • Would you be more willing to cooperate if the same platform also highlighted stories of everyday heroism by both officers and neighbors, and published clear data on AI tools, stops, complaints, and outcomes in your neighborhood?
  • What safeguards would you require—such as independent oversight boards, data‑deletion guarantees, bans on using shared information for immigration or protest monitoring, or strict limits on AI‑driven “risk scores” tied to individuals?
  • In communities with long histories of over‑policing, would a system like this feel like a bridge toward co‑produced safety—or a more polished interface for the same unequal power?

A deep‑dive, multi‑newsroom investigation into AI and justice in Los Angeles today would likely find a city already saturated with cameras and data, where violence against officers and abuses by officers coexist in a fragile, often combustible equilibrium. The challenge for a platform like GoVia Highlight A Hero is not just technical design, but democratic design: building a system in which residents of Palmdale and across LA County can see themselves not merely as sources of evidence, but as co‑authors of what justice, safety, and fairness should look like—online and on the street.

ED Post

Leave a comment

Your email address will not be published. Required fields are marked *