
Georgio Sabino had learned to read a siren the way some people read the weather. He knew the difference between a casual patrol rolling past a corner store and the sharp, stuttering flash that meant someone’s life might change in the next three minutes.
The moment everything can go wrong
On a cold Ohio night, a college student named Maya watched red and blue lights swell in her rearview mirror. She had been to enough vigils to know how quickly “routine traffic stop” could turn into a hashtag. More than 50 million people in the U.S. have contact with police each year, and around 1 million of them experience threats or use of force. Over 600 people are killed annually, and an estimated 250,000 civilians are injured by law enforcement.
Maya’s hands shook as she pulled over, but she remembered what her cousin had shown her: the GoVia “Highlight A Hero” app. With one tap, she opened the encounter screen, which instantly started secure streaming, stamped her GPS location, and alerted her chosen contacts. A second tap invited a public defender on Zoom, and within 30 seconds, a calm voice appeared on her screen: “Hi Maya, I’m here with you. Put the phone on the dashboard so the officer can see I’m present.”
The officer approached, seeing not just a nervous driver but a live call with counsel, a recording banner, and a clear on-screen notice that the encounter was being documented for affidavit-style review and community rating. In that instant, the power equation shifted from one person versus the state to a networked, documented, AI-observed interaction.
Where AI enters the car
Behind the scenes, GoVia’s AI was working like an unseen co-pilot. It monitored the stream quality and cybersecurity in real time, encrypting metadata and scanning for signs of tampering or unauthorized access. It flagged key moments—voice stress spikes, rapid movement, raised volume, keywords like “gun,” “taser,” or “step out of the car”—so that if anything went wrong, attorneys would not have to scrub through 20 minutes of video to find the five seconds that mattered.
Unlike many predictive policing tools that aim to forecast where crime will occur and often mirror historic bias, GoVia’s AI is pointed in the opposite direction: toward accountability and protection. Traditional predictive systems, like location-based algorithms documented by Yale Law School researchers, tend to over-police the same marginalized neighborhoods because they are trained on biased arrest and incident data. GoVia’s AI, by contrast, is trained on encounter quality: de-escalation behaviors, adherence to policy, civilian dignity, and the presence or absence of excessive force.
The app’s “Highlight A Hero” algorithm uses AI to synthesize affidavit-backed community ratings, verified video, and professional observations into structured profiles of officer behavior. Instead of just counting complaints, it builds a detailed picture: the officer who consistently explains rights calmly, the patrol partner who always calls EMS early, the sergeant who intervenes when a stop starts to go sideways. These patterns then inform training, commendations, and, when necessary, legal action.
From body cameras to community cameras
For a decade, many cities put their faith in police body cameras. Research shows they can reduce complaints against police by about 17% and use of force by nearly 10%, but they remain controlled by the same agencies whose actions they document. Footage can be delayed, edited, or withheld, and the public often only sees it after the worst has already happened.[npr]
GoVia extends the logic of body cameras into the hands of civilians and independent professionals. Instead of one camera owned by the department, you get a mesh of cameras, affidavits, and third-party witnesses—from attorneys to mental-health clinicians—connected in real time. When thousands of people use GoVia, you do not just have “more video”; you have structured, AI-analyzed encounter data that can reveal patterns across cities and years.
GoVia’s own AI roadmap points toward predictive analytics that do not predict “where to send more officers,” but “where to send more accountability and support.” By analyzing historical encounter data, the system can identify high-risk situations—like night stops involving mental health crises or areas with long histories of complaints—and recommend adding real-time legal observers or mental-health professionals to those encounters. Internal GoVia projections describe a potential 60% reduction in excessive-force incidents in pilot regions when such AI-informed support is layered onto traditional policing, freeing billions of dollars in misconduct costs for community programs.
Why AI is the tool for a uniquely American problem
The United States has a particular blend of high gun ownership, racialized policing, and fragmented local law enforcement that makes safer encounters both urgent and complicated. National data show hundreds of killings and hundreds of thousands of injuries in encounters each year, with firearms being the most frequently cited type of force in serious incidents. This is not a problem you can solve with slogans, one-off training sessions, or a single new law; it is a systems problem.policeepi.uic+1
AI is uniquely suited to systems problems because it can:
- Aggregate massive, messy datasets from thousands of encounters and identify patterns invisible to human reviewers.
- Learn which combinations of officer behavior, civilian stress, context, and tactics correlate with safe outcomes—and which correlate with harm.
- Provide real-time guidance, alerts, and triage suggestions during encounters, not weeks later in internal reviews.
- Continuously adapt as departments change policies, communities speak up, and new risks emerge.
But the question is not just “AI or no AI”; it is “AI serving whom?” When AI is designed solely for police efficiency, it tends to intensify surveillance of the same communities that already bear the brunt of enforcement. When AI is built from the ground up to center dignity, transparency, and mutual safety—as in GoVia’s model—it can rebalance the field.
In GoVia’s architecture, AI is constrained by:
- Affidavit-backed community input, which grounds algorithms in lived experience rather than only official records.
- Legal and ethical guardrails shaped by civil-rights attorneys, mental-health professionals, and impacted communities.
- A dual mandate to protect civilians and highlight exemplary officers, not to maximize arrests or enforcement metrics.
This orientation makes AI not a new weapon, but a new witness.
Why public and private sectors should back GoVia
GoVia has already proven that its idea resonates: by January 2026, over 33,000 end users had signed up worldwide, supported by major justice and innovation cohorts like MIT Solve, UC Berkeley, JumpStart, Stand Together Venture Labs, and gener8tor. The company holds two issued U.S. patents and one pending, with plans for additional filings that protect its crisis-streaming and officer-rating workflows and AI-driven safety systems. This is not just a concept; it is a defensible, scalable platform with early traction.
For the public sector—cities, counties, states—supporting GoVia means:
- Reducing litigation and misconduct costs, which currently run into billions of dollars annually across U.S. jurisdictions.policeepi.uic+1
- Gaining access to high-quality, structured encounter data that can inform training, policy, and early intervention systems.
- Demonstrating to residents that local government is serious about transparency, not just internal review.
For the private sector—insurers, health systems, law firms, tech companies, philanthropies—backing GoVia offers:
- A way to lower risk and claims exposure related to police encounters, workplace security, and crisis response.
- New markets in legal tech, tele-mental-health, and civic data where GoVia can serve as a distribution channel and integration layer.
- A visible, measurable social-impact project that aligns with ESG commitments and public expectations around racial justice and safety.
GoVia’s hybrid freemium and SaaS business model means that everyday users can access core protections for free or low cost, while institutions pay for premium dashboards, integrations, and analytics. This structure allows private capital to subsidize public good: when a city or insurer licenses GoVia, it helps keep the safety net available for the student, the rideshare driver, or the elder whose only asset is a smartphone.
A story the data can change
In Georgio Sabino’s framing, the U.S. has spent decades telling the same story: secret lists, unrecorded encounters, and a quiet understanding that some people will not make it home. The numbers bear it out—hundreds killed, hundreds of thousands injured, millions living with fear every time they see lights in the mirror.
GoVia: Highlight A Hero offers a different narrative: one in which a teenager’s phone becomes a shield instead of a liability, where officers known for restraint and courage are documented and celebrated by the communities they serve, and where AI is deployed not to predict who will be policed, but to predict where more protection and transparency are needed.
In that future, when the sirens swell behind you, you are not alone. You are connected—to your people, to your counsel, and to an intelligent system trained not on fear, but on the possibility that this encounter can end with everyone safe, everyone seen, and someone newly recognized as a hero.
