00:00A man sits at his kitchen table. The morning light cuts across the worn wood.
00:04He sorts through his mail, a small pile of bills and advertisements. Then he sees it,
00:10an official-looking envelope from the local police department. His heart beats a little faster.
00:16He has not been in trouble for years, not since he was much younger. He carefully tears it open.
00:21The letter inside is short and direct. It informs him that based on a recent analysis,
00:27he has been identified as a person at high risk of being involved in future criminal activity.
00:33It offers him a meeting with a social worker. It also warns him that he will be subject to
00:39increased attention from law enforcement. He reads the words again. High risk. The man did nothing
00:47wrong. This is the quiet reality of predictive policing. So, how does this technology actually
00:54work? Predictive policing systems are essentially data-crunching engines. They take in vast
01:00amounts of information and search for patterns. Think of it like a weather forecast, but for human
01:06behavior. The system's goal is to make a prediction. That prediction might be a place.
01:13Example. Software highlights a three-block area as high probability for a car break-in this week.
01:19Police departments can use this to decide where to send patrol cars. This is called place-based
01:26predictive policing. It seems logical. Send officers where crime is most likely.
01:32Another, more controversial type creates risk scores for people. This is what happened to the
01:38man with the letter. Person-based systems assign a numerical score to individuals. The score represents
01:44statistical risk of future involvement in crime. To calculate the score, the algorithm looks at many
01:51data points. Criminal history, including every arrest, even drop charges. Age. Known associates.
02:00Whether they've been a victim of violence, where they live. When you feed biased data into an algorithm,
02:07the algorithm learns the bias. It does not know the history behind the numbers. It only sees that more
02:16arrests happen in a specific neighborhood. So the machine concludes that this neighborhood is a high
02:22crime area. It then tells the police to patrol that area even more. The increased police presence leads
02:30to more arrests for minor infractions that might go unnoticed elsewhere. These new arrests are fed back
02:37into the system as more data. The system sees them and becomes more certain it was right. Its prediction
02:44has been confirmed. This creates a destructive feedback loop.
02:52A neighborhood is labeled a hotspot, not because residents are more criminal, but because they are more
03:00policed. The data paints a picture of a dangerous place, and that picture becomes the community's
03:07reality. People there live under constant suspicion. They are stopped more often. They are questioned
03:14more frequently. Their lives are disrupted by a cycle started by a computer program acting on flawed
03:22historical information. When we talk about these systems, we must ask a fundamental question about
03:28fairness. Is it fair to treat a person differently based on a prediction? A fair justice system is
03:36supposed to be based on evidence. It is supposed to be based on individual actions. It punishes you for
03:43what you did, not for what a computer thinks you might do. Predictive systems turn this principle on its
03:50head. They prejudge people. A high-risk score acts like a digital stain. It colors how the world sees
03:58you before you have even done anything. This can lead to a presumption of guilt, where law enforcement
04:04is already primed to view someone as a suspect. Furthermore, these systems raise serious concerns about
04:11privacy. To generate a risk score, the software needs access to a huge amount of personal information.
04:20It collects data about your past mistakes. It collects data about where you live. It collects data about who
04:27you know. In some cases, it can even scrape data from your social media posts. This information is compiled
04:35and analyzed without your knowledge or consent. You become a collection of data points, a profile to be
04:43assessed for risk. We stand at a crossroads. Technology is offering us powerful new tools for law enforcement.
04:51But these tools force us to ask hard questions about the kind of society we want to live in.
04:58Who gets to decide what is just? Should that power be handed over to a private company and its secret
05:04algorithm? Is justice simply a matter of statistical efficiency, of reducing crime numbers at any cost?
05:12Or is justice about something more? Is it about dignity, equality, the right to be judged as an
05:17individual? When a machine predicts a person's future, it is making a profound judgment. We must be
05:24very careful about giving that power away. So what can we do? We are not powerless in the face of
05:31these
05:31systems. The first step is to demand transparency, demand accountability. City governments should not
05:38be allowed to purchase and use these technologies in secret. There must be public debate and community
05:44input before a system is adopted. Independent audits should be required. The future they predict is not
05:51yet written. We still have the power to shape it.