While these debates are crucial, technology is increasingly becoming part of the criminal justice system. Based on my experience with electronic monitoring, it is clear that all criminal justice technologies need to be regulated, just as medical devices are. Electronic monitoring technologies have an incredible impact on the lives of individuals and entire communities with the potential for life-altering consequences on freedom, employment, housing and family and community unity. Just as with other high-impact technologies, we need to ensure open, transparent regulation and information for every specific technology, population and implementation protocols.
While policymakers weigh the arguments for and against electronic monitoring technologies, the information available to them on which to base their decisions is almost always limited in one of two ways: either, biased information about a particular technology that invariably comes from the tech company selling it; or researchers, limited by funding, looking at electronic monitoring as one category, rather than engaging with the specifics of a particular technology on a specific population with a specific implementation — an approach akin to researching cancer treatment as if it were one-size-fits-all rather than acknowledging the wide range of specific treatments for different cancers on unique patients.
Electronic Monitoring Is a Vast Umbrella of Technologies and Goals
The term electronic monitoring applies to a wide range of technologies. For instance, an ankle bracelet using radio frequency to ensure you are at home that needs to be charged four hours a day, a GPS app on a phone that detects location only once every day and requires random photo check-ins, and a chip implanted in someone’s skin are all forms of electronic monitoring technologies.
And the technologies are being used on a wide and varied population, including, teenagers on pretrial supervision who is mandated to attend school, paroled sex offenders who are under house arrest and asylum seekers who receive an ankle bracelet that they must wear for years (and pay for).
Each kind of technology, and each situation, has countless capacity for technical and human failures, positive and negative impacts, and ways they can meet or fall short of their goals. For example, GPS tracking on a phone requires keeping the phone charged, making sure there is a cell signal and GPS signal, and, performing random biometric check-ins (to confirm that the mandated individual is with the phone) that can be disruptive and cause anxiety. An ankle bracelet does not require biometric check-ins, but it is far more cumbersome and may cause injuries and prevent the wearing of certain clothes — winter boots, for example.
The specific technical setup — frequency of GPS pings, biometric check-ins and alerts — has a big impact on battery life, which in turn could require individuals to recharge as often as multiple times a day, or as infrequently as a couple of times a week. The specific implementation protocols of electronic monitoring technologies also have a big impact on the cost to the individual and their community — for example, the consequences for an individual whose monitor loses batteries or who does not respond to a biometric check-in within 10 minutes can be significant. The bottom line is that each of these technologies, uses, and protocols needs to be analyzed and assessed separately.
Regulation Provides Neutral Information
Electronic monitoring technologies need to be regulated. Regulation of technologies means providing transparent information to the public and holding companies accountable to ensure that their equipment is achieving its goals and performing as promised.
For regulation of any technology to be effective, we need to be clear about what we want it to achieve. In the case of electronic monitoring, is the goal in a particular case to reduce failure-to-appear rates? Or is it to improve public safety? In order to assess performance, the first step is to be clear about what we hope the technology’s human outcomes will be.
Regulators would look not only at whether a technology achieves its stated goals, but also at what cost (in terms of time, money and resources), what the risks and consequences might be, and how easy it is to use, both in perfect and real-world conditions. For example, Therac25 radiation therapy worked perfectly in laboratory conditions, but killed and burned at least six people due to the real-life situation where users pressed a certain button too quickly. A recent study showed that asthma inhalers were used incorrectly up to 84% of the time, although when used correctly they are perfectly effective.
As with medical devices, different uses of electronic monitoring technologies require different tests.
If the goal is knowing where a person is 24/7, regulation would include analyzing how easy it is to keep the device on and charged at all times given the real-life circumstances of the particular population — how easy it is to regularly access a charger and be beside that charger for the required number of hours, and how quickly the device will drain.
If the goal is to allow an individual to continue to participate in positive life activities, we would want to look at whether an individual is able to play sports with an ankle bracelet, or watch a movie with alerts and check-ins beeping on their phone.
And if the goal is to keep someone out of jail, we would want to see how limits of the technology — for instance, a battery dying — play out in real-world conditions, and whether the implementation protocols result in more jail time when the device runs out of batteries.
One Real-World Experience
When we examined a specific form of electronic monitoring technology with young people, we found significant gaps between what was expected, and the on-the-ground reality. In 2015, the Center for Court Innovation, in partnership with the District Attorney of New York and Open Society Foundations, conducted a pilot with a small number of young adults who were released from custody on Rikers Island.
For this study, we looked at a two-piece electronic monitor — a Bluetooth ankle bracelet tethered to a smartphone that tracked a participant’s location via GPS. We focused on the human impact. For example, when we found that the devices were running out of charge, we poured over data, spoke to parents and teachers, and watched the young people charge it. In some cases, we found that it was faulty technology and in others we found that it was common teenage behavior — doing activities that rapidly drained the batteries, like watching YouTube or frequent texting, or just forgetting to charge it due to other instabilities in their lives.
We repeatedly asked the question: How is this technology able to meet the stated goals — knowing a young person’s location and whether they were in compliance with their mandates — given the realities of their lives? And at what cost?
We found significant gaps between our expectations and the technology’s performance. Only 5% of the alerts we received communicated accurate locations. The other 95% were either false location alerts or caused by low batteries, dead batteries, lost GPS, lost cell signal or tether breaks (when the device recorded that they walked away from the phone).
One of our participants averaged 30 alerts a day (alerts to both their phones, and to the supervisors of the monitor). Sometimes participants had to wake up at 3 or 4 in the morning to reboot their phones to fix bad GPS signals. One participant got ankle blisters. Another got kicked out of class for the disruption caused by their alerts. To make up for the approximate accuracy of GPS, we had to set “location zones” (the perimeter where the young person is supposed to be) around home and school that were so large that they were meaningless. We found that with this particular technology being used by teenagers and young adults, batteries lasted half a day on average, leading to either major disruptions during school or a complete inability to get location information for significant parts of the day.
Based on this experience, I believe we need a regulatory body to ensure, in a methodical way, that criminal justice technologies adhere to basic principles of humane technology. They would assess technology from a neutral perspective and ensure their assessment and data is transparently available to the public.
There have been many issues with regulatory bodies, and in this case there is an additional layer of complexity — the consumers of technologies are not the only people impacted by the use of the technology. So any regulatory body — whether government, private or nonprofit organizations — would need to keep the justice-involved individuals and communities at the forefront of assessing the impact of justice-technologies. Regardless of the specific structure, regulators should test new technology for the specific proposed audience and implementation policies — looking at real-world best-case and worst-case situations. They should also conduct replicable tests, answering a set of well-defined questions such as:
- How effective is the technology?
- How easy is the technology to use?
- What are the potential risks to the participating individuals and communities if the technology is used correctly? If used incorrectly?
- How safe and secure is the data?
It is impossible to view any electronic monitoring technologies as a simple “set it and forget it” alternative to jail. We need to acknowledge that the potential outcomes of any criminal-justice technology are so significant — both for individuals and entire communities — that we can no longer leave the job of communicating about them to the technology companies that derive a profit from their use. The time has come for a rigorous regulatory approach that will help ensure that we aren’t using technology to harm vulnerable populations.
Shubha Balasubramanyam is a director of technology at the Center for Court Innovation.
"Perspectives" is a regular feature written by guest authors on access to justice issues. To pitch article ideas, email firstname.lastname@example.org.
The opinions expressed are those of the author and do not necessarily reflect the views of the organization, or Portfolio Media Inc., or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
 Nancy G. Leveson and Clark S. Turner, “An Investigation of the Therac-25 Accidents”, Computer 26, 7 (July 1993): 18-41, https://web.stanford.edu/class/cs240/old/sp2014/readings/therac-25.pdf
 Perri Klass, “Using an Asthma Inhaler Correctly”, The Checkup, The New York Times, March 11 2019, https://www.nytimes.com/2019/03/11/well/family/using-an-asthma-inhaler-correctly.html
 Shubha Balasubramanyam and Jethro Antoine, “Young Offenders, Electronic Monitoring, Cell Phones, and Battery Life”, The Journal of Offender Monitoring 31, no. 1 (Spring/Summer 2018): 4-8, https://www.courtinnovation.org/publications/young-offenders-electronic-monitoring-cell-phones-and-battery-life
 Shubha Balasubramanyam, “10 Principles for Humane Justice Technology”, The Center for Court Innovation, February 2019, https://www.courtinnovation.org/publications/10-principles-humane-justice-technology