Working Where Statistics and Human Rights Meet

When we tell people that we work at the intersection of statistics and human rights, the reaction is often surprise. Everyone knows that lawyers and journalists think about human rights problems … but statisticians? Yet, documenting and proving human rights abuses frequently involves the need for quantification.

In the case of war crimes and genocide, guilt or innocence can hinge on questions of whether violence was systematic and widespread or one group was targeted at a differential rate compared to others. Similar issues can arise in assessing violations of civil, social, and economic rights. Sometimes the questions can be answered through simple tabulations, but often, more-complex methods of data collection and analysis are required. This is something Richard Savage, former chair of the Department of Statistics at Yale, knew well when he called for more statisticians to get involved in human rights data analysis in his 1984 presidential address to the Joint Statistical Meetings.

When Scott Evans approached us about helping to curate a special issue of CHANCE that would look at the use of statistical analyses in human rights investigations, the request came at a particularly opportune time. Like all researchers, we both focus on specific areas, including estimating war deaths and developing statistical underpinnings for forensic science (with the goal of reducing wrongful convictions). In the wake of a 2016 conference on quantitative data and human rights at Duke University, though, we had been discussing the range of arenas in which statisticians can contribute to investigating potential human rights abuses. Current events have further highlighted the need for this kind of thinking.

Formally, the term “human rights” covers a range of individual rights—rights intrinsically due to all individuals simply because those individuals exist. The most famous document to outline these core rights is the Universal Declaration of Human Rights, adopted by the United Nations on December 10, 1948. These include the right to equality before the law; presumption of innocence; freedom from arbitrary detention; access to education, work, and housing; and many others.

The Declaration was a nonbinding statement; however, those rights have since been codified in multilateral treaties, primarily the International Covenant on Civil and Political Rights and the International Covenant on Economic, Social, and Cultural Rights. Many countries have national laws addressing economic, social, and civil rights as well.

Additionally, many group rights have been established through a series of international agreements that codify what governments can and cannot do. Starting with the Geneva Conventions in 1949, a body of international law has developed that includes agreements on how war can be fought. Commonly referred to as international humanitarian law, these treaties attempt to limit the violation of human rights even in the context of military action.

The substantive range of the stories in this issue highlights the breadth of issues that are, at their core, about human rights. While the pieces may at first seem unrelated, an underlying theme comes up repeatedly: the need for good data and analyses to assess whether violations of rights occur, and to direct efforts to make change.

David Hemenway tackles a topic that’s been in the news—gun violence, highlighting the need for better surveillance systems, the special interests that fight against them, and how better data collection can upend myths and document problems that deserve a public health response.

Matt Jans and colleagues show us how statisticians working on large public surveys are trying to improve measurement of sexual orientation and develop measurements of gender identity, noting that the first step in ensuring rights is often documenting the size and existence of a minority community.

Art Kendall tells the story of a consulting project in which he helped Maryland Legal Aid conduct a quantitative analysis of cases in the Maryland Rent Court System which showed high levels of rights violations in failure-to-pay-rent cases. Kendall came to the project through the AAAS Scientists on Call program, and he also showcases Statistics Without Borders, an organization that pairs statisticians with clients who need pro bono consulting.

Looking beyond U.S. borders, Phuong Pham and Patrick Vinck explain how their use of qualitative and quantitative methods, including embedding qualitative questions in sample surveys, has led to insights in conflict and post-conflict settings that they would have otherwise missed.

Finally, one of us—Megan Price—highlights how statistical analyses can play an important role in international war crimes trials. She describes the use of multiple systems estimations to estimate homicide rates during Guatemala’s civil war and how that work played out in the trial against former military dictator Efraín Ríos Montt.

As this issue goes to press, we are surrounded by reminders that if one cares about truth, good data matter, and we cannot afford to take such data for granted.

Last summer, the Brookings Institution and Rolling Stone both published editorials worrying that the 2020 census would fail. Robert Shapiro, who as under secretary of Commerce for Economic Affairs under President Clinton oversaw the 2000 Census, wrote the Brookings piece. He highlighted a range of actions by the current administration, noting that “[i]t is no surprise that the Government Accountability Office recently designated the 2020 Census as one of a handful of federal programs at ‘High Risk’ of failure”: Decisions by the current administration to limit crucial funding and failures to fill leadership roles have been widely documented.

Yet another important issue may play a role as well. The U.S. Census asks citizens to provide information about characteristics such as their race. These data are what allow researchers to document disparities and monitor progress; however, history has shown us how population data systems can be used to target minority communities as well.

For a census to achieve full and accurate responses, citizens must trust their government. Many commentators have noted that the most likely problems in 2020 will include undercounting of disadvantaged communities, including minorities, since funding needed to reach them is limited at the same time trust in government is eroding. This will have consequences even if the data are never directly misused; if, for example, biased counts are used to form congressional districts and apportion funding for services.

History has shown us that we cannot afford to take good data for granted. This issue reminds us of the value that good data and research can provide.

About the Authors

Robin Mejia, PhD, MPH, holds a special faculty appointment in the Department of Statistics and manages the Statistics and Human Rights Program at the Center for Human Rights Science at Carnegie Mellon University. Before coming to Carnegie Mellon, she worked both as a statistician and as a journalist for outlets including the Los Angeles Times, Washington Post, Science, Nature, and CNN. Her current research focuses on understanding risk factors for human rights violations and estimating the prevalence of conflict events, as well as improving statistical practices in forensic science applications.

Megan Price is the executive director of the Human Rights Data Analysis Group, where she designs strategies and methods for statistical analysis of human rights data for projects in a variety of locations, including Guatemala, Colombia, and Syria.

*The authors are writing a book about statistics and human rights, due out later this year from Taylor&Francis.

Back to Top

Tagged as: