PERSPECTIVE: Government Algorithms and the Public’s Right to Know
/by Mitchell W. Pearlman In many respects, computers have made life easier. But they have also made life quite a bit more complicated. For example, before the computer age most government documents were on paper. Today, people not only need access to government information on computer media and in computer-readable formats, they need access to the computer programs and systems government uses to make policy and other important decisions. Yale Law Professor Jack Balkin calls this “algorithmic transparency.”
An algorithm is the step-by-step procedure by which a task is performed. People use simple algorithms in their lives every day, such as for adding numbers or sorting books. On the other hand, algorithms used in computers are often highly complex. They require sophisticated logic and advanced mathematical modeling in the design and functionality of the applications in which they are embedded. Because of this, algorithms are considered intellectual property and deemed trade secrets by most private enterprises and many government agencies that create and use them.
Governments now gather almost incomprehensible amounts of information, organize them into vast databases, and make and implement important decisions using the algorithms they create or purchase. Governments use computer algorithms in making tax policy and budget decisions; they use them in forecasting various transportation and infrastructure needs; and they use them in analyzing public health and environmental issues and formulating policy based on these analyses. Of course, if the data used are less than complete or accurate, or if the algorithms themselves are based on flawed reasoning or assumptions, then government policies and decisions based on them will likewise be flawed. Such errors can lead not only to unsound decisions, but they also can lead to an enormous waste of public resources and even to a significant loss of life.
In Connecticut, as elsewhere, various government agencies forecast income and expenditures to help guide lawmakers in constructing state budgets. Each of these offices has access to the same data sets. But the assumptions programmed into their algorithms can differ significantly, leading to different outcomes in determining whether a budget will or will not be in balance. The 2018 state budget was out-of-balance by several hundred million dollars just weeks after it was enacted. How did this happen? Was it because the data was faulty? Or was it because the algorithms, and the assumptions built into them, were wrong?
The first shots in the battle for algorithmic transparency have already been fired. Recently the New York City Council passed an algorithmic accountability bill, which establishes a task force to study, and within 18 months of passage report, how city agencies use algorithms to make decisions. The bill was enacted in the wake of a racially biased algorithm used to assess risk factors of criminal defendants. The algorithm’s source code was confidential until a federal judge ordered it to be disclosed and the bias subsequently identified.
Trade secrets and confidential commercial information – which are exempt from public disclosure under current Freedom of Information laws – often represent a significant financial investment by those enterprises and organizations that create or own them. On the other hand, computer algorithms are now – and increasingly will be – vital components in government policy and other decision making. To prevent significant errors or miscalculations in the future, many government algorithms need to be transparent so they can be publicly vetted before policy decisions are made or legislation becomes law.
Thus, proprietary rights face an important competing value when they would prevent the disclosure of information about which there is a legitimate and important public interest. The notion of an informed and knowledgeable electorate is one of the cornerstones of our country’s democratic tradition. To paraphrase the Connecticut Supreme Court in another context, trade secrets and confidential commercial information must give way when balanced against the publication of matters of public interest, in order to ensure the “uninhibited, robust and wide-open discussion of legitimate public issues.”
So in this case, as in others before it, the balance of competing interests must be resolved in favor of algorithmic transparency to the greatest extent possible. This is not to say that government need not provide some measure of just compensation if it discloses secret or confidential proprietary information. But the bottom line is that algorithmic transparency is essential to the continuance of our democratic system of governance.
___________________________
Mitchell Pearlman is the former executive director of the Connecticut Freedom of Information Commission. He currently teaches law in the Journalism Department of the University of Connecticut at Storrs. Rogers Epstein, MIT class of 2019, also contributed to this article. The article is abstracted from a “White Paper” published by the Connecticut Foundation for Open Government (CFOG), which can be found in its entirety at www.ctfog.org.
PERSPECTIVE columns from contributing writers appear each weekend on Connecticut by the Numbers.