A Blog by Jonathan Low

 

Sep 20, 2017

The Movement To Expose Algorithms Determining Government Decisions and Services

Citizens, increasingly, are demanding the right to know what assumptions go into designing the algorithms that may govern important aspects of their lives - the proverbial black box.

This will get really interesting if they then demand the same information regarding their commercial transactions, as well. JL

Jim Dwyer reports in the New York Times:

Algorithms can decide  where kids go to school, how often garbage is picked up, which police precincts get the most officers, where building code inspections should be targeted, an individual’s liberty, as when they are used by the criminal justice system to predict future criminality.and even what metrics are used to rate a teacher. Few, if any, major cities in the United States require transparency for those computer instructions, or algorithms
Let us say that James Vacca is not necessarily the first person you’d think would begin a deeply necessary revolution to peel away some of the secrecy around technology that shapes government decisions. In the 1980s, Mr. Vacca admitted, he told an aide that it would be a waste of money to replace office typewriters with word processors.
Yet on Thursday, Mr. Vacca, 62, a Democratic City Council member from the Bronx, introduced a bill that would require the city to make public the computer instructions that are used, invisibly, in all kinds of government decision-making. Experts say that few, if any, major cities in the United States require transparency for those computer instructions, or algorithms.
If the principles in Mr. Vacca’s bill become law, it could turn out be as important to public society in the city and around the country as the smoking ban signed into law by Mayor Michael R. Bloomberg in 2002.
“I think I’m on to something that many people have spoken about, but have been unable to get their hands around,” Mr. Vacca, chairman of the council’s technology committee, said. “I’m trying to get my hands around something that affects millions of New Yorkers every day.”
In commercial life, algorithms help businesses use data collected from our digital footprints — information that is revealed from a search on the web, the use of an app, a subscription to a particular music service, shopping habits and so on. Algorithms can take into account the kind of phone you use or your ZIP code before you see a price while shopping online, as investigative journalists reported last year for ProPublica.
Governments also have access to oceans of data. Algorithms can decide where kids go to school, how often garbage is picked up, which police precincts get the most officers, where building code inspections should be targeted, and even what metrics are used to rate a teacher.
Mr. Vacca, who represents the East Bronx, said a mother might come to him when the Education Department’s algorithm sends her son to a distant high school.
She may say, “‘This school is his seventh choice. I work two jobs — how does he get to this school that he was assigned to when I can’t drive him?’” Mr. Vacca said.
Andrew Nicklin, who ran open data projects for the city and state and is now at Johns Hopkins University, said experts were still learning how algorithms affect society.
“New York City’s attempt to increase transparency is noble,” Mr. Nicklin said.
Naked algorithms are just bunches of code, and even experts can find it challenging to discern what values they express. So researchers are discussing ways to include public participation before they are written. “We can formalize certain notions of fairness and nondiscrimination, affirmatively, at the outset,” said Solon Barocas, a professor of information science at Cornell University.
At their most powerful, algorithms can decide an individual’s liberty, as when they are used by the criminal justice system to predict future criminality. ProPublica reporters examined the risk scores of 7,000 people assigned by a private company’s algorithm. The recidivism rankings were wrong about 40 percent of the time, with blacks more likely to be falsely rated as future criminals at almost twice the rate of whites, according to Julia Angwin, who led the investigation.
In testimony to a committee of the European Parliament, Danah Boyd, a principal researcher at Microsoft and the founder of the think tank Data & Society, cited a scheduling algorithm for big retailers that was designed to spread workers out as widely as possible — without taking into account the need for predictable work hours to take care of children, for instance, or the burden of double shifts. If programmers “don’t have clear direction, they’re going to build something that affects peoples’ lives in unexpected ways,” Ms. Boyd said.
Because some algorithms used by the city are leased from private companies, Mr. Vacca’s bill would require them to be available for “algorithmic audits.” These would allow the public to submit test data to see how the algorithm handles it. One analysis of the city’s teacher rating algorithm in 2009 and 2010 found a pattern of bizarre results, like an individual teacher who scored 97 in teaching sixth-grade math but only a 6 for seventh graders.
Mr. Vacca said he is not claiming that algorithms used by the city are necessarily flawed by bias, but their power cannot be ignored. As a committee chairman, he plans to convene hearings before he leaves office in December.
“Going forward in this technological age requires that we have this discussion,” Mr. Vacca said. “Is there public sentiment to examine these issues? I say yes.”

0 comments:

Post a Comment