The Harvard Crimson writes, “Two researchers discussed the potential for innovations in the use of artificial intelligence and digital phenotyping to advance social justice causes at a Harvard Law School panel Wednesday.”
The panel, titled “Computational Justice,” was the latest installment of the Project on Law and Applied Neuroscience, an event series by the Center for Law, Brain, and Behavior at Massachusetts General Hospital and the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School.
The goal of the panel was to help legal professionals “improve their paved route to justice” by using neuroscience technology to tailor approaches to individual cases, according to moderator Francis X. Shen, the executive director of the Center for Law, Brain, and Behavior.
Panelist Rediet Abebe ’13, a computer science Ph.D. candidate at Cornell and Junior Fellow at Harvard, discussed how she uses machine learning to study the way poverty, disease, inequality, and other factors impact underrepresented groups like pregnant women.
“A lot of the data is very hard to come by,” Abebe said. “It’s not collected in a sort of systematic manner, it’s not comprehensive, you cannot regularly sort of rely on it.”
“And what this does is it basically makes it very challenging to identify sort of gaps in our policies in terms of existing communities and also where there might be interest that we haven’t thought about,” she added.
Later in the event, Harvard Medical School assistant professor Justin T. Baker discussed his research on psychiatric conditions like bipolar disorder and schizophrenia, explaining how he designs technological systems tailored to people’s specific behaviors and needs.
In one experiment, Baker used geospatial recording to track the locations of a patient with bipolar disorder for more than two years, monitoring when the patient ended up in the hospital and the patterns of behaviors they exhibited beforehand.
Audience member Lauren M. Chambers questioned that kind of surveillance in the discussion section of the talk. Chambers, who serves as a technology fellow with the American Civil Liberties Union of Massachusetts, referred to a hearing she attended at the Massachusetts State House to introduce a bill that would set a moratorium on the use of facial surveillance technology by the government.
“Some of the concerns that we have about facial recognition are that, as much as we might want to bring this objective algorithm into our systems to reduce human bias in decision-making processes, the technology is not up to those standards,” Chambers said. “How are you making sure that the things you’re measuring for these kinds of processes are free from different sorts of biases?”
Baker explained in response that researchers developing new systems for tracking and treating mental illness should strive to consider data sets “within a particular cultural context.”
“Part of the rationale for developing objective measures is to develop things that would be at least applicable across cultures,” Baker said.
He added that his tracking methods focus not on broad diagnoses, but on individual patients’ behavior and how it changes over time.
“In mental health especially, diagnosis may be of marginal value,” Baker said. “It’s really about detecting what someone’s state is, and then detecting how the state changes over time.”