Lecture | February 24 | 1:10-2:30 p.m. | 190 Doe Library
Aaron Halfaker, Principal Research Scientist, Wikimedia Foundation; Senior Scientist, University of Minnesota
Wikipedia has become a dominant source of reference information for more than half a billion people every month. Through its improbable rise to popularity, this "free encyclopedia that anyone can edit" has become a synecdoche for open production communities online. In order to operate at massive scales (~160k edits per day), Wikipedians have embraced algorithmic technologies that bring efficiency and consistency to the wiki's complex, distributed processes. These algorithms mediate social processes, governance decisions, and editors' perceptions of each other. Specifically, so-called "black box" artificial intelligences have proven invaluable for supporting curation activities at scale, but they also have the potential to silence voices and introduce ideologically founded biases in insidious ways. Despite Wikipedians' open/audit-able processes, that's exactly what's been happening. In this talk, I'll introduce "ORES," an open AI platform that is designed to enable Wikipedia's technologists to enact alternative ideological visions and to enable researchers to easily perform audits. I'll share some lessons that we've learned maintaining a large-scale, generalized AI service and discuss a call to action direct towards critical algorithms researchers to take advantage of this platform for their studies.