Last fall, The MIT Review invited its readers to play a courtroom algorithm game, asking its readers “can you make AI [Artificial Intelligence] fairer than a judge?” The answer is no, for the moment. Algorithms, like judges, make trade-offs as they assess risk, make predictions, and allocate resources, but unlike judges, algorithms are calibrated according to data training sets, which often contain bias as well as software that may not be prepared to handle the task. These current shortcomings present students of computer engineering and public policy a formidable challenge: how can we make AI designed for courtroom settings more fair?
In CS182W, “Ethics, Public Policy, and Technological Change,” students who have spent much of their undergraduate career studying engineering, learn principles and frameworks from philosophy and political science that help them wrestle with the value trade offs of any design. The students who come to this class without the engineering training of CS majors are also asked to stretch, learning enough about the affordances and limitations of machine learning algorithms to assess, among other things, their use in decision making. This deep commitment to interdisciplinary learning and teaching is apparent in every aspect of the course, from the composition of the teaching team to the varied lecture formats and writing assignments. As Professor of Computer Science Mehran Sahami puts it, the course helps students “build bridges between disciplines and affirms their agency.” In all its facets, CS182W challenges students to think of themselves as enablers and makers of technological change.
Integrating Disciplines Pedagogically
About 300 students, most of whom are CS majors, are taking CS182W in winter 2020; almost 95 will take it to fulfill their WIM requirement. The 5-unit class meets three times a week for lectures, panel discussions, and interactive sessions dubbed “tensions and trade-offs” [download syllabus here]. Students also meet with their graduate teaching assistants in a discussion section each week.
Lectures are given by Sahami and Professors Rob Reich and Jeremy Weinstein. Reich is the Director of the McCoy Center for Ethics in Society; Weinstein has worked at the highest levels of American government. The three work as a team, showing up to all lectures to facilitate an interdisciplinary and sustained conversation about four thematic units related to algorithmic decision-making, data collection and privacy, autonomous systems, and private platforms. Their lectures introduce students to the relevant computer science, the competing values at stake, and the policy implications. In discussion section and through their writing, students then make choices, designing a product, system, or policy that adjudicates the competing values.
Lectures are complemented by panel discussions so that students can hear from people making these trade-offs in their day-to-day work. For example, in the unit on algorithmic decision-making, students heard from a panel made up of the faculty leaders and Frida Polli, PhD, CEO and Co-Founder, Pymetrics, and Jeff Thamkittikasm, Director of Operations, NYC Mayor’s Office. In the Tensions & Tradeoffs sessions, held in a large hall four times a quarter, students are given the chance to wrestle with hard questions in small groups. For instance, with reference to the recent Google employee resistance to the company’s work with U.S. Customs & Border Patrol, students are asked to imagine themselves as employees of a company that has just announced a new Department of Defense contract for new AI technologies, not for weapons systems, but for image recognition and data analysis. Professor Reich asked, “would you support or oppose your company working on this project? As a company leader, how would you weigh the concerns of vocal employees in the decision about whether or not to do this work? As a company based in the U.S., what obligations does the company have to the country?” In a small group, students practice articulating their stance and the values behind it. If ready, they might even call for the mic and go public with their ideas, a step that they will be asked to take in their writing, too.
Readings are robust and carefully curated, assigned to build students’ background knowledge, model interdisciplinary argumentation, and spark responses. Students encounter well-known scholars--John Rawls, CP Snow, and Cass Sunstein, for example--as well as pieces from Wired, Harvard Business Review, and MIT Technology Review. For those students new to these issues or for those who just want more, supplementary reading is suggested; for example, before a lecture on a data collection and civil liberties, students might read “The End of Trust” from McSweeney’s and the Electronic Frontier Foundation. A prompt in the syllabus encourages students to read actively. A student prepared for section is one who is able to articulate for each assigned reading
● what the main claims offered in the texts or case studies are;
● the arguments offered in favor of these claims;
● whether these are good or plausible arguments;
● whether the claim is, all things considered, strong or plausible;
● what alternatives to the claims and arguments exist; and
● whether some alternative is superior to the claim under discussion.
Graduate teaching assistants from computer science, philosophy, political science, law, and sociology lead discussion section, focusing on student engagement with key concepts in the readings. This approach to reading prepares students to write, to join the conversation and contribute an argument that will be heard.
A senior CS major reports that prior to taking CS182W, she had not had to read or write academic papers in her CS classes. While she knew little writing was expected in the undergraduate curriculum, she also knew that a career in CS would require writing. CS182W was her first exposure to articles explaining technical topics to a more generalist audience, and she learned a lot about communication strategies just by reading what was assigned. Allison Tielking, who also took the class in 2019, especially appreciated a podcast on the labor costs behind the smiles on Amazon’s brown boxes. Encountering so many voices in the reading helped her think on her own, she says.
Beyond Optimization: Writing to Find Better Answers
In CS182W, students write approximately 30 pages in total over the course of the quarter: four writing assignments plus a final exam. In all of the assignments, students identify, confront, and referee value trade-offs. They write to authentic audiences in varied genres and through distinct disciplinary lenses, evaluating fairness in a real-world setting. There is no singularly correct answer, just better and worse answers. As Professor Reich explains, “Some domains of life resist optimization. For example, your life partner, is there a single right person? No! But you will agree there are better and worse [people]” and that you need to be able to articulate the values that are driving your selection of a life partner.
Like the readings, lectures, and section discussions, the writing assignments ask students to integrate disciplinary knowledge. Students write two technical briefs, about four pages each; a collaborative ten-page policy memo; and a ten-page philosophy paper. When asked why students majoring in CS should learn to write a policy or philosophy paper, Professor Reich replied, “the great power of technologists to create technologies of enormous scale and impact mean that there are ethical and social dimensions in their creations that deserve as much attention as the technical creation itself. There are value trade-offs to be confronted in all the technologies of our lives. Engineers alone shouldn’t be making the tradeoffs on their own for the rest of us. And the policy makers and the philosophers also need to be fully informed citizens,” able to engage meaningfully with new technologies.
In the first technical writing assignment, students assess a decision-making algorithm that predicts criminal recidivism and informs courtroom sentencing. This assignment is inspired by a reading from the class: the widely read 2016 ProPublica article “How We Analyzed the COMPAS Recidivism Algorithm.” Students are first asked whether or not the provided code is biased. They then revise the code (they may use Java or Python) to make it’s results less biased and write paragraph-long explanations for each of their revisions, drawing on concepts such as classification parity, calibration, and disparate impact from the readings and lectures. Finally, they defend the “fairness” of their coding decisions in a memo to the Chief Technology Officer of a company selling the software. In this assignment, the writing of code, philosophy, and policy are intertwined. The second technical writing assignment is similar to the first but it aligns with the final thematic unit on the power of networks. In this assignment, students analyze how the parameters of a social network recommendation system can affect political polarization.
The group policy paper also asks students to formulate and defend a recommendation. The prompt asks, “Should Stanford adopt data-driven personalized advising?” In this hypothetical scenario, Stanford is considering adopting software that surveils students closely to make advising recommendations. The software aims to identify students most at-risk of an academic or personal crisis and help them quickly. Based on research, including interviews with varied stakeholders, from students to teachers and administrators, each group formulates a policy recommendation for Stanford Provost Persis Drell. Their recommendations must consider the technologies and articulate their values. For example, a group might argue for end-to-end encryption because they believe in privacy (making the surveillance project and approach to advising impossible) or they might want to advocate for more decentralized AI architectures with much less access to data, greater anonymity and protections, and a limited personalized approach to advising. In other words, students are invited to consider the ways the technology informs the larger tradeoffs. A senior reports the policy paper was “fun to do. Working in a group was unique and fun. I got to know my classmates.”
In the philosophy paper, students address morally-informed policy questions related to autonomous systems. Drawing on the assigned reading, they craft a closely reasoned answer to one of three challenging prompts. One, for example, asks them to weigh the trade-offs in designing autonomous vehicles for safety: should vehicles be optimized for passenger safety or global human welfare? Should a safer roadway for motorists be prioritized over a marginally less safe roadway for bicyclists? A workshop run by graduate students in philosophy helps students respond to these questions in a discipline-appropriate way. Moreover, those taking CS182W for their WIM requirement revise their philosophy paper in response to feedback from both their TAs and a Technical Communication Program tutor.
In his handout to students on writing a successful philosophy paper, Professor Reich articulates a cornerstone of writing pedagogy: the process of writing and revising your writing is a way to get clear about what you think. That’s because, as he says, “there are not uniquely correct answers in a philosophy paper the way there are in a problem set. You have to enhance the clarity of your thinking which you can only do in revision. So it’s not just an added bonus--like an extra credit problem on a problem set--it’s better thinking.” A senior remembers, “as an engineer, people don’t push you to elucidate what your work means, but being asked to think critically about what this [work] means in a philosophy paper, not just for you but for everyone” was enlightening.
Separate rubrics are developed for each of the assignments, and the TAs meet ahead of deadlines to discuss grade norming. The TAs with a technical background, typically in computer science, grade the technical papers. The TAs from philosophy, political science, or law grade the philosophy paper. All TAs grade the policy paper. At some time in the future, the teaching team looks forward to a more ideal scenario: when all TAs are as comfortable with the technical writing as they are with the humanities and social science argumentation and vice versa.
Another key feature of the assessment in this class is anonymized grading, to which the teaching team is “really committed,” says a former student. All writing assignments are submitted with an ID number only. This procedure reduces preferences and inconsistent grading. The senior had never seen anonymized grading in a class at Stanford and deemed it “a good idea” as it “took away a lot of biases.”
Thinking about Impact
This course, which evolved from the ethics, policy, and communications concerns of legendary Stanford Computer Science Professors Terry Winograd and Eric Roberts, confronts the ethical issues of AI that have arisen with the ascendancy of machine learning in the last five years. CS182W represents a historic educational shift because it asks students to analyze algorithms for ethical problems as well as practice three different genres of writing about ethics and computing.
One student observed the class got people in technology majors “excited about writing. Assignments played to everyone’s strengths and gave them an opportunity to expand [their skills].” “Writing” in CS is often a presentation, the student reports. Writing in new genres in CS182W, however, taught her to think critically about issues at “the intersection of tech and society.” She says the “next few years policy and tech will become ever more intertwined so knowing how to do the research to defend a point of view will become ever more valuable.” She adds she grew “even more as an engineer.” She now thinks carefully about “societal context and impact,” and is especially interested in sociology and “thinking more critically about how technology impacts vulnerable communities.” As a soon-to-be-graduate looking for a job at a technology company, she is asking “as a junior engineer, could I speak up? Is the product ethical?”
Allison was inspired by a class focused on the Uber whistleblower Susan Fowler. She felt the ideas “swimming around in her head,” and “planted them” in a Stanford Daily article that critiqued Lyft’s safety protocols. Because of her piece, which invited Stanford women to share their stories and garnered national attention, both Lyft and Uber made big changes to their apps. Allison says, “[CS182W] makes you feel more confident about speaking out” and “crystallized her values.”
Winter 2020 is the second time the teaching team has taught the class. From their first experience, they learned they needed to specify further the writing expectations. Because the class was made up predominantly of CS majors, students weren’t familiar with communication conventions in policy or philosophy. Two things have made a difference here: the assignment sheets have named more carefully the expectations and workshops outside of class described the why, what, and how of writing a policy memo and framing an argument in philosophy. Even though the topics and readings are highly interdisciplinary, the writing itself conforms to many of the dominant conventions of particular fields. In CS182W, students thus practice the qualities of clear thinking that transcend discipline as well as the distinctive rhetorical moves of a technical brief, philosophy paper, and policy memo.
For Sahami, the course is important for computer scientists who are also interested in democratic decision making. And he wants engineers to learn that philosophy can have direct relevance to decision making. Reich hopes that the interdisciplinary team’s success teaching a large course shows the model could be used in other majors such as bio-engineering or earth systems. He’s pleased with the outcomes he’s seen in the student writing. As he says, “when expectations are appropriately set, students respond with enormous appreciation for feedback on their writing. With sustained attention to their writing, the typical student recognizes they’ve grown through revision.”