Ethical Reflection Modules for CS 1
- Evan M. Peck, Associate Prof. of Computer Science, Bucknell University
- email me | find me on Twitter | visit my website
Image by Balu Ertl
Activity Quick Link | Programming Topic |
---|---|
Developers as Decision-Makers | Conditionals |
Developers as Gatekeepers | Functions & Data types |
Developers as Future Makers | For Loops & Lists |
Developers as Image Manipulators | Nested Loops & 2D Lists |
Developers as Prioritizers | OOP / APIs |
In Fall 2019, I redesigned our CS 1 course to integrate practice-based (coding!) reflection directly with technical concepts. This is a space to share those activities. Their goal is to:
- Introduce a deeper level of reflection in CS 1 courses. I want students to see that their actions either directly or indirectly impact people, communities, and cultures, and that this impact is often not felt equally by different groups of people (along lines of gender, race, class, geography, etc.).
- Develop reflection habits alongside coding habits - all modules involve programming! I believe that habits are formed early in CS and must be tightly coupled with technical concepts in order for them to stick.
- Pair directly with existing CS 1 curriculum - CS 1 is already a busy course. You don’t need to set aside a month of new material. I believe that reflection and responsible computing pairs directly with technical concepts already taught (conditionals, for loops, etc.)
What these activies are not:
- They are not a replacement for teaching students issues of cultural competency and identity. While computer scientists can (and should) point to those issues in class, most of us are not the experts. Students should be taking courses that directly speak to the structures of power that they will be introducing systems into (including gender / race / ethnicity / class / geography / etc.)
- They do not teach students what the correct design is. They prompt students to reflect on the human consequences of their decisions. Sometimes, students answer I’m not sure I can design this well enough to prevent harm. That’s a great answer too. Choosing not to build something is okay.
Note: If you are looking for the old homepage of this site, click this link
Programming + Reflection Activities
[Conditionals] Developers as Decision-Makers
What are the consequences when we turn people into numeric scores for algorithms? Who benefits and who are disadvantaged by our decisions?
- Scenario: Develop a scoring algorithm to determine which classmates are prioritized for housing on campus. Students use a human-centered design process to reflect on the ways in which different scoring algorithms can advantage or harm different groups of people.
- Practice: Conditionals (
if/elif/else
), Input (input()
), Difference in strings vs. ints - Material: Google Doc assn (2021) | Nifty Assignments 2020 Page
- Author: Evan Peck (Bucknell University)
- Context: 2 hour lab setting. Small student groups.
- Instructor Guidance: Guidance provided by Jaye Nias and Marty Wolf
- Supplementary Reading:
This assignment appeared as part of ACM SIGCSE’S Nifty Assignments track. You can cite that work with:
Nick Parlante, Julie Zelenski, John DeNero, Christopher Allsman, Tiffany Perumpail, Rahul Arya, Kavi Gupta, Catherine Cang, Paul Bitutsky, Ryan Moughan, David J. Malan, Brian Yu, Evan M. Peck, Carl Albing, Kevin Wayne, and Keith Schwarz. 2020. Nifty Assignments. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (SIGCSE ‘20). Association for Computing Machinery, New York, NY, USA, 1270–1271. DOI:https://doi.org/10.1145/3328778.3372574
[Functions & Data types] Developers as Gatekeepers
What assumptions do we make about the people using our technology? What are the consequences of those assumptions? - who might we exclude? How do we capture diversity through design?
- Scenario: Collect and validate personal information of people visiting a university. Through designing form input and validation, students uncover assumptions they have made about the diversity of different aspects of identity, including name, address, and gender.
- Practice: data types, string and integer operations, python functions, conditionals (
if/elif/else
) - Material: Google Doc assn (2021) | old web-based assn (2019)
- Author: Justin Li (Occidental College), Adaptation by Evan Peck (Bucknell University)
- Instructor Guidance: Guidance provided by Colleen Greer
- Supplementary Reading:
- Falsehoods Programmers Believe about Names
- Falsehoods Programmers Believe about Addresses
- Falsehoods Programmers Believe about Geography
- Facebook suspends Native Americans over ‘real name’ policy
- Airport body scan machines flag transgender passengers as threats
- “Why are they all obsessed with gender?” - (Non)Binary Navigations Through Technological Infrastructures - by Katta Spiel
[For Loops & Lists] Developers as Future Makers
What does it mean to design a fair algorithm? What is the human cost of efficiency? What systemic advantages/disadvantages are your algorithms likely to amplify?
- Scenario: Develop an algorithm that filters job applications based on student grades. Students reflect on specific cases in which a human would very likely make a different decision than the algorithm. What was the cost of automation?
- Practice:
for
loops, pythonlist
operations - Material: updated Google Doc assn (2021) | old web-based assn (2019)
- Author: Evan Peck (Bucknell University)
- Instructor Guidance: Guidance provided by Patrick Anderson and Jaye Nias
- Writeup: Ethical Design in CS 1: Building Hiring Algorithms in 1 Hour (Evan Peck)
- Supplementary Reading:
This assignment appeared as part of ACM SIGCSE’S Assignments that Blend Ethics and Technology special session. You can cite that work with:
Stacy A. Doore, Casey Fiesler, Michael S. Kirkpatrick, Evan Peck, and Mehran Sahami. 2020. Assignments that Blend Ethics and Technology. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (SIGCSE ‘20). Association for Computing Machinery, New York, NY, USA, 475–476. DOI:https://doi.org/10.1145/3328778.3366994
[Nested Loops & 2D Lists] Developers as Image Manipulators
How does representation in a dataset impact an algorithm’s outcome? Is it possible to create a representation that treats all people fairly? What are the possible implications of facial recognition software when it is used on historically marginalized groups?
- Scenario: This activity starts as a classic media manipulation lab (changing RGB values in pixels). In the last portion of the lab, students are given a series of face images, and write code to generate the average face of those images. In the following lecture, students reflect on what happens when we analyze the demographics of the data underlying our face-averaging algorithm. We use it as an introductory analogy to the shortcomings of training data on machine-learning, and an entry to talk about face-recognition.
- Practice: 2D python
list
, nestedfor
loops - Material: Google Doc Assn (2021)
- Author: Evan Peck (Bucknell University)
- Instructor Guidance: Guidance provided by Emanuelle Burton and Darakhshan Mir
- Supplementary Reading: I use some the following material in a subsequent lecture where we reflect on the lab. Click this link to get a sense of that material
- Gender Shades - by Joy Buolamwini
- ACM US Technology Policy Committee Urges Suspension of Private and Governmental Use of Facial Recognition Technologies
- Facial Recognition Is Accurate, if You’re a White Guy
- Teachable Machine
- An Ethics of Artificial Intelligence Curriculum for Middle School Students
- Face Averager by Lisa DeBruine and Ben Jones
[Intro OOP] Developers as Prioritizers
What is ‘moral’ behavior in the context of a computer? How do we write code that is forced to assign value to people? What are the implications of our representation decisions?
- Scenario: Program a disaster-relief robot to prioritize which distressed people to saves. This reframing of the Trolley Problem nudges students to reflect on issues of representation in their code (what are the problems with male/female representation? Should we even represent weight?), and consider how individual decisions could amplify systemic biases if it was used at scale.
- Practice: conditionals, use of APIs and objects, dictionaries (in optional last part)
- Material: updated Google Doc assn (2021) | old website version (2019)
- Author: Evan Peck (Bucknell University), parts of activity by Vinesh Kannan (Mimir HQ)
- Instructor Guidance: Guidance provided by Judy Goldsmith and Patrick Anderson
- Write ups: Note: these reflections are based on an earlier version of the assignment, but should still communicate the philosophy.
- Supplementary Reading:
While not peer-reviewed, people have pointed to my reflection on Medium when looking to cite this work:
Evan Peck. 2017. The Ethical Engine: Integrating Ethical Design into Intro Computer Science. https://medium.com/bucknell-hci/the-ethical-engine-integratingethical-design-into-intro-to-computer-science-4f9874e756af
License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License