Skip to content

Commit 6ee5478

Browse files
committed
Near-final update
1 parent 575ef07 commit 6ee5478

File tree

7 files changed

+106
-149
lines changed

7 files changed

+106
-149
lines changed

_site/feed.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.7.3">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2021-06-17T11:01:11-04:00</updated><id>http://localhost:4000/</id><title type="html">Ethical CS</title><subtitle></subtitle><entry><title type="html">Welcome to Jekyll!</title><link href="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html" rel="alternate" type="text/html" title="Welcome to Jekyll!" /><published>2017-08-04T10:04:48-04:00</published><updated>2017-08-04T10:04:48-04:00</updated><id>http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll</id><content type="html" xml:base="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html">&lt;p&gt;You’ll find this post in your &lt;code class=&quot;highlighter-rouge&quot;&gt;_posts&lt;/code&gt; directory. Go ahead and edit it and re-build the site to see your changes. You can rebuild the site in many different ways, but the most common way is to run &lt;code class=&quot;highlighter-rouge&quot;&gt;jekyll serve&lt;/code&gt;, which launches a web server and auto-regenerates your site when a file is updated.&lt;/p&gt;
1+
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.7.3">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2021-06-30T12:47:00-04:00</updated><id>http://localhost:4000/</id><title type="html">Ethical CS</title><subtitle></subtitle><entry><title type="html">Welcome to Jekyll!</title><link href="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html" rel="alternate" type="text/html" title="Welcome to Jekyll!" /><published>2017-08-04T10:04:48-04:00</published><updated>2017-08-04T10:04:48-04:00</updated><id>http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll</id><content type="html" xml:base="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html">&lt;p&gt;You’ll find this post in your &lt;code class=&quot;highlighter-rouge&quot;&gt;_posts&lt;/code&gt; directory. Go ahead and edit it and re-build the site to see your changes. You can rebuild the site in many different ways, but the most common way is to run &lt;code class=&quot;highlighter-rouge&quot;&gt;jekyll serve&lt;/code&gt;, which launches a web server and auto-regenerates your site when a file is updated.&lt;/p&gt;
22

33
&lt;p&gt;To add new posts, simply add a file in the &lt;code class=&quot;highlighter-rouge&quot;&gt;_posts&lt;/code&gt; directory that follows the convention &lt;code class=&quot;highlighter-rouge&quot;&gt;YYYY-MM-DD-name-of-post.ext&lt;/code&gt; and includes the necessary front matter. Take a look at the source for this post to get an idea about how it works.&lt;/p&gt;
44

_site/img/banner.png

72.1 KB
Loading

_site/img/top.png

41.8 KB
Loading

_site/index-new.html

Lines changed: 67 additions & 77 deletions
Large diffs are not rendered by default.

img/banner.png

72.1 KB
Loading

img/top.png

41.8 KB
Loading

index-new.md

Lines changed: 38 additions & 71 deletions
Original file line numberDiff line numberDiff line change
@@ -5,34 +5,39 @@
55
layout: home
66
exclude: true
77
---
8-
98
# Ethical Reflection Modules for CS 1
109
- [Evan M. Peck](http://www.eg.bucknell.edu/~emp017/), Associate Prof. of Computer Science, Bucknell University
1110
- [email me](mailto:evan.peck@bucknell.edu) \| [find me on Twitter](https://twitter.com/evanmpeck) \| [visit my website](http://www.eg.bucknell.edu/~emp017/)
1211

12+
![top logo](img/banner.png "diagram depicting the step-by-step process of replacing pairs of items during the shell sorting algorithm")
13+
<sup>Image by [Balu Ertl](https://commons.wikimedia.org/w/index.php?curid=38531293)</sup>
14+
15+
| Activity Quick Link | Programming Topic |
16+
| ----------- | ----------- |
17+
| [Developers as Decision-Makers](#decision-makers) | Conditionals |
18+
| [Developers as Gatekeepers](#gatekeepers) | Functions & Data types |
19+
| [Developers as Future Makers](#future-makers) | For Loops & Lists |
20+
| [Developers as Image Manipulators](#manipulators) | Nested Loops & 2D Lists |
21+
| [Developers as Prioritizers](#prioritizers) | OOP / APIs |
22+
1323
In Fall 2019, I redesigned our CS 1 course to integrate practice-based (coding!) reflection directly with technical concepts. This is a space to share those activities. Their goal is to:
1424
1. Introduce **a deeper level of reflection in CS 1 courses**. I want students to see that their actions either directly or indirectly impact people, communities, and cultures, and that this impact is often not felt equally by different groups of people (along lines of gender, race, class, geography, etc.).
1525
2. **Develop reflection habits _alongside_ coding habits** - all modules involve programming! I believe that habits are formed early in CS and must be tightly coupled with technical concepts in order for them to stick.
1626
3. **Pair directly with _existing_ CS 1 curriculum** - CS 1 is already a busy course. You don't need to set aside a month of new material. I believe that reflection and responsible computing pairs directly with technical concepts already taught (conditionals, for loops, etc.)
1727

1828
What these activies are **not**:
19-
- They are **not** a replacement for teaching students issues of cultural competency and identity. While computer scientists can (and should) point to those issues in class, we are _not_ the experts. Students should be taking courses that directly speak to the structures of power that they will be introducing systems to (including gender / race / ethnicity / class / geography / etc.)
20-
- They do **not** teach students what the _correct_ design is. They prompt students to reflect on the human consequences of their decisions. Sometimes, students answer _I'm not sure I can design this well enough to prevent harm_. That's a great answer too. Choosing to _not_ build something is okay.
29+
- They are **not** a replacement for teaching students issues of [cultural competency](https://dl.acm.org/doi/abs/10.1145/3328778.3366792) and identity. While computer scientists can (and should) point to those issues in class, most of us are _not_ the experts. Students should be taking courses that directly speak to the structures of power that they will be introducing systems into (including gender / race / ethnicity / class / geography / etc.)
30+
- They do **not** teach students what the _correct_ design is. They prompt students to reflect on the human consequences of their decisions. Sometimes, students answer _I'm not sure I can design this well enough to prevent harm_. That's a great answer too. Choosing _not_ to build something is okay.
2131

2232
_Note: If you are looking for the old homepage of this site, [click this link](archive/old-index.html)_
2333

24-
## Activity List
25-
26-
- [[Conditionals] Developers as Decision-Makers](#decision-makers)
27-
- characteristics
28-
- test
2934

3035
------------
3136

3237
# Programming + Reflection Activities
3338

3439
## <a name="decision-makers">**[Conditionals]** Developers as Decision-Makers</a>
35-
![housing algorithms](img/housing.png)
40+
![housing algorithms](img/housing.png "photo of a row of houses")
3641
_What are the consequences when we turn people into numeric scores for algorithms? Who benefits and who are disadvantaged by our decisions?_
3742

3843
- **Scenario:** Develop a scoring algorithm to determine which classmates are prioritized for housing on campus. Students use a human-centered design process to reflect on the ways in which different scoring algorithms can advantage or harm different groups of people.
@@ -51,8 +56,8 @@ This assignment appeared as part of [_ACM SIGCSE'S Nifty Assignments_](https://d
5156

5257
--------------------
5358

54-
## **[Functions & Data-types]** Developers as Gatekeepers
55-
![input validation](img/university.jpg)
59+
## <a name="gatekeepers">**[Functions & Data types]** Developers as Gatekeepers</a>
60+
![input validation](img/university.jpg "photo of a worn page with large typed words - the word 'university' is focused")
5661
_What assumptions do we make about the people using our technology? What are the consequences of those assumptions? - who might we exclude? How do we capture diversity through design?_
5762
- **Scenario:** Collect and validate personal information of people visiting a university. Through designing form input and validation, students uncover assumptions they have made about the diversity of different aspects of identity, including name, address, and gender.
5863
- **Practice:** data types, string and integer operations, python functions, conditionals (`if/elif/else`)
@@ -64,11 +69,12 @@ _What assumptions do we make about the people using our technology? What are the
6469
- [Falsehoods Programmers Believe about Geography](https://wiesmann.codiferes.net/wordpress/?p=15187)
6570
- [Facebook suspends Native Americans over 'real name' policy](https://www.theguardian.com/technology/2015/feb/16/facebook-real-name-policy-suspends-native-americans)
6671
- [Airport body scan machines flag transgender passengers as threats](http://time.com/4044914/transgender-tsa-body-scan/)
72+
- ["Why are they all obsessed with gender?" - (Non)Binary Navigations Through Technological Infrastructures - by Katta Spiel](https://www.youtube.com/watch?v=ISTWLqChfkg)
6773

6874
--------------------
6975

70-
## **[For Loops & Lists]** Developers as Future Makers
71-
![ethical hiring](img/hiring.jpg)
76+
## <a name="future-makers">**[For Loops & Lists]** Developers as Future Makers</a>
77+
![ethical hiring](img/hiring.jpg "photo of a row of sitting people in suits who appear to be waiting for something")
7278
_What does it mean to design a fair algorithm? What is the human cost of efficiency? What systemic advantages/disadvantages are your algorithms likely to amplify?_
7379
- **Scenario:** Develop an algorithm that filters job applications based on student grades. Students reflect on specific cases in which a human would very likely make a different decision than the algorithm. What was the cost of automation?
7480
- **Practice:** `for` loops, python `list` operations
@@ -77,6 +83,7 @@ _What does it mean to design a fair algorithm? What is the human cost of efficie
7783
- **Writeup:** [Ethical Design in CS 1: Building Hiring Algorithms in 1 Hour (Evan Peck)](https://medium.com/bucknell-hci/ethical-design-in-cs-1-building-hiring-algorithms-in-1-hour-41d8c913859f)
7884
- **Supplementary Reading:**
7985
- [Amazon scraps secret AI recruiting tool that showed bias against women](https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G)
86+
- [Fired by Bot at Amazon: 'It’s You Against the Machine'](https://www.bloomberg.com/news/features/2021-06-28/fired-by-bot-amazon-turns-to-machine-managers-and-workers-are-losing-out)
8087
- [Hiring Algorithms are Not Neutral (Gideon Manna and Cathy O'Neil)](https://hbr.org/2016/12/hiring-algorithms-are-not-neutral)
8188
- [Can an Algorithm Hire Better Than a Human?](https://www.nytimes.com/2015/06/26/upshot/can-an-algorithm-hire-better-than-a-human.html)
8289
- [Now Algorithms Are Deciding Whom to Hire, Based on Voice](https://www.npr.org/sections/alltechconsidered/2015/03/23/394827451/now-algorithms-are-deciding-whom-to-hire-based-on-voice)
@@ -87,23 +94,26 @@ This assignment appeared as part of [_ACM SIGCSE'S Assignments that Blend Ethics
8794
8895
--------------------
8996

90-
## **[Nested Loops & 2D Lists]** Developers as Media Manipulators
91-
![averaging faces](img/faces.png)
97+
## <a name="manipulators">**[Nested Loops & 2D Lists]** Developers as Image Manipulators</a>
98+
![averaging faces](img/faces.png "3 images created by averaging the images of other faces")
9299
_How does representation in a dataset impact an algorithm's outcome? Is it possible to create a representation that treats all people fairly? What are the possible implications of facial recognition software when it is used on historically marginalized groups?_
93100

94101
- **Scenario:** This activity starts as a classic media manipulation lab (changing RGB values in pixels). In the last portion of the lab, students are given a series of face images, and write code to generate the _average_ face of those images. In the following lecture, students reflect on what happens when we analyze the demographics of the data underlying our face-averaging algorithm. We use it as an introductory analogy to the shortcomings of training data on machine-learning, and an entry to talk about face-recognition.
95102
- **Practice:** 2D python `list`, nested `for` loops
96-
- **Material:**
97-
- **Writeup:** [follow-up reflection](https://twitter.com/evanmpeck/status/1307043732676644864)
98-
- **Supplementary Material:**
99-
100-
101-
103+
- **Material:** [Google Doc Assn (2021)](https://drive.google.com/drive/folders/19-2_YE2NiZQ7FvyOxKgKQ4PPHqkU1Imn?usp=sharing)
104+
- **Author:** [Evan Peck (Bucknell University)](http://www.eg.bucknell.edu/~emp017/)
105+
- **Supplementary Reading:** I use some the following material in a subsequent lecture where we reflect on the lab. [Click this link to get a sense of that material](https://twitter.com/evanmpeck/status/1307043732676644864)
106+
- [Gender Shades - by Joy Buolamwini](https://www.youtube.com/watch?v=rWMLcNaWfe0)
107+
- [ACM US Technology Policy Committee Urges Suspension of Private and Governmental Use of Facial Recognition Technologies](https://www.acm.org/binaries/content/assets/public-policy/ustpc-facial-recognition-tech-statement.pdf)
108+
- [Facial Recognition Is Accurate, if You’re a White Guy](https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html)
109+
- [Teachable Machine](https://teachablemachine.withgoogle.com/train)
110+
- [An Ethics of Artificial Intelligence Curriculum for Middle School Students](https://docs.google.com/document/d/1e9wx9oBg7CR0s5O7YnYHVmX7H7pnITfoDxNdrSGkp60)
111+
- [Face Averager by Lisa DeBruine and Ben Jones](http://faceresearch.org/demos/average)
102112

103113

104114
------------------------
105-
## **[Intro OOP]** Developers as Prioritizers
106-
![rescue](modules/ethicalengine1/img/people.jpg)
115+
## <a name="prioritizers">**[Intro OOP]** Developers as Prioritizers</a>
116+
![rescue](modules/ethicalengine1/img/people.jpg "photo of a busy urban street filled with people. Many are blurred to show movement and activity")
107117
_What is 'moral' behavior in the context of a computer? How do we write code that is forced to assign value to people? What are the implications of our representation decisions?_
108118
- **Scenario:** Program a disaster-relief robot to prioritize which distressed people to saves. This reframing of the Trolley Problem nudges students to reflect on issues of representation in their code (what are the problems with male/female representation? Should we even represent weight?), and consider how individual decisions could amplify systemic biases if it was used at scale.
109119
- **Practice:** conditionals, use of APIs and objects, dictionaries (in optional last part)
@@ -112,61 +122,18 @@ _What is 'moral' behavior in the context of a computer? How do we write code tha
112122
- **Write ups:** _Note:_ these reflections are based on an earlier version of the assignment, but should still communicate the philosophy.
113123
- [The Ethical Engine: Integrating Ethical Design into Intro Computer Science (Evan Peck)](https://medium.com/bucknell-hci/ethical-design-in-cs-1-building-hiring-algorithms-in-1-hour-41d8c913859f)
114124
- [Write Up the Ethical Engine Lab (Justin Li)](https://howtostartacsdept.wordpress.com/2018/01/13/step-86-write-up-the-ethical-engine-lab/)
125+
- **Supplementary Reading:**
126+
- [When binary code won’t accomodate non-binary people](https://slate.com/technology/2019/10/gender-binary-nonbinary-code-databases-values.html)
127+
- [Can you make AI fairer than a judge? Play our AI courtroom game](https://www.technologyreview.com/s/613508/ai-fairer-than-judge-criminal-risk-assessment-algorithm/)
128+
- [Machine Bias: there’s software used across the country to predict future criminals. And it’s biased against blacks.](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing)
129+
- [Can an algorithm tell when kids are in danger?](https://www.nytimes.com/2018/01/02/magazine/can-an-algorithm-tell-when-kids-are-in-danger.html)
115130

116131
While not peer-reviewed, people have pointed to my reflection on Medium when looking to **cite this work**:
117132
> Evan Peck. 2017. The Ethical Engine: Integrating Ethical Design into Intro Computer Science. https://medium.com/bucknell-hci/the-ethical-engine-integratingethical-design-into-intro-to-computer-science-4f9874e756af
118133
119-
<!-- ## [Hiring Algorithms: Developers as Decision-Makers](modules/hiring)
120-
121-
![ethical hiring](modules/hiring/img/hiring.jpg)
122-
123-
_What does it mean to design a fair algorithm? What is the human cost of efficiency? What systemic advantages/disadvantages are your algorithms likely to amplify?_
124-
- *Scenario:* Develop an algorithm that filters job applications based on GPA
125-
- [Material](modules/hiring)
126-
- *Practice:* loops, conditionals, python lists
127-
- *Writeup:* [Ethical Design in CS 1: Building Hiring Algorithms in 1 Hour (Evan Peck)](https://medium.com/bucknell-hci/ethical-design-in-cs-1-building-hiring-algorithms-in-1-hour-41d8c913859f)
128-
- *Author:* [Evan Peck (Bucknell University)](http://www.eg.bucknell.edu/~emp017/) -->
129-
130-
--------------------
131-
<!-- ## [Input Validation: Developers as Gatekeepers](modules/input)
132-
![university](modules/input/img/university.jpg)
133-
134-
- *Scenario:* Collect and validate personal information of people visiting a university
135-
- [Material](modules/input)
136-
- *Practice:* conditionals, functions, data types
137-
- *Author:* [Justin Li (Occidental College)](https://justinnhli.com/), Adapted by [Evan Peck (Bucknell University)](http://www.eg.bucknell.edu/~emp017/) -->
138-
139-
--------------------
140-
## [Ethical Engine 1: Developers as Definers of Identity](modules/ethicalengine1)
141-
![rescue](modules/ethicalengine1/img/people.jpg)
142-
143-
_How can we adequately represent people in code? What characteristics of people should we **NOT** include in code? What are the implications of our representation decisions?_
144-
145-
- *Scenario:* In code, represent a person so that autonomous cars can make life-critical decisions
146-
- [Material](modules/ethicalengine1)
147-
- *Practice:* OOP design, data types
148-
- *Author:* [Evan Peck (Bucknell University)](http://www.eg.bucknell.edu/~emp017/)
149-
150-
--------------------
151-
## [Ethical Engine: Developers as Moral Arbiters](modules/ethicalengine2)
152-
![rescue](modules/ethicalengine2/img/rescue.jpg)
153-
154-
_What is 'moral' behavior in the context of a computer? How do we write code that is forced to assign value to people?_
155-
- *Scenario:* Program a disaster-relief robot to prioritize which distressed people to saves
156-
- [Material](modules/ethicalengine2)
157-
- *Practice:* conditionals, use of APIs and objects, dictionaries (in optional last part)
158-
- *Write ups:*
159-
- [The Ethical Engine: Integrating Ethical Design into Intro Computer Science (Evan Peck)](https://medium.com/bucknell-hci/ethical-design-in-cs-1-building-hiring-algorithms-in-1-hour-41d8c913859f)
160-
- [Write Up the Ethical Engine Lab (Justin Li)](https://howtostartacsdept.wordpress.com/2018/01/13/step-86-write-up-the-ethical-engine-lab/)
161-
- *Author:* [Evan Peck (Bucknell University)](http://www.eg.bucknell.edu/~emp017/), parts of activity by [Vinesh Kannan (Mimir HQ)](https://github.com/vingkan)
162-
163-
---------------------
164-
165134

166135

167136

168-
<!-- ## Other Resources
169-
-->
170137
---------------------
171138

172139
## License

0 commit comments

Comments
 (0)