You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: _site/feed.xml
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -1,4 +1,4 @@
1
-
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.7.3">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2021-06-17T11:01:11-04:00</updated><id>http://localhost:4000/</id><title type="html">Ethical CS</title><subtitle></subtitle><entry><title type="html">Welcome to Jekyll!</title><link href="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html" rel="alternate" type="text/html" title="Welcome to Jekyll!" /><published>2017-08-04T10:04:48-04:00</published><updated>2017-08-04T10:04:48-04:00</updated><id>http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll</id><content type="html" xml:base="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html"><p>You’ll find this post in your <code class="highlighter-rouge">_posts</code> directory. Go ahead and edit it and re-build the site to see your changes. You can rebuild the site in many different ways, but the most common way is to run <code class="highlighter-rouge">jekyll serve</code>, which launches a web server and auto-regenerates your site when a file is updated.</p>
1
+
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.7.3">Jekyll</generator><link href="http://localhost:4000/feed.xml" rel="self" type="application/atom+xml" /><link href="http://localhost:4000/" rel="alternate" type="text/html" /><updated>2021-06-30T12:47:00-04:00</updated><id>http://localhost:4000/</id><title type="html">Ethical CS</title><subtitle></subtitle><entry><title type="html">Welcome to Jekyll!</title><link href="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html" rel="alternate" type="text/html" title="Welcome to Jekyll!" /><published>2017-08-04T10:04:48-04:00</published><updated>2017-08-04T10:04:48-04:00</updated><id>http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll</id><content type="html" xml:base="http://localhost:4000/jekyll/update/2017/08/04/welcome-to-jekyll.html"><p>You’ll find this post in your <code class="highlighter-rouge">_posts</code> directory. Go ahead and edit it and re-build the site to see your changes. You can rebuild the site in many different ways, but the most common way is to run <code class="highlighter-rouge">jekyll serve</code>, which launches a web server and auto-regenerates your site when a file is updated.</p>
2
2
3
3
<p>To add new posts, simply add a file in the <code class="highlighter-rouge">_posts</code> directory that follows the convention <code class="highlighter-rouge">YYYY-MM-DD-name-of-post.ext</code> and includes the necessary front matter. Take a look at the source for this post to get an idea about how it works.</p>
Copy file name to clipboardExpand all lines: index-new.md
+38-71Lines changed: 38 additions & 71 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,34 +5,39 @@
5
5
layout: home
6
6
exclude: true
7
7
---
8
-
9
8
# Ethical Reflection Modules for CS 1
10
9
-[Evan M. Peck](http://www.eg.bucknell.edu/~emp017/), Associate Prof. of Computer Science, Bucknell University
11
10
-[email me](mailto:evan.peck@bucknell.edu)\|[find me on Twitter](https://twitter.com/evanmpeck)\|[visit my website](http://www.eg.bucknell.edu/~emp017/)
12
11
12
+

13
+
<sup>Image by [Balu Ertl](https://commons.wikimedia.org/w/index.php?curid=38531293)</sup>
14
+
15
+
| Activity Quick Link | Programming Topic |
16
+
| ----------- | ----------- |
17
+
|[Developers as Decision-Makers](#decision-makers)| Conditionals |
18
+
|[Developers as Gatekeepers](#gatekeepers)| Functions & Data types |
19
+
|[Developers as Future Makers](#future-makers)| For Loops & Lists |
20
+
|[Developers as Image Manipulators](#manipulators)| Nested Loops & 2D Lists |
21
+
|[Developers as Prioritizers](#prioritizers)| OOP / APIs |
22
+
13
23
In Fall 2019, I redesigned our CS 1 course to integrate practice-based (coding!) reflection directly with technical concepts. This is a space to share those activities. Their goal is to:
14
24
1. Introduce **a deeper level of reflection in CS 1 courses**. I want students to see that their actions either directly or indirectly impact people, communities, and cultures, and that this impact is often not felt equally by different groups of people (along lines of gender, race, class, geography, etc.).
15
25
2.**Develop reflection habits _alongside_ coding habits** - all modules involve programming! I believe that habits are formed early in CS and must be tightly coupled with technical concepts in order for them to stick.
16
26
3.**Pair directly with _existing_ CS 1 curriculum** - CS 1 is already a busy course. You don't need to set aside a month of new material. I believe that reflection and responsible computing pairs directly with technical concepts already taught (conditionals, for loops, etc.)
17
27
18
28
What these activies are **not**:
19
-
- They are **not** a replacement for teaching students issues of cultural competency and identity. While computer scientists can (and should) point to those issues in class, we are _not_ the experts. Students should be taking courses that directly speak to the structures of power that they will be introducing systems to (including gender / race / ethnicity / class / geography / etc.)
20
-
- They do **not** teach students what the _correct_ design is. They prompt students to reflect on the human consequences of their decisions. Sometimes, students answer _I'm not sure I can design this well enough to prevent harm_. That's a great answer too. Choosing to _not_ build something is okay.
29
+
- They are **not** a replacement for teaching students issues of [cultural competency](https://dl.acm.org/doi/abs/10.1145/3328778.3366792) and identity. While computer scientists can (and should) point to those issues in class, most of us are _not_ the experts. Students should be taking courses that directly speak to the structures of power that they will be introducing systems into (including gender / race / ethnicity / class / geography / etc.)
30
+
- They do **not** teach students what the _correct_ design is. They prompt students to reflect on the human consequences of their decisions. Sometimes, students answer _I'm not sure I can design this well enough to prevent harm_. That's a great answer too. Choosing _not_ to build something is okay.
21
31
22
32
_Note: If you are looking for the old homepage of this site, [click this link](archive/old-index.html)_
23
33
24
-
## Activity List
25
-
26
-
-[[Conditionals] Developers as Decision-Makers](#decision-makers)
27
-
- characteristics
28
-
- test
29
34
30
35
------------
31
36
32
37
# Programming + Reflection Activities
33
38
34
39
## <aname="decision-makers">**[Conditionals]** Developers as Decision-Makers</a>
35
-

40
+

36
41
_What are the consequences when we turn people into numeric scores for algorithms? Who benefits and who are disadvantaged by our decisions?_
37
42
38
43
-**Scenario:** Develop a scoring algorithm to determine which classmates are prioritized for housing on campus. Students use a human-centered design process to reflect on the ways in which different scoring algorithms can advantage or harm different groups of people.
@@ -51,8 +56,8 @@ This assignment appeared as part of [_ACM SIGCSE'S Nifty Assignments_](https://d
51
56
52
57
--------------------
53
58
54
-
## **[Functions & Data-types]** Developers as Gatekeepers
55
-

59
+
## <aname="gatekeepers">**[Functions & Datatypes]** Developers as Gatekeepers</a>
60
+

56
61
_What assumptions do we make about the people using our technology? What are the consequences of those assumptions? - who might we exclude? How do we capture diversity through design?_
57
62
-**Scenario:** Collect and validate personal information of people visiting a university. Through designing form input and validation, students uncover assumptions they have made about the diversity of different aspects of identity, including name, address, and gender.
58
63
-**Practice:** data types, string and integer operations, python functions, conditionals (`if/elif/else`)
@@ -64,11 +69,12 @@ _What assumptions do we make about the people using our technology? What are the
64
69
-[Falsehoods Programmers Believe about Geography](https://wiesmann.codiferes.net/wordpress/?p=15187)
65
70
-[Facebook suspends Native Americans over 'real name' policy](https://www.theguardian.com/technology/2015/feb/16/facebook-real-name-policy-suspends-native-americans)
66
71
-[Airport body scan machines flag transgender passengers as threats](http://time.com/4044914/transgender-tsa-body-scan/)
72
+
-["Why are they all obsessed with gender?" - (Non)Binary Navigations Through Technological Infrastructures - by Katta Spiel](https://www.youtube.com/watch?v=ISTWLqChfkg)
67
73
68
74
--------------------
69
75
70
-
## **[For Loops & Lists]** Developers as Future Makers
71
-

76
+
## <aname="future-makers">**[For Loops & Lists]** Developers as Future Makers</a>
77
+

72
78
_What does it mean to design a fair algorithm? What is the human cost of efficiency? What systemic advantages/disadvantages are your algorithms likely to amplify?_
73
79
-**Scenario:** Develop an algorithm that filters job applications based on student grades. Students reflect on specific cases in which a human would very likely make a different decision than the algorithm. What was the cost of automation?
@@ -77,6 +83,7 @@ _What does it mean to design a fair algorithm? What is the human cost of efficie
77
83
-**Writeup:**[Ethical Design in CS 1: Building Hiring Algorithms in 1 Hour (Evan Peck)](https://medium.com/bucknell-hci/ethical-design-in-cs-1-building-hiring-algorithms-in-1-hour-41d8c913859f)
78
84
-**Supplementary Reading:**
79
85
-[Amazon scraps secret AI recruiting tool that showed bias against women](https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G)
86
+
-[Fired by Bot at Amazon: 'It’s You Against the Machine'](https://www.bloomberg.com/news/features/2021-06-28/fired-by-bot-amazon-turns-to-machine-managers-and-workers-are-losing-out)
80
87
-[Hiring Algorithms are Not Neutral (Gideon Manna and Cathy O'Neil)](https://hbr.org/2016/12/hiring-algorithms-are-not-neutral)
81
88
-[Can an Algorithm Hire Better Than a Human?](https://www.nytimes.com/2015/06/26/upshot/can-an-algorithm-hire-better-than-a-human.html)
82
89
-[Now Algorithms Are Deciding Whom to Hire, Based on Voice](https://www.npr.org/sections/alltechconsidered/2015/03/23/394827451/now-algorithms-are-deciding-whom-to-hire-based-on-voice)
@@ -87,23 +94,26 @@ This assignment appeared as part of [_ACM SIGCSE'S Assignments that Blend Ethics
87
94
88
95
--------------------
89
96
90
-
## **[Nested Loops & 2D Lists]** Developers as Media Manipulators
91
-

97
+
## <aname="manipulators">**[Nested Loops & 2D Lists]** Developers as Image Manipulators</a>
98
+

92
99
_How does representation in a dataset impact an algorithm's outcome? Is it possible to create a representation that treats all people fairly? What are the possible implications of facial recognition software when it is used on historically marginalized groups?_
93
100
94
101
-**Scenario:** This activity starts as a classic media manipulation lab (changing RGB values in pixels). In the last portion of the lab, students are given a series of face images, and write code to generate the _average_ face of those images. In the following lecture, students reflect on what happens when we analyze the demographics of the data underlying our face-averaging algorithm. We use it as an introductory analogy to the shortcomings of training data on machine-learning, and an entry to talk about face-recognition.
-**Supplementary Reading:** I use some the following material in a subsequent lecture where we reflect on the lab. [Click this link to get a sense of that material](https://twitter.com/evanmpeck/status/1307043732676644864)
106
+
-[Gender Shades - by Joy Buolamwini](https://www.youtube.com/watch?v=rWMLcNaWfe0)
107
+
-[ACM US Technology Policy Committee Urges Suspension of Private and Governmental Use of Facial Recognition Technologies](https://www.acm.org/binaries/content/assets/public-policy/ustpc-facial-recognition-tech-statement.pdf)
108
+
-[Facial Recognition Is Accurate, if You’re a White Guy](https://www.nytimes.com/2018/02/09/technology/facial-recognition-race-artificial-intelligence.html)
-[An Ethics of Artificial Intelligence Curriculum for Middle School Students](https://docs.google.com/document/d/1e9wx9oBg7CR0s5O7YnYHVmX7H7pnITfoDxNdrSGkp60)
111
+
-[Face Averager by Lisa DeBruine and Ben Jones](http://faceresearch.org/demos/average)
102
112
103
113
104
114
------------------------
105
-
## **[Intro OOP]** Developers as Prioritizers
106
-

115
+
## <aname="prioritizers">**[Intro OOP]** Developers as Prioritizers</a>
116
+

107
117
_What is 'moral' behavior in the context of a computer? How do we write code that is forced to assign value to people? What are the implications of our representation decisions?_
108
118
-**Scenario:** Program a disaster-relief robot to prioritize which distressed people to saves. This reframing of the Trolley Problem nudges students to reflect on issues of representation in their code (what are the problems with male/female representation? Should we even represent weight?), and consider how individual decisions could amplify systemic biases if it was used at scale.
109
119
-**Practice:** conditionals, use of APIs and objects, dictionaries (in optional last part)
@@ -112,61 +122,18 @@ _What is 'moral' behavior in the context of a computer? How do we write code tha
112
122
-**Write ups:**_Note:_ these reflections are based on an earlier version of the assignment, but should still communicate the philosophy.
-[Can you make AI fairer than a judge? Play our AI courtroom game](https://www.technologyreview.com/s/613508/ai-fairer-than-judge-criminal-risk-assessment-algorithm/)
128
+
-[Machine Bias: there’s software used across the country to predict future criminals. And it’s biased against blacks.](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing)
129
+
-[Can an algorithm tell when kids are in danger?](https://www.nytimes.com/2018/01/02/magazine/can-an-algorithm-tell-when-kids-are-in-danger.html)
115
130
116
131
While not peer-reviewed, people have pointed to my reflection on Medium when looking to **cite this work**:
117
132
> Evan Peck. 2017. The Ethical Engine: Integrating Ethical Design into Intro Computer Science. https://medium.com/bucknell-hci/the-ethical-engine-integratingethical-design-into-intro-to-computer-science-4f9874e756af
118
133
119
-
<!-- ## [Hiring Algorithms: Developers as Decision-Makers](modules/hiring)
120
-
121
-

122
-
123
-
_What does it mean to design a fair algorithm? What is the human cost of efficiency? What systemic advantages/disadvantages are your algorithms likely to amplify?_
124
-
- *Scenario:* Develop an algorithm that filters job applications based on GPA
125
-
- [Material](modules/hiring)
126
-
- *Practice:* loops, conditionals, python lists
127
-
- *Writeup:* [Ethical Design in CS 1: Building Hiring Algorithms in 1 Hour (Evan Peck)](https://medium.com/bucknell-hci/ethical-design-in-cs-1-building-hiring-algorithms-in-1-hour-41d8c913859f)
<!-- ## [Input Validation: Developers as Gatekeepers](modules/input)
132
-

133
-
134
-
- *Scenario:* Collect and validate personal information of people visiting a university
135
-
- [Material](modules/input)
136
-
- *Practice:* conditionals, functions, data types
137
-
- *Author:* [Justin Li (Occidental College)](https://justinnhli.com/), Adapted by [Evan Peck (Bucknell University)](http://www.eg.bucknell.edu/~emp017/) -->
138
-
139
-
--------------------
140
-
## [Ethical Engine 1: Developers as Definers of Identity](modules/ethicalengine1)
141
-

142
-
143
-
_How can we adequately represent people in code? What characteristics of people should we **NOT** include in code? What are the implications of our representation decisions?_
144
-
145
-
-*Scenario:* In code, represent a person so that autonomous cars can make life-critical decisions
-[Write Up the Ethical Engine Lab (Justin Li)](https://howtostartacsdept.wordpress.com/2018/01/13/step-86-write-up-the-ethical-engine-lab/)
161
-
-*Author:*[Evan Peck (Bucknell University)](http://www.eg.bucknell.edu/~emp017/), parts of activity by [Vinesh Kannan (Mimir HQ)](https://github.com/vingkan)
0 commit comments