... | ... | @@ -59,6 +59,23 @@ Summary provided by Monika Barget (History), based on the sources cited. |
|
|
<details>
|
|
|
<summary>
|
|
|
|
|
|
### (Algorithmic) Bias
|
|
|
|
|
|
</summary>
|
|
|
|
|
|
In ‘What is (a) Digital Society?’ we introduced the notion of ‘bias’ in discussions of the common idea that technology is neutral. One of the key themes of that course was precisely about this notion of neutrality. In several modules, we explored how digital technologies are made to serve the interests of particular groups. This comes up in other courses, including ‘Digitalisation and politics’ and ‘Surveillance Society’. Bias can arise from absence of data (e.g. if only male bodies are used to collect data about the effectiveness of medicines), or by the classifications made by data scientists (e.g. that racializes groups when making decisions about jobs, mortgages, etc.)
|
|
|
- Reference (in UM Library): O’Neil, C. (2016). Weapons of Math Destruction. How Big Data Increases Inequality and Threatens Democracy. Crown Publishing.
|
|
|
|
|
|
You read various chapters of this book for ‘What is (a) Digital Society?’.
|
|
|
Pay attention to different meanings of bias you have encountered. Here we are talking about prejudices against individuals or groups that lead to unfair outcomes. But you have also learned about statistical bias (in ‘Quanitative Data Analysis’ and ‘Working with Big Data’). For statisticians, bias is a technical term, often referring to the lack of representativeness of data.
|
|
|
|
|
|
Summary provided by Sally Wyatt, based on the sources cited.
|
|
|
|
|
|
</details>
|
|
|
|
|
|
<details>
|
|
|
<summary>
|
|
|
|
|
|
### Critical Making (see *design thinking* in methods)
|
|
|
|
|
|
</summary>
|
... | ... | |