A brief introduction to the domain that is variously described as the ethics of machine learning, data science ethics, AI ethics and the ethics of big data. (Delivered as a guest lecture for COMPSCI 361 at the University of Auckland on May 29, 2019)
2. 1. What is big data ethics / data science ethics / AI ethics?
Outline:
2
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
1. What is big data ethics / data science ethics / AI ethics?
3. Case study: COMPAS predictions of risk of recidivism
6. Some ethical questions in AI and data science:
6
- Who owns data that is generated by you and/or about you
online?
- Why are transparent and explainable systems important?
- When should you get consent to run experiments on users?
- Under what conditions does your right to privacy
come into play?
- How do we judge whether an algorithm is fair?
7. Outline:
7
1. What is big data ethics / data science ethics / AI ethics?
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
3. Case study: COMPAS predictions of risk of recidivism
8. 8
The Truth About Algorithms | Cathy O'Neil
Audio extracted from a free talk given by O'Neil at the RSA in London, 2017.
https://www.youtube.com/watch?v=heQzqX35c9A
10. 1
0
Opacity: “Even if the participant is aware of
being modelled, or what the model is used for,
is the model opaque, or even invisible?”
Scale: Does the model have the capacity
to grow exponentially? Can it scale?
Damage: “Does the model work against
the subject’s interest? In short, is it
unfair? Does it damage or destroy lives?”
11. Outline:
11
1. What is big data ethics / data science ethics / AI ethics?
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)
3. Case study: COMPAS predictions of risk of recidivism
15. 1
5
Opacity: The COMPAS algorithm is proprietary to
Northpointe (now Equivant).
Scale: In Broward County, everyone who is arrested must
be assessed using COMPAS.
Damage: The analysis indicates that African-Americans
are more likely than Caucasians to receive a false positive
for med/high risk of recidivism. This negatively impacts
bail decisions for them.
Verdict: based on the above characteristics, COMPAS arguably is a WMD.
17. Diagnosing the disparity in false positive rates:
17
Images: Nature, "Bias detectives: the researchers striving to make algorithms fair",
20 June 2018
18. Diagnosing the disparity in false positive rates:
18
Images: Nature, "Bias detectives: the researchers striving to make algorithms fair",
20 June 2018
19. Diagnosing the disparity in false positive rates:
19
Images: Nature, "Bias detectives: the researchers striving to make algorithms fair",
20 June 2018
22. Outline:
22
1. What is big data ethics / data science ethics / AI ethics?
3. Case study: COMPAS predictions of risk of recidivism
4. How do we incorporate ethics into systems development?
2. O’Neil’s analysis of Weapons of Math Destruction (WMDs)