Computing Bias
- Popcorn Hack 1:
- Popcorn Hack 2:
- In the financial industry, an AI system used to approve loan applications unintentionally favors male applicants over female applicants because it was trained on past loan approval data, which reflected gender biases. This is an example of Pre-existing Social Bias.
- Give two ways to mitigate this bias and make the system more fair.
- Homework Hack
Popcorn Hack 1:
Think of a real-world example where a computer system has shown bias. It could be something you’ve read about, experienced, or imagined.
One real-world example is facial recognition software that performs poorly on people with darker skin tones. This is an example of Pre-existing Social Bias, as the training data often lacks diversity. One way to reduce this bias is to ensure the training dataset is balanced and includes a wide variety of skin tones and demographics.
Popcorn Hack 2:
In the financial industry, an AI system used to approve loan applications unintentionally favors male applicants over female applicants because it was trained on past loan approval data, which reflected gender biases. This is an example of Pre-existing Social Bias.
Give two ways to mitigate this bias and make the system more fair.
-
The training data can be balanced to remove historical gender bias by including more diverse examples
-
There can be different algorithms put in place to ensure that no gender bias is happening in companies systems.
Homework Hack
Question: Think of a system or tool that you use every day—this could be a website, app, or device. Can you identify any bias that might exist in the way the system works?
Task:
- Describe the system you’re thinking of.
- Identify the bias in that system and explain why it might be biased. (Is it Pre-existing Social Bias, Technical Bias, or Emergent Social Bias?)
- Propose one way to reduce or fix the bias in that system.
One system I use a lot is Google Translate. This is a platform to translate words from language to language. While this is not applicable in english, in many other languages, they have male/female pronouns for different terms.
Because of bias, google translate often defaults to male pronouns for terms such as doctor and engineer, and female pronouns for words like teacher and nurse.
This is pre existing bias because it reflects the stereotypes found in the data it was trained on.
One way to prevent this is to make sure the data is clean first, and then actively check when outputting any terms to make sure pronouns are gender neutral.