Monday, September 21, 2020

Coded Bias Reflection

 



    Before watching Coded Bias, I knew very little about artificial intelligence, and all that it can do. I was also extremely ignorant to the fact that coding and AI can be bias. I never thought about how facial recognition would be such a big problem. Everyone has their own biases that they have to learn to get over, but I didn't realize a personal bias could make such an impact on technology. Joy Buolamwini noticed that facial recognition software had a very difficult time detecting her face, why? Because she is not white. She was not able to be recognized until she put on a white mask that was able to detect the face and features. This happens when coders have a bias, whether or not they know it, against African Americans or other dark skinned people. They don't see a problem because it does not affect them and their personal lives-in fact, they might not even know that it is causing problems for other people. 

This film showed how important it is to know your own biases. There were clips shown of police officers using facial recognition, but it was stated time and time again that this system has identified the wrong people when it comes to crime. How is this helpful? It is only making it harder for the police to find the actual criminal, and it is harmful to the person who is wrongfully arrested. Honestly, after watching this short film it has made me more aware of how big biases can be, and how we as citizens should be more aware of what is good and bad with AI and facial recognition.

No comments:

Post a Comment

Blog 11: Final Blog

       So many people say that "kids these days" are addicted to our phones and social media, and while this may be true for some ...