Coded Bias - 2020 - 6/10
MIT researcher Joy Buolamwini, playing with new facial recognition software, discovers built in prejudices.
The software is very good at differentiating between white males, less with white females, even less with males of color, and poor with females of color.
In other words, facial algorithms developed by white males seem best suited for white males.
The errors of mistaken identity have implications when facial recognition is used by law enforcement.
Parallel narratives follow how this is being implemented in China, and in the West, by Big Tech (Amazon).
The narrative wanders off-topic into identity, privacy and data mining, not necessarily by facial recognition.
Meaning, the biggest personal snitch is the one in your pocket … your “smart” phone.
And, in my opinion, fewer and fewer angst over personal privacy. Younger souls have less inkling of living free range.