To fix algorithmic bias, we first need to fix ourselves
In 2014, two 18-year-old girls found an unlocked kid’s bicycle and a Razor scooter in a residential area of Fort Lauderdale, Florida. Sade Jones and Brisha Borden decided to ride them a couple of blocks, laughing at how big they felt on the tiny frames. Hours later, they were thrown in jail and charged with robbery and petty theft. According to ProPublica, they were not locked up because they had “borrowed” their rides — they were locked up because COMPAS, a proprietary AI system designed by Northpointe corporation for predicting recidivism, rated Borden at a high risk and Jones at medium risk of reoffending within the next two years.
7701 Las Colinas Blvd., Ste. 800, Irving, TX 75063