'Racist' technology is a bug — not a crime
We are told of late that we must entertain whether technology can be a racist. Like when Google Photos, in 2015, algorithmically identified black people as gorillas. Or earlier this year when Microsoft's Twitterbot Tay, designed to emulate human conversation by trawling tweets, sucked up racist nonsense along with everything else and started spouting some of its own.
7701 Las Colinas Blvd., Ste. 800, Irving, TX 75063