When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.
You must log in or register to comment.
An important point, buried deeply in the article :
She said she has observed differential treatment across the social media platform, depending on the race of the subject in the image in question or who posted it.
“We’ve seen this time and again, Meta taking down content by and about people of color,” she said. “While similar content by and about white people remains up.”