When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.
Why let a company be in a position to ruin ur life in the first place? It’s like putting ur balls in a Crocs mouth.
Don’t kink shame!