The pressure to review thousands of horrific videos each day – beheadings, child abuse, torture – takes a devastating toll on our mental health
What could be a solution for dealing with that? I wouldn’t want to be exposed to that type of content even if I was paid to do so and had access to health support to deal with the aftermath every time.
On paper, it’s one of the uses for AI image recognition. It could reduce the amount that needs human review drastically.
In reality, Youtube’s partially automated system (to my knowledge the most robust one around) regularly flags highly stylized videogame violence as if it is real gore. It also has some very dumb workarounds like simply putting the violence more than 30 seconds into the video (which has concerning implications for its ability at filtering real gore).
What could be a solution for dealing with that? I wouldn’t want to be exposed to that type of content even if I was paid to do so and had access to health support to deal with the aftermath every time.
On paper, it’s one of the uses for AI image recognition. It could reduce the amount that needs human review drastically.
In reality, Youtube’s partially automated system (to my knowledge the most robust one around) regularly flags highly stylized videogame violence as if it is real gore. It also has some very dumb workarounds like simply putting the violence more than 30 seconds into the video (which has concerning implications for its ability at filtering real gore).