• arc@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    4 months ago

    I’ve disabled personalised ads on YouTube and I see this sort of shit all the time. I’ve given up reporting them because 90% of the time the report is rejected. I don’t even understand the rationale for rejecting it because it’s an obvious a scam as a scam can be - ai impersonation, fake endorsement, illegal advertising category. It’s a scam YouTube.

    I don’t even get why these ads even appear. YouTube has transcription & voice / music recognition capabilities. How hard would it be to flag a suspicious ad and require a human to review it? Or search for duplicates under other burner accounts and zap them at the same time? Or having some kind of randomized audit based on trust where new accounts get reviewed more frequently by experienced reviewers.

    • CileTheSane@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      How hard would it be to flag a suspicious ad and require a human to review it?

      Hard? No. But then humans would have to be paid which would slow down the growth of the dragon horde.

      Better to have a computer analyze the ad that another computer thinks looks real.

      • arc@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        They have to have a human respond to each and every complaint about that ad. Seems more sensible to automate and flag suspicious ads before the complaints happen.