‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • andrew_bidlaw@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 months ago

    It was inevitable. And it tells more about those who use them.

    I wonder how we’d adapt to these tools being that availiable. Especially in blackmail, revenge porn posting, voyeuristic harassment, stalking etc. Maybe, nude photoes and videos won’t be seen as a trusted source of information, they won’t be any unique worth hunting for, or being worried about.

    Our perception of human bodies was long distorted by movies, porn, photoshop and subsequent ‘filter-apps’, but we still kinda trusted there was something before effects were applied. But what comes next if everything would be imaginary? Would we stop care about it in the future? Or would we grow with a stunted imagination since this stimuli to upgrade it in early years is long gone?

    There’re some useless dogmas around our bodies that could be lifted in the process, or a more relaxed trend towards clothing choices can start it’s wsy. Who knows?

    I see bad sides to it right now, how it can be abused, but if these LLMs are to stay, what’re the long term consequencies for us?

    • LufyCZ@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      I think that eventually it might be a good thing, especially in the context of revenge porn, blackmail, etc. Real videos won’t have any weight since they might as well be fake, and as society gets accustomed to it, we’ll see those types of things disappear completely