TikTok has to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated For You Page (FYP). The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”

TikTok’s algorithmic recommendations on the FYP constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    arrow-up
    6
    ·
    2 months ago

    I’m gonna take the side that tok is potentially liable on the algo argument but these parents also failed their children. Teaching your kids to avoid replicating unsafe internet content should be just as primary as looking both ways before crossing the road.

    • Kissaki@beehaw.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 months ago

      As a society, we’re responsible for all our children. The point of child protection laws, and population protection in general, is to support and protect them, because often times, parents are incapable of doing so, or it’s social dynamics that most parents can’t really understand, follow, or teach in.

      Yes, parents should teach and protect their children. But we should also create an environment where that is possible, and where children of less fortunate and of less able parents are not victims of their environment.

      I don’t think demanding and requiring big social platforms to moderate and regulate at least to the degree where children are not regularly exposed to life-threatening trends is a bad idea.

      That stuff can still be elsewhere if you want it. But social platforms have a social dynamic, more so than an informative one.

  • t3rmit3@beehaw.org
    link
    fedilink
    arrow-up
    5
    ·
    2 months ago

    I am generally very skeptical of lawsuits making social media and other Internet companies liable for their users’ content, because that’s usually a route to censor whatever the government deems “harmful”, but I think this case actually makes perfect sense by attacking the algorithmic “curation” that they do. Imo social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

    • Chahk@beehaw.org
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

      But then how would they make money if they can’t keep users doomscrolling forever to keep serving them ads? Won’t someone think of the shareholders?!

  • stardust@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    2 months ago

    I remember reading that China’s version of tiktok more promotes stuff like sciences to kids. Then for everyone else they get degeneracy of stuff like stealing KIAs, licking grocery store items, and now black out challenges.

    It would be interesting if how the algorithm is tuned for China and the rest of the world was available. Makes me wonder if it’s intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

    Stuff like Facebook and Twitter are insane too so it’s all self sabatoge at this point, but tiktok has seemed to become the trend setter.

    • LukeZaz@beehaw.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      Makes me wonder if it’s intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

      Good lord, this is a massive reach. A much simpler explanation is that algorithmic garbage is profitable, and China’s government does not care about negative ramifications that occur outside China itself and so do not regulate it.

      China’s run by a terrible government, not an MCU villain.

      • stardust@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Uhhh… I don’t think you got my point for why I also included Facebook and Twitter at the end as examples of domestic companies also willingly allowing harmful societal trends.

        Money being a reason doesn’t absolve and provide a convient out and let companies do whatever they want without consequence or criticism. I put them all in the camp of willingly selling out a worse society for profit, and whether a country sees that as a win for them or not doesn’t change that.

        • Yoruio@lemmy.ca
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          2 months ago

          this is just how capitalism works - you have to appeal to your audience more than your competition, and guess which kind of content teenagers want to watch more. Hell, even adults want fun content as opposed to educational content.

          they’re not willingly selling a worse society for profit, that’s just the only way to stay competitive.

          any platform that pushes educational content in North America would just not get any customers and go bankrupt.

          edit: there’s plenty of educational video platforms out there, like Khan academy. Try and get your kids to scroll through that during their free time instead, I bet they won’t.

          • stardust@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            I know how capitalism works… I was just sharing my thoughts on the situation of a company knowingly adjusting the algorithm in a positive direction for one demographic but a negative for another showing a clear awareness of impact. Not sure why you are so worked up about tiktok getting criticized too. Whatever.

            • Yoruio@lemmy.ca
              link
              fedilink
              arrow-up
              0
              ·
              2 months ago

              In the US, publically traded companies have a legal obligation to make as much money for their shareholders as legally possible (See Ford getting sued by shareholders after giving workers raises). It would be borderline illegal for a company to adjust their algorithm in a way that makes them less competitive.

              This needs to be regulated by government, not the companies themselves. Thay would mean that the companies would be forced to all change their algorithms at the same time, and not impact their competitiveness.

              So the government going after tiktok is a good first step, IF it does the same thing to Facebook / instagram / YouTube / snapchat. But I’m betting it won’t be because those companies spend an absurd amount of money on lobbying.

              • t3rmit3@beehaw.org
                link
                fedilink
                arrow-up
                3
                ·
                edit-2
                2 months ago

                This is a false narrative that stock traders push. The fiduciary duty is just one of several that executives have, and does not outweigh the duty to the company health or to employees. Obviously shareholders will try to argue otherwise or even sue to get their way, because they only care about their own interests, but they won’t prevail in most cases if there was a legitimate business interest and justification for the actions.

  • coyotino [he/him]@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    ah shit me and my friends used to do this, pre-social media. I remember one time in middle school recess, going out to the farthest corner of the playground with my friends, and we all did a thing where we took turns holding our breath while someone else squeezed our chest. I remember blacking out, hearing the pokemon theme in pitch darkness, and then waking up on the ground.

    I don’t think we did it more than once (at least I didn’t). But of course, the crucial difference was that I was with my dumbass friends, so at least there was someone to run for help if someone didn’t wake up.