How will Lemmy handle illegal content like drug dealing, child porn, snuff movies etc? On Reddit, the corporation is accountable so they will make an effort to ensure non of this exists on their platform. And they would face legal troubles if they failed to act.

But as Lemmy is decentralised this isn’t really possible. Sure the main instances can defederate from bad instances, but those instances could still operate and be accessible on the web. Especially if they’re hosted in countries outside of the western sphere of influence. Multiple bad instances could federate and duplicate the illegal content pretty easily, making it difficult for the authorities to keep it shut down. Has this been thought about already?

  • Rottcodd@kbin.social
    link
    fedilink
    arrow-up
    55
    ·
    edit-2
    1 year ago

    “Lemmy” can’t handle anything. That’s by design.

    “Lemmy” is really just a piece of software that people can use to run forums that will federate with other forums and so forth and so on. There is no central “Lemmy” authority that could do anything, and that’s by design, and a lot of the point. It means that there can never be a Lemmy spez or Musk or Zuckerberg, fucking things up for everyone.

    The highest authorities are the individual instance owners, so it will fall on them to deal with illegal content as they see fit. Presumably they’ll generally work to keep it off of their own instances through active moderation, and they’ll block other instances that they have reason to believe do not maintain acceptable standards.

    And like it or not, some share of responsibility will fall on individual users to manage their own activities in order to avoid problematic instances.

    The trade-off for having no central authority that can fuck things up for everyone is that there’s no big mommy/daddy to watch over you and protect you. The fediverse is better suited for people who are okay with that.

  • daniskarma@lemmy.world
    link
    fedilink
    arrow-up
    40
    ·
    1 year ago

    It’s like they host those sites on a apache server.

    Lemmy is just the subjacent software that runs an instance.

    Authorities would track the illegal content the same way they do on any other website I wouldn’t worry too much about it. Also descentralized illegal content exist since P2P protocols exist. I don’t see anything new with lemmy.

    • I Cast Fist@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It just makes it easier to find. Instance owners/admins are legally bound by the host country’s laws, so the smarter ones know which countries they should host their stuff.

  • kuneho@lemmy.world
    link
    fedilink
    arrow-up
    32
    ·
    1 year ago

    My first guess would be filtering out illegal content is something that the operator/admins of the given instance have to take care according to the law of the country the server is hosted in.

  • Th4tGuyII@kbin.social
    link
    fedilink
    arrow-up
    30
    arrow-down
    1
    ·
    1 year ago

    Each Fediverse instance is just a server running software (I.e. Lemmy, Kbin) that uses the ActivityPub protocol to communicate with other instances.

    There is no central authority, that’s kinda the whole point of being federated and decentralised. Each instance is it’s own website, it’s own island. It’d be like asking how “Email” or “Https” would handle illegal content. It doesn’t, it’s up to the hosts themselves to do so.

    On a fundemental level, each instance is just a website, so an illegal instance would be tracked down and prosecuted in the same way as any other website/forum doing illegal stuff.

    Best you can do is encourage your instance’s admin to defederate from those illegal instances if they haven’t already. Let the authorities handle the rest.

  • RightHandOfIkaros@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    1 year ago

    The responsibility would fall on the instance admin(s). Moderators could also share the responsibility, as well as any users involved in the illegal activity.

  • bdonvr@thelemmy.club
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    Images aren’t federated. You’re viewing them from their host server.

    So it’s not really duplicated anywhere

  • ashethursday@lemmy.world
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    I’d say moderation is up to each instance, but it does leave the question if anyone is legally responsible for ensuring no instances have illegal content - I’d guess no. If it’s open source then the perpetrators will be solely responsible for illegal content on that instance, as they used the open source platform for nefarious purposes… I’m thinking individual instances have a legal obligation to keep that content blocked, but I’m not sure. I think this is a really good question

  • cloaker@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Im sure it would be easy to track on behalf of authorities, no more than downloading a large zip and uploading it elsewhere is. Authorities can notice the unique characteristics of files and find them across the web.

    As for actually stopping this, right now a lot of instances federate with many instances, I’d be worried about bad actors fwderating with big instances and then posting illegal shit, meaning big instances download onto their servers by accident. More moderation tools need to be developed into fedi.

  • cfx_4188@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    The creators of Lemmy keep telling us about the “federation” of the social network. A federation is something that has a federal center that exercises a governing function.

    Just because we all have not been informed of the existence of such a center does not mean that it does not exist.

    • ravheim@kbin.social
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      There was a discussion a few weeks ago where the NSFW instance made it a rule that illegal material would not be removed.
      There was then a clarification that if the material was obviously illegal it would be removed but if it was questionable it would be allowed to stay.
      This makes everyone that allows NSFW material to show up on their feed open to seeing illegal content.
      OP’s question is a valid concern.