Hey folks!

I made a short post last night explaining why image uploads had been disabled. This was in the middle of the night for me, so I did not have time to go into a lot of detail, but I’m writing a more detailed post now to clear up where we are now and where we plan to go.

What’s the problem?

As shared by the lemmy.world team, over the past few days, some people have been spamming one of their communities with CSAM images. Lemmy has been attacked in various ways before, but this is clearly on a whole new level of depravity, as it’s first and foremost an attack on actual victims of child abuse, in addition to being an attack on the users and admins on Lemmy.

What’s the solution?

I am putting together a plan, both for the short term and for the longer term, to combat and prevent such content from ever reaching lemm.ee servers.

For the immediate future, I am taking the following steps:

1) Image uploads are completely disabled for all users

This is a drastic measure, and I am aware that it’s the opposite of what many of our users have been hoping, but at the moment, we simply don’t have the necessary tools to safely handle uploaded images.

2) All images which have federated in from other instances will be deleted from our servers, without any exception

At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.

3) I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

Lemmy has always loaded some images directly from other servers, while saving other images locally to serve directly. I am eliminating the second option for the time being, forcing all images uploaded on external instances to always be loaded from those servers. This will somewhat increase the amount of servers which users will fetch images from when opening lemm.ee, which certainly has downsides, but I believe this is preferable to opening up our servers to potentially illegal content.

For the longer term, I have some further ideas:

4) Invite-based registrations

I believe that one of the best ways to effectively combat spam and malicious users is to implement an invite system on Lemmy. I have wanted to work on such a system ever since I first set up this instance, but real life and other things have been getting in the way, so I haven’t had a chance. However, with the current situation, I believe this feature is more important then ever, and I’m very hopeful I will be able to make time to work on it very soon.

My idea would be to grant our users a few invites, which would replenish every month if used. An invite will be required to sign up on lemm.ee after that point. The system will keep track of the invite hierarchy, and in extreme cases (such as spambot sign-ups), inviters may be held responsible for rule breaking users they have invited.

While this will certainly create a barrier of entry to signing up on lemm.ee, we are already one of the biggest instances, and I think at this point, such a barrier will do more good than harm.

5) Account requirements for specific activities

This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

This could in theory limit creation of new accounts just to break rules (or laws).

6) Automated ML based NSFW scanning for all uploaded images

I think it makes sense to apply automatic scanning on all images before we save them on our servers, and if it’s flagged as NSFW, then we don’t accept the upload. While machine learning is not 100% accurate and will produce false positives, I believe this is a trade-off that we simply need to accept at this point. Not only will this help against any potential CSAM, it will also help us better enforce our “no pornography” rule.

This would potentially also allow us to resume caching images from other instances, which will improve both performance and privacy on lemm.ee.


With all of the above in place, I believe we will be able to re-enable image uploads with a much higher degree of safety. Of course, most of these ideas come with some significant downsides, but please keep in mind that users posting CSAM present an existential threat to Lemmy (in addition to just being absolutely morally disgusting and actively harmful to the victims of the abuse). If the choice is between having a Lemmy instance with some restrictions, or not having a Lemmy instance at all, then I think the restrictions are the better option.

I also would appreciate your patience in this matter, as all of the long term plans require additional development, and while this is currently a high priority issue for all Lemmy admins, we are all still volunteers and do not have the freedom to dedicate huge amounts of hours to working on new features.


As always, your feedback and thoughts are appreciated, so please feel free to leave a comment if you disagree with any of the plans or if you have any suggestions on how to improve them.

  • bdesk@kbin.social
    link
    fedilink
    arrow-up
    83
    arrow-down
    1
    ·
    1 year ago

    You forgot getting the authorities involved when somebody does upload csam

    • nxfsi@lemmy.world
      link
      fedilink
      arrow-up
      32
      arrow-down
      1
      ·
      1 year ago

      It’s a known tactic by trolls to upload cheese pizza and then notify the media/the authorities themselves because context doesn’t matter when it comes to CSAM

    • sunaurus@lemm.eeOP
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      The Lemmy.world team is getting some authorities involved already for this particular case. I am definitely in favor of notifying law enforcement or revelant organizations, and if anybody tries to use lemm.ee to spread such things, I will definitely be involving my local authorities as well.

    • TWeaK@lemm.ee
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      10
      ·
      1 year ago

      getting the authorities involved

      How do you imagine that playing out? This isn’t some paedophile ring trading openly, this is people using CSAM as an attack vector. Getting over-enthusiastic police involved is exactly their goal, and will likely do very little to help the victims in the CSAM itself.

      Yes, authorities should be notified and the material provided to the relevant agencies for examination. However that isn’t truly the focus of what’s happening here. There is no immediate threat to children with this attack.

      • WalkableProgrammer@lemmy.world
        link
        fedilink
        arrow-up
        43
        arrow-down
        2
        ·
        1 year ago

        How do you imagine that playing out?

        FBI: Whoa that illegal

        Admin: Ya

        FBI: We’re going to look for this guy

        Admin: alright

        END ACT 1

        • TWeaK@lemm.ee
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          6
          ·
          1 year ago

          This isn’t something the FBI have much involvement with. The FBI deal with matters across states.

          This isn’t America, where you have a bunch of separate states unified under one American government. People haven’t been posting porn to lemm.ee. People have been posting porn to other instances, which has seeped through to lemm.ee.

          Getting the Estonian law enforcement involved is like trying to get the Californian government involved in dealing with a problem from Texas. Estonian law enforcement have no jurisdiction over lemmy.world or any other instance, and giving them an opportunity is only going to lead to locking down lawful association and communication in favour of some vague “think of the children” rhetoric. And, like I say, it won’t do anything to curtail the production of CSAM as the purpose of this attack has little to do with the promotion of CSAM.

          Frankly, it could easily be more like:

          lemm.ee: We’ve got a problem with illegal content

          Estonian law enforcement: Woah that’s illegal.

          Estonian law enforcement: You’ve admitted to hosting illegal content. We’re going to confiscate all your stuff.

          lemm.ee is shut down pending investigation.

          Meanwhile, if lemm.ee continues its current course of action, yet someone notifies law enforcement:

          Estonian law enforcement: Woah, we’ve got a report of something dodgy, that’s illegal.

          lemm.ee: People tried to post illegal content elsewhere that could have come to our site, we blocked and deleted it to the best of our ability.

          Estonian law enforcement: Fair enough, we’ll see what we can figure out.

          It really matters how and when the problem is presented to law enforcement. If you report yourself, they’re much more likely to take action against yourself than if someone else reports you. It doesn’t do yourself any favours to present your transgressions to them, not unless you’re absolutely certain you’re squeeky clean.

          At this stage and in these circumstances, corrective action is more important than reporting.

          • lagomorphlecture@lemm.ee
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            You’re assuming that no American user saw any of the content. I think the FBI could absolutely get involved if the content was seen by anyone in the US, let alone by people in more than 1 state. I’m not going to pretend to be an expert on child abuse or cyber crimes but the FBI devotes massive resources to investigation of crimes against children and could potentially at least help other agencies investigate where this attack originated from. And if the FBI were able to determine that the attack originated from the US, I assure you the DOJ is far less kind to people who possess, commit or distribute that type of horrible child abuse than they are to rich old white men who commit a coup. You’re kind of acting like this is just another DDOS attack rather than the deliberate distribution of horrific images of child abuse to a platform that in no way encourages distribution of child abuse material.

            Anywhooooo the problem was much worse on lemmy.world since they were the main target of the attack. Does anyone know if they reported it?

            • barsoap@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              1 year ago

              Local authorities will be the contact point of the admins (or authorities of where the servers are hosted). They’ll investigate what they can and then ring up euro/inter/whatever pol as necessary to have other forces handle stuff in their respective jurisdictions. Cross-border law enforcement isn’t exactly unchartered waters, they’ve been doing it for quite a while.

              As to the current case the ball is clearly in the field of lemmy.world admins and their local authorities (Germany? Hetzner, I think, as so many) as they’re the ones with the IP logs. Even if the FBI gets a tip-off because an American saw anything they’re not exactly in a position to do anything but go via Interpol and ask the BKA if they’d like to share those IP logs.