-
@ DEG Mods
2024-11-14 19:15:23As some of you might be aware, we can't take down mod posts/pages or ban creators' accounts on DEG Mods, as we've built the site on top of the Nostr protocol. With that said, a lot of people are concerned with seeing a malicious posts on the site, like spam, scams, posts with viruses, and illegal content, and other worries like not wanting to see specific content. People are worried on how we'd be handling all of these issues as we claim censorship-resistance.
This post is to help showcase and explain how we'll be handling content moderation on the site, while maintaining the site's censorship-resistant and permissionless nature.
Reporting and blocking
The first thing that can be done, in regards to moderation, is for users to report mod posts (or other type of posts). Reporting would result in us having a look at said reported post and determine if we'd block it or not.
Blocking it means that it will be hidden from initial view on the website, yet it'd still be accessible if someone has the link, or accessible on other sites.
Illegal content
In regards to illegal content, we'd essentially do the same thing as mentioned above, and at that point the relevant authorities would need to handle it from then on, as they try to find the perpetrator and prosecute them.
Web of Trust (WoT)
There will be, if not already, an near automated moderation system applied almost site wide, which would handle most of the leg work. This system is called Web of Trust, where it works by showing content published by users who we have followed, and their tree of followers (up to a point that we'd determine).
In essence, the natural behavior of users using the site, from mod creators to enjoyers, like publishing mods, commenting, reacting, publishing short posts, writing update posts or news/announcements, following whoever they like, not following who they don't like and/or blocking those who publishing malicious posts, would act as a moderation layer for for themselves and everybody else. Of course, this means that not everyone will necessarily be visible on the site initially as they start, so that's why everyone is encouraged to interact with each other, follow who you like, block bad faith actors, and so on. Everything is interconnected with each other to create a very efficient, effective, and safe viewing/usage experience on the site.
As we mentioned initially, we'll also maintain our claims of censorship-resistance by providing a toggle to remove the site's WoT filter, including the user's WoT, if they want to see almost everything. This gives everyone a method and chance to be seen, regardless of this moderation layer.
Individual moderation tools
Each individual user on the site has the ability to block a mod posts (or any other type of posts) on the site, as well as whole accounts and their posts.
As we mentioned briefly above, each user will also have their own WoT preference, to see posts of the people they trust, as well as their follows, and so on.
Various systems and designs
We'll be introducing various different systems/designs that would help with general moderation of the site from the user's perspective and exploration of the site. As an example, if a mod post won't include a malware scan report of the mod files (and we'll see if we can make this an automated, then there'd be a visible note around it to indicate that there hasn't been a scan done for it and advise users to be careful of malware).
Questions and answers
Here are a few questions and answers related to this topic. We'd update this post as new info comes along.
Do you allow X or Y type of content?
As long as its a mod (specially cases were non-mods wouldn't be subject to moderation), is legal, not directly harmful (malware, scam), and isn't spam, then you should be fine.
Does fiction equal fiction?
Yes.