Discord is updating its Community Guidelines There is a clause prohibiting the sharing of information it believes is “false or misleading” and “could cause physical or social harm” if action is taken. This rule may apply to a large amount of information, but Covid-19 rhetoric is the main example given. The chat service does not want to be a source of “anti-vaccination content” or advice not accepted by the medical community, such as the use of unproven home remedies.
In short, Discord will not allow individuals to “post, promote, or organize communities around false or misleading health information that may cause harm,” said Alex Anderson, Discord’s senior platform policy expert, in a statement. blog post Explanation update.
Discord defines false or misleading health information as any health information that “directly and explicitly contradicts the latest consensus in the medical community,” and it provides a surprising amount of detail about what it means.
Below is a list of topics that Discord warns against making “false or misleading” claims:
- the safety, side effects or effectiveness of the vaccine;
- the composition, development or approval of the vaccine;
- Alternative, unapproved treatments for diseases (including claims promoting harmful forms of self-medication, and claims refusing to use vaccines or alternatives);
- the presence or prevalence of disease;
- the spread or symptoms of disease;
- Health guidance, advice or authorization (including false claims about preventive measures and actions that may hinder the resolution of a public health emergency);
- Availability or eligibility for health services; and,
- Content that suggests a healthy conspiracy of malicious forces (including statements that could lead to social unrest or facilitate the destruction of critical infrastructure).
On its own, the list could be interpreted as a blanket ban on expressions of distrust in any local health mandate, or even discussion of “alternatives” to traditional medicines. However, Anderson said Discord considers context such as intent and won’t take action unless it believes a message “may cause some form of harm.”
“This policy is not designed to punish polarized or controversial views,” he wrote. “We allow the sharing of personal health experiences; opinions and comments (as long as those opinions are based on fact and do not cause harm); well-meaning discussions about medical science and research; content designed to condemn or debunk health misinformation; satire and humor that is clearly intentional Laugh at false or misleading health claims.”
People with polarized or controversial views may disagree with the claim that they are not being targeted, although it is worth mentioning that Discord users, who mostly stick to smaller groups, may not notice any changes, no matter what they say on the platform what.
When I talked to Discord about privacy in 2019, it told me that it doesn’t actively monitor any given server’s text and voice chats — with over 150 million monthly active users, how could it be? Instead, moderators primarily respond to user reports, which are most likely from large public servers. I think it’s unlikely that Discord scans the chat logs of every 20-person server for narratives related to Covid-19 vaccines and microchips.
However, there is some precedent for active moderation on Discord. In 2018, after publishing several articles report Since the relative privacy that Discord offers has turned it into a hideout for white supremacists, the company has publicly struggled to get rid of hate group servers. Following this example, Discord may find and shut down servers that publicly advertise themselves as anti-Vax centers, if such servers exist. (If I had to guess, I’d say they have.)
This new discord policy Effective March 28. The new guidelines also ban “malicious parody” and state that “sarcasm and parody are fine,” and Discord allows itself to consider reports of “relevant off-platform behavior” when taking action against users, such as “membership or association with hate groups, illegal activity” and hate, sexual or other types of violence”.
Discord also said it would crack down on “false, malicious or spam” reports. “If we find that you made a malicious report, we may take action against your account,” the company said.
As someone who doesn’t somehow use Discord as a soapbox for vaccine-related commentary, the news is mostly a reminder that even on so-called private servers, conversations that take place on the platform aren’t completely private. This is a moderated social network, so if someone submits a report, the Discord mod may look at your chats and issue a warning, suspension, or ban.For those who want Discord-like functionality without joining a social network, companies like group speech Paid private VOIP servers are still available. (Currently, I’m not too concerned about the encryption of my D&D group’s meme posts.)