• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Facebook 'Supreme Court' Orders Social Network To Restore 4 Posts In 1st Rulings

McBell

Unbound
It overruled Facebook's removal of a post from a user in France criticizing the government for withholding an alleged "cure" for COVID-19. Facebook had removed the post because it said it could lead to imminent harm, but the board said the user's comments were directed at opposing government policy. "Facebook had not demonstrated the post would rise to the level of imminent harm, as required by its own rule in the Community Standards," the board said. It also recommended that the company create a new policy specifically about health misinformation, "consolidating and clarifying the existing rules in one place."

It overruled a case in which Facebook took down a post from a user in North America allegedly quoting Joseph Goebbels, a Nazi official on Facebook's list of "dangerous individuals." The board found it "was a criticism, not a celebration of the attitude exemplified by the alleged Goebbels quote," board co-chair Michael McConnell, director of Stanford Law School's Constitutional Law Center, told reporters.

In a case dealing with nudity, the board overturned the removal of an Instagram post promoting breast cancer awareness in Brazil that showed women's nipples. The board pointed out that Facebook's nudity rules include an exception for posts about breast cancer. Facebook restored the post back in December, after the board announced it would be reviewing the case.

"I think this is a really good example of how the mere prospect of a board review has already begun to alter how Facebook acts," McConnell said.

The board also sounded the alarm that the Instagram post was initially removed by automated systems. "The incorrect removal of this post indicates the lack of proper human oversight which raises human rights concerns," it said in its decision. It recommended that Facebook tell users when their posts have been taken down by automated systems and make sure they can appeal those decisions to a person.

The final two cases dealt with hate speech. In the first, the board overruled the removal of a post from a Facebook user in Myanmar that Facebook said violated its rules against hate speech for disparaging Muslims as psychologically inferior. While the board found the post "pejorative," taking into account the full context, it did not "advocate hatred or intentionally incite any form of imminent harm," the board said in its decision.

The board upheld Facebook's removal of another post for breaking hate-speech rules, however. It said the company was right to remove a post that used a slur against Azerbaijanis. "The context in which the term was used makes clear it was meant to dehumanize its target," the board said in its decision.
Facebook 'Supreme Court' Orders Social Network To Restore 4 Posts In 1st Rulings
 

Brickjectivity

Veteran Member
Staff member
Premium Member
It overruled Facebook's removal of a post from a user in France criticizing the government for withholding an alleged "cure" for COVID-19. Facebook had removed the post because it said it could lead to imminent harm, but the board said the user's comments were directed at opposing government policy. "Facebook had not demonstrated the post would rise to the level of imminent harm, as required by its own rule in the Community Standards," the board said. It also recommended that the company create a new policy specifically about health misinformation, "consolidating and clarifying the existing rules in one place."

It overruled a case in which Facebook took down a post from a user in North America allegedly quoting Joseph Goebbels, a Nazi official on Facebook's list of "dangerous individuals." The board found it "was a criticism, not a celebration of the attitude exemplified by the alleged Goebbels quote," board co-chair Michael McConnell, director of Stanford Law School's Constitutional Law Center, told reporters.

In a case dealing with nudity, the board overturned the removal of an Instagram post promoting breast cancer awareness in Brazil that showed women's nipples. The board pointed out that Facebook's nudity rules include an exception for posts about breast cancer. Facebook restored the post back in December, after the board announced it would be reviewing the case.

"I think this is a really good example of how the mere prospect of a board review has already begun to alter how Facebook acts," McConnell said.

The board also sounded the alarm that the Instagram post was initially removed by automated systems. "The incorrect removal of this post indicates the lack of proper human oversight which raises human rights concerns," it said in its decision. It recommended that Facebook tell users when their posts have been taken down by automated systems and make sure they can appeal those decisions to a person.

The final two cases dealt with hate speech. In the first, the board overruled the removal of a post from a Facebook user in Myanmar that Facebook said violated its rules against hate speech for disparaging Muslims as psychologically inferior. While the board found the post "pejorative," taking into account the full context, it did not "advocate hatred or intentionally incite any form of imminent harm," the board said in its decision.

The board upheld Facebook's removal of another post for breaking hate-speech rules, however. It said the company was right to remove a post that used a slur against Azerbaijanis. "The context in which the term was used makes clear it was meant to dehumanize its target," the board said in its decision.
Facebook 'Supreme Court' Orders Social Network To Restore 4 Posts In 1st Rulings
It won't be enough, and more people than I have begun to argue that the public must abandon this kind of centralized social network. I can't say I know exactly what to do, but its proven dangerous already. Its already been used to manipulate. Its already sold user data that didn't belong to it.

Perhaps it should be illegal to have this kind of false internet page with a centralized database paid for through advertising alone. Perhaps there ought to be a fee required, such as 1$ per year at least, charged to users.
Perhaps countries should make it illegal to require real names on free social networks.
Perhaps the problem is that all of the security and data are owned by Facebook and that data should remain in the hands of its users instead. That is technically feasible. There's no reason it has to be kept by facebook or that users can't secure our own data.
 
Top