• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

YouTube Censoring

Rival

Diex Aie
Staff member
Premium Member
YouTube has an absurd number of words creators can't use in their videos, apparently because the ad companies don't like them. I don't understand this though. How is one meant to make a documentary video about certain topics without using words like 'suicide' or 'rape'? If you're making a documentary about, say, Jimmy Savile or Jim Jones, it's going to be a huge mess.

What's the issue here?
 

Orbit

I'm a planet
YouTube has an absurd number of words creators can't use in their videos, apparently because the ad companies don't like them. I don't understand this though. How is one meant to make a documentary video about certain topics without using words like 'suicide' or 'rape'? If you're making a documentary about, say, Jimmy Savile or Jim Jones, it's going to be a huge mess.

What's the issue here?
I'm not sure where you're running into that, because there are a ton of documentaries about Jim Jones on there, and of course they use the words "mass suicide". Maybe it's a country thing? I'm in the U.S. and I uploaded a video about femicide in Mexico for my students, and got a message saying my video was restricted to viewers outside of Mexico (a viewer from Mexico can't watch it). Dunno.
 

SalixIncendium

अहं ब्रह्मास्मि
Staff member
Premium Member
I'm not sure where you're running into that, because there are a ton of documentaries about Jim Jones on there, and of course they use the words "mass suicide". Maybe it's a country thing? I'm in the U.S. and I uploaded a video about femicide in Mexico for my students, and got a message saying my video was restricted to viewers outside of Mexico (a viewer from Mexico can't watch it). Dunno.
This is a very good point. I think the UK may have stricter censorship laws than the US on content.
 

Rival

Diex Aie
Staff member
Premium Member
I'm not sure where you're running into that, because there are a ton of documentaries about Jim Jones on there, and of course they use the words "mass suicide". Maybe it's a country thing? I'm in the U.S. and I uploaded a video about femicide in Mexico for my students, and got a message saying my video was restricted to viewers outside of Mexico (a viewer from Mexico can't watch it). Dunno.
I believe it happens at the beginning of videos, which is what the algorithm looks at and if they say the no-no words within the first 10 minutes or so the algorithm won't choose their videos. Some of them use these words after that mark. Some, through not wanting to offend ad companies etc., do it throughout.
 

Orbit

I'm a planet
I believe it happens at the beginning of videos, which is what the algorithm looks at and if they say the no-no words within the first 10 minutes or so the algorithm won't choose their videos. Some of them use these words after that mark. Some, through not wanting to offend ad companies etc., do it throughout.
I just went to youtube and searched the word "suicide" and a ton of videos came up. What happens when you do that?
 

Rival

Diex Aie
Staff member
Premium Member
I just went to youtube and searched the word "suicide" and a ton of videos came up. What happens when you do that?
You can search for them, that's not the issue. It's when the people on the videos say the word they blank it out. The creators themselves are doing this. I'm surprised you've not come across it. If you go to shows such as LoreLodge, Stephanie Harlowe, Simon Whistler, etc. you'll come across this. It happens so often it's bothering me. It's apparently about the ads, and these ads are for US companies, not British ones, so it must be an American thing. I'm just trying to figure out why ad companies don't like it.

https://www.reddit.com/r/IntellectualDarkWeb/comments/140oto4
 

Dao Hao Now

Active Member
Same as @Orbit……
I’m in the U.S., when I searched those terms on YouTube I got a multitude of hits.
When searching “causes of suicide” it did prioritize a link to a suicide crisis hotline and a warning that the “the following results may be about suicide or self-harm”, but once acknowledged, put up a multitude of hits.

It appears it may be a local filtering in your area.
 

Orbit

I'm a planet
You can search for them, that's not the issue. It's when the people on the videos say the word they blank it out. The creators themselves are doing this. I'm surprised you've not come across it. If you go to shows such as LoreLodge, Stephanie Harlowe, Simon Whistler, etc. you'll come across this. It happens so often it's bothering me. It's apparently about the ads, and these ads are for US companies, not British ones, so it must be an American thing. I'm just trying to figure out why ad companies don't like it.
I guess I just haven't run across it, then. It sounds very strange. Here is what I found in their terms of service:

"On April 18, 2023, we updated our Eating disorders policy to better protect the community from sensitive content that may pose a risk to some audiences. We may remove imitable content, age-restrict content, or show a crisis resource panel on videos about eating disorders or self-harm topics.

At YouTube, we take the health and well-being of all our creators and viewers seriously. Awareness and understanding of mental health is important and we support creators sharing their stories, such as posting content discussing their experiences with depression, self-harm, eating disorders, or other mental health issues.


However, we do not allow content on YouTube that promotes suicide, self-harm, or eating disorders, that is intended to shock or disgust, or that poses a considerable risk to viewers.

Don't post the following content:​

  • Content promoting or glorifying suicide, self-harm, or eating disorders
  • Instructions on how to die by suicide, engage in self-harm, or engage in eating disorders (including how to conceal them)
  • Content related to suicide, self-harm, or eating disorders that is targeted at minors
  • Graphic images of self-harm
  • Visuals of bodies of suicide victims unless blurred or covered so they are fully obscured
  • Videos showing the lead-up to a suicide, or suicide attempts and suicide rescue footage without sufficient context
  • Content showing participation in or instructions for suicide and self-harm challenges (e.g. Blue Whale or Momo challenges)
  • Suicide notes or letters without sufficient context
  • Content that features weight-based bullying in the context of eating disorders
In some cases we may restrict, rather than remove, suicide, self-harm, or eating disorder content if it meets one or more of the following criteria (for example, by placing an age-restriction, a warning, or a Crisis Resource Panel on the video). Please note this is not a complete list:

  • Content that is meant to be educational, documentary, scientific, or artistic
  • Content that is of public interest
  • Graphic content that is sufficiently blurred
  • Dramatizations or scripted content, which includes but is not limited to animations, video games, music videos, and clips from movies and shows
  • Detailed discussion of suicide or self-harm methods, locations and hotspots
  • Graphic descriptions of self-harm or suicide
  • Eating disorder recovery content that includes details which may be triggering to at-risk viewers"
 

Rival

Diex Aie
Staff member
Premium Member
I guess I just haven't run across it, then. It sounds very strange. Here is what I found in their terms of service:

"On April 18, 2023, we updated our Eating disorders policy to better protect the community from sensitive content that may pose a risk to some audiences. We may remove imitable content, age-restrict content, or show a crisis resource panel on videos about eating disorders or self-harm topics.

At YouTube, we take the health and well-being of all our creators and viewers seriously. Awareness and understanding of mental health is important and we support creators sharing their stories, such as posting content discussing their experiences with depression, self-harm, eating disorders, or other mental health issues.


However, we do not allow content on YouTube that promotes suicide, self-harm, or eating disorders, that is intended to shock or disgust, or that poses a considerable risk to viewers.

Don't post the following content:​

  • Content promoting or glorifying suicide, self-harm, or eating disorders
  • Instructions on how to die by suicide, engage in self-harm, or engage in eating disorders (including how to conceal them)
  • Content related to suicide, self-harm, or eating disorders that is targeted at minors
  • Graphic images of self-harm
  • Visuals of bodies of suicide victims unless blurred or covered so they are fully obscured
  • Videos showing the lead-up to a suicide, or suicide attempts and suicide rescue footage without sufficient context
  • Content showing participation in or instructions for suicide and self-harm challenges (e.g. Blue Whale or Momo challenges)
  • Suicide notes or letters without sufficient context
  • Content that features weight-based bullying in the context of eating disorders
In some cases we may restrict, rather than remove, suicide, self-harm, or eating disorder content if it meets one or more of the following criteria (for example, by placing an age-restriction, a warning, or a Crisis Resource Panel on the video). Please note this is not a complete list:

  • Content that is meant to be educational, documentary, scientific, or artistic
  • Content that is of public interest
  • Graphic content that is sufficiently blurred
  • Dramatizations or scripted content, which includes but is not limited to animations, video games, music videos, and clips from movies and shows
  • Detailed discussion of suicide or self-harm methods, locations and hotspots
  • Graphic descriptions of self-harm or suicide
  • Eating disorder recovery content that includes details which may be triggering to at-risk viewers"
Yes, and in the main, they end up demonetised. 99.9% of these videos are educational documentaries. not glorifying anything. But the algorithm doesn't get this.
 

Rival

Diex Aie
Staff member
Premium Member
Same as @Orbit……
I’m in the U.S., when I searched those terms on YouTube I got a multitude of hits.
When searching “causes of suicide” it did prioritize a link to a suicide crisis hotline and a warning that the “the following results may be about suicide or self-harm”, but once acknowledged, put up a multitude of hits.

It appears it may be a local filtering in your area.
It's not the words, it's the actual content on the videos that are self-censored.
 

Heyo

Veteran Member
YouTube has an absurd number of words creators can't use in their videos, apparently because the ad companies don't like them. I don't understand this though. How is one meant to make a documentary video about certain topics without using words like 'suicide' or 'rape'? If you're making a documentary about, say, Jimmy Savile or Jim Jones, it's going to be a huge mess.

What's the issue here?
First, it isn't censorship. Content creators are still free to use these words.
Advertisers are free to choose which kind of content they want to promote by giving them money.

Why do advertisers not like some content?
Because viewers of certain content are most likely not in their target audience. You simply don't advertise all electric cars on a right-wing podcast, that would be ill-spent money.

On YouTube you have congregated content and congregated ads. I.e. advertisers don't choose specific channels but random time on random videos. YouTube simply anticipates what advertisers don't want to be associated with and exempts some content from YouTube ads.
 

Twilight Hue

Twilight, not bright nor dark, good nor bad.
YouTube has an absurd number of words creators can't use in their videos, apparently because the ad companies don't like them. I don't understand this though. How is one meant to make a documentary video about certain topics without using words like 'suicide' or 'rape'? If you're making a documentary about, say, Jimmy Savile or Jim Jones, it's going to be a huge mess.

What's the issue here?
Advertisers are paranoid crazy gits.
 

Rival

Diex Aie
Staff member
Premium Member
First, it isn't censorship. Content creators are still free to use these words.
Advertisers are free to choose which kind of content they want to promote by giving them money.

Why do advertisers not like some content?
Because viewers of certain content are most likely not in their target audience. You simply don't advertise all electric cars on a right-wing podcast, that would be ill-spent money.

On YouTube you have congregated content and congregated ads. I.e. advertisers don't choose specific channels but random time on random videos. YouTube simply anticipates what advertisers don't want to be associated with and exempts some content from YouTube ads.
Well, this is a bit of a gross overgeneralisation.

They aren't free to use those words if YT is their main source of income, so they are being censored unless they want to put hours of work into a video for no payoff. Nobody is going to do that.

Second, they are severely misunderstanding groups. I'm right wing and don't mind electric anything. Most Europeans on the right don't care; what we care about is the cost and effect on the economy, not the vehicles themselves. So they're demonetising channels because their algorithm doesn't understand nuance. This is YT's fault.

But censoring words like suicide, murder, abuse, rape, is just absurd. We are not 4. There is a dedicated kids' section on YT.
 
Top