YouTube Addresses Some Policy Misconceptions About Content

You’ve probably heard a few rumors about YouTube‘s content policies. Perhaps someone’s told you that YouTube is only allowing videos of laughing babies and cute kittens to be uploaded on Fridays. Maybe you’ve read about a new rule that demands all your comments on Justin Bieber‘s videos must declare undying love for the Canadian pop singer. Perhaps you’ve heard a rumor that YouTube pre-screens all videos before they can be viewed by the public.

While I cannot confirm or deny these rumors (the Bieber myth appears to be totally legit though), that last one is most certainly wrong. YouTube has dispelled this myth, and several others, in a new blog post.

With around 48 hours of videos being uploaded to YouTube every minute (that’s almost 8 years worth of content being added to the video-sharing service every day, fact fans), it’s impossible for the YouTube staff to review all of that content before it is made publicly available. That’s why the service relies on the community to flag material which may be violating the Community Guidelines.

Here are some other policy myths which YouTube has cleared up:

If you repeatedly flag a video, YouTube will remove it.

Not true. Once a video has been flagged, YouTube will review it. If the video does not violate the Community Guidelines, it stays up. So if there’s a video on YouTube you don’t like that’s already gone through the flag-and-review process and does not violate the Community Guidelines, there’s not much else you can do about it unless it violates your privacy — more on that in a bit. YouTube won’t remove videos that don’t violate the guidelines for policy reasons.

If I flag a video, the person who uploaded it will know who I am.

Wrong. YouTube doesn’t reveal who you are when you flag content for a guidelines violation.

If I mistakenly flag content that I think violates the Community Guidelines, YouTube will delete my account.

Not necessarily. YouTube won’t delete your account if you inaccurately flagged content, as long as you flag in good faith (though I’m not sure if flagging a bunch of Bieber and Lady Gaga videos counts as flagging in “good faith”).

If my video’s removed for a guidelines violation, there’s nothing more I can do about it.

Nuh uh. There’s actually a video appeals process you can use to appeal against the removal of one of your videos if you don’t agree with the moderators’ decision.

There is no way to remove an embarrassing or otherwise sensitive video of me that someone else has posted.

This isn’t the case. While the video might not violate the Community Guidelines, you can submit a privacy removal request if a video contains your image or other personal information that you did not consent to being shared with the world.

YouTube censors art.

This is a sticky situation, but YouTube says it supports free expression and wants to be a place where artists can freely showcase their work, even in videos that show a little nudity. YouTube does not normally allow nudity of a sexual nature, but if you can justify the artistic, scientific or educational context for nudity in the title, description and tags, the video may remain on the service with an age restriction in place.

Adding unrelated tags to my video’s title and description is totally fine.

Not really. The tags feature is supposed to help users find relevant videos, so if you add a tag like “Lady Gaga” to a 5-second video of your toddler taking their first steps, your video might be removed due to misleading metadata. Keep your tags relevant to your video.

Tags: , , , ,

Comments are closed.