Last month, Allison Joel got an email from YouTube saying the self-made video of her hoop dancing had been pulled from the mega video-sharing website. She had violated YouTube’s copyright laws and terms of service by using “Derezzed” by Daft Punk as her background music. Someone had flagged the 23-year-old Albany woman’s clip, resulting in its removal less than a day after it was posted.
That’s not abnormal. YouTube, a site that has 35 videos uploaded every minute, has long relied on its users to flag inappropriate content and has regularly pulled videos as a result.
At the forefront of the site’s mission: Youth violence. YouTube will confirm that one of the largest categories of videos that are pulled involve violent behavior — specifically uploads featuring children or teens. A video from last year of two girls fighting on the grounds of Shenendehowa High School, which was pulled from the site earlier this month, is just one of the many examples of a video violating YouTube’s strict youth violence rules.
The 15-second clip had thousands of views. It was brought to the attention of YouTube, which pulled the clip and posted a message stating “its content violated YouTube’s Terms of Service.”
That doesn’t mean the fight can’t be found elsewhere on the Internet, but you won’t be seeing it on YouTube (they have technology that blocks flagged content from being re-posted).
After a video is flagged, a YouTube employee views the segment to determine whether it violates the site’s terms of service. Every flagged video is reviewed, and then a decision is made.
Videos are removed daily, although YouTube keeps its operations information close, refusing to divulge how many are flagged (only saying that it’s “thousands” a day), the number removed or even the size of the review staff. They won’t even speak in terms of percentages, or permit the spokeswoman to be quoted by name. The one thing YouTube will confirm: The review team is on-hand 24/7/365 and always has something to do.
Colonie’s Ed Fricken is one of the 490 million unique monthly YouTube users, according to Google Adplanner, and he’s also one of the many trying to keep the community honest, safe and operating within terms.
While Fricken, 24, does post videos, he’s on the other side of Joel, having been the one to flag what he deems inappropriate content. He’s called out about a dozen uploads in the past six months or so, mostly focused on entries that are “hateful and spiteful — usually against gays or anything racist,” he says.
He also targets comments. Many videos on the site have an option where viewers can offer feedback and talk about what they watched. Fricken will flag those, too, when they appear inappropriate.
The regular user is quick to add that he tries to be fair and objective in his flagging.
“I would never flag a video for posting their opinions about homosexuality, because everyone is entitled to their opinion,” he says. “But when people start getting nasty and saying things like ‘faggot’ or when people are extreme about it like the Westboro Baptist Church — there’s no need for that.”
Fricken has never gone back to see if any of the videos he’s flagged are removed (finding them is too cumbersome, he says), but he has noticed one or two of the comments he marked as offensive being taken down.
In the end, though, he says he realizes YouTube has left it up to the community to police themselves, and realizes the site probably has its priorities. And those priorities may not line up with those of the people watching, filming — or caught on film.
Sometimes people featured in a video want it pulled because they’re embarrassed, or wish they hadn’t behaved in such a way (think ”kegs and eggs”). Thing is, if they are in public and the video does not violate the site’s terms (youth violence, music copyright, hate speech) people don’t really have much leverage, says Tom Carr, a partner at Tully Rinckey, an Albany law firm.
“Once you are in a public place, you no longer have an expectation of privacy,” says Carr. “So anything you are doing is fair game for people to photograph or videotape.”
That doesn’t mean people who don’t have success with flagging may not try to sue — because, as Carr putts it, “with the great American way, anyone can sue anybody for anything” — but they probably aren’t going to win.
Sometimes just the threat is enough to get someone to act. For example, Joe is in a video shot by Jane. Joe is acting in a way that, once sober, he realizes is embarrassing and perhaps damaging to his character. Thing is, Joe was down on Pearl Street and Jane just happened to capture his antics on her phone. While legally Joe has no real grounds to sue, just the suggestion that he will — if Jane doesn’t pull the video — could be enough, says Carr.
People may not have the money to fight the accusation in court, says Carr, so they’ll pull the video just to make the whole thing go away.