YouTube executives ignored warnings, letting toxic videos run rampant | 2019-04-03 | daily-sun.com

YouTube executives ignored warnings, letting toxic videos run rampant

The Star Online

3rd April, 2019 10:34:57 printer

YouTube executives ignored warnings, letting toxic videos run rampant

A year ago, Susan Wojcicki was on stage to defend YouTube. Her company, hammered for months for fuelling falsehoods online, was reeling from another flare-up involving a conspiracy theory video about the Parkland, Florida high school shooting that suggested the victims were “crisis actors”.

 

Wojcicki, YouTube’s chief executive officer, is a reluctant public ambassador, but she was in Austin at the South by Southwest conference to unveil a solution that she hoped would help quell conspiracy theories: a tiny text box from websites like Wikipedia that would sit below videos that questioned well-established facts like the moon landing and link viewers to the truth.

 

 Wojcicki’s media behemoth, bent on overtaking television, is estimated to rake in sales of more than US$16bil (RM65.33bil) a year. But on that day, Wojcicki compared her video site to a different kind of institution.

 

“We’re really more like a library,” she said, staking out a familiar position as a defender of free speech. “There have always been controversies, if you look back at libraries.”

 

Since Wojcicki took the stage, prominent conspiracy theories on the platform – including one on child vaccinations; another tying Hillary Clinton to a Satanic cult – have drawn the ire of lawmakers eager to regulate technology companies. And YouTube is, a year later, even more associated with the darker parts of the Web.

 

Wojcicki would “never put her fingers on the scale”, said one person who worked for her. “Her view was, 'My job is to run the company, not deal with this’.” This person, like others who spoke to Bloomberg News, asked not to be identified because of a worry of retaliation.

 

 

A YouTube spokeswoman contested the notion that Wojcicki is inattentive to these issues and that the company prioritises engagement above all else. Instead, the spokeswoman said the company has spent the last two years focused squarely on finding solutions for its content problems.

 

Since 2017, YouTube has recommended clips based a metric called “responsibility”, which includes input from satisfaction surveys it shows after videos. YouTube declined to describe it more fully, but said it receives “millions” of survey responses each week.

 

“Our primary focus has been tackling some of the platform’s toughest content challenges,” a spokeswoman said in an emailed statement. “We’ve taken a number of significant steps, including updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies – we made more than 30 policy updates in 2018 alone. And this is not the end: responsibility remains our number one priority.”

 

In response to criticism about prioritising growth over safety, Facebook Inc has proposed a dramatic shift in its core product. YouTube still has struggled to explain any new corporate vision to the public and investors – and sometimes, to its own staff. Five senior personnel who left YouTube and Google in the last two years privately cited the platform’s inability to tame extreme, disturbing videos as the reason for their departure. Within Google, YouTube’s inability to fix its problems has remained a major gripe.

 

YouTube’s inertia was illuminated again after a deadly measles outbreak drew public attention to vaccinations conspiracies on social media several weeks ago. New data from Moonshot CVE, a London-based firm that studies extremism, found that fewer than twenty YouTube channels that have spread these lies reached over 170 million viewers, many who were then recommended other videos laden with conspiracy theories.

 

 

 “We would have severely restricted them or banned them entirely,” Schaffer said. “YouTube should never have allowed dangerous conspiracy theories to become such a dominant part of the platform’s culture.”

 

Somewhere along the last decade, he added, YouTube prioritised chasing profits over the safety of its users. “We may have been haemorrhaging money,” he said. “But at least dogs riding skateboards never killed anyone.”

 

 

People inside YouTube knew about this dynamic. Over the years, there were many tortured debates about what to do with troublesome videos – those that don’t violate its content policies and so remain on the site. Some software engineers have nicknamed the problem “bad virality”.

 

Yonatan Zunger, a privacy engineer at Google, recalled a suggestion he made to YouTube staff before he left the company in 2016. He proposed a third tier: Videos that were allowed to stay on YouTube, but, because they were “close to the line” of the takedown policy, would be removed from recommendations. “Bad actors quickly get very good at understanding where the bright lines are and skating as close to those lines as possible,” Zunger said.

 

His proposal, which went to the head of YouTube policy, was turned down. “I can say with a lot of confidence that they were deeply wrong,” he said. 

 

Rather than revamp its recommendation engine, YouTube doubled down. The neural network described in the 2016 research went into effect in YouTube recommendations starting in 2015. By the measures available, it has achieved its goal of keeping people on YouTube.

 

 

A YouTube spokeswoman said that, starting in late 2016, the company added a measure of “social responsibility” to its recommendation algorithm. Those inputs include how many times people share and click the “like” and “dislike” buttons on a video. But YouTube declined to share any more detail on the metric or its impacts.

 

Three days after Donald Trump was elected, Wojcicki convened her entire staff for their weekly meeting. One employee fretted aloud about the site’s election-related videos that were watched the most.

 

They were dominated by publishers like Breitbart News and Infowars, which were known for their outrage and provocation. Breitbart had a popular section called “black crime”.

 

The episode, according to a person in attendance, prompted widespread conversation but no immediate policy edicts. A spokeswoman declined to comment on the particular case, but said that “generally extreme content does not perform well on the platform”.

 

At that time, YouTube’s management was focused on a very different crisis. Its “creators”, the droves that upload videos to the site, were upset. Some grumped about pay, others threatened openly to defect to rival sites.

 

Wojcicki and her lieutenants drew up a plan. YouTube called it Project Bean or, at times, “Boil The Ocean”, to indicate the enormity of the task. (Sometimes they called it BTO3 – a third dramatic overhaul for YouTube, after initiatives to boost mobile viewing and subscriptions.) The plan was to rewrite YouTube’s entire business model, according to three former senior staffers who worked on it.

 

It centred on a way to pay creators that isn’t based on the ads their videos hosted. Instead, YouTube would pay on engagement – how many viewers watched a video and how long they watched.

 

 

At the end of the year, fewer than twenty people were on the staff for “trust and safety”, the unit overseeing content policies, according to a former staffer. The team had to “fight tooth and nail” for more resources from the tech giant, this person said. A YouTube spokeswoman said that the division has grown “significantly” since but declined to share exact numbers.

 

In February of 2018, the video calling the Parkland shooting victims “crisis actors” went viral on YouTube’s trending page. Policy staff suggested soon after limiting recommendations on the page to vetted news sources. YouTube management rejected the proposal, according to a person with knowledge of the event. The person didn’t know the reasoning behind the rejection, but noted that YouTube was then intent on accelerating its viewing time for videos related to news. 

 

However, YouTube did soon address its issues around news-related content. Last July, YouTube announced it would add links to Google News results inside of YouTube search, and began to feature “authoritative” sources, from established media outlets, in its news sections.

 

YouTube also gave US$25mil (RM102.08mil) in grants to news organisations making videos. In the last quarter of 2018, YouTube said it removed over 8.8 million channels for violating its guidelines. Those measures are meant to help bury troubling videos on its site, and the company now points to the efforts as a sign of its attention to its content problems.

 

Yet, in the past, YouTube actively dissuaded staff from being proactive. Lawyers verbally advised employees not assigned to handle moderation to avoid searching on their own for questionable videos, like viral lies about Chief Justice Ginsburg, according to one former executive upset by the practice. The person said the directive was never put in writing, but the message was clear: If YouTube knew these videos existed, its legal grounding grew thinner. Federal law shields YouTube, and other tech giants, from liability for the content on their sites, yet the companies risk losing the protections of this law if they take too active an editorial role.

 

Some employees still sought out these videos anyway. One telling moment happened around early 2018, according to two people familiar with it. An employee decided to create a new YouTube “vertical”, a category that the company uses to group its mountain of video footage. This person gathered together videos under an imagined vertical for the “alt-right”, the political ensemble loosely tied to Trump. Based on engagement, the hypothetical alt-right category sat with music, sports and gaming as the most popular channels at YouTube, an attempt to show how critical these videos were to YouTube’s business. A person familiar with the executive team said they do not recall seeing this experiment.

 

Still, as the company’s algorithms continued to cause headaches, knives have come out.

 

Some former staff fault Wojcicki, who inherited a business oriented toward netting more views and failed to shift its direction meaningfully. Others blame Kyncl, YouTube’s business chief, who oversees creator relations and the content moderation decisions. While Wojcicki and Neal Mohan, YouTube’s product head, have given several public addresses on content-related issues, Kyncl has been less vocal on the matter. Even so, the executive has made other public moves that are viewed by some inside Google as self-promotional. Last August, a week after a damning report on the prevalence of extremist videos on YouTube, he modelled a suit in an ad by luxury brand Brioni. That ad, released amid YouTube’s troubles, raised concerns about Kyncl’s priorities among several employees at Google, according to one person there. Representatives for the company and Kyncl declined to comment.

 

The company has been applying the fix Wojcicki proposed a year ago. YouTube said the information panels from Wikipedia and other sources, which Wojcicki debuted in Austin, are now shown “tens of millions of times a week”.

 

But YouTube appears to be applying the fix only sporadically. One of iHealthTube.com's most popular videos isn’t about vaccines. It’s a seven-minute clip titled: “Every cancer can be cured in weeks”. While YouTube said it is no longer recommends the video to viewers, there is no Wikipedia entry on the page. It has been viewed over 7 million times.


Top