Skip to main content

Facebook's former director of monetization says Facebook intentionally made its product as addictive as cigarettes — and now he fears it could cause 'civil war' (FB)

facebook ceo mark zuckerbergAP Photo/Andrew Harnik, File

Summary List Placement

Facebook's former director of monetization Tim Kendall says he had a role in making Facebook as addictive as cigarettes — and worries that Facebook could be just as damaging to its users.

In a testimony before the House Consumer Protection and Commerce Subcommittee published Thursday, Kendall accused Facebook of building algorithms that have facilitated the spread of misinformation, encouraged divisive rhetoric, and laid the groundwork for a "mental health crisis."

"We took a page from Big Tobacco's playbook, working to make our offering addictive at the outset," Kendall said in prepared remarks submitted to lawmakers ahead of Thursday's hearing. "The social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding — at worst, I fear we are pushing ourselves to the brink of a civil war."

Kendall, who is now CEO of the time management app Moment, joined Facebook as its first director of monetization in 2006 and remained in the role until 2010. Kendall said he initially thought his role would involve balancing Facebook's interest in revenue with the wellbeing of its users, but that Facebook was interested in profits over everything.

"We sought to mine as much attention as humanly possible and turn into historically unprecedented profits," Kendall said.

Facebook's algorithm rewards shocking content and divisive rhetoric in order to evoke extreme emotional responses from users in order to hold users' attention and generate more ad revenue, Kendall told lawmakers. 

"These algorithms have brought out the worst in us. They've literally rewired our brains so that we're detached from reality and immersed in tribalism," he said.

Facebook did not immediately respond to Business Insider's request for comment in response to Kendall's testimony.

Kendall isn't the first former Facebook employee to raise concerns about the platform's capacity to sow division. A Facebook engineer quit in protest last month, accusing the company of "profiting off hate." More recently, a fired Facebook data scientist reportedly wrote a whistleblower memo accusing the company of failing to direct enough resources to fighting misinformation.

Facebook has also faced activist campaigns urging it to more robustly crack down on misinformation and hate speech. More than 1,000 companies joined an advertiser boycott of the platform this summer led by civil rights activists, and this month, influencers staged a day of protest over hate speech on Facebook and Instagram.

At the subcommittee hearing Kendall testified at on Thursday, lawmakers said the spread of misinformation on Facebook could be cause for future government regulation of social media platforms.

"Driven by profit and power and in the face of obvious harm, these mega-companies successfully have convinced governments all over the world to essentially leave them alone ... big tech has helped divide our nations and has stoked genocide in others," said Rep. Jan Schakowsky, an Illinois Democrat who chairs the House Consumer Protection and Commerce Subcommittee. 

Meanwhile, Republicans on the subcommittee focused primarily on claims of anti-conservative bias, a frequent talking point of President Donald Trump. They pointed to social media platforms' occasional fact-checking of Trump's posts that violate their policies on spreading misinformation as censorship. While Trump has frequently railed against these fact-checks, Republicans have provided minimal evidence of broader censorship of conservative ideas.

"Free speech is increasingly under attack," said Rep. Cathy Rodgers of Washington, the ranking Republican on the subcommittee. "I am extremely concerned when platforms apply inconsistent content moderation policies for their own purposes ... there's no clearer example of a platform using its power for political purposes than Twitter, singling out President Trump."

Republicans and Democrats alike said they supported reforming Section 230, a law that makes social media platforms immune to legal liability for the content of users' posts. Attorney General William Barr announced Wednesday that the Department of Justice has urged Congress to amend the law, but did not immediately elaborate on how the law should be replaced.

NOW WATCH: How the suicide hotline saved my life

See Also:

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.