Offensive material 'the cocaine which attracts users' to Facebook, TD tells social media giant

The company is outlining a number of actions that have been taken to the Oireachtas Communications Committee.

Offensive material 'the cocaine which attracts users' to Facebook, TD tells social media giant

Update 3.28pm:Facebook has been accused of betraying its own standards.

The company is defending how it moderates offensive material on its platform to the Oireachtas Communications Committee.

It is addressing issues highlighted in a Channel 4 documentary that was broadcast in July and actions that have been implemented in response to the programme.

Facebook is denying they turn a blind eye to disturbing material claiming it would damage their revenue through advertising, and they have clear community standards.

However, Deputy Timmy Dooley says the company's Vice President's motto in a 2016 memo entitled "the ugly" sets out why anything that achieves growth for Facebook is defacto good.

Mr Dooley said: "Material - even if it is ugly and it's disturbing, and it's bullying and it's death and destruction and bullying of children - it's de facto good because, effectively, it's the cocaine which attracts users to this material.

"And you, in one of these paragraphs, suggest that that material would be off-putting to people and that would be damaging to you. I put it to you that it is quite the opposite."

A senior staff member of Facebook has apologised to the committee after disturbing content was allowed to remain on the social media giant.

Niamh Sweeney, head of public policy of Facebook Ireland, has told the committee that Facebook has removed the content including a video of a toddler being assaulted by an adult.

Two senior Facebook staff members are appearing before the committee to answer questions about the content moderation policy of violent and harmful content.

Deputy Timmy Dooley.
Deputy Timmy Dooley.

A documentary on Channel 4's Dispatches programme, Inside Facebook, used hidden camera footage to show how content moderation practices are taught and applied within the company's operations in Dublin.

It emerged in the undercover investigation that Facebook moderators were instructed not to remove extreme, abusive or graphic content from the platform even when it violated the company's guidelines.

This included violent videos involving assaults on children, racially charged hate speech and images of self-harm among under-age users.

Ms Sweeney said she and her colleagues were "upset" by what came to light in the programme.

She said: "Dispatches identified some areas where we have failed, and Siobhan (Cummiskey, head of content policy of Europe, the Middle East and Africa) and I are here to reiterate our apology for those failings.

"We should not be in this position and we want to reassure you that whenever failings are brought to our attention, we are committed to taking them seriously, addressing them in as swift and comprehensive a manner as possible."

Committee chair Hildegard Naughton said it was essential for lawmakers to question Facebook following the July 17 broadcast of a documentary.

While nudity is almost always removed, violent videos involving assaults on children, racially charged hate speech and images of self-harm among under-age users remained on Facebook despite being reported by users and reviewed by the moderators.

An undercover reporter worked at CPL Resources in Dublin, Facebook's largest centre for Ireland and UK content.

For six weeks the reporter attended training sessions and filmed conversations in the offices.

One video showed a man punching or stamping on a screaming toddler.

The moderators marked it as disturbing and allowed it to remain online and used it as an example of acceptable content.

"If you start censoring too much then people stop using the platform. It's all about money at the end of the day," one moderator was filmed saying.

Ms Sweeney also admitted that disturbing content of violent assaults and racially-charged hate speech that was allowed to remain on its platform was a betrayal of Facebook's own standards.

She said that the social media giant was not aware that a video of a young toddler being assaulted by an adult was being used to show the type of content that was allowed to remain on its site.

Ms Sweeney went on to tell the committee that a claim made in Channel 4 programme which suggested that the social media giant turns a blind eye to disturbing content was "categorically untrue".

"We are in the process of an internal investigation to understand why some actions taken by CPL were not reflective of our policies and the underlying values on which they are based," she said.

"Creating a safe environment where people from all over the world can share content is core to our business model."

Update 1.20pm: Facebook answering questions on how they moderate offensive material online

Facebook is answering questions about how they moderate offensive material on their platform.

The company is outlining a number of actions that have been taken to the Oireachtas Communications Committee.

It is after a Channel 4 documentary raised important questions about the social media giant's policies and processes.

Last month Channel 4 went undercover with workers who decide what can and can not be posted on Facebook.

The Dispatches crew used hidden camera footage to show how Facebook deals with abusive content, racist images and messages.

Today the social media giant is appearing before the communications committee to respond to issues around how they moderate 'violent and harmful' content.

In their opening statement to the committee, the company described action that is being taken in three key areas; restricting access for under 13s, non-sexual child abuse and increased training and quality control for content moderators.

Niamh Sweeney, Head of Public Policy, said user safety is their top priority.

She said: "One of the claims made in the programme - that it is in our interest to turn a blind eye to controversial or disturbing content on our platform - is categorically untrue.

Creating a safe environment where people all over the world can share and connect is core to our business model. If our services are not safe, people won't share with each other and over time will stop using them.

Facebook confirmed they are investing heavily in new technology to help them deal with problematic content and have doubled the number of people working on their safety and security teams to 20,000.

They have also started a consultation process with external child safety organisations and law enforcement including An Garda Síochána to review their policy regarding leaving videos online.

Update 11.35am: Facebook bosses due before Communications Committee to answer questions over handling of violent content

Representatives from Facebook are due before a Dáil Communications Committee shortly, to explain why the company failed to take down offensive material from its platform.

It follows the airing of a Channel 4 documentary, which showed secret footage of moderators being instructed not to remove abusive or graphic content from the website.

Committee Member Timmy Dooley says Facebook has a responsibility to maintain standards.

Facebook, Google and these digital platforms are extremely powerful, have a phenomenal reach and by and large they have been beneficial to society but that is not to suggest that there shouldn't be some level of regulation, he said.

Earlier: Facebook bosses to be questioned by Communications Committee over handling of violent content

The Oireachtas Communications Committee is set to discuss Facebook's handling of violent and harmful content today.

It will grill the social media site's executives after revelations from the Channel 4 documentary 'Inside Facebook'.

It used hidden camera footage to show how content moderation practices are taught and applied within the company's operations in Dublin.

Communications Committee Chairperson Deputy Hildegarde Naughton says members are worried about how Facebook deals with harmful content.

She said:

There is a lot of concern among members of our committee in relation to the procedures in place and particularly, how moderators within Facebook were trained in relation to leaving this very harmful content online in order to generate revenue for Facebook.

She also says all social media needs to be better regulated: "I think we need to look at social media right across the board, not just Facebook. It is the one area that has no regulation. They have been let self-regulate for many years, it is obviously not working. It does have an impact on our democracy and on the workings of different countries, freedom of speech."

Committee Member Timmy Dooley says the TV programme clearly illustrated unacceptable behaviour on Facebook's behalf.

He said:

What we say was material being used to train the moderators. So you had somebody from CPL who was working on behalf of Facebook saying 'this video, of an adult beating a child senselessly' saying 'this kind of material is ok, so long as there isn't a tagline on it that says this is fun, you can do this.' Well I don't think that's acceptable.

- Digital Desk

more courts articles

Football fan given banning order after mocking Munich air disaster Football fan given banning order after mocking Munich air disaster
Man (25) in court charged with murdering his father and attempted murder of mother Man (25) in court charged with murdering his father and attempted murder of mother
Man appears in court charged with false imprisonment of woman in van Man appears in court charged with false imprisonment of woman in van

More in this section

WHO teams up with 500 experts to define transmission of diseases spread 'through the air' WHO teams up with 500 experts to define transmission of diseases spread 'through the air'
Justice Minister's decision not to attend GRA conference 'extremely disappointing'  Justice Minister's decision not to attend GRA conference 'extremely disappointing' 
Hiqa inspection finds pests and overcrowding in asylum seeker accommodation centres Hiqa inspection finds pests and overcrowding in asylum seeker accommodation centres
War_map
Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited