UK News

UK MPs said Facebook needs regulation as Zuckerberg ‘fails’

UK MPs said Facebook needs regulation as Zuckerberg ‘fails’
no comments
0
0

According to MPs Facebook needs stricter regulations with tough and urgent action necessary to end the spread of disinformation.

A Commons committee has concluded that the platform’s founder Mark Zuckerberg has failed to show “leadership or personal responsibility” over fake news. Untrue stories from foreign powers were risking the UK’s democracy according to MPs.

Facebook have agreed the digital select committee’s report and said it would be open to “meaningful regulation”.

MPs have said what was needed to deal with the rapid increase of fake news online and the misuse of personal data was a “radical shift in the balance of power between social media platforms and the people”.

The inquiry into fake news lasted 18 months was conducted by the Digital, Culture, Media and Sports committee, a big part of the evidence focused on the business practices of Facebook before and after the Cambridge Analytica scandal.

The Cambridge Analytica scandal was when a political advertising firm had access to millions of users data some of which was allegedly used to psychologically profile US voters. The data was collected by a personality quiz.

The report concluded: “Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use every day,”

It added: “The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.”

The report called for:

  • tech companies operating in the UK to be taxed to help fund the work for the Information Commissioner’s Office and any new regulator set up to oversee them.
  • the government to reform current electoral laws and rules on overseas involvement in UK elections
  • social media companies to be forced to take down known sources of harmful content, including proven sources of disinformation
  • the regulator to be given powers to launch legal action if companies breach the code
  • a compulsory code of ethics for tech companies, overseen by an independent regulator.

In response, Facebook said: “We share the committee’s concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence.

“We are open to meaningful regulation and support the committee’s recommendation for electoral law reform. But we’re not waiting. We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years. No other channel for political advertising is as transparent and offers the tools that we do.”

MPs said they had found it difficult dealing with Facebook during the inquiry and chair Damian Collins had strong words for the social media platform and Mr Zuckerberg.

 Damian Collins said: “We believe that in its evidence to the committee, Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at time misleading answers to our questions,”

“These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the ‘move fast and break things’ culture seems to be that it is better to apologise than ask permission.”

Mr Collins added: “Even if Mark Zuckerberg doesn’t believe he is accountable to the UK Parliament, he is to billions of Facebook users across the world,”

“Evidence uncovered by my committee shows he still has questions to answer yet he’s continued to duck them, refusing to respond to our invitations directly or sending representatives who don’t have the right information.”

The report has recommended that clear legal liabilities should be established for tech companies to act against harmful or illegal content on their sites.

Facebook has repeatedly said it is committed to fighting fake news and works with more than 30 fact checking companies around the world.

Skip to toolbar