Facebook and Google should audit algorithms that boost fake news, say UK Lords

Facebook and Google should audit algorithms that boost fake news, say UK Lords

GettyImages-304520-004

A UK report gave 45 recommendations to deal with inaccurate reports.

John Lamb / Getty

Global Coronavirus epidemic Governments have left, to the tech industry and citizens, not only from the devastating effects of the virus, but from the misinformation that accompanies it. How good is it to deal with the spread of false information Is the subject of global debateSpecifically, just how much responsibility the technology platform hosting it has.

In the UK, the House of Lords Democracy and Digital Technologies Committee published a report on Monday containing 45 recommendations for the UK government to take action against the “epidemic of misinformation” and divestment. It would weaken democracy to take the threat seriously, making it a “reduction of irrelevance”.

During the outbreak, misinformation and danger of disintegration is taken on a new urge Conspiracy theories thrive on online platforms. The worst of these have directly endangered people’s health by falsely endorsing dangerous treatment or discouraging people from taking precautions against the virus. Across Europe, even when telecommunications infrastructure has suffered COVID-19 was incorrectly connected to 5G.

The report investigates the methods of false information spreading during virus outbreaks, and warns that misinformation is a crisis “with roots that spread more deeply, and last longer than COVID-19 Is likely to run. “

In a statement, chairman of the committee David Putnam said, “We are going through a phase in which trust is collapsing.” “People no longer believe they can trust the information received or believe what they have been told. It is absolutely corrosive for democracy.”

Among the recommendations are the major requests to hold large platforms, especially Google and Facebook, responsible for their “black box” algorithms, which control the content shown to users. The report states that these companies have hurt their decisions in shaping and training algorithms.


Playing now:
Check it out:

YouTube cracked down on voter misinformation …


2:41

The report states that companies should be obliged to conduct audits of their algorithms, so that they can take steps to protect them from discrimination. It also suggests increased transparency from digital platforms about content decisions so that people have a clearer understanding of the rules of online debate.

Facebook and Google did not immediately respond to requests for comment.

Regulation: Online Harms Bill

One of the primary recommendations of the report is for the UK government to publish its draft immediately. Online harms bill. The bill would regulate digital platforms such as Google and Facebook, hold them responsible for harmful content and punish them when they failed to meet their obligations.

With a white paper published in May 2019, the bill’s progress has slowed, with the government’s initial response to publishing in February this year and the full response, which should have been published in summer, delayed until the end of the year .

The government was not able to confirm to the committee whether it would bring a draft bill in Parliament by the end of 2021. As a result, the bill may not come into force until the end of 2023 or 2024, the report states. During a briefing ahead of the publication of the report, Lord Putnam described the delay as “unforgivable”.

He said, “The challenges are moving faster than the government and this gap is getting bigger.” “Holding on from afar, we’re really slipping back.”

The report details the ways in which comics, designated the online harms regulator, should be able to hold companies accountable under the law. Digital companies should have the power to fine up to 4 percent of their global businesses or block ISPs of serial criminals, it says.

Online platforms are “not inherently irreplaceable,” it says, urging the government “not to overflow in the face of the inevitable and powerful lobbying of big technology.”

The report is particularly seen in the recent case in which Twitter Selected to hide some of President Donald Trump’s tweets It violated its policies, and criticized Facebook’s decision not to follow suit. Lord Putnam said that Twitter CEO Jack Dorsey had “badly misrepresented Facebook.”

The story is not yet over, he said, but he was optimistic that Twitter’s decision to take action against the president could have an impact if violations of the platform’s rules.

“There is an understanding that these big companies look at each other and when one makes a sensible shift in a sensible direction, others feel a lot of pressure, a lot of pressure to make a similar shift,” he said.

There have been many efforts across Europe to pressurize big technology, not only to crack down on fake news, but to pay more taxes and change their practices through hostile decisions and privacy regulation. The success of these efforts so far is a matter of debate, but Lord Putnam and other committee members eventually hoped that positive change would come in the tech industry.

If the government, which now has two months to respond to the report, accepts the committee’s recommendations, it believes that there is a chance that technology can support democracy and instead of reducing it Can help restore public confidence.

Leave a Reply

Your email address will not be published. Required fields are marked *