Massive public pressure has often been necessary to move Facebook to the mind. In the debate on hate commentaries, Federal Minister of Justice Maas had to get involved – and still the Community standards are still rather sloppy. Also, the fact that historical documents such as the photo of the nine-year-old Kim Phuc in the Vietnam war are not simply classified as “naked” are eroded and deleted, Facebook was convinced after a fierce public debate.
Such a debate is also happening to Facebook when it comes to Fakenews. Critics are convinced that the network is contributing by contributing falsehoods to Donald Trump’s new US president. BuzzFeed has summed up the fact that the 20 most successful Fakenews on the presidential election at Facebook have provided more user interactions than the 20 most successful (and serious) contributions from major suppliers such as the New York Times or the Washington Post.
It is already a certain chutzpah to reject the accusation as a whole, as Facebook CEO Mark Zuckerberg did it last week at the Techonomy conference: “I personally consider it a rather crazy idea that false news on Facebook, the Only a very small proportion of the content that could have influenced the choice in some way. “This is the only way to see the employees who have set up a task force, which is discussing the responsibility of their own company.
Evidently, Zuckerberg has changed his attitude within a few days: He announced early this morning, in his own contribution to Facebook, concrete steps to be worked on. “We have to be careful not to keep people from sharing their opinions,” says Zuckerberg. In addition, he wanted to prevent the accidental deletion of content that is not a Fakenews – and then put together what is being worked on.
Facebook wants to work with better automatisms to recognize Fakenews: “This means better systems that recognize what content people will report, even before they do it themselves.” This first point is probably also the most discusible. The advertisers and a large part of the users seem to be satisfied, for example, with the Newsfeed algorithm, which regulates which content we get on the homepage. And yet we are discussing the social and political implications of this technique. Would even more algorithms really help solve problems caused by algorithms?
According to Zuckerberg, reporting functions are also more user-friendly. And Facebook is using the help of both journalists to learn from their craft of source verification, as well as from external organizations that examine facts. This includes internationally the Snopes team; The makers of Mimikama have been doing this for a long time in the German speaking world.
One of the considerations at Facebook is to mark doubtful contributions with warnings. A team of students has shown in recent days that such a system can be implemented as a browser add-on within a few hours. And his power in the advertising market also wants to use Facebook in order to evade the makers of Fakenews their business basics.
As long as it has lasted, it is so important that Facebook is debating its own responsibility. The network has become too important for society and in an uncontrolled state to become too dangerous for democracy, to argue that it is only an infrastructure provider and has nothing to do with what is happening.