Hardly a day goes by without the national or international press publishing negative stories about social networks. Facebook has once more got problems in keeping the data of its more than two billion users under lock and key, or Mark Zuckerberg stands up another parliament in favour of a personal statement. YouTube, acting to protect children, disables the comment function on many channels and deletes millions of old customer posts. And Twitter is yet again misused for spreading hate speech and fake news.
The strategies of digital media businesses have so far been designed to keep their users active as long as possible on their own platform but have run their course. Clickbait and emotional content – true or not – scarcely generate any informational added value but help in meeting corporate objectives. Early on, it was nice pictures of cats or babies that went viral. Today, all too often, it’s posting polarised opinion that raises the emotional stakes in social media. It’s clear that some fundamental change must come. As well as amending business strategies, corporate governance must be put under scrutiny. Interestingly enough, democratic power-sharing in this area could point the way to novel reform paths.
Current business strategies will have to change because there are already negative reactions on three levels. First, users of social networks are becoming rightly much more sceptical about the platforms themselves and, in particular, the use of their data. Until some time ago it was very hard to imagine, if at all, the great damage that could come about by tracking one’s own behaviour on the net. Yet it was always presumably about tailoring relevant advertising to individual users. But since the Cambridge Analytica scandal at the very latest it has become clear how accumulated data can be misused for malicious purposes. When the attempt is made via the build-up of psychometric profiles to manipulate users then the joke is finally over.
This growing scepticism among users is now combined with a political backlash. The cleavage in political discourse enabled by social networks is eroding social cohesion and this is strongly suspected of having decisively influenced decisions such as the Brexit vote in Great Britain. Polarisation makes it harder and harder to create the political compromise on which many consensus-seeking government systems are based. Long-term fragmentation may be one consequence and there is a frenetic effort in political circles to think of how to regulate against this danger.
While the first two reactions involve users and politicians, the social networks themselves are the ones pressurised in the third form of backlash. Dubious content and dwindling user trust are inducing quite a number of advertisers to rethink their activities on the platforms concerned. YouTube’s modification of the comment function came, for instance, as a direct reaction to the fact that big advertising partners such as Disney and Nestlé had stopped advertising. And Facebook is also regarded critically by many advertising companies.
The chain of events is clear: If social media are increasingly shunned by the population, no business would want to be associated with them long term. Most of all when there is no guarantee about which content your own advert will be placed next to. With reduced advertising income social media companies suffer from a weakened commercial base and that can only be bad news in the medium to longer term. A vicious circle emerges. Therefore, it is in the elemental interests of the digital companies concerned to do something about the current situation.
Some courses of action are already under way. Twitter is considering for example to what extent the business can place other incentives on its own platforms if likes and follower numbers are being downgraded. Facebook even wants to re-orientate the entire business towards personal messaging and groups. And companies like Apple and Google are also setting ever-greater score on protecting the private realm. In iOS and Android operating systems time spent on using particular apps is now displayed, among other things with the goal of persuading users to curtail their activities on social networks. These first attempts to change old strategies are necessary. But they fall short because they ignore an important constitutional conflict that has to do with the private sector status of the business.
Social media are, in the end, private sector companies with commensurate business models that perform a public function. It’s not by chance that social networks are often labelled the “marketplaces of the 21st century”. The companies themselves promote this role by, for example, coming out for overall freedom of speech on their platforms and strictly avoiding taking sides for their own part. But freedom of speech relates in fact to the public and not to the private realm. Anybody can create their own website and give off their opinion there. However, there is of course no obligation to make my personal website available to anyone else as a platform. That also holds true for private sector companies.
If a private actor tries to replicate the public sphere, then they allow problematic but legal content and are themselves thereby associated with such content. This association comes about above all because the private actor has “domiciliary right“ while the strictly legal framework is not congruent with the rules of the platforms. In many social networks, nudity or sexual illustrations, for example, are not allowed although they are legal. If one penalises such legal content then why not do so for hate speech and rabble-rousing? This is a big part of the reason why content policies never seem to fit and are always under attack.
What’s interesting here is that, towards the end of last year, Mark Zuckerberg put forward an astute idea when he announced the creation of a “Facebook Supreme Court“ where complaints against content-related decisions of Facebook may be lodged. His idea rested on three grounds. First, an external institution would take the final decision out of the hands of the company. Second, this would create accountability and transparency. And, third, an independent body would ensure that decisions were not driven by commercial interests. From an analytical point of view, the attempt is being made here to resolve the aforementioned conflict between public function and private sector organisation by outsourcing certain functions.
This is the right approach as it starts with corporate governance structure. But, in the end, it falls short as, when it comes to the problems of social networks, it’s not just a question of implementation but of legitimation of the very rules. But if, within the frame of a “Facebook Supreme Court,“ a type of jurisdiction is outsourced, then one might go beyond this and weigh up whether to adopt other elements of the division of powers within democratic systems.
Uppermost in my thinking here is a possible legislative element which enables users to be directly involved in drawing up the platform’s rules. Up to now, standards are determined by the executive or the company itself. Sometimes there are external consultations so as to give the rules a broader basis. But final decision-making falls to the company. As there is, however, simply no objective barrier between content that is indeed legitimate and content that already crosses an acceptable border, the rules need another basis for their legitimacy. In other words: since an output legitimisation is impossible, the input legitimisation should be strengthened since, in the end, what counts is what the users consider to be legitimate.
Of course, it’s not easy to generate such an element of user democracy but, as Mark Zuckerberg rightly declared, there are no definitive solutions for the problems described above but only good and less good ways of dealing with them. If rules founded upon a broader basis of legitimacy could be better implemented, e.g., through the enhanced use of artificial intelligence, more staff and bodies like an outside “Supreme Court“, this would be an important step taken. In interplay with sensible political regulation, the quality of the digital sphere would be significantly improved acting from within the companies themselves.
It is no accident that the democratically legitimised division of powers has prevailed as the organising principle of the public space. Here too there’s often no objective right or wrong but only legitimate preferences. In a period in which private sector companies exercise public functions the transfer of more elements of democratic statehood into the corporate governance of companies could be a suitable instrument for tackling their problems of legitimacy. The new approach here is the attempt to improve the situation not for the user but with the user.
This article was first published in 2019 by the LSE Business Review