Journalists and researchers from around the globe came together to develop recommendations for stakeholders on how to improve online public dialogue.
DW Akademie’s Reclaiming Social Media initiative is a research and advocacy project that examines how media outlets and journalists in the Global South tackle challenges – brought about by social media – to constructive dialogue around public issues. The project has a twofold goal: to inspire more journalists to experiment, innovate and create new solutions, and to develop specific recommendations for stakeholders on how to improve online public dialogue.
In June 2023, DW Akademie brought project researchers and journalists from Africa, Asia, Europe, Latin America and Middle East/North Africa together with DW Akademie research and advocacy experts for a one-day workshop in Bonn to conclude the project’s research phase. It was a day of intense reflection and inspiring discussions, where the 17 participants developed suggested recommendations.
The ideas and opinions presented here are those of the participants only, and not necessarily those of the organizations they work for or represent. DW Akademie sees these proposed recommendations as inspiration for the project’s advocacy phase, and serve as a starting point and basis for future consultations with various stakeholders.
The influence of social media algorithms on content visibility and user engagement has raised concerns within the digital rights community about the algorithms’ impact on public discourse and the prioritization of private interests over the public good.
These algorithms often favor emotional and harmful content, amplifying their reach to keep users' engagement high and generate revenue. Many platforms have been criticized for putting business interests above the public interest in terms of information, attention and dialogue. For example, in Myanmar,Facebook algorithms have been accused of promoting content inciting violenceand discrimination against the Rohingya people, exacerbating existing discrimination and increasing the risk of mass violence.
Content and paid users influence algorithmic visibility and potentially overshadow important issues for online discussions about public affairs. Astroturfing is an example – a practice of creating artificial grassroots campaigns that appear to be genuine but are orchestrated and funded by organizations with specific agendas which hijack public debates.
Algorithms determine the content that users see on social media platforms. These algorithms are also employed to moderate user comments and reactions, and automatically filter out certain types of content. There have been allegations of biased decisions made by algorithms that have led to the wrongful suspension of African Twitter users' accounts.
Human-determined data used to train algorithms maycontain inherent biases. Disinformation campaigns in Africa have beenidentified as originating from foreign interests, diverting attention from local community topics.
One significant obstacle in addressing these issues is the lack of transparency among social media platforms regarding their algorithms. Most platforms see them as proprietary, making it difficult for users and independent researchers to understand how content is (not) prioritized and circulated.
This lack of transparency hinders the development of tools and initiatives that promote constructive dialogue on social media. Social media platforms have also increasingly limited the public’s access to their data and social graphs, thus prohibiting their use for research purposes.
Rather than becoming more open, these platforms have become more restrictive. Meta, for example, has greatly limited access toCrowdTangle, a tool that academics, watchdog organizations and journalists had relied on to expose disinformation.
In addition, X (formerly Twitter) has decided todiscontinue its provision of free access to its API.As a result, many initiatives aiming to enhance social media platforms are encountering challenges in designing solutions, due to their dependence on the platform's data (see case study example:Uli by Tattle).
Social media platforms, private actors and governments have a shared responsibility to establish and maintain a digital infrastructure that serves the public interest and upholds human rights (SeeGuiding Principles on Business and Human Rights by OHCHR). To bring about meaningful change, we recommend focusing on the following areas:
Social media platforms and private actors:
Open APIs (application programming interface) and open dashboards: Social media platforms and private actors need to prioritize transparency by providing open APIs for researchers and media professionals. This is to enable independent analysis and accountability regarding content visibility, algorithmic and human decision-making and potential biases. Platforms should also provide open dashboards that are accessible by all, and ensure that data is user-friendly. This would enable citizens to access and comprehend the data that concerns them, and their privacy would be protected.
Transparent conditions and rules of engagement with external partners: Social media platforms must establish transparency regarding the conditions and rules of engagement with external partners. This is to foster a better understanding of the platform's policies, content moderation practices and partnership dynamics; these are prerequisites for informed participation and constructive collaboration with the public, a platforms’ advisory board, civil society organizations, media organizations and similar groups.
Data on government & foreign influence: Social media platforms need to regularly analyze and make available data on governmental and foreign influence in the information space of every country where they are active. By sharing insights on disinformation campaigns, foreign interference or other forms of manipulation, platforms can contribute to a better-informed public and support efforts to address and mitigate the impact of these activities.
Governments:
Transparency of algorithms and data: Governments should enact legislation that compels social media platforms to disclose their algorithms and data, and at the same time implement robust measures to protect the privacy and information of activists and vulnerable groups. This approach ensures that users, researchers and regulating bodies have a clear understanding of how their content is managed on these platforms; it also minimizes the risk of algorithmic biases and safeguards the data of activists or those who belong to potentially vulnerable communities.
Note from the authors:
Given that these recommendations were developed by just 17 participants within a short time frame, some aspects require further development. For example, we see a need to further explore the repercussions of disclosing data on governmental and foreign influence. Meta in itsTransparency Center, for example, shares the number and kind of requests that were made per country but this information is insufficient for the public to understand disinformation campaigns or other kinds of governmental / foreign interference. However, we realize it is a challenge for platforms to publish the exactly requested information that would enable a better understanding. In upcoming consultations, we aim to discuss these recommendations with a broader range of stakeholders to refine and enhance these proposals.
Social media discussions around news have become increasingly heated, polarized or at times violent. However, given their priority to secure their organization’s viability, many media outlets struggle to allot enough time and resources to moderate user generated content. As a result, they only partially regulate content, if at all. Media outlets’ use of social media therefore tends to be more passive (or linear) and focuses primarily on distribution and not on fostering and maintaining constructive dialogue. This insufficient and misdirected public dialogue poses a societal problem and hinders social inclusion, cohesion, problem-solving and empathy, an understanding and the democratic principles of open discourse.
Journalists from marginalized groups and different genders frequently face online violence, leading to significant consequences for individuals as well as society. Attacks on social media have prompted many journalists to quit or limit their online presence, or refrain from discussing controversial topics.
In Latin America, for example, 68% of journalists interviewed said that after being attacked, they had posted less frequently, left social media or avoided certain topics (source:UNESCO). However, it is crucial that marginalized groups and journalists become part of the public dialogue; this ensures that diverse perspectives are represented, voices are heard, and systemic inequalities and injustices are effectively addressed.
Some governments, especially in the Global South, have made efforts to regulate social media spaces and introduced cyber laws that – while often claiming the contrary – aim to silence critical voices online. In 2022 alone, authorities in Africa shut down the Internet 187 times across 35 countries to suppress citizens' voices (source:Accessnow). These tendencies restrict freedom of expression and freedom of the media. When societal voices cannot be heard and journalists can no longer freely report on issues, this not only challenges the freedom of individuals to form opinions but democracy as a whole.
Social media platforms, private actors and governments all play a crucial role in shaping the digital landscape, and each has a responsibility to ensure the protection of freedom of expression and human rights.
However, for meaningful change to come about, it’s essential to find a balance between protecting freedom of expression and regulating harmful content and behavior. Finding the right approach requires thoughtful consideration of the specific context and potential risks involved. Moderation approaches need to conform to human rights standards and the rule of law, as alsooutlined by the UN rapporteurs. For this, we recommend focusing on the following areas:
Social media platforms and private actors:
Establish local oversight boards: To ensure a more nuanced understanding of cultural contexts and sensitivities, social media platforms and private actors such as Facebook should establish local oversight boards. These boards should function in a way similar to theexisting Oversight Board but with more powers and responsibilities to effectively address and moderate content based on local perspectives and concerns. The board's composition is important. To ensure its credibility, impartiality and a well-rounded decision-making process, it should consist of independent and multistakeholder members. For this, it is essential to include representatives from diverse backgrounds, such as civil society, academia and human rights organizations.
Invest in local moderation teams that include local language experts: A priority for social media platforms and private actors should be to include, as part of their moderation teams, local language experts with cultural expertise. These experts can better understand and interpret content in regional languages, accurately assess its context, and make informed unbiased decisions regarding whether content should be permitted or removed. This is vital for preventing misinterpretations and misjudgements. To effectively do this, social media platforms must prioritize staffing the teams with enough members and allocate sufficient resources to support their operations.
Transparently disclose information about content moderation teams: For accountability, it is important that social media platforms and private actors are transparent about their moderation teams. This includes providing information about the backgrounds of moderators as well as the tasks and responsibilities assigned to them. Transparency builds trust and allows users to better understand the decision-making processes behind content moderation, and leads to greater accountability
Focus on behavior rather than content: Instead of solely relying on automated content filters, social media platforms and private actors should shift their focus to patterns of conduct, such as harassment, hate speech or coordinated misinformation campaigns, instead of focusing solely on individual pieces of content. By addressing harmful behaviors in an ongoing form of governance, platforms can create a safer environment while minimizing the potential biases and limitations of automated content moderation (see suggestion by Douek). This can allow for a more nuanced and adaptable regulation that addresses the root causes of misconduct on social media platforms, while respecting the freedom of expression and diversity of opinions.
Governments:
Establish independent governance mechanisms: Governments should prioritize the establishment of independent control mechanisms, such as local and international oversight boards and moderation teams, instead of attempting to directly influence online dialogue. These independent structures should effectively regulate and moderate social media content while ensuring the transparency, fairness and adherence to guidelines and principles set up by the oversight boards.
Note from the authors:
Given that these recommendations were developed by just 17 participants within a short time frame, some aspects require further development.
The following aspects need to be closely reviewed and discussed in upcoming consultations: Which entity/entities should moderate and regulate content? Who should advise on regulations? Which bodies should review and decide on submissions made to the platforms? And (how) can a database of moderation decisions such as the DSA Transparency Database, which collects decisions by social media platforms, be a useful tool for enhancing constructive content moderation and curation?
Internet access remains unevenly distributed worldwide. Some 57% of the global population (4,388 billion people) has internet access, meaning that nearly half of the global population does not. The level of connectivity varies significantly among the countries researched by this project. For instance, in the Central African Republic, Eritrea and Comoros, only 10% of the population has access to the internet. By contrast, Argentina boasts a connectivity rate of 91%, and Ukraine a connectivity rate of 86%.
However, it is crucial to recognize that disparities exist – even in highly connected countries – and are based on factors such as gender, education, and rural versus urban settings. For instance, in Bangladesh, 95% of girls face obstacles, including not owning a mobile phone and having to borrow one despite not knowing if the connection is working. Acquiring essential skills to navigate the internet is also difficult, and these are all hindrances to fully participating in the digital world.
In many countries, the cost and reliability of internet services pose further challenges. Government taxes, such as Uganda's 12% tax on mobile data bundles introduced in 2021, are viewed as limiting freedom of expression.
Zero-rating policies enable users to access certain services and applications without this affecting their data allowance. While these policies appear to offer users affordable access, they raise concerns about net neutrality and limit citizens' ability to fact-check information outside of social media. This particularly affects lower-income groups who rely on social media for news, and who lack access to broader internet search capabilities.
In addition, most internet services and content are not available in local languages, and thus further limit inclusivity. This constrains representation and impedes online participation.
Meaningful change in this area requires various stakeholders to focus on collaborative efforts and proactive measures. The following recommendations highlight key actions that can contribute to a more inclusive and connected digital landscape:
Social media platforms and private actors:
Represent minorities and underrepresented groups:Social media platforms and private actors should prioritize the inclusion of representatives from minorities and underrepresented groups in decision-making processes. This contributes to taking diverse perspectives into account and addressing the concerns of marginalized communities.
Collaborate with local representatives, authorities and media organizations: Social media platforms and private actors should collaborate with local representatives, authorities and media organizations to bridge the gap between online and offline communities. This collaboration helps bring people affected by the digital divide (indirectly) into online discussions; local media and representatives could also summarize the debates for the public. Both approaches ensure that the voices of individuals are heard and considered.
Governments:
Ensure affordable and achievable internet access:Governments should make it a priority to ensure that their populations have affordable and easy access to the internet, and should prevent prejudice against i.e., low-income groups and rural populations to ensure their right to freedom of expression. Governments should refrain from implementing regressive policies such as imposing taxes on internet usage, and from enacting laws that restrict people’s access to information and freedom of expression.
Fund alternative and decentralized models of social media: Governments should support innovation and the development of alternative and decentralized communication platforms. Initiatives like these should provide various options for access and participation in the digital world, including for those who live in underserved areas, and consider the varying connectivity situations of different communities.
Note from the authors:
Since these recommendations were discussed and developed by just 17 actors within a short time frame, some aspects may need further development to effectively close the digital divide. We look forward to additional input in upcoming consultations.
Marginalized groups and different gender identities, such as those in the LGBTQIA+ community, are particularly vulnerable to online attacks. While several regional legal bodies have adopted the conventions on cybersecurity and personal data protection (e.g.the AU's Convention on Cyber Security and Personal Data Protection), enforcement has often targeted opposition politicians, journalists and human rights defenders. This raises concerns about the impartiality and fairness of current oversight mechanisms.
To effectively address these problems, better accountability measures need to be installed to combat online attacks, protect user privacy and uphold the principles of equality and human rights. Strengthening oversight and accountability measures is crucial for ensuring a safer, more inclusive and equitable digital environment for all users. It is essential to balance the protection of freedom of expression and the regulation of harmful content and behavior. To bring about meaningful change, we recommend focusing on the following areas:
Social Media platforms and private actors:
Establish, fund and support local oversight boards with multidisciplinary expertise.
While this point was mentioned earlier on, it is equally relevant here. Please refer to the “Content moderation & curation” section for more details.
Creation of social media councils: Encourage and support the establishment of social media councils that comprise representatives from civil society, academia and other relevant stakeholders. These councils can serve as a platform for dialogue and collaboration between the platforms, oversight boards and broader society.
Governments:
Government-mandated revenue allocation: To finance the operations of the oversight boards and local moderation teams, governments should consider establishing regulations that require social media platforms to allocate a specific percentage of their revenue to a dedicated fund.
Needs assessment by local oversight board: Before proposing any agenda or legislation, it is crucial to conduct local needs and risk assessments. These should be conducted by the local oversight boards to ensure that regulatory measures are tailored to address the specific challenges and priorities of the local community.
Legally binding and independent regulation: Regulation mandating revenue allocation should have a clear legal basis and have provisions expressly outlining the responsibilities, powers and independence of the oversight board.
Note from the authors:
Given that these recommendations were developed by just 17 participants within a short time frame, some aspects require further development.
Matters to discuss in ongoing discussions should include whether social media councils and local oversight boards are both needed, and how they can work most effectively together. In addition, all suggested recommendations regarding oversight and accountability should be carefully adjusted to the regional context.
These recommendations were developed by journalists and researchers from five world regions who took part in a DW Akademie workshop together with DW Akademie research and advocacy experts. No legal or regulation experts were involved in this process. However, we look forward to receiving their input in upcoming consultations.
We also hope these recommendations will clearly contribute to the complex debate on social media regulation and freedom of speech.
We see them as a basis for discussions in regional consultations where stakeholders from diverse backgrounds can come together to address the specific challenges and needs of their regions.
Through these consultations, stakeholders can together develop advocacy plans that encompass region-specific concerns, thus allowing for targeted and effective advocacy efforts. It is important to note that these recommendations cannot be used as are for advocacy activities but need to be carefully tailored and adjusted to the regional context.
These recommendations should also be discussed in international consultations with organizations working on the right to freedom of expression and the right to access information. These international consultations, in turn, should incorporate the feedback from regional consultations.