By Daryna Sterina
Twitter, Facebook, or YouTube: for many people, these platforms are an important gateway to news and information. And as access to the internet continues to grow exponentially in countries around the world, these platforms, and the internet more broadly, have become vital tools in both the production and dissemination of independent news and reporting.
Yet, all too often internet governance policies and the policies of specific platforms are not designed with media development in mind. As a result, internet governance policy fails to take into account the needs and considerations of media outlets, many of which are increasingly reliant on digital platforms to engage their audiences. The consequences of this have been amplified this year, as the economic impacts of the COVID-19 imperils the ability of news outlets to stay afloat at the very same time that citizens increasingly turn to the internet for up-to-date news and information.
This is especially true in developing countries, where the pandemic has heralded a moment of financial peril for many news outlets. To combat this, many independent news outlets in the Global South have taken steps to increase their digital presence and disseminate their content online. And while they have successfully increased their readership in the face of a growing demand for high-quality information, they have not been able to translate this increased readership into higher revenues. For these outlets, making their online media profitable is a matter of survival. As such, the need for an internet policy framework supportive of a sustainable online media environment is now more important than ever.
On November 6, the Dynamic Coalition on the Sustainability of journalism and News Media (DC-Sustainability)—one of the 23 officially recognized working groups at the UN’s Internet Governance Forum (IGF)—convened virtually to discuss the intersection of news media sustainability and internet governance. The establishment of the Dynamic coalition by the Global Forum for Media Development (GFMD), the Center for International Media Assistance (CIMA), the Committee to Protect Journalists (CPJ), Deutsche Welle Akademie, WAN-IFRA, and others was premised on the need for media development stakeholders to engage meaningfully in internet governance processes and debates.
The Dynamic Coalition’s online session featured the launch of its inaugural annual report—a compilation of research and analysis that articulates the challenges news media face vis-à-vis internet governance and offers policy solutions. Moderators Daniel O’Maley (CIMA) and Courtney C. Radsch (CPJ) opened the floor for media experts and researchers who contributed to the report to discuss the problems of today’s internet.
Algorithmic Removal of Content
The event kicked off with a case study exploring the algorithmic removal of political content and information in Serbia by Tanja Maksic from BIRN Serbia, an independent investigative journalism news site based in Belgrade. When BIRN Serbia had its content removed—two YouTube videos meant to accompany an investigative article—the news outlet was unable to convince YouTube to restore it until one of their larger international partners, Reporters Without Borders, got involved. Given these challenges, Maksic stressed the need for more partnerships between media outlets that publish non-English content or operate in smaller media markets with policymakers and the tech companies running and programming algorithms. Since the policy frameworks for addressing issues like content removal are not yet fully developed, Maksic advocated for including these types of media organizations when discussing potential solutions.
The problem of algorithmic removal of media content is not unique to Serbia. Fiona Nzingno from RNW Media’s Love Matters Global Network explained the impact of Facebook’s removal of advertisements on sexual and reproductive health and rights for young people in countries where access to such information is denied in most other offline contexts. Love Matters has teams working in India, Mexico, Kenya, Nigeria, the Democratic Republic of Congo, Egypt, and China. Facebook’s miscategorization of educational information on sexual health as pornographic content has reduced access to sexual education. The network tried to combat algorithmic removal of their ads by using synonyms, avoiding certain images, and writing in local slang. Nonetheless, user access to Love Matters’s content continued to decrease. Love Matters’s experiences highlight the ways in which content moderation can restrict access to information that touches on culturally sensitive topics, even when it serves an important educational purpose.
A Call for More Tech Company Transparency
There is a need for more transparent regulatory policies that go beyond the already critical issue of content removal. According to Ellery Biddle, editorial director of Ranking Digital Rights, the threat that big tech platforms pose to democracy and human rights centers on their business model. Biddle noted that content-shaping algorithms determine what content users get to see while allowing companies to increase profits by collecting personalized data for targeted ads. These algorithms allow companies to build extensive digital profiles of users that could be used by anyone who buys ads on the platform to potentially target users with misleading content.
To combat this, Biddle made specific US policy recommendations that would regulate this business model and require more transparency in online advertising, especially political advertising. Biddle also called for companies to put in place rules that prevent the manipulation of these targeted advertising systems, and to institute transparency practices that allow users to see who is influencing the content they see on these platforms, and why it is aimed at them.
Some efforts to create a more transparent online media environment have had unintended effects. For instance, Radsch’s (CPJ) research found that social media platforms’ inconsistent use of labels to denote state-funded or state-controlled media potentially harm media sustainability. Google’s labelling of state-funded and publicly funded media is inconsistent and often based on information pulled from Wikipedia. And while other platforms, like Facebook, obtain information through a review of an outlet’s company documents, including budgets and editorial standards, their determinations can still be subjective and inconsistent.
One example of the challenges emerging from labeling is Facebook applying its “state-controlled” label, meant to identify outlets that are partly or totally under a local or foreign government’s editorial control, to the outlets belonging to the media company Maffick. Maffick, which runs the Facebook Pages for online media outlets including In the NOW (which has nearly five million followers), Waste-Ed, and Soapbox, was labeled “Russian state-controlled media” in the wake of reports that the company had received significant funding from a subsidiary of RT, a Russian state-funded media organization. Maffick responded by filing a lawsuit claiming that the label was inaccurate and defamatory. This incident demonstrates the complicated politics of labeling media outlets. This issue is also compounded by the fact that the impact these policies will have on media remains unclear, since, as Radsch pointed out, the tech platforms do not collect data on how labeled media outlets are affected or whether the labels actually succeed in educating audiences.
Supporting Financial Sustainability for Online Media
These issues highlight the immense power tech companies and social media platforms hold over the digital media sphere. Regulating these companies will be crucial to supporting the financial viability of online media. Michael J. Oghia from GFMD, pointed out that finding and supporting sustainable funding models needs to be the priority for media’s continued survival.
Similarly, Olaf Steenfadt from Reporters Without Borders spoke about how the EU Digital Services Act (DSA) presents a window of opportunity. The DSA is a legislative package aimed at modernizing the EU’s e-Commerce Directive. The DSA would pave the way for transnational regulation of tech companies, which, according to Steenfadt, is crucial to supporting digital media sustainability. Steenfadt noted that this move, which breaks from the EU’s tradition of non-interference in national media policies, potentially signals the development of a European cross-border approach to fostering a financially sustainable and pluralistic media environment. He stressed that the implementation of legal obligations for tech platforms—ones that prioritize users’ safety and support the development of businesses that foster digital innovation—is an important solution to the current issues in internet governance.
Conclusion
All speakers agreed that efforts to improve the digital media ecosystem must take into account the needs and considerations of developing countries and countries undergoing democratic reforms. In particular, Nzingo from the Love Matters Global Network highlighted that the problems of censorship and algorithmic content removal are impacted by Facebook’s interpretation of cultural norms and its understanding of different regions, which often leads to the removal of important information for marginalized groups. Likewise, Maksic from BIRN Serbia pointed out that large tech companies are not sufficiently attentive to the needs and contexts of smaller emerging markets.
The main takeaway of this session of the DC-Sustainability is simple: global problems require global solutions. The issues highlighted in the report all speak to the clear need for cross-border policy for a stronger digital media environment. To that end, as social media platforms play an increasingly significant role in audiences’ news consumption habits, there is also a need to regulate the ways in which the tech companies operating these platforms handle news content. It is crucial, however, that such policies and regulations also take into account the voices of the media markets of the developing world which are often underrepresented in global debates. Gatherings like the DC-Sustainability session serve as just the first step in flagging issues and identifying opportunities. It is abundantly clear that there is more work to be done in bringing together a wide array of stakeholders, particularly those from the Global South, to discuss, and eventually implement, international policy solutions in support of a vibrant online media ecosystem.
Daryna Sterina is a recent graduate of the master’s program in International Conflict Analysis of the University of Kent and works currently as the advocacy and engagement intern at the Global Forum for Media Development (GFMD).
Comments (0)