A new book, Information Ecosystems and Troubled Democracy: The State of Knowledge on News Media, AI, and Data Governance, published by Nordicom at the University of Gothenburg, explores how information ecosystems are shaped by conditions, resources, and power relations in specific countries and regions – and why understanding these dynamics is essential to building policies and practices that support equity and justice in the digital age. Kristin Clay, manuscript editor at Nordicom, spoke with co-author Gyan Prakash Tripathi about the motivations for taking a global approach and highlighting the differences and inequities between different parts of world when it comes to equitable data governance and the knowledge production needed to support it.
Kristin Clay (KC): We were so pleased at Nordicom when we were approached to publish this book. Based on a previously published report with extensive effort of your group to spread this research and promote fruitful discussions, we were eager to help bring it to wider audience.
The motivation of this research synthesis was to take a global approach; however, it is acknowledged that even here, the corpus of sources favours the Global North. What actions should be taken to improve production of and access to knowledge and information in the Global Majority world?
Gyan Prakash Tripathi (GPT): Thank you, Kristin. We are also very glad that Nordicom agreed to publish the book and make it Open Access to facilitate a wider dissemination, beyond just resourceful academic networks and Universities.
In this book, we demonstrate how extractive research governance moves knowledge out of communities, while benefits rarely return. We show that the bottleneck is less “lack of research” and more “lack of visibility”. We realised this in the early stages of our research and took corrective measures by launching a second call for research focusing specifically on Global Majority researchers and publications. Based on our experience leading up to this book and my perspectives from India, I would recommend targeted emphasis on the following three things:
First, discovery: It is pertinent that publicly funded critical studies must be released with persistent identifiers and machine-readable metadata. They will also benefit from being aggregated in local repositories and federated into a national discovery layer that connects to global indices.
Second, translation as infrastructure: We must treat human (and automated) translation as a public service to ensure that work in under-resourced languages travels across regions and is more accessible to researchers globally.
Third, increased incentives to generate local research: Academic promotions, funding awards and grants should credit datasets, software, and policy briefs, not just journal articles and academic literature.
These steps can help in rebalancing epistemic authority toward Global Majority authors, journals, and publics.
KC: The book highlights that governance decisions are often established by Global North governments, and rule-setting by (mostly United States–based) Big Tech can be considered “digital imperialism”. What (perhaps unintended) consequences might this have for regions in the Global Majority world that have different realities than Global North countries?
GPT: The most direct but silent harm is compliance overfit. We see that when frameworks which are tuned for developed and English-dominant contexts become global templates, local regulators attempt to mirror them, importing obligations that assume capacities they often do not have.
However, this makes things relatively easier for platforms – they ship one-size governance globally, imposing identical models ill-suited to informal economy, deploying poor safety systems that often fail in low-resource languages, and systematically depress the visibility of local media in favour of sources ranking higher in their taxonomies. These design choices end up locking users as well as regulators into ecosystems and toolchains unfamiliar to them, gradually increasing switching costs.
We show that these outcomes are structural features of the current model, not accidental byproducts or temporary anomalies. This also means that it is tough to remedy these consequences. One of the few ways to avoid it could be to mandate contextual-adaptation impact assessments and empower a designated local authority to pause or prohibit rollouts of platform policies or features that create exclusion or security risks at the community or country level. This would correct one-size-fits-all governance and restore local agency.
KC: In the preface to the book, Robin Mansell poses the question: “What can be done to foster online information and communication spaces that respect human rights, do no harm, and underpin justice for all?” In the context of the Global Majority World, what are your thoughts about this?
GPT: This is indeed one of the key questions we are left with after the first research cycle of the Observatory.
Through this book, we argue that rights online must be measurable and not aspirational. In the context of the Global Majority World, I would treat high-reach services as critical information infrastructure requiring a rights-based “service-level objective” for core protections. This could include safeguards through encryption availability, timely appeals, low false-positive rates across key local languages, and potentially a crisis-mode cap on amplification. There must be an independent audit for these targets, along with deterrent penalties for failure of adherence.
KC: We increasingly must rely on Big Tech companies and their services in everyday life, but these benefits increasingly come with risks and harms for the excluded and disadvantaged. What can be done to combat these risks?
GPT: A very clear learning from this global meta-research is the lack of resistance to data creation. We seem to have internalised the data-led economy and the narrative of datafication as inevitable progress, rather than as a political choice that can be contested, constrained, or refused.
We evidence that risk often concentrates at the intersection of affordability, language, and precarity. Therefore, the first step should be to raise the safety baseline before scale – mandating “safety floors” validated first in low-resource languages; requiring graceful degradation so that features default to privacy-preserving, low-bandwidth modes rather than en masse exclusion.
There must be consent ceilings for essential services, collected in an informed manner. One route to go through existing mechanisms can be through consumer authorities to ensure that “consumers” of these technology corporations and their services are protected. There is an urgent need to divert public resources towards public-interest services instead of engagement-maximising feeds. We also show how when used correctly, procurement can a powerful lever. By procuring public-interest technologies and investing in them, governments can lead the way towards the adoption for alternative digital infrastructure, mitigating current risks.
KC: The book emphasises that there is distorted picture of information ecosystems that is fostered by the predominance of research from and about the Global North and the urgency of working towards the epistemic justice–oriented research. What are some first steps we can take as we seek a more equitable future?
GPT: As a first step, we emphasise strengthening shared infrastructure that broaden access rather than creating new gatekeepers to restrict it. It is pertinent to build a federated commons that connects regional repositories through a shared metadata backbone and aligns them with researcher-identity systems, so outputs from Global Majority institutions surface in global discovery tools. Epistemic justice begins not just by recognising scholarship from Global Majority countries and researchers, but also valuing the people who make knowledge possible.
It is crucial to build community benefit agreements into every protocol, including access to results in local languages and resources for community-run repositories that are now being established as a resistance move to large corporations which only engage in extractive practices. However, this is only a steppingstone: Real change requires formal recognition and adequate resourcing of research labour that shifts the agenda and reduces dependence on the Global North.
It is also important to realise that these are embedded flaws in our current systems and the way they are designed. It will require, as Camille Grenier, ED of the Forum puts, “more research, more data, committed policymakers, and a global movement to push those who can act to defend democracy and the information and communication space”.
KC: Thank you very much for going into depth about some of these crucial aspects of the book – and the state of global research and data governance. And we’re happy to say the book is now available completely Open Access and will be freely downloadable in PDF and accessible ePub formats, in addition to being available as a print version to buy at a non-profit cost-covering price.
––––––––
For further reading, visit NordMedia Network for an interview with lead author Robin Mansell and Nordicom’s communication offer, Mia Jonsson Lindell.