The economic structure of algorithmic colonialism is analyzed most precisely through two related frameworks: data colonialism (Couldry and Mejias) and surveillance capitalism (Shoshana Zuboff).
Couldry and Mejias, define data colonialism as dependent on the extraction and recombination of what they call 'terra nullius data': data treated as if it were unowned and available for appropriation, like the colonial doctrine of terra nullius that treated Indigenous land as empty and available. The key drivers of data colonialism, as they identify them, are private companies 'heavily dependent upon the accumulation of individual data, such as social media companies like Facebook, mobile telecommunications companies like AT&T and advertising technology specialists such as Google.' These companies extract behavioral data from billions of users globally, convert it into behavioral predictions and profiles, and sell those predictions to advertisers and other buyers, without any meaningful compensation to the data producers.
Zuboff's concept of surveillance capitalism, describes this as a 'new logic of data accumulation' in which 'behavioral surplus,' the data collected beyond what is needed to improve a service, 'is monetized to predict and shape human behavior.' The asymmetry is structural: users provide behavioral data in exchange for free services, but the actual economic value created by that data flows to shareholders of corporations headquartered primarily in California.
For the Global South, the asymmetry is sharpened by the global digital infrastructure hierarchy. As Kwet and others have argued, the data extractivism of the internet regime is 'spreading like a new colonialism across the countries of the Global South,' creating 'heightened dependence' on Western technological infrastructures. Another line of analysis adds the China dimension: while the dominant framework focuses on US corporations, Chinese technology companies (Huawei, TikTok, Alibaba) are building parallel infrastructures that reproduce similar extraction dynamics with different political interests attached.
Scholars have added the concept of postcolonial differentials in algorithmic bias: the argument that algorithmic systems that are already biased against marginalized groups within Western societies produce even more severe and less well-documented harms when deployed in postcolonial contexts, where the affected populations have less political power to document, challenge, or seek redress for those harms.
Birhane and others have made the point about governance: 'As AI systems, largely created in the global north, transcend national borders and are integrated into every aspect of global society, the question of who gets to shape these systems becomes increasingly critical.' The governance structures for global AI, including standards bodies, regulatory frameworks, and the corporations that set default system parameters, are dominated by actors from a small number of wealthy countries, while the populations most affected by AI deployment often have the least input into its design and governance.
Another strand of analysis adds the drone example from the chapter 'Data colonialism, surveillance capitalism and drones': military drone technology developed for surveillance and targeting in the 'war on terror' is being repurposed for civilian surveillance and predictive policing in both Western and Global South contexts, with the algorithmic systems trained on data from colonial military operations being directly applied to domestic populations. The colonial origins of the surveillance apparatus are not metaphorical: the actual systems, data, and institutional knowledge flow from military uses in postcolonial contexts to civilian uses globally.
Quick reflection
Couldry and Mejias argue that data is being extracted from populations globally under a logic they call 'terra nullius data': treated as unowned and available for appropriation, just as colonial powers treated Indigenous land. Think carefully about the data you produce daily through your use of digital services. Do you understand what is collected, how it is used, and who profits from it? And if you do not, is that ignorance an individual failure of attention, a structural feature of platforms designed to obscure the terms of the exchange, or something closer to the colonial dynamic Couldry and Mejias describe, in which the extraction is normalized and the asymmetry of power makes meaningful consent impossible?