Bridging digital divides: We are proud partners of the CDH Social Data School 2021

By Hugo Leal

In the data-driven age, we believe it is our duty to help bridge some of the digital divides that plague our societies. The Minderoo Centre for Technology and Democracy is proud to partner on the CDH Social Data School 2021.

We are pleased to announce that applications are open to the CDH Social Data School 2021, taking place entirely online from 16-29 June.

Originally conceived by Cambridge Digital Humanities, this year’s event is organised in association with us, the Minderoo Centre for Technology and Democracy. We co-designed and will deliver together a new version of an already outstanding initiative.

Two of our goals at the Minderoo Centre for Technology and Democracy are to enhance public understanding of digital technologies and build journalistic capacity to interrogate big data and Big Tech.

These goals align neatly with the objectives of a Data School, borne out of the need to democratise the exploration of digital methods and push back against abusive practices of data appropriation and exploitation by internet giants.

In the data-driven age, we believe it is our duty to help bridge some of the digital divides that plague our societies.”

I was part of the team that originally put together the Data School pilot, back in 2019, and it was clear to us then that academia was, once again, falling short on its mission and failing the public it should serve.

In the data-driven age, we believe it is our duty to help bridge some of the digital divides that plague our societies.

The yawing skills-gap between those who can and those who cannot understand key aspects of digital data manipulation and analysis, is one of the digital divides that must be urgently closed.

For that purpose, the CDH Social Data School utilises in-house expertise in digital methods and provides hands-on training and knowledge exchange across sectors, professions and disciplines.

Who can apply?

We invite, in particular, people and organisations whose role is to form and inform the public, such as journalists, watchdogs and NGOs, academics, and civil servants, to join us in Cambridge.

The CDH Social Data School also strives to address, even if modestly, other digital divides that fall along the traditional class, gender and racial fault-lines.

Although open to all, the selection procedure prioritises individuals from organisations whose access to digital methods training is limited or non-existent due to insufficient human or financial resources, especially the ones located in the Global South.

“The Social Data School is a venue for that dialogue and an avenue to foster the development of better technical, legal and ethical practices in digital methods research.”

Furthermore, we particularly welcome applications from women and black and minority ethnic candidates as they have historically been under-represented in the technology and data science sectors.

While this will do little to redress centuries of colonial, affluent, white, male fuelled inequalities, we at the Minderoo Centre for Technology and Democracy believe that academic institutions, widely perceived as bastions of elitism, have a special responsibility to adopt inclusive practices and adapt our events to pressing public needs.

If academics want to remain relevant and have proper impact beyond obtuse journal impact factors, we must remove the barriers standing in the way of cross-sectorial and interdisciplinary dialogue.

The CDH Social Data School is a venue for that dialogue and an avenue to foster the development of better technical, legal and ethical practices in digital methods research. Both our delivery format and our programme intend to facilitate a conversation among professions, disciplines and methods.

It is less about having experts looking into a pool of data than about inviting participants to share their knowledge and experiences within the context of a guided immersion into digital methods.

Whenever someone asks to describe the CDH Social Data School format the term “data stroll” comes to mind.

It bears some resemblance with what our colleague Tommaso Venturini calls a “data sprint”, intensive code and data-driven gatherings of people with different skill sets focused on specific research question, but with a more critical and even contemplative nature.

For starters, the pace is slower as we intend to reflect critically upon problems arising from data rather than solving them in a week.

This peripatetic wondering confers the Data School its “strolling” colours. It caters more to “adventurous beginners” willing to get their hands dirty than to data whizzes obsessed over data cleaning.

It is less about having experts looking into a pool of data than about inviting participants to share their knowledge and experiences within the context of a guided immersion into digital methods.

For these reasons, having people from diverse backgrounds is not just a matter of desegregating or decolonising curricula but also an opportunity to confront and learn from different regional, disciplinary, cultural or gender informed perspectives over the widespread practices of data surveillance.

In the context of the Social Data School, democratising access to digital methods is also a call to reclaim our data and demonstrate that data appropriated for private profit can be reappropriated for the common good.

This year’s programme is very rich and ambitious, covering topics ranging from data protection and surveillance to machine learning.

We will also try to make (some) sense of the online disinformation nonsense.

If you are a journalist who has the interest but lacks the tools to investigate the spread of misinformation, work for an NGO who wants to monitor online abuses, a watchdog trying to assess the impact of Machine Learning, a civil servant working to improve the health of our online spaces, an academic willing but hesitant to experiment with digital methods, the Social Data School was designed for you.

Apply now

Review: What Tech Calls Reading

A Review of FSG x Logic Series

by Alina Utrata


Publisher Farrar, Straus and Giroux (FSG) and the tech magazine Logic teamed up to produce four books that capture “technology in all its contradictions and innovation, across borders and socioeconomic divisions, from history through the future, beyond platitudes and PR hype, and past doom and gloom.” In that, the FSG x Logic series succeeded beyond its wildest imagination. These books are some of the most well-researched, thought-provoking and—dare I say it—innovative takes on how technology is shaping our world. 

Here’s my review of three of the four—Blockchain Chicken Farm, Subprime Attention Crisis and What Tech Calls Thinking—but I highly recommend you read them all. (They average 200 pages each, so you could probably get through the whole series in the time it takes to finish Shoshana Zuboff’s Surveillance Capitalism.)


Blockchain Chicken Farm: And Other Stories of Tech in China’s Countryside

Xiaowei Wang

“Famine has its own vocabulary,” Xiaowei Wang writes, “a hungry language that haunts and lingers. My ninety-year-old great-uncle understands famine’s words well.” Wang writes as beautifully as they think, effortlessly weaving between ruminations on Chinese history, personal and family anecdotes, modern political and economic theory and first-hand research into the technological revolution sweeping rural China. Contradiction is a watchword in this book, as is contrast—they describe the difference between rural and urban life, of the East and the West, of family and the globe, of history and the present and the potential future. And yet, it all seems familiar. Wang invites us to think slowly about an industry that wants us to think fast—about whether any of this is actually about technology, or whether it is about capitalism, about globalization, about our politics and our communities—or, perhaps, about what it means to live a good life.

On blockchain chicken farms:

“The GoGoChicken project is a partnership between the village government and Lianmo Technology, a company that applies blockchain to physical objects, with a focus on provenance use cases—that is, tracking where something originates from. When falsified records and sprawling supply chains lead to issues of contamination and food safety, blockchain seems like a clear, logical solution. . . These chickens are delivered to consumers’ doors, butchered and vacuum sealed, with the ankle bracelet still attached, so customers can scan the QR code before preparing the chicken . . .”

On a Blockchain Chicken Farm in the Middle of Nowhere, pg 40

“A system of record keeping used to be textual, readable, and understandable to everyone. The technical component behind it was as simple as paper and pencil. That system was prone to falsification, but it was widely legible. Under governance by blockchain, records are tamperproof, but the technical systems are legible only to a select few. . . blockchain has yet to answer the question: If it takes power away from a central authority, can it truly put power back in the hands of the people, and not just a select group of people? Will it serve as an infrastructure that amplifies trust, rather than increasing both mistrust and a singular reliance on technical infrastructure? Will it provide ways to materially organize and enrich a community, rather than further accelerating financial systems that serve a select few?”

On a Blockchain Chicken Farm in the Middle of Nowhere, pg 48

On AI pig farming:

“In these large-scale farms, pigs are stamped with a unique identity mark on their bodies, similar to a QR code. That data is fed into a model made by Alibaba, and the model has the information it needs to monitor the pigs in real time, using video, temperature, and sound sensors. It’s through these channels that the model detects any sudden signs of fever or disease, or if pigs are crushing one another in their pens. If something does happen, the system recognizes the unique identifier on the pig’s body and gives an alert.”

When AI Farms Pigs, pg 63

“Like so many AI projects, ET Agricultural Brain naively assumes that the work of a farmer is to simply produce food for people in cities, and to make the food cheap and available. In this closed system, feeding humans is no different from feeding swaths of pigs on large farms. The project neglects the real work of smallholder farmers throughout the world. For thousands of years, the work of these farmers has been stewarding and maintaining the earth, rather than optimizing agricultural production. They use practices that yield nutrient-dense food, laying a foundation for healthy soils and rich ecology in an uncertain future. Their work is born out of commitment and responsibility: to their communities, to local ecology, to the land. Unlike machines, these farmers accept the responsibility of their actions with the land. They commit to the path of uncertainty.”

When AI Farms Pigs, pg 72

“After all, life is defined not by uncertainty itself but by a commitment to living despite it. In a time of economic and technological anxiety, the questions we ask cannot center on the inevitability of a closed system built by AI, and how to simply make those closed systems more rational or “fair.” What we face are the more difficult questions about the meaning of work, and the ways we commit, communicate, and exist in relation to each other. Answering these questions means looking beyond the rhetoric sold to us by tech companies. What we stand to gain is nothing short of true pleasure, a recognition that we are not isolated individuals, floating in a closed world.”

When AI Farms Pigs, pg 72

Subprime Attention Crisis: Advertising and the Time Bomb at the Heart of the Internet

Tim Hwang

Subprime Attention Crisis

In Subprime Attention Crisis, Tim Hwang argues that the terrifying thing about digital platforms is not how effective they are at manipulating behavior—it’s that they might not be very effective at all. Hwang documents, with precise and technical detail, how digital advertising markets work and how tech giants may be deliberately attempting to inflate their value, even as the actual effectiveness of online ads declines. If you think you’ve seen this film before, Hwang draws parallels to the subprime mortgages and financial systems that triggered the 2008 financial crash. He makes a compelling case that, sooner or later, the digital advertising bubble may burst—and the business model of the internet will explode overnight (not to mention all the things tech money subsidizes, from philanthropy to navigation maps to test and trace). Are Google and Facebook too big to fail? 

On potential systems breakdown:

“Whether underwriting a massive effort to scan the world’s books or enabling the purchase of leading robotics companies, Google’s revenue from programmatic advertising has, in effect, reshaped other industries. Major scientific breakthroughs, like recent advances in artificial intelligence and machine learning, have largely been made possible by a handful of corporations, many of which derive the vast majority of their wealth from online programmatic advertising. The fact that these invisible, silent programmatic marketplaces are critical to the continued functioning of the internet—and the solvency of so much more—begs a somewhat morbid thought experiment: What would a crisis in this elaborately designed system look like?”

The Plumbing, pg 25

“Intense dysfunction in the online advertising markets would threaten to create a structural breakdown of the classic bargain at the core of the information economy: services can be provided for free online to consumers, insofar as they are subsidized by the revenue generated from advertising. Companies would be forced to shift their business models in the face of a large and growing revenue gap, necessitating the rollout of models that require the consumer to pay directly for services. Paywalls, paid tiers of content, and subscription models would become more commonplace. Within the various properties owned by the dominant online platforms, services subsidized by advertising that are otherwise unprofitable might be shut down. How much would you be willing to pay for these services? What would you shell out for, and what would you leave behind? The ripple effects of a crisis in online advertising would fundamentally change how we consume and navigate the web.”

The Plumbing, pg 27

On fraud in digital advertising:

“One striking illustration is the subject of an ongoing lawsuit around claims that Facebook made in 2015 promoting the attractiveness of video advertising on its platform. At the time, the company was touting online video—and the advertising that could be sold alongside it—as the future of the platform, noting that it was “increasingly seeing a shift towards visual content on Facebook.” . . . But it turned out that Facebook overstated the level of attention being directed to its platform on the order of 60 to 80 percent. By undercounting the viewers of videos on Facebook, the platform overstated the average time users spent watching videos. . . . These inconsistencies have led some to claim that Facebook deliberately misled the advertising industry, a claim that Facebook has denied. Plaintiffs in a lawsuit against Facebook say that, in some cases, the company inflated its numbers by as much as 900 percent. Whatever the reasons for these errors in measurement, the “pivot to video” is a sharp illustration of how the modern advertising marketplace can leave buyers and sellers beholden to dominant platform decisions about what data to make available.”

Opacity, pg 70

On specific types of ad fraud:

“Click fraud is a widespread practice that uses automated scripts or armies of paid humans in “click farms” to deliver click-throughs on an ad. The result is that the advertising captures no real attention for the marketer. It is shown either to a human who was hired to click on the ad or to no one at all. The scale of this problem is enormous. A study conducted by Adobe in 2018 concluded that about 28 percent of website traffic showed “non-human signals,” indicating that it originated in automated scripts or in click farms. One study predicted that the advertising industry would lose $19 billion to click fraud in 2018—a loss of about $51 million per day. Some place this loss even higher. One estimate claims that $1 of every $3 spent on digital advertising is lost to click fraud.”

Subprime Attention, 85

What Tech Calls Thinking: An Inquiry into the Intellectual Bedrock of Silicon Valley

Adrian Daub

What Tech Calls Thinking

What Tech Calls Thinking is “about the history of ideas in a place that likes to pretend its ideas don’t have any history.” Daub has good reason to know this, as a professor of comparative literature at Stanford University (I never took a class with him, a fact I regretted more and more as the book went on). His turns of phrase do have the lyricism one associates with a literature seminar—e.g. “old motifs playing dress-up in a hoodie”—as he explores the ideas that run amok in Silicon Valley. He exposes delightful contradictions: thought leaders who engage only superficially with thoughts. CEOs who reject the university (drop out!), then build corporate campuses that look just like the university. As Daub explains the ideas of thinkers such as Abraham Maslow, Rene Girard, Ayn Rand, Jurgen Habermas, Karl Marx, Marshall McLuhan and Samuel Beckett, you get the sense, as Daub says, that these ideas “aren’t dangerous ideas in themselves. Their danger lies in the fact that they will probably lead to bad thinking.” The book is a compelling rejection of the pseudo-philosophy that has underpinned much of the Valley’s techno-determinism. “Quite frequently,” Daub explains, “these technologies are truly novel—but the companies that pioneer them use that novelty to suggest that traditional categories of understanding don’t do them justice, when in fact standard analytic tools largely apply just fine.” Daub’s analysis demonstrates the point well. 

On tech drop outs:

“You draw a regular salary and know what you’re doing with your life earlier than your peers, but you subsist on Snickers and Soylent far longer. You are prematurely self-directed and at the same time infantilized in ways that resemble college life for much longer than almost anyone in your age cohort. . . .  Dropping out is still understood as a rejection of a certain elite. But it is an anti-elitism whose very point is to usher you as quickly as possible into another elite—the elite of those who are sufficiently tuned in, the elite of those who get it, the ones who see through the world that the squares are happy to inhabit . . .  All of this seems to define the way tech practices dropping out of college: It’s a gesture of risk-taking that’s actually largely drained of risk. It’s a gesture of rejection that seems stuck on the very thing it’s supposedly rejecting.”

Dropping Out, pg 37

On platforms versus content creation:

“The idea that content is in a strange way secondary, even though the platforms Silicon Valley keeps inventing depend on it, is deeply ingrained. . . . To create content is to be distracted. To create the “platform” is to focus on the true structure of reality. Shaping media is better than shaping the content of such media. It is the person who makes the “platform” who becomes a billionaire. The person who provides the content—be it reviews on Yelp, self-published books on Amazon, your own car and waking hours through Uber—is a rube distracted by a glittering but pointless object.”

Content, pg 47

On gendered labor:

“Cartoonists, sex workers, mommy bloggers, book reviewers: there’s a pretty clear gender dimension to this division of labor. The programmers at Yelp are predominantly men. Its reviewers are mostly female . . . The problem isn’t that the act of providing content is ignored or uncompensated but rather that it isn’t recognized as labor. It is praised as essential, applauded as a form of civic engagement. Remunerated it is not. . . . And deciding what is and isn’t work has a long and ignominious history in the United States. They are “passionate,” “supportive” volunteers who want to help other people. These excuses are scripts, in other words, developed around domestic, especially female, labor. To explain why being a mom isn’t “real” work. To explain why women aren’t worth hiring, or promoting, or paying, or paying as much.”

Content, pg 51

On gendered data:

“There is the idea that running a company resembles being a sexual predator. But there is also the idea that data—resistant, squirrelly, but ultimately compliant—is a feminine resource to be seized, to be made to yield by a masculine force. . . .To grab data, to dispose of it, to make oneself its “boss”—the constant onslaught of highly publicized data breaches may well be a downstream effect of this kind of thinking. There isn’t very much of a care ethic when it comes to our data on the internet or in the cloud. Companies accumulate data and then withdraw from it, acting as though they have no responsibility for it—until the moment an evil hacker threatens said data. Which sounds, in other words, not too different from the heavily gendered imagery relied on by Snowflake. There is no sense of stewardship or responsibility for the data that you have “grabbed,” and the platform stays at a cool remove from the creaturely things that folks get up to when they go online and, wittingly or unwittingly, generate data.”

Content, pg 55

On disruption:

“There is an odd tension in the concept of “disruption,” and you can sense it here: disruption acts as though it thoroughly disrespects whatever existed previously, but in truth it often seeks to simply rearrange whatever exists. It is possessed of a deep fealty to whatever is already given. It seeks to make it more efficient, more exciting, more something, but it never wants to dispense altogether with what’s out there. This is why its gestures are always radical but its effects never really upset the apple cart: Uber claims to have “revolutionized” the experience of hailing a cab, but really that experience has stayed largely the same. What it managed to get rid of were steady jobs, unions, and anyone other than Uber’s making money on the whole enterprise.”

Desire, pg 104
Create your website with WordPress.com
Get started