A brand-new endeavour based at the Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), University of Cambridge. Under the leadership of CRASSH Director, Professor Steven Connor, and Advisory Board Chair, Professor John Naughton, the Centre brings decades of experience in leading cutting-edge interdisciplinary research and maximising public engagement. The Centre will begin its work-program by developing projects under four themes typically neglected in discussions of technology futures: public discourse, work, environment, and trust. The Centre will place a spotlight on tech narratives about AI, machine-learning, and the deterministic thinking that accompanies them. It will refocus the conversation on the future of work to consider the meaning work brings to people’s lives, and the significance for democracy when it is taken away. It will reveal the hidden environmental impacts of digital technology, from the minerals and metals extracted to make hardware, to the emissions generated from data centres. Finally, it will graft these questions onto the landscape of trust and trustworthiness: in public institutions, in companies, and in each other.
Emeritus Professor of Public Understanding of Technology, John Naughton, Chair of the Advisory Board for the Cambridge node, believes the network has the appropriate strategic experience to work collectively on serious regulation of the business model of technology companies. “We know enough about the business model and political tactics of companies like Facebook and Google now. We know what to target, and how, and we’ll be operating with intellectual firepower around the world to go after it,” Professor Naughton said.
The Minderoo Tech and Policy Lab serves as both the network’s coordinating headquarters, as well as a regional hub for incubating projects across the southern hemisphere. The location in Perth, Western Australia, places the Lab in the same time-zone as 60 per cent of the world’s population, embedded in neighbourhood with many of the world’s net importers of technology. Under the direction of Professor Julia Powles, the Tech & Policy Lab will help lead the research and impact agenda for the tech impact network, sequencing policy moves and targeted interventions on tech law and governance. Relative to other sectors, industries, and utilities, tech exists in a legal vacuum: with stark absences of any sense of obligation to respond to national frameworks of rules and regulation. The Lab aims to dramatically change this status quo, with a relentless focus on defending rights and protecting against harms.
Incoming Director of the Lab and principal architect of the Tech Impact Network, Professor Powles noted the urgency of innovating with law and policy to reset power. “From our vantage point in Western Australia, in neighbourhood with many of the world’s net tech importers, we have a significant creative agenda for reigning in the unchecked power of the existing tech monopolies and designing with intention the pro-public technologies we deserve in their place.”
The Minderoo Initiative on Technology and Power will critically investigate the social impact of digital technologies on communities and the broader public good. It will create new paradigms for the public to understand the harms of tech platforms, predictive technologies, advertising-driven algorithmic content, and the work of digital labourers. The Minderoo Initiative’s work program draws heavily from the intellectual contributions of its Co-Directors, Professors Safiya Umoja Noble and Sarah T. Roberts. Both scholars are at the forefront of uncovering and exposing the darkest corners of tech’s power and politics. Professor Roberts’ work reveals the low-paid and precarious labour holding up much of the tech ecosystem, and Professor Noble’s work shows the structural racism baked into the infrastructure of commercial search and advertising-driven algorithmic content.
Co-Director of the UCLA Initiative and author of the critically-acclaimed book, Algorithms of Oppression, Professor Noble said, “We are at a critical juncture where abolitionist and restorative interventions must be considered in the face of mounting social harms from internet-based technologies. This new Minderoo Initiative situates us as part of a global network of scholars, journalists, makers, and artists who care deeply about these issues and we hope this gift is the first of many transformational investments that will help us make a long-term impact for change and repair.”
Minderoo Professor Meredith Whittaker examines the social implications of artificial intelligence for institutions, culture, and rights. From 13 years working at Google, Professor Whittaker witnessed first-hand the unprecedented incursions the company was making into markets, domains, and public institutions, building worker power to counter the unethical decisions of the company. Now a full-time researcher and activist supported by Minderoo Foundation, she dedicates her time to scaling, convening, and organising power across all the workers’ communities that tech touches. Professor Whittaker also leads the AI Now Research Institute, of which she was a co-founder. She is supported by Minderoo High-Impact Distinguished Fellow Frank Pasquale, author of the forthcoming New Laws of Robotics.
Minderoo Foundation is also providing foundational support to NYU-based ENRICH (Equity in Indigenous Research and Innovation Coordinating Hub), which domiciles a constellation of Indigenous approaches to data ethics, collective privacy, data governance, digital infrastructure, and responsive policy. Minderoo Foundation’s support enables two Global Chairs per year to spend time collaborating with the NYU Hub as well as in community, implementing advanced data practices on the ground. The first ENRICH Global Chair will commence in late 2020 and is Professor Maggie Walter from the University of Tasmania.
Minderoo Foundation is co-designing and supporting a bespoke Challenge Fund at the University of Oxford, to be administered by the Oxford Humanities Division. Providing seed funding to student, faculty, and community projects of varying lengths and budgets, the Fund will catalyse a dramatic expansion of the notion of artificial intelligence ethics, to fully embrace the seismic challenges that contemporary digital technologies pose to labour, institutions, and public scrutiny, by stimulating new investigative research and public education and engagement projects drawn from many different fields: history, art, performance, linguistics, governance, community projects, and more.