Neeta attends her brother’s online classes through his phone. She goes to a government school while he goes to a private school in India. They and their friends google assignments for clarity. She complains that her teachers publicly berate her for asking questions, but “Google will never insult me like this.”1 Even parents are on board with online learning. As one remarks, “YouTube is better than school. The clever person will learn from YouTube. They will take admission to a government school and then learn from YouTube.”2 Teachers also throw their hats into the ring. Kamlesh, a government-school teacher, has set up his own Instagram account, where he experiments with teaching methods through reels while earning a little cash on the side.
This is a snapshot from a digital learning project that FemLab undertook beginning in 2022.3 We focused on resource-constrained communities in India, which are the most vulnerable when it comes to losing out on access to education. We found that youth have high aspirations—they want to become doctors, engineers, entrepreneurs. They want a fighting chance in a world structured by injustice, but changing an educational system, a social norm, and a patriarchal culture takes time.
Using digital tools is a quick way to make a dent, carve a niche, and etch a space in a relatively safe way. But while YouTube saves the day, schools are here to stay. School is synonymous with education, especially for parents with limited economic means, who view this formal pathway as the single most powerful way for their children to escape poverty. Both learning spaces are fraught with harms and challenges but signal progress for the next generation. Digital technology, much like school, is synonymous with hope.
When we turn to the West, the education system, much like other social systems, appears deeply fragile. New AI technologies like ChatGPT could disrupt traditional ways of learning, teaching, and assessment. Tech monopolies pose an existential threat to the social systems people have come to cherish. Resentment leads to resignation as users inhabit digital spaces they loathe, consume content they condemn, and engage with tools they view as toxic to their lives. This book makes the case that this pessimism bubble, while persistent, represents a small fraction of the world’s perspective. Even as despair drives design, policy, and public sentiment in the West, the rest of the world remains contagiously hopeful.
With close to 90 percent of young people worldwide residing in the Global South, hope is the global norm. These youths yearn to disrupt and dismantle systems that continue to fail them. They are forward-looking and optimistic. Digital platforms, despite the risks and harms, are often more pliable than existing institutions and cultural norms. Afghan girls and women, deprived by the Taliban of access to education, parks, markets, and even salons, cope with this medieval treatment by finding sanity online. They watch YouTube tutorials and Instagram fashion reels and share jokes with strangers from the isolation of their homes.
Millions of young resellers in Bangladesh use Facebook to do business, especially when the COVID-19 pandemic shut down their markets. Some imams in Saudi Arabia with more liberal values experiment on X, making piety accessible to their young followers. As nuance paves the way for negotiation, Global South users have little choice but to hack creativity, circumvent state control, and build proxies for self-expression. The new algorithmic cultures promise new ways to make a living, escape loneliness, and collectively champion a cause.
The binary of good and evil is fitting for children’s stories but not for reality. As we grow up in digital worlds that are intrinsically cross-cultural, we need to ask questions that mirror that ethos. Tech bros can resist claiming that new tech will replace entire institutions and cultures; civic actors need to shelve interrogations that serve as a litmus test for which side people are on; media pundits need to resist queries of causality, such as, Will this new tech be a kill switch for democracy/romance/creativity/community/civility? Instead, they should ask questions about why people and institutions do what they do and how we can steer them to optimize technologies for human and planetary well-being.
In the fast-paced world of digital innovation, insights on AI futures may appear slow and quickly outdated. Trends rise and fall and threaten to take these visions with them into the shadows of oblivion. In the graveyard of apps, we find punditry buried. This angst, however, applies to those who insist on believing that novelty in tech translates to novelty in social life. Human nature, on the contrary, is predictable in its response to new technologies. Society tends to react to new technologies with utopian and dystopian futuristic visions, as well as moral panics. That leads us to approach designs and policies reactively. The fact is that all innovations are intrinsically humane, complex, and contradictory, and often become mundane over time. Novelty becomes fringe. New tools and techniques, if they are lucky to survive, become embedded in the fabric of our everyday lives, as ideas, cultures, and perhaps even as institutions, making way for the next wave of novelty. This should serve as a comfort to readers who seek to move from pessimism to possibility.
Mainstream media headlines on the rise of AI declare the end is near. An article in The Guardian states, “The Future of AI Will Fill You with Unholy Terror.”4 The Nation declares, “The Future of AI Is War” while the BBC warns us it “Could Lead to Extinction.”5 Even former OpenAI CEO Sam Altman echoed this fear in his testimony to the United States Congress, in which he pushed for state regulation to ensure that the benefits outweigh the harms to society.6 If readers want to ground themselves to cope with this existential threat, there are few spaces of solace. Every act you do and every word you say is up for automation and erasure. In this climate, this book, true to its optimistic commitment, promises a future with AI that is rooted in the manual, the material, and the multicultural.
As the world grapples with the loneliness epidemic and people yearn for real connection, it is worth imbibing Indigenous and feminist wisdom that values the manual as vital to being humane, beyond the digital, the market, and the state. Over the internet, a hug can only go so far. Human touch builds intimacy. Face-to-face interactions nurture empathy and physical presence stirs hope and belonging; we are social creatures, after all. There is something to be said for the joy of shared experience.
People who fix a car, mend a fence, build a shed, knit a sweater, or cook a meal increasingly evoke respect. The AI age parallels one in which societies are starting to revalue repair culture, organic living, going local, and being offline. However, existing systems are designed to valorize the head over the hand. Our institutions are geared toward deskilling youth in their everyday lives while upskilling them for some corporate niche. The manual offers an opportunity to slow down time, to savor the present, and to immerse ourselves in the pleasure of collaboration. It offers intrinsic value and personal satisfaction in being autonomous, yet useful to others. The manual has a profound role to play in satisfying this fundamental human tactile need.
When handiwork is valued, there are double standards. Those who are good with their hands in the Global North are celebrated for their authenticity, individuality, and creativity. Those in the Global South, however, are viewed as mass producers, cheapening the allure of the manual. Moreover, what counts as automation can be a market fabrication. The AI value chain for a business—all the phases in its business cycle from the creation of a digital product or service to the delivery to market and everything in between—needs reassessment. The fact is that most of this AI cycle is fueled by the sweat of people in the Global South: the labor of data feeders, labelers, content moderators, coders, auditors, data stewards, and those working in renewables and grid maintenance, all striving to keep the AI wheels turning.
Western corporations and policymakers need to acknowledge the full life cycle of the future of AI, much of it hidden and invisible to consumers and citizens. Culture will never be completely computable, nor should it be. Intermediaries of the physical and emotive kind are here to stay. AI laborers should be integrated into the corporate and public imagination, decision-making, and value systems. We can learn from feminist struggles around the world that continue to resist patriarchal systems that have made women’s networks of care invisible and disparaged while depending on them as essential safety nets. Good leadership in the AI era comes from those who are driven by compassion and camaraderie, recognizing and instituting feminist values when it comes to AI innovations.
The manual today gets a bad rap. It needs a renaissance, especially in the AI age. Its Latin root is the term manualis, meaning “of or belonging to the hand.”7 In antiquity it was associated with craftsmanship, artistry, and human skill. There was a certain honesty that came with handiwork. Its status changed during the Industrial Revolution as it transformed into physical acts of monotony and repetition primed to be replaced by the machine. This historical shift marked a growing devaluation of the manual, equating it with the menial. Elites used the new techniques of the day to build arbitrary and artificial differentials of value between the mental and the physical, the intellectual and the manual, to systematize inequality.
Fast-forward to the present, when advanced AI tools have created an existential crisis of what it means to be human. As this book illustrates, this panic builds on the continuing legacies of patriarchy, paternalism, and narratives of Western progress that negate the culture of care that people around the world crave. While this mindset has moved into the digital era, we have much to learn from long-dismissed Indigenous cultures that have developed ways to imbue the manual with spiritual significance, dignity, and mindfulness.
AI is the air we breathe, the water we drink, the land we stand on. It manifests in underwater cables, data centers, and solar panels. The hunger of the cloud is fed by our planet. Once corporate leaders and policymakers get into that default mode of thinking of the virtual as physical, and institutionalize it in everyday decision-making, it will get easier to translate sustainable values into design. For this approach to succeed, it will be imperative to honestly assess the material costs associated with automating human needs and channel creative energies to address such challenges. As corporate leaders weigh the actual price of AI-driven automation through this ecological approach, they may be able to break out of the age-old logic: “if we build it, they will come.” Strategic friction is needed to ensure that AI futures are approached in ways that nurture our digital, social, and environmental health.
In 2023 new facilities operated by Africa Data Centres will occupy 125 acres of land in Ghana.8 This is the tip of the iceberg. From Nairobi to Cape Town, Africa is witnessing a boom in data centers in the name of data localization, digital sovereignty, and local leadership. There is a corresponding increase in water consumption to cool these data centers. As global digital communication becomes more audiovisual and the demand for unstructured data increases for advanced computing, water and data become two sides of the same coin. The United Nations has declared that by 2025, 50 percent of the world’s population is projected to live in water-stressed areas, making the environmental cost of data growth a priority lest the war on data become a war on water.9
Indigenous communities have long understood the delicate balance between humans and the natural world. Their intimate relationships with the environment have given rise to profound respect for all living beings based on the recognition that every action has material consequences. Applying this wisdom to the realm of AI may generate innovative ways to assess and balance our needs for being digitally connected and having a dignified living with adequate resources. Indigenous cultures can teach decision makers the critical value of the interconnectedness of all life forms, including humans and the land. This holistic approach offers a stark contrast to the modern era’s anthropocentric view, which tends to overlook the intricate web of relationships that sustains life. By understanding that AI’s material impact extends beyond human society, we can foresee its implications for ecosystems, biodiversity, and even climate change, encouraging more responsible development and deployment.
Additionally, Indigenous cultures carry deep understandings of sustainability and long-term thinking. Their traditions often incorporate practices that ensure resources are used wisely, preserving the earth’s vitality for future generations. This ancient knowledge challenges our fast-paced, consumer-driven approach to AI development, where the race for technological supremacy sometimes overshadows the importance of ethical considerations and environmental stewardship. Moreover, Indigenous cultures have demonstrated remarkable resilience in the face of adversity, preserving their traditions and knowledge over countless generations. As leaders navigate the uncharted waters of AI, Indigenous people’s stories of survival and adaptation should serve as inspiration to foster hope. These narratives stand as helpful reminders that humanity possesses the capability to find solutions, adapt, and learn from the challenges that AI may bring, ultimately leading to a more optimistic future.
The rise of the next billion users from the Global South promises to radically diversify our digital cultures. While tech corporations struggle to move beyond the commodification mindset, and the state and aid agencies work to shift away from their need for totalizing control, users are at work shaping the nature of global datasets and algorithmic cultures. Content creators from rural areas in Zambia and Brazil churn out multimedia content daily, playing with stereotypical scripts of poverty porn, victimhood, and racialized and sexualized selves, often through satire, humor, or just mundane snippets of everyday dignified living. AI is a numbers game and the numbers work in their favor.
As this book has revealed, these youth, or what I call the marginalized majority, are not waiting for policies, designs, or strategies to include them. They are forging ahead by gaming the metrics, finding their niche, and hunting for cross-cultural emotive resonance that can make their content go viral. However, the responsibility to diversify datasets should not rest on their shoulders. This trend offers an opportunity for corporations, entrepreneurs, and designers to embrace a multicultural and multilingual approach to debiasing and building new datasets. Diversity by default can be the aspirational rule of the day.
There is much work to do to curate multicultural datasets. When we go online to search for images of people, things, and cultures outside of the WEIRD (Western, educated, industrialized, rich, and democratic) context, we often face data absences or stereotypical images. For instance, most Indigenous communities in Latin America are not mapped, captured, or visualized. An image search for “women at work in India” returns a stark lack of women doing household work. This situation is further exacerbated by the limited types of visuals in stock images and under Creative Commons licenses. This case can be made for audio diversity as well. It is understood today that the quality of trained datasets influences what we experience when online; what we hear, see, and feel are shaped by the biases in these systems. These limited options influence how the West imagines and approaches the rest of the world. With algorithmic models being only as good as the quality of the training data, these myopic framings will amplify stereotypical views of the majority world.
Generative AI tools have the potential to create new semisynthetic datasets in partnership with Global South creators and communities to tackle this persistent sensory and cultural bias. Addressing data scarcity and distortion has positive implications for educators, activists, entrepreneurs, and policymakers driven to build understanding and empathy for diverse cultural heritages and contexts without being trapped in the given boxes of cultural and creative representation. Questions abound regarding the cultural identity, digital creative ownership and fair value, scalability, and quality of this creative content that demand an interdisciplinary social and computational undertaking.
A multicultural dataset will enable AI algorithms to learn from a wide range of voices, providing a more comprehensive understanding of the human condition. With diverse datasets, algorithms will be less likely to perpetuate stereotypes or discriminatory patterns, as they will have been exposed to a variety of perspectives and cultural nuances. As a result, AI systems will make more informed and empathetic decisions, leading to greater fairness and social justice in areas like hiring, loan approvals, and criminal justice, among others. A multicultural approach to building datasets serves as a powerful catalyst in our journey toward a more compassionate society.
Decolonizing is already a buzzword. The term’s popular usage in media today denotes a widespread awareness of how social injustice is inscribed in our AI worlds, and there is momentum to make AI more inclusive. The challenge remains how to translate this drive into design interventions.
Reflexive fatigue is the first obstacle. Many institutions have embraced moral agendas, but few have inscribed them in their organizational workings so that people have the bandwidth to pause the business-as-usual approach and make meaningful changes to their practices. Organizations treat the decolonizing agenda as an add-on. Doing things differently may result in emotional exhaustion and burnout, which may translate to resentment toward the very groups targeted to be served. There is a reason why cooperative design efforts often result only in product testing experiments or market sentiment pilots. Tight and short-term deadlines, restrictive outcomes as measures of success, limited incentives, and the lack of a common vocabulary to understand grievances and aspirations hold people back from plunging into this rich reality.
As organizations grow, they can’t help but get trapped in their cultural myopia. Relationship building in public and private enterprises is typically instrumental. The term networking smells like work. To do things differently and more equitably, corporate and policy leaders should have safe spaces to build ties with civic agencies, artists, and community members where their organizations can organically learn from one another and chalk out common ground. This takes time, energy, and long-term commitment. Every organization has internal disruptors who are able and willing to shift their work cultures from within. To identify, collaborate, and deepen these relationships, we need time and space, and most importantly, an open agenda.
Innovation comes with failure. Bad designs, policies, and code are made all the time. There is no such thing as a finished product. When products are released into the wild, they take on lives of their own. Yet the responsibility for the impact these products have on society—especially toward those most vulnerable—still lies with the organization. Accountability should not arise only when mistakes are made. Auditors are there to do more than ensure standards are met—they serve as educators, mediators, and catalysts for change by aggregating insights of the most affected groups and sharing these understandings in ways that can be incorporated into further product development.
We can get lofty about decolonial design, but at the end of the day, systemic change happens incrementally, consciously, and thoughtfully, by embracing the “boringness” of institutional reform.
I was in New Orleans to deliver the closing keynote at an international conference on Human-Computer Interaction in 2022. It was my first postpandemic, in-person appearance at a large international event, as it was for many of its more than two thousand attendees. When I talk in front of a big gathering, I usually love to get the pulse of the room. I ask the audience to make some noise if my statements and questions resonate with them. At the beginning of this keynote, I asked how many in the crowd worked in UX. Some people clapped. I followed up by asking who worked in or with people in the Global South. A few people made some noise. I then asked if anyone felt like an imposter in their area of expertise. The room exploded.
When even tech people don’t feel they belong, we have a cultural crisis at hand. Utility companies and automobile industries now compete with the data industry. Super apps push gaming, banking, and romance industries into an arranged marriage. Banks compete with WhatsApp and TikTok to issue digital currencies. AI has seeped into the everyday operations of systems, shifting culture from the inside out. Academics have become influencers. Influencers have become politicians. Politicians have become entertainers online. The world feels upside down. Virtual hate leads to bodily harm, even death. Land gets occupied by the cloud. Imposter syndrome doesn’t have to be a malady; it can be an asset as it allows us to stay at the cusp of change, ready to take on the future. No discipline, expertise, or industry practice should be sacrosanct. Embracing the imposter in ourselves shows that we do not need to belong to some sector or field to feel part of a whole.
These AI trends don’t have to be disorienting. If leaders truly imbibe the lesson of interconnectedness, they will start by looking at any product or service, from toothpaste to health insurance, as a cultural artifact shaped by larger social and ecological forces. They will pursue the value chain of manual labor that goes into their business, its material impact to gauge the true value, and how different cultures perceive their products and services.
In 2022 I was headhunted to serve as a director in AI impact for a Western multinational furniture company. It puzzled me at first, as I did not see the link between my CV and selling chairs. After my conversation with their team, it became clear. Their business model was pivoting to the next billion users, who are their future customers. AI is not the only black box that needs unpacking. The Global South is just as alien and incomprehensible to many businesses. What makes people do what they do is at the heart of any operation. Furniture, for example, is an aspiration that comes alive in Pinterest posts and Netflix shows. Homes are social spaces of memory and intimacy, and a chair tells the story of pregnancy and loss. Our situated contexts, our relations, our everyday institutions, and our cultures shape how and what we value.
At the end of the day, if an organization wants to get on board to build hope through their AI-enabled designs and policies, they need to buy into the fact that culture, local and global, truly matters.