“Look at them! I think that’s really what the essence of dancing is about—not showing off but really bringing happiness and joy to the dancers themselves in a simple way,” commented someone on a viral video of a couple dancing in rural China.1 The couple in question are corn farmers Peng and Fan from the countryside of Ruian, Wenzhou, who took to dancing when faced with personal adversities of financial stress, health issues, and depression. They invented the “rural shuffle dance,” which was watched and praised by millions in China and abroad. Their positivity struck an emotional chord among people, tapping into a contemporary yearning for purity, simplicity, resilience, and joy.
More than a hundred years ago French sociologist Émile Durkheim coined the term “collective effervescence” to describe shared human sentiment.2 He postulated that collective emotions built through everyday acts of sharing stories, gestures, movements, and other humble forms of expression can lead to social solidarity, bonding, and a sense of belonging. Today, memes, hashtag cultures, WhatsApp groups, YouTube channels, and Facebook pages have become cultural and cathartic repositories that can bring people together on matters from the mundane to the profound. Zoom, in a way, got us through the COVID-19 pandemic. While we seek each other online for solace, strength, and support, social surveillance using digital tools comes with a dark side. Collective effervescence is fodder for the capitalist machine as our social lives become data to be traced, tracked, and turned into value for states and brands. As social psychologist Shoshana Zuboff argues, we have become “means to others’ ends.”3
Is there a way to steer social surveillance toward tools to build social health and collective well-being? To rebuild social trust, society needs a surveillance system of care—one that moves away from watching each other as a form of policing to watching over one another as a form of recognition and compassion. The culture of fear needs replacing with a culture of collective empathy. Memes can boost mental health. Open-source data and crowdsourcing can build maps of neglected areas and generate empathy through visibility. Refugees and their families can track each other for safety and security. Female workers can organize themselves digitally and protect themselves by rating unruly customers. Sentiment analysis of global emotions can help improve public policy to increase global happiness. Digital tools can serve us by feeding the collective effervescence we desperately need in our societies today.
To build a safe public space, there must be eyes on the street.4 In the 1960s US urban reformer Jane Jacobs introduced this idea as a useful rubric when approaching city planning. She argued that neighborhoods would become safer if buildings faced the street. People who live, work, and inhabit these public spaces can look out for the safety of both residents and strangers, although it is a thin line between watching each other and watching over each other. To deter care from becoming a means of control, power brokers need to be distributed, decentralized, yet connected to those that can institute change. This network formation demands a democratic crafting of the social order. Jacobs remarked, “The trust of a city street is formed over time from many, many little public sidewalk contacts. . . . Most of it is ostensibly trivial but the sum is not trivial at all.”5 This principle applies to our digital spaces as well.
“Our main demand is not to get killed,” explains Camila, a student in Mexico City, on why she joined a Facebook and WhatsApp women’s group to track each other’s real-time locations.6 Femicides, the gender-based killing of women by men, rose by 135 percent in Mexico in 2021 and constituted a quarter of the 3,750 women killed. While more than 90 percent of people in Mexico have Facebook, the majority find the interface boring. Yet for the purposes of safety and solidarity in the face of growing gender-based violence, women appreciate the platform’s simplicity and functionality. They share their stories, alert others to their plight, and reach out to one another for help through this channel.
During the COVID-19 lockdown, mothers in China looked out for one another and boosted each other through humor. The #workfromhomewithchildcare meme in China received millions of posts, likes, and shares.7 Mothers used emojis, animations, and word art to mock expectations to balance work and childcare when schools shut down. Profiles on Chinese TikTok like that of Chubby Toot, the username of a twenty-five-year-old Chinese mother, attracted two million followers as she shared her struggles when working at home with her two-year-old son, whom she refers to as her “little rascal.”
Solidarity surges have spread worldwide. Media researcher Maud Ceuterick calls humor “a great feminist weapon.”8 The Atlantic points out that “humor is everywhere because fear is too.”9 Youth solidarity has also increased worldwide, especially in the Global South, where young people face crushing workloads, social pressures, and family responsibilities.10 In recent years the “lying flat movement” of young Asians refusing to work long hours has grown. Many youths view their existing workloads as untenable and even undesirable. In July 2020, rural Chinese vlogger Li Ziqi set the record for the most subscribers of a Chinese-language channel on YouTube with more than sixteen million subscribers.11 Videos of her pickling vegetables in the countryside, making a dress out of dried grapes, horseback riding in the forest, and building a thatched roof have inspired youths from Bangladesh to Portugal.
We tend to underestimate the value of ordinary acts from strangers whether they be sharing a joke, saying a kind word, or being decent with one another. Canadian journalist Malcom Gladwell argues that “kindness [from strangers] is a temporary suspension of indifference.”12 These acts can serve as a social glue, holding humanity together.
Pushing against antisocial behaviors takes collective effort, with help from influencers, users, civic groups, regulators, and algorithmic innovations. The Goodness Bot (@goodnessbot), an automated X account, when tagged in a reply to a bullying tweet, generates a similar version of the offending tweet, but replaces the negative messages with kind and humorous words.
US activist Monica Lewinsky, in partnership with the advertising firm BBDO, spearheaded this initiative as part of a larger anti-cyberbullying campaign. Lewinsky taps into her past of being publicly bullied and how strangers’ kind words got her through:
There was a time where getting the mail was the highlight of my day . . . so I know the power and the impact of hearing from a stranger. . . . That’s what this bot can do. You don’t have to know the person. It’s just passing along the good deed in a way and trying to help someone that you see who may be in pain.13
TikTok’s algorithms have come under scrutiny as they are accused of privileging content, such as body shaming, that is harmful to minors. The scale of outreach is staggering. In 2021, posts with a body-shaming hashtag on TikTok got 1.2 billion views.14 TikTok tries to reduce these harms by creating blocked lists of keywords and profiles, which has limited effect as content creators and users circumvent these blocks by deliberately misspelling keywords to evade filters.
To tackle this crisis, body-positive influencers have come together as a counterforce. Organizations like Within Health, a digital health startup, suggest a social and technical strategy to fight eating disorders.15 Crowdsourcing keywords and associated songs, images, and themes from users helps keep the blocked list dynamic and responsive to ongoing changes in how people communicate body shaming when online. TikTok, however, can prime mindsets by disabling harmful content through their autocomplete, ranking, and search functions. The company can provide users with opt-out choices on what they consume. They can deploy trigger-warning labels, embed pop-up tutorials to bring awareness to the problem when users search for negative content, and collaborate with other platforms to standardize this impact.
Another initiative comes from Line, one of the most popular encrypted messaging apps in Asia. Line launched a program to tackle misinformation in Taiwan in partnership with local fact-checking organizations like Taiwan Fact Check Center and Cofacts.16 The app allows users to forward messages they want verified and receive an answer in real time about whether the content is true. This seemingly humble intervention allows the company and other stakeholders to tackle harmful content without breaking end-to-end encryption. The novelty of these digital tactics is less in their technological innovation than in social relationships and partnerships to build “natural proprietors.” Such tactics, while trivial by themselves, can cumulatively nudge users toward a cultural shift toward collective care online.
Exercising collective care does come with challenges, especially when caught in the age-old struggle between freedom and control. X indiscriminately purges bots to build trust. The quality of fact-checking is contingent on what kinds of training human and AI moderators get. Blocking keywords can inadvertently censor legitimate content, including political critique of the state, in the name of social harmony. The freedom-control debate gained renewed vigor during the COVID-19 lockdowns as states used AI-enabled cameras, digital apps, heat sensors, mobile tracking, and social media content to enforce public health measures.
Internet governance scholar Laura DeNardis argues that the internet is no longer just a digital public sphere within which we connect and communicate. It is an infrastructural system that has “no off switch.”17 This system is both virtual and physical, connecting vehicles, wearable devices, home appliances, drones, medical equipment, currency, and every other tool that mediates our social life. Ubiquitous social and data surveillance, however, is far from a reality for the majority of the world. While there may not be an “off switch,” there are loose connections, poor bandwidth, VPNs, offline sharing of profiles and devices, hacking, weak digital literacies, and other factors. Being “on” 24/7 is a myth for many.
The surveillance of the world is fragmented, unrepresentative, and faulty. While the omnipresence of surveillance tools sounds ominous, those at the margins have long been invisible, silenced, unmapped, and undocumented. In some cases, digital surveillance provides security that enables freedom. Senior humanitarian advisor Ivan Gayton of Humanitarian OpenStreetMap underlines, “There are about two billion people in the world who don’t appear on a proper map.”18 A third of humanity will live in slums or informal settlements in the future.19 The growth of slums accounts for 90 percent of urbanization in the twenty-first century, yet these slums are largely unmapped. Smart cities of the future will resemble Kinshasa and Mumbai, not London and New York.
Having a digital identity matters. More than a billion people with no formal identity want to opt in, not out, of this state tracking system. Digital identification would help them open bank accounts, secure funding, apply for jobs, exercise their legal rights, enable formal claims, and prevent undue detention and deportation.20 For instance, while India has made strides in economic growth in the last few decades, 90 percent of workers remain informally employed as of 2022.21 Many of the vulnerable workers are young, female, and belong to groups of migrants, scheduled castes, scheduled tribes, and other “backward” classes. As surveillance scholars Murakami Wood and Rodrigo Firmino argue, “The fear of anonymity and being ‘lost’ is far stronger than any concern about surveillance or control.”22
Tracking people, supplies, infrastructural projects, and access to welfare schemes can lead to more accountability and fairness in the distribution of scarce resources. This applies particularly to the shadow economy in the Global South, where precarity is the norm and informality is the way of life.
At the onset of the pandemic in January 2020, I founded FemLab, a feminist futures of work initiative, with feminist media scholar Usha Raman. We received funding from the International Development Research Centre (IDRC), a Canadian grant agency. This project was part of its larger funding initiative on digital innovations and the future of work in the Global South. IDRC focuses on how automation and digitization impact the economy in the Global South, with approximately sixty million workers engaged in sectors transformed by digital interventions.
FemLab focused on the changing work conditions for female gig workers in India and Bangladesh during the pandemic in the ride-hailing, home-based salon services, sanitation, artisanal, and construction sectors. We built a team across four cities in India to examine how these workers were organizing themselves using digital tools in ways that could boost their statuses, voices, opportunities, conditions, and livelihoods. We wanted to unpack the surveillance of care apparatus in the world of work at the bottom of the value chain.
The project came at a time when remote work became the norm in many sectors during COVID-19 lockdowns. Workplace surveillance technologies boomed around the world, including in India. While venture capital funding was down for most sectors, remote-monitoring tech received more than $394 million, a 62 percent increase from 2021.23 Companies wanted to comply with health measures while ensuring workers’ productivity. Informatics scholar Jessica Vitak has explained a cultural shift to employers believing that having a “treasure trove” of data on their workers could lead to making them more productive.24
The Centre for Internet and Society (CIS) reported the exponential growth of monitoring technologies such as facial-recognition software, human-efficiency trackers, keystroke loggers, and social media screeners being used on Indian employees.25 CIS found these software running in the background, collecting data from employees’ WhatsApp, email, and social media accounts, even past their work hours.
While there are data protection policies in place to curb companies’ indiscriminate collection of their customers’ data, there are few comparable measures to protect employees. The choice can be stark for workers. Employees consent to this intrusive surveillance because they fear losing their job. This power asymmetry is far worse in the informal sector, which constitutes the bulk of India’s economy and is increasingly becoming digitized.
In 2020 sanitation workers in Chandigarh in India protested having to wear GPS-enabled watches that tracked their movements. The municipal corporation (MC) officer justified the use of these wearable devices by saying, “We often get complaints that MC sweepers, gardeners, sanitation inspectors and enforcement staff are absent from field duty. This new system will rein this in.”26 However, many workers perceived this as “bonded labor,” given that the majority of the five million sanitation workers come from the marginalized Dalit caste.27
The reliability of these tools also came into question. Women sweepers wearing bangles complained that their watch would turn on when their bangle accidently touched the watch. Some complained that the watches showed the wrong location, causing a loss of wages. Their salaries could decrease if the wearable’s batteries died while they were at work, failing to capture their time.
Few redressal mechanisms and high demands on the workers’ time to address glitches result in more precarity and exploitation. As the private sector experiments with new tracking and automation of their workers, the public sector experiments with new ways to deliver essential safety nets, albeit with mixed results.
In 2021, the Indian government launched the e-Shram portal, a national database of unorganized workers.28 The portal registers unorganized workers and connects them with social security benefits if they meet with an accident, become disabled or unemployed, or die at the work site. This is an unprecedented effort to formalize the massive informal economy.
Sociologist Shweta Mahendra Chandrashekhar worked for FemLab with a focus on the construction business in India. She interviewed union workers and leaders, municipal officers, and contractors to understand worker benefits, including how digital registrations improved access to welfare benefits. Her father owns a tunnel construction company in Pune, and her childhood was shaped by her visits to the construction sites and her interactions with the site engineers, migrant workers, contractors, and government officials. The COVID-19 lockdown hit the construction sector particularly hard since a majority of their workers are migrants. Pune saw about 75 percent of workers leave for their villages when the lockdown was announced overnight in March 2020.29
A majority of migrants failed to access welfare benefits, despite the government making funds available during this humanitarian crisis. Union leaders provided reasons for these failures: Migrants couldn’t directly register online, so contractors needed to register workers on their behalf. Sometimes the contractors themselves were unregistered and were subcontracted by other private contractors. Middlemen have few incentives to register migrants, as laws such as the Interstate Migrant Workers Act demand that they provide migrant workers and their families with health care, housing, education for children, semiannual train fare to their villages, and other benefits. The president of the Construction Workers Union, while hopeful about digitizing the registration system, argued that we need to simultaneously build awareness of these schemes and provide free digital literacy support to enable online registration. He remarked that otherwise, the digital tracking process could create another layer of middlemen and more opportunities for exploitation.30
The government’s efforts to standardize registration and simplify migrants’ access to benefits across states are fraught with challenges. These goals require consolidating migrants’ registered accounts, including their pension and other accrued benefits. The idea of a singular professional identity doesn’t match the reality of migrants who are seasonal farmers and work simultaneously in construction, ride-hailing, and domestic services to make ends meet.
Women face similar issues of being absorbed into the state welfare system. Their situation is compounded by the patriarchal culture in which being too public, too outspoken, and too present can have adverse consequences. Monitoring apps can amplify the social pressures women face and further deter their participation in the workforce. According to the World Bank, the percentage of women working in India decreased from 2010 to 2020, dropping from 26 to 19 percent.31 In fact, this decline has become a global trend, especially among female gig workers. Women are increasingly abandoning the use of digital tools and the internet itself as fear of a digital presence overtakes them.32 Locked profiles, while an option, keep them from capitalizing on the attention economy. Tracking devices deter movements. Social expectations relegate them to care work, which remains undervalued and underpaid. In such an ecosystem of withdrawal, what will it take for women at the bottom of the supply chain to go online and use networked tools to their advantage?
FemLab’s media researcher, Pallavi Bansal, and I designed a project to gain insights on why it was so difficult to find female drivers for ride-hailing companies despite the high demand. We found double standards for women working with strangers in public spaces. Sushil Shroff, director of Taxshe, an exclusive all-women driver-on-demand cab service running in Bengaluru, explains that cultural norms result in female drivers being judged harshly and can deter them from choosing this as a profession. Shroff explains that there is a tendency to judge women drivers as being “cheap and available,” because “anyone can sit in her car and talk anything to her, and she’ll have to take it.”33
Taxshe had to rebrand women drivers as “alternate moms” to attract women to this profession, get their families to agree to them working in the ride-hailing sector, and shift customers’ mindset. The company leveraged the cliché trope of caring women that feminists typically want to break out of. They labeled female drivers as “Roos”—referring to kangaroos—who will protect young ones while moving. Shroff explains that they had to try to convince clients to respect drivers by equating them to mothers: “We told each of our clients, ‘like you, she’s a mother. Yeah, she’s handling her kids, and she’s driving your kids. . . . It’s a parent who’s driving. It’s not just a woman.’ . . . So that gets the respect.”34
Nonprofits like the Azad Foundation and the Neeva Foundation explain that it is essential to invest in female drivers to prevent them from dropping out. Women may have never driven a car or even opened a car door for a customer. They may not be used to talking with strangers. Filling out forms, getting a license, dressing professionally, handling finances, and dealing with the mobile app takes time and, more importantly, confidence. One driver, a widow with two daughters to support, shares her story:
I was very afraid of starting new things . . . my confidence level was not that good. First, they give you the confidence over there. Like “You can do, you can do,” that one is the first thing. . . . I was so afraid of everything. . . . I hesitated to speak, I hesitated to approach people, like that. That is one thing I am not now.35
While these organizations build women’s confidence and capacity as drivers, some fixes are more related to urban planning, social norms, digital affordances, awareness, and media literacy.
Women drivers at Ola and Uber express frustration about the lack of access to clean and safe toilets. Ola Mobility Institute, a think tank focusing on mobility innovation in India, explains that it is also the lack of awareness of app options that contributes to this plight. It has conducted surveys among women drivers and discovered that few are aware of how to navigate their app to find the nearest restroom.36 Additionally, the app could benefit from a rating system on the cleanliness and safety of restrooms for women. The SOS feature on the app, while present, is rarely connected to a live operator and often results in no response. Women drivers want to be able to rate their customers, too, and have their company aggregate these ratings. They want the company to act on the data insights and block customers who have broken rules of engagement with persistent rudeness, harassment, and overall bad behavior.
Women workers or “partners” with home-services gig platforms like Urban Company have shared similar concerns and desires for the ratings to be a two-way process with our FemLab researcher Sai Amulya Komarraju.37 Moreover, while they recognize the risks of everyday digital surveillance, they also see in them an opportunity to prove their worth and distinguish themselves. They feel pride in completing the ten-day training course, given that some fail in the onboarding process. They learn how to sanitize their products, deliver their services, present themselves, greet customers, operate the app, and check their commissions and ratings. The women learn the vocabulary of this digitized sector—power bank, app, leads, credits, ratings, reviews, automated, tracking, app store. These terms become part of their standard operating procedures. The partners feel professional and respected, and the data proves their standing to the company and customers alike. A service provider confidently explains the rating system:
Now you [the customer] will be questioned, how was the pro who was sent to you, did she behave well, did she sanitize, wear PPE kit, gloves, mask. If you see me doing all of this then you will tick yes, but if I have never used any of these things, you have never seen me use these things you say no.38
Many feel pride in being able to manage expectations, comply with hygiene standards, and maintain a high average rating in the system. Monitoring and rating systems can be repurposed as tools for transparency, company negotiations, promotions, and social recognition. The professionalization can have a spillover effect on the standing of these women workers in their community:
The moment you say beautician, to be honest, the mindset is they look at you differently, not with good perspective or good intention. Of course, this does not mean that everybody is like that. But UC [Urban Company] is a brand, so in the starting, during the training itself all of this is clarified. If you are working in UC, then people should automatically differentiate between you and whatever preconceived notions they have about beauticians, they must perceive us as thorough professionals. There is clarity about this now for sure.39
Building surveillance of care requires negotiating deep-seated cultural systems and infusing power into given labels such as “partners” by appropriating tools at one’s disposal. However, this is done through soft power, subtly increasing agency from within, and starting with a core power unit—the joint family.
Meesho, the reselling app, learned this cultural norm quickly as it scaled its business model. Engaging women resellers on their platform is a family effort. Achyutha Sharma, the company’s user research manager, explains how over time these compliances lead to subtle but important shifts in the women’s status, freedom, and confidence in and outside the home. Sharma points out that at the start a woman must negotiate “social permissions” with her husband or family decision makers to get a mobile device and internet access for work. The family typically agrees on the condition that she also fulfill her household duties. Over time, as they see profits coming in, things begin to change in the woman’s status in the house and her bargaining power, leading to a higher degree of “digital confidence.”40
Family surveillance can ease over time to a healthy inattention that allows women to become more visible and vocal online. These tactics, while commendable, underline the weight of the patriarchal apparatus women need to push against to have a fighting chance in an increasingly automated workforce. The future of work will continue to be upended by disruptive technologies. Workers at the bottom of the value chain bear the heaviest burden; this is especially true of women in the informal sector. Media debates position labor futures as a contest between tech and humanity. Tech, however, is human. The digital is physical, built on the continuing sweat of workers’ experience. Algorithms learn from people as much as people learn from them.
Building institutions, infrastructures, and initiatives of care such as clean and safe toilets for women, two-way rating systems, confidence training, upskilling, and universal basic incomes can sand down the hard edges of patriarchal norms. These changes start from a position of compassion. Installing the surveillance of care demands a shift from the rational to the emotional. Feeling can become fuel to action, unless it is commodified against those that need the most change. Machinery and empathy do not have to stand against each other in this pursuit of social change.
Machine learning is built to identify and structure opinions, emotions, and beliefs from collected text, expressions, and other biometric cues through wearable devices. Businesses valorize emotional intelligence (EI) to help build relationships, manage crises, communicate effectively, and inspire their teams. It provides a competitive edge. A 2019 survey with managers in the United States showed that 71 percent of employers favor the so-called emotional quotient (EQ) over the intelligence quotient (IQ) in employees.41
Empathy and social skills trump logic and problem-solving as management gurus tout EI as the key to good leadership. Impression management is all the rage. Recruiters are increasingly giving weight to EI when hiring.42 Insurance plans, advertising campaigns, product designs, banking protocols, health care tools, education content, and agricultural innovations are increasingly tailored to their diverse customers by mining their sentiments. This outlook was far from the norm in the past: emotionality was perceived poorly in society and was associated with women and the Global South. To be emotional was to be weak.
Feminists have long battled the undermining of emotion and its link with the feminine to keep women away from political life. A persistent trope of women revealing their emotions easily, like a “personal polygraph machine,” has worked against them as leaders are expected to be self-contained and self-controlled.43 This degradation of emotion applies to nations as well. The international peace organization Service Civil International, in their “Picturing the Global South” toolkit, highlights how countries in the Global South are typically perceived as “emotional” by those in the North.44
In the colonial days, imperial settlers framed Southern cultures as “emotionally pathological.”45 They viewed the habits of their new subjects as the antithesis of the Enlightenment values of reason, scientific progress, and technological order—the makings of masculine sensibility. Their worldview provided a “rational” justification and moral responsibility for settlers to “civilize” the Natives. Anthropologist Ann Laura Stoler explains how colonialism was driven by an “emotional economy,” where rulers strove to predict and prescribe what sentiments to perpetuate and what emotional contagions to contain.46
Until the late nineteenth century, emotion connoted more a collective than an individual state of affairs. It alluded to groups driven by feeling to action; as gender studies scholar Jacqueline Holler explains, people were “powerless to resist.”47 Inspired by the church and politically vested interests, emotional standards were established to manage and control people’s sentiments through rituals, institutions, and cultural practices. Settlers validated certain types of emotion, steered by beliefs about patriotism, racial superiority, paternalism, and honor.
Today, as emotional intelligence meets artificial intelligence, how do we negotiate the potential of collective care and the fear of collective control? As Stoler argues, “Sentiment is the ground against which the figure of reason is measured and drawn.”48
Cultural anthropologist William M. Reddy recognizes emotion as the driving force of the Renaissance, the “age of reason,” and has retitled the era the “age of sentiment.”49 He may have inadvertently captured our contemporary time. The battle for the future of our sentiments as an instrument of care versus control has begun.
Meta has been inundated by controversies over its experiments using sentiment analysis. Studies conducted as early as 2014 by its Core Data Science Team and Cornell University’s Departments of Communication and Information Science found that “emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness.”50 In June 2022, Microsoft announced its new policy to phase out public access to several of their “emotional recognition” tools (such as Azure) after criticism of their flawed results.51 Experts argue that without accounting for context, facial expressions cannot be equated to internal feelings.
Spotify appears to have “solved” some of the context issues with its 2021 patent on monitoring users’ speech to infer their tastes.52 The AI captures intonation, stress, rhythms, and other elements of speech to assess emotional states. Machine learning contextually situates these insights with identity-based metadata (such as gender, age, and other such variables) and environmental metadata of users’ locations and physical surroundings to get a more accurate analysis of the emotions driving music choices.
During the COVID-19 pandemic, Zoom became a living room for billions of people and a global social experiment on people’s emotional analysis. Zoom analyzes users’ facial expressions, vocal patterns, body language, and gestures to deduce collective human behavior. In May 2022, Fight for the Future and twenty-seven other human rights organizations wrote an open letter to Zoom, calling for the halt of their AI-driven emotion-tracking software.53
While the efficacy of sentiment analysis continues to evolve, the intent to weaponize collective emotions remains a serious concern. How can regulators break away from these binary choices to steer tech toward a care-based approach? Can tech companies reverse engineer these tools to benefit society?
Tech companies must be pushed into action. Meta can build on such analytics to reduce sexual harassment, foster mental health, and build community. Spotify can repurpose itself as a music therapy portal and provide comfort to millions with better targeted playlists. Microsoft’s Seeing AI, a tool that helps blind people record their experiences and events, can be enhanced with sentiment analysis to deepen users’ recorded memories.54 Zoom can continue to build on its living-room feeling by designing for contagious joy, shared grief, and solidarity, as we witness online weddings, funerals, birthdays, graduations, and other events that mark our lives.
Take the field of medicine, supposedly the most rational of fields. Clinical practitioners are trained to view emotion as obstructive to care. “Hysterical” patients need containment. There is little space for depression, anger, or faith in decision-making. In recent years medical experts have started to question “objective” care as they find evidence that some providers offer discriminative care treatments based on race, class, gender, age, and ethnicity. Health sociologist Amanda Gengler pushes us to elevate emotions in health care to make better decisions and choices of care.55 Emotions, when imbued with symbolic value and mobilized in the decision-making process, could make decisions easier to arrive at, increase feelings of confidence in decisions made, and stave off interpersonal conflict.56
Understandably, however, attending to emotion is difficult and exhausting. Nurses and doctors continue to suffer burnout. Our current global health care system is under tremendous strain, which is not sustainable. Sentiment analysis can play a part in reducing the burden of care among providers. These tools can assess patients’ states of mind and generate messages that can best induce the most responsible behavior.57 AI analytics of social media content can capture anger, angst, and fear related to topics like vaccines and help craft more nuanced strategies for outreach and awareness. Digital surveillance systems and prediction models can conduct real-time analysis of public emotion with disease progression in relation to demographic and geographic contexts. Sentiment-mining techniques could become one of the most critical tools for public-health officials in designing evidence-informed policy.
While health systems are having their “emotional” moment and tech companies are pushed to recalibrate for compassion, self-care apps are reaching out directly to users. Personalized care is built on mass emotion and speaks especially to users who have long felt forgotten.
Apps in one sense are bandages on a global need for accessible and formal mental health programs. Simultaneously, apps are more immediate market responses toward people’s needs, which could stimulate the conversation and policy on better institutional mental health access for all.
In the last decade, mobile health (mHealth) apps have skyrocketed in user uptake, partly due to billions of Global South users coming online. The mHealth market is booming globally, valued at $38.2 billion in 2021 and expected to expand at a compound annual growth rate of 11.8 percent from 2022 to 2030.58 These apps span the full spectrum of self-training, symptom management, information seeking, risk assessment, and home monitoring. While the tools drop the human intermediary, sentiment analytics–enabled chatbots can introduce a humanizing dimension and improve the digital bedside manner.
Global South millennials are leaders in app-ifying their emotions. China, Turkey, and Argentina make up three of the top five countries generating self-care app revenue in 2022. If we are to go by today’s self-care market broadly, the emotional economy is not just lucrative, but also empowering. Emotion is now at the center of a growing genre of self-care apps that are especially popular among millennials. Moodpath helps millennials track their moods for daily self-reflection. I Am is a daily affirmation app to perk up youth during their day. Virtual Hope Box provides games, coping cards, feel-good visuals, and soothing music to help users deal with stress. Calm helps you meditate. Meditation apps alone were projected to earn $3.71 billion in 2022.59
Bots have become increasingly anthropomorphic as emotion enters algorithmic solutions. AI companies have added feminine features to build comfort and trust among their users.60 People are increasingly assigning qualities like warmth, friendliness, empathy, and communal feelings to AI tools. Consumer ethicist Sylvie Borau and her team point to an ethical quandary that AI designers and policymakers face. They find that gendering tech helps improve consumer uptake of apps. However, they argue that if we destigmatize emotion and elevate it as a critical asset, we could transform the gendering of design to be empowering, positing that feminizing AI could “inject a unique human essence into such lifeless, inanimate tools.”61 In turn, it could lead to some form of customized care where AI lends a helping hand.
Mental health is a pervasive concern as our lives are increasingly digitized, automated, and alienated from others. Social well-being is of paramount concern as the fabric of emotion is tearing from divisiveness of all stripes. Emotion runs high.
Gallup, a US data analytics firm, conducts an annual poll on global emotions to reveal the emotional state of people in more than one hundred countries. The 2022 report showed that people have become significantly unhappier over the years and that one-fifth of all adults do not have a single person they can count on for help.62 War, poverty, loss of work, environmental devastation, political instability, and loneliness have made us more miserable. Riots, strikes, and demonstrations increased 244 percent from 2011 to 2019. Negative emotions make people crueler. Depression can lead to death. Loneliness can make us bitter. The report concluded with a push to rethink our detachment from one another, as a “world filled with negative emotions make[s] people behave differently.”63 Negative emotions beget negative decisions and destructive actions, which spread faster than a virus, especially when amplified by social media.
Moreover, while people, especially in the Global South, struggle to get jobs, fulfill their aspirations, or find more humane work environments, politicians in the West are increasingly expressing antigrowth sentiments. The Economist marked the end of 2022 with a reflective piece on how the West has fallen out of love with growth and has distanced itself from free-market ideals.64 This polarizing worldview is further widening the North-South divide. The antigrowth sentiment driven by an aging Western population can stifle the collective aspirations of young populations in the Global South for a better life, more innovation, and serious reforms. What if tech companies could channel this angst into productive care by ensuring our digital tools work for humane growth?
Emotion needs rebranding. Let us infuse it with the meaning of joint action. Collective emotion should be neither romanticized nor denigrated. It is a tool much like any other that, when used to our advantage, can inspire decency, kindness, and fairness. Like all tools, it requires a village of stakeholders to become a success. Algorithms, when closely audited, monitored, and accounted for, can be a friend to whom we unburden ourselves of the weight of social expectations, unbearable routines, and crippling loneliness. Claims made by apps, tools, and devices, however, need humbling.
An emotive nudge can produce a ripple effect to meet a need. The trivial adds up and can create a cultural shift. The marginalized recognize virtue in marginal change, coming from the inside, slowly, stealthily, but with cumulative impact. Women working within conservative family settings and informal workers in precarious jobs can draw strength from the collective anonymity of emotive analytics to signal change.
The surveillance machine, if designed as envisioned by Jane Jacobs, can help us take care of one another. Users can do their bit by ensuring they don’t stay silent before online toxicity. Designers must figure out how to repurpose virality, attention, personalization, and other data tools at their disposal for nurturing our mental, physical, and social health. Politicians need not throw the baby out with the bathwater. They can use policies to build competitive standards of care and compassion. To code with care is to design for context. The growing global climate crisis, loss of biodiversity, and environmental degradation demand an extension of our social care apparatus to our planetary well-being.