5

Technology

As we discussed in the previous chapters, the global food system is in need of serious changes. However, the recipes for solutions that have been emerging diverge greatly, pointing to very different concerns and interpretations of the current situations. Are we all going to starve, as Malthus prophesized in the eighteenth century? Or will we always find ways to feed humankind, regardless of its expansion, thanks to scientific advancements, technologies, and resourcefulness? What transformations should take place for us to be resilient, reacting and adapting to the consequences of climate change? Will they be based on intensive industrial agriculture, on the rediscovery and adaptation of our ancestors’ ways, on the application of evolving scientific research on ecology and sustainability, or on solutions that draw from all these approaches? Each response is far from being objective and neutral: it is rather the expression of ideologies and political negotiations that are solidly rooted in our present and in our evaluation of the societies we live in.1

This chapter will focus on how the growing role of technology in the global food system both eases and amplifies concerns about what, why, and how we eat now and what the future holds. In particular, we’ll explore the ownership of intellectual property, the role and scope of innovation in food production, and the relationships of consumers with technological change in their daily lives. To examine these issues, we first need to look at how the contributions—or the threats—of technology and science to the food system are understood and dealt with in practice. The evaluation of the risks and benefits of technology depends on who is doing the assessment, their position in the food system, and what kind of negative or positive impact they could experience.

Utopias and Dystopias

In political, media, and civil society debates about the introduction of technology, visions for the future emerge on a continuum between utopia and dystopia, with humankind’s ingenuity and creativity always producing new, positive solutions on one end and nightmarish scenarios, in which science and technology become tools of oppression and destruction or rebel against their own creators, on the other end. Of course, certain utopic approaches disavow technology completely and do not even consider its potential. We can recognize such positions in the Luddites in England, who in the early nineteenth century saw the destruction of machines as the only viable opposition to exploitation, and in the 1960s and 1970s back-to-the-land movement in the United States, in which emphasis was given to physical work, self-reliance, and manual production of goods to escape the strictures of capitalist society. The utopian points of view that embrace technology instead express the stance that the food system can only gain from the introduction of innovations ranging from laboratory experimentations on GMOs to replacement meats. This kind of future has constituted a boundless frontier for the imagination in popular culture: Will we feed ourselves with pills that contain all the nutrients we need? Will we be able to construct food out of thin air, assembling atoms floating in the environment, like in The Jetsons? Will small, dehydrated lumps turn into full-on roast chicken meals, like in Luc Besson’s 1997 film, The Fifth Element? Or will we instead be forced to consume deceased humans in the form of protein bars, like in Richard Fleischer’s 1973 film, Soylent Green?

Will we be able to construct food out of thin air, assembling atoms floating in the environment, like in The Jetsons? Will small, dehydrated lumps turn into full-on roast chicken meals, like in … The Fifth Element? Or will we instead be forced to consume deceased humans in the form of protein bars, like in … Soylent Green?

The culinary arts have been at the forefront of the explorations of what and how we eat, and above all how to cook, generating stimulating ideas for chefs, researchers, food scientists, and food enthusiasts. New techniques such as foam making and sous vide cooking have spread quickly, while research in the chemical and physical characteristics of ingredients have spurred the use of liquid nitrogen, alginates, and collagens in the kitchen.2 Molecular gastronomy (which studies the physical and chemical changes food undergoes when cooked) and neurogastronomy (focusing on how the brain processes flavors, smells, and textures) are among the fields of research that have emerged from innovative interactions among culinary arts, sciences, and technology.3 Such approaches, however, do not claim to offer contributions to the larger issues haunting the food system.

Reliance on and even delight in technology can provoke visceral reactions that take the shape of dystopian visions, as justified fears exist that mass-manufacturing, mechanization, and the intensive use of food-science research will lead to a total dehumanization of the food system and increased risks for human health and the environment. Technology is identified as a cause for the exclusion of whole segments of the human population from crucial decisions about what we grow and consume. Within this camp, forms of opposition to the current food system are emerging that support human-centered production models. In the culinary world, chefs are embracing farm-to-table dining; they highlight the provenance of their ingredients—emphasizing local and organic ones in particular—and stress the skill in their craft. The nutritional features, the origin, and the cultural value of what we eat are increasingly relevant for growing segments of consumers in the Global North. Such approaches, while placing health and sustainability at their core, can veer toward nostalgia for traditional and artisanal foods and express indifference—if not open mistrust—toward innovation. Embraced by numerous proponents of the food movement that aims to change the global food system for the better, these attitudes have been at times dubbed culinary luddism because they may fail to acknowledge the contribution of modernization to food security in the forms of abundant, durable, accessible, and affordable products.4 The interventions they favor have been criticized as elitist and ineffective in tackling the enormous scale of the food system and its problems.

The utopian perspective considers technological globalization inevitable at worst and an advantage at best, criticizing opposing perspectives as pastoral nostalgias. The dystopian outlook, meanwhile, decries excessive reliance on technology as a scourge unleashed all around the world by political and technocratic choices that can be opposed—or at least mitigated, by embracing the local dimension as the gold standard. However, the dichotomy between tradition and modernity, artisanal skills and technology does not hold. Ancient agricultural methods can support contemporary ecological approaches to food production, while newfangled tools in food science may support dying artisanal traditions, improving food quality, safety, and durability. Change has been integral to the food system since its inception at the beginning of the agricultural revolution, thousands of years ago. It doesn’t pay to demonize technology, lest we risk forgetting that even a spoon is a piece of technology—and an evolving one at that. The interaction of science and creativity can generate new opportunities in all aspects of the food system. Technology has the potential to improve agricultural productivity, resilience to climate change, and the environmental impact of crop cultivations, livestock farming, and fisheries. Innovative instruments can be applied to develop vertical farming, sustainable fisheries, and humane animal pens.

What really counts is who controls the research agenda, whose interests and priorities the research reflects, and who owns the intellectual property tied to the discoveries that derive from it. We always need to consider how technology spreads, who has the financial power to invest in it and implement it, and whether it favors or hinders the democratic participation of all stakeholders involved.

Ancient agricultural methods can support contemporary ecological approaches to food production, while newfangled tools in food science may support dying artisanal traditions. … It doesn’t pay to demonize technology, lest we risk forgetting that even a spoon is a piece of technology—and an evolving one at that.

Who Owns Technology? The GMO Case

Assessments of the social impact of new technologies in terms of who has access to them and who gains from them are crucial, but they have not always been at the forefront of rural development agendas. Echoing the growing fears that there would not be enough food for a growing human population, the 1960s and 1970s saw intense top-down efforts to introduce technologies and management methods in agriculture as part of the green revolution. The term was coined in 1968 by William Gaud, director of the US Agency for International Development, to denote a set of measures aimed at increasing agricultural output by introducing high-yield crop varieties, often grown in monoculture with the support of fertilizers, pesticides, irrigation, and mechanization. The Rockefeller Foundation and the Ford Foundation, international organizations like the FAO, and the United States, which provided economic aid and could use its political weight, promoted these interventions among governments in the developing world. The green revolution saw its first concrete application with the introduction of wheat varieties in Mexico right after World War II, which turned the country into a wheat exporter within twenty years. This new agricultural approach soon was applied in many Southeast Asian countries and India, which also implemented ambitious measures such as the systematic exploitation of groundwater resources, the adoption of new land-management techniques, and electrification.5 Indonesia planted new rice varieties, the selection of which had started in laboratories in the Philippines in the early 1960s. The government made major investments into creating the scientific infrastructure that was necessary to adapt these varieties to local environments. It also launched the extension services and farming cooperatives necessary to spread the new technologies among farmers, while building a rural banking system and a bureaucracy able to manage the effort. Rice production tripled over about thirty years, allowing Indonesia to achieve self-sufficiency and to start exporting. However, most efforts focused on land that was already the most productive, like the island of Java. Over time, the focus on the production intensification of a few varieties had a negative impact on biodiversity and increased the vulnerability of harvests. A single infestation could wipe out whole crops, as happened in the mid-1970s with the outbreak of the so-called brown plant hopper. The new technologies saw an increase in the use of chemical inputs that was not strictly necessary but was supported by local policies. When the subsidies for pesticides were stopped in the early 1990s, farmers immediately started applying less of them, without major consequences. The innovation also involved profound cultural changes; segments of populations in Bali, for example, resisted the new methods because they impinged on subak, the traditional management of irrigated paddy fields.

Attempts to introduce the green revolution in Africa were less successful than those in Asia, due to factors such as corruption and inefficiency in local and national governments, lack of infrastructure, and specific environmental issues like water scarcity and soil fragility. The green revolution, hailed by many as a success, particularly in terms of yields, often proved to be unsustainable in the long term from an environmental point of view, causing a loss of biodiversity, water overuse, soil impoverishment, pollution related to herbicides and fertilizers, and increased use of fossil fuels to operate machinery.6 Moreover, the direct involvement of transnational agribusiness corporations and their research and development departments guaranteed dominant positions for the holders of intellectual property tied to seeds and other technological inputs.

The most glaring examples of the social and political relevance of ownership of technology are found in cases tied to genetically modified (GM) crops, in which genes are transferred among varieties of the same species and across species without sexual crossing, as was the case in traditional methods of hybridization and selection. So far, public debates have mostly focused on the risks the introduction of GM organisms may generate for human health and the environment. However, more studies and clinical trials are required, and research has not reached conclusive results yet.7 Less attention has been paid to who retains the legal rights to the use of technology, what model of agriculture it fosters, how it is introduced, and what kind of crops or animals are targeted for research and development.

The historical turning point in the establishment of the legal framework for the ownership and commercialization of genetic material was the US Supreme Court 1980 decision Diamond v. Chakrabarty, which determined that anything made by man, including living organisms, can be patented. The decision opened the floodgates of biotechnologies, marking a shift in funding from public institutions, which in the past had been the main actors in the research on plants and animals, to private corporations. Biotech focused on profitable endeavors such as genetic modifications, aimed at increasing yields and enhancing the herbicide and pest-resistant properties of commercial crops—especially canola, corn, soybeans, and cotton—that are often cultivated in high-input monocultures and can take advantage of economies of scale. Research dynamics and adoption of GM crops are different in countries like Brazil, Argentina, China, and India, where governments have intervened with policies that prioritize national development.8 Few efforts have instead targeted crops that are relevant to farmers in developing countries: drought- and pest-resistant high-yield cassava, millet, sorghum, yams, and bananas could positively contribute to food security.

Divergent approaches for risk assessment exist in terms of human health and environmental consequences. The EU embraces a precautionary principle that bars the introduction of GM crops unless tests and trials demonstrate that they do not constitute a threat to humans and the environment. The burden of proof rests on the proponents of the new genetic material. Mistrust toward GMOs is so deeply rooted among European citizens that even after approval at the EU level, whole countries refuse to plant GM crops—or to import products containing GM ingredients, unless they’re clearly labeled. Beyond the environmental risks, opponents to GMOs also argue that corporate control over the direction and focus of research has a global impact on biosafety regulations, agricultural policies, development strategies, and global trade. Concerns about these issues led to the 2003 adoption of the Cartagena Protocol on Biosafety to the Convention on Biological Diversity, an international agreement on biosafety that aims to protect biodiversity against the risks of exposure to GMOs. However, trade controversies between countries that opt for preventive procedures and those that embrace and implement production of GM crops can arise, as many governments, including the United States, have opted instead for a more reactionary—rather than precautionary—approach to risk: according to this point of view, new genetic material can be introduced into the environment if it meets acceptable standards of safety, often set by the biotech firms themselves. Interventions take place in case of irrefutable evidence of damage or if problems occur, and the burden of proof shifts to the public.

Regardless of what anyone thinks of the dangers of GM crops, their diffusion may be problematic for farmers who want to stick to non-GM varieties. Pollens do not stop at the edges of fields, and cross-pollination with crops on nearby properties can take place easily. Because GMOs are covered by intellectual property laws, cross-pollination can be considered an infringement. Biotech corporations frame such occurrences in terms of seed piracy, in the sense that seeds containing their proprietary and copyrighted genetic material turn up in areas in which they have not been approved and where farmers have not paid the license fees that cover their use. In 1998, Monsanto brought Percy Schmeiser, a Canadian farmer, to court for having illegally saved seeds for the following agricultural cycle. The farmer argued that he was being prosecuted because a Monsanto variety had ended up in his fields accidentally due to cross-pollination, but the courts sided with the biotechnology giant in 2004.9 Licenses are legal tools meant to discourage farmers from saving seeds from one harvest and using them to grow crops for the following one. Since the early 1990s, technologies have existed for so-called terminator genes—technically known as genetic use-restriction technology (GURT)—which would make seed sterile and stop the diffusion of second-generation seeds. However, such technologies are not commercially available, and a moratorium on their use was discussed in 2006 at the United Nations Convention on Biological Diversity in Curitiba. The very existence of a market for stealth seeds, GM seeds that farmers save, exchange, crossbreed, and sell, regardless of biosafety concerns, points to the fact that terminator genes have not been introduced into living plants.10

Biotech companies hope that new methods such as CRISPR-Cas9, a high-precision genome-editing technique, can change public perceptions about genetic alterations. Bacteria were previously used to insert genes in DNA sequences without much precision (and required long and expensive attempts and trials), but CRISPR-Cas9 doesn’t necessarily introduce foreign DNA; instead, it deletes or alters traits already present in the genetic material of the organism. The process is also much cheaper, which over time could make it available to actors other than large biotech corporations. However, opponents to genetic modifications in food point out that a different technology doesn’t change the risks for human consumption and for the environment.

Glimpsing the Future of Food Production

Although less intense, similar debates about intellectual property, ownership, accessibility, and governance extend to precision farming, climate-smart agriculture, and e-agriculture. These three descriptors all refer, more or less interchangeably, to the introduction into the rural world of information and communication technologies (ICT), as well as the Internet of Things, the exchange of data among tools, machineries, sensors, software, and mobile applications. The use of cellular platforms in rural areas is one the most promising and viable innovations. Most farmers now have access to simple handheld devices with voice and text capabilities—which is much cheaper than building telephone land lines in remote locations. Building on the diffusion of cell phones among farmers, easy-to-use mobile applications allow them to have a better sense of the current market prices for their crops and the costs of inputs while accessing financing opportunities, insurance tools, and real-time information about weather events. Farmers are introduced to knowledge-intensive practices, including data point analysis and alternative modes of receiving information, seeing, counting, and deciding, which complicate expectations about what local and traditional mean.

Other technologies are more capital-intensive. Besides geolocating, GPS systems generate images of agricultural lands—which, together with other parcel-identification tools, can be used to determine subsidies for farmers and provide other services (as is already happening in the EU). Drones provide visual information about the state of the fields, the presence of pests and vermin, and the effects of the weather. Software that manages irrigation by regulating valves and pumps contributes to saving water through more efficient distribution. Sensors located in fields send data about soil moisture, temperatures, sun exposure, and crop health to mobile applications that are easily accessed remotely. Sensory technology is also applied in fish and livestock farming to track the behavior and the movements of the animals, monitor their health, and receive updates about pregnancies and births (in the case of livestock). Sensors and tools based on GPS technology have been employed widely in the fishing industry to follow the movements of fish, identify their feeding patterns, and monitor changes in currents and temperatures due to weather events and climate change.

These innovations complicate the fantasy, especially widespread among affluent consumers in postindustrial societies, of traditionally grown crops and traditionally raised animals, generally perceived as safer, healthier, more authentic, and more meaningful. Although innovations can help farmers reduce the use of pesticides and other inputs, the use of cutting-edge technology takes away from the sense of connecting with real people—farmers, shepherds, fishermen, artisans—that intervene in person in food production, getting their hands dirty. Besides challenging consumers’ perceptions, the diffusion of high-tech solutions for farming raises serious political issues because they could further concentrate control of the global food system in the hands of a few highly capitalized firms and financial investors. As ICT and IoT generate an unprecedented quantity of data that provides invaluable information for farmers, the industry as a whole, and researchers, concerns arise about how this data is protected and who has access to it. Can the communities involved have a say in how this information is used? Will data be public and free, accessible for a fee, or even sold as an asset? Will its analysis and use be restricted to the tech companies that generate and manage it? As the vulnerability of computer networks becomes painfully clear, could hackers gain access to data about food production, with great risks in terms of food security and food safety? Could information be weaponized to stir financial panic on the commodity markets, to cause dysfunction or paralysis in food-distribution networks, or to make proprietary intellectual property available to anybody?

At the same time, the use of data could bring positive change. Great excitement has accompanied the development of blockchain technology—best known for its use in virtual currencies—and its possible applications in supply networks. Using encryption to keep information secure, blockchain constitutes a dispersed database of transactions (known as a digital ledger) that all participants in a network have access to. In fact, to be verified and recorded, every transaction must be approved by the networked computers. No single participant has control over the network or can modify transactions independently. Start-ups are applying the new technology to ensure traceability in supply networks through data, confirmations, and certificates, which should make pinpointing critical data easier in the case of a food-safety emergency. Each actor could add information about costs and payments that would keep the whole supply network completely transparent. The risk of fake information would still exist but would be reduced by the integration of geographic information system (GIS) technologies, satellite photography, and peer-to-peer controls. Qualitative data could be included and made available to all users, allowing consumers to verify who the farmers that grew their food are and where they are located.

At a smaller scale, tensions between the productive potential of new technologies and concerns about access, use, and cost also surround the development of hydroponics, a method for growing plants without soil that employs solutions of nutrients and water in combination with fluorescent lamps or LED lamps. Allowing for food production in closed and limited spaces, hydroponics has been hailed as a new frontier for urban agriculture and a tool to increase the resilience of urban supply networks in case of disasters (so long as the installations become autonomous in terms of energy thanks to solar panels, wind turbines, or other technologies). Large hydroponics plants have been mounted inside abandoned industrial buildings, as in the case of AeroFarms in Newark, New Jersey. The company has patented a system in which plants grow on cloth, fed by a fine mist and kept growing thanks to LED lighting in a controlled and contained environment; in this way, the company reduces the amount of land, water, and pesticides used while repurposing existing structures.11 In Singapore, where real estate is a luxury, urban farmers have embraced hydroponics and similar technologies to provide fresh salads and herbs while also integrating composting and fish ponds into aquaponics projects.12 In such systems, hydroponics are connected to aquaculture, using the waste from fish farming to provide the nutrients necessary for plant growth. There are doubts among consumers about the nutritional content and flavor of plants grown indoors without soil and exposure to natural light, and the impact of hydroponics and aquaponics will greatly depend on the ownership of the technology, its price and accessibility, and its management. Will such productions be controlled by large companies that have the financial means to secure the necessary investments, or could ownership be distributed among citizens?

Consumers and Technology

Innovation also influences aspects of the food system that are much closer to consumers but much less visible. Let’s consider just one aspect of the food system: distribution. Computer networks make it possible for food to get to us smoothly: deliveries are organized, stockrooms are kept full, and we can even shop online. Many of the infrastructures that support the processing, warehousing, delivery, and retail of food are so embedded in supply networks that they may be hard to notice. The invention of refrigerated train cars; the freezing technology that connects industries to domestic freezers through specialized trucks, warehouses, and dedicated structures in retail; and the introduction of forklifts, pallets, and containers into food transportation are just a few of the many innovations that have shaped global distribution in past decades.

Technologies function at different scales, from massive machinery all the way to the most intimate dimensions of our bodies. Wearable appliances as small as watches now can easily monitor our movements, our blood pressure, and the intensity of our exercise. We can store and carry easy-to-administer medications in case we eat something that provokes an allergic reaction or, worse, an anaphylactic shock. We can ingest cameras to check how our stomachs and intestines digest the food we eat.13 Nanotechnologies constitute a constantly expanding frontier, creating opportunities to track our physiology on an unprecedented detailed scale. Research is closely looking at our intestinal microbiota—that is, the “ecological community of commensal, symbiotic, and pathogenic microorganisms” that share our body space and which is now indicated as both the possible cause of and the solution to many health problems.14 At a macro level, innovation takes inspiration from nature to create processes that mimic activities and metabolic processes of nature in its ecological complexity, from microorganisms in the soil to the use of cover crops that disrupt the growth of pathogens after the harvest. Fungi and algae are being studied as tools to produce biofuels, compost waste, and even provide biodegradable building materials. Meat protein is being grown in labs and tests conducted to produce it commercially, while plant-based burgers designed to look, smell, and taste much like beef already are sold in restaurants.15

Kitchen appliances are food-related technologies that constantly interact with humans without much tension, especially when they offer convenience and efficiency. Most consumers seem to have gotten over any diffidence toward the introduction of frozen food in the 1950s and microwaves in the 1980s, which required profound cultural changes with regard to ideas about food quality, freshness, and safety, as well as the agency of the cook. Today, innovations such as sous vide machines, convection ovens, induction burners, and digital refrigeration monitoring can be found both in domestic and professional kitchens. When applied to food manufacturing, restaurants, and large institutions, such innovations can offer more efficient use of ingredients, inputs, and energy. Other, apparently more fanciful appliances are met with reactions that range from curiosity to amusement. Fridges can keep tabs on what food is going bad inside or what products are running out, connecting with online grocery shops that arrange deliveries of what’s needed. Such an application of IoT may come across as far from essential, but it works well for individuals who forget about food, like patients suffering from Alzheimer’s disease or dementia, and helps them to eat properly, improving their health and relieving family and friends from some worries. Domestic appliances can provide solutions that improve quality of life for people with disabilities. Kitchenware and silverware are being designed to allow blind individuals to eat more easily, and 3-D printers might let friends and family share recipes and actual food at a distance by using the same files to print edible matter.

Food design, a new field of research and practice within the discipline of design, is developing as an answer to these opportunities and challenges through interaction with chefs, producers, and food-studies scholars. The field is growing fast in Western Europe and South America and less quickly in North America, Asia (with the exception of East Asia), and Africa. As defined by Food Design North America (of which I am among the founders), food design “includes any action that can improve our relationship with food individually or collectively. These actions can relate to the design of food products, materials, practices, environments, systems, processes and experiences.”16 The association clarifies that the working definition needs to be considered as a point of departure, not a conclusion, because the goal is to open up dialogues rather than offering schematic or reductionist demarcations. In recent years, design has turned its attention to all aspects of the food we produce and eat, from tableware to restaurant design, from experience to networks. This interest is part of the evolution of design itself, which has expanded its horizons from objects and spaces to include knowledge-intensive forms of processes, services, and systems. Designers that opt for human-centered innovation participate in the development of projects that recognize the priorities, values, and needs of all the actors involved, especially those whose voices are least heard. These projects tend to consider complex contexts and situations to test prototypes that can then be improved through feedback from users and local communities. This reflects a change in the involvement of stakeholders, who move from being mere recipients of the professional interventions of designers to becoming codesigners and participants.

Such approaches could guide new technologies to harmonize the need for greater food availability with efforts to ensure long-term sustainability and to reflect the preferences of consumers. Will new technologies usher in a greater democratization in the food system, or will they intensify the inequalities between the haves and the have-nots? Is technology the only way to improve efficiency and yields in food production? More importantly, is producing more the single most urgent priority? These questions are crucial at a time when, despite the growing availability, accessibility, and affordability of food, many individuals, families, and whole communities struggle to get proper nourishment, as we will discuss in the next chapter.