Page numbers in italic indicate a figure and page numbers in bold indicatea table on the corresponding page.
- A
- Abstract features, categorization based on, 82–83
- Admissions. See College admissions scenarios
- Agency. See Moral agency
- Agrawal, Ajay, 109
- AI. See Artificial intelligence (AI)
- Airline reservation algorithm, 64
- Alexa, privacy concerns with, 88
- Algorithmic aversion, 5–6, 8, 37
- Algorithmic bias, 8
- college admissions scenarios for, 70–71, 73, 75, 79, 83–84
- demographic correlates with, 69, 73
- equity versus predictive accuracy in, 83–85
- examples of, 64–65
- explicit versus abstract features in, 82–83
- harm-wrongness ratings of, 131
- human resource screenings scenarios for, 70, 74, 78
- implicit association tests of, 19–20
- Likert-type scale for, 68–69
- origins of, 64–65
- policing scenarios for, 72, 77, 81, 84–85
- salary increase scenarios for, 71–72, 76, 80
- Amazon Mechanical Turk, 5n, 28, 166
- Ambush scenario
- moral dimensions of, 186, 186
- moral space representation of, 131–132
- moral space representations of, 128, 129, 131–132
- American Airlines reservation algorithm, 64
- Amusement park scenario
- moral dimensions of, 193, 193
- moral space representation of, 128, 129
- Anthems scenarios. See National symbol scenarios
- Artificial intelligence (AI)
- advances in, 14
- interactions with, 2–3
- liability for, 156–159
- moral agency of, 7, 16–18, 37
- moral rules versus social networks in, 150–156
- moral status of, 7, 16–18
- participant attitudes toward, 169–170, 171
- regulation of, 152
- responsibility for actions of, 156–159
- strong, 15–16
- weak, 15–16
- Asimov, Isaac, “three laws of robotics,” 150–153
- Authority
- definition of, 21
- measurement of, 28–29, 28
- perceived levels of (see Moral scenarios)
- Automatic associations
- cultural differences in, 22
- implicit association tests of, 19–20
- Automatic teller machines (ATMs), introduction of, 107–108
- Autonomous vehicle scenarios
- moral dimensions of, 23–24, 50–55, 53–54
- moral space representation of, 127, 128, 129, 130, 131
- Autonomous weapon systems, 14
- B
- Bank robbery scenario
- moral dimensions of, 185, 185
- moral space representation of, 128, 129
- Bias. See also Algorithmic bias
- intergroup, 124, 125n
- selection, 167
- Bimodal judgment, 9, 146, 157
- Blame, attribution of. See also Responsibility
- in peanut butter allergy scenario, 23
- in trolley problem, 174–176, 175
- Blasphemous comedian scenario
- moral dimensions of, 46–47, 48
- moral space representation of, 128, 129
- Botnik, 40–41
- Brand names scenario
- moral dimensions of, 192, 192
- moral space representation of, 128, 129
- Brynjolfsson, E., 119
- Bureaucracies, judgment of, 159–162
- C
- Camera bots, 88
- Chatbots, 3
- Citizen scoring scenario, 100–103, 101
- Civil engineer scenarios
- moral dimensions of, 201–202, 201, 202
- moral space representation of, 128, 129
- Civil Rights Act (1964), 84
- Cleaning robots, 17
- Clopening, 106
- Codelco, 50
- College admissions scenarios
- efficiency versus equity in, 83–84
- moral dimensions of, 70–71, 73, 75, 79
- moral space representation of, 128, 129, 131
- Comedian sketch scenario. See Blasphemous comedy scenario
- Competence, person perception and, 82
- Computer vision systems
- moral dimensions of, 91–96, 94–95
- privacy concerns with, 88
- Condorcet, Nicolas de, 121
- Congruent trials, 19–20
- Control conditions, 4
- Conversational robots. See Chatbots
- Counseling scenario
- moral dimensions of, 190, 190
- moral space representation of, 128, 129
- Creative artificial intelligence (AI)
- capabilities of, 40–41
- creative industry scenarios, 46–49, 48–49
- limitations of, 41n
- marketing scenarios, 42–44, 44–45
- moral space representation of, 128, 129
- Culture, impact on moral judgment, 22
- D
- Data privacy. See Privacy and surveillance concerns
- Deep fake videos, 40
- Demographic parity, 65–66
- Demographics, 9
- algorithmic bias and, 69, 73
- effects on moral judgment, 141–146, 142–143, 145
- of study participants, 167–169, 168
- Desecration of national symbols. See National symbol scenarios
- Detroit (game), 153–154
- Differential privacy, 90–91
- Digital records, privacy concerns with, 88
- Discriminatory situations. See also Algorithmic bias
- college admissions, 70–71, 73, 75, 79, 83–84
- demographic correlates with, 69, 73
- fairness in, 64–68
- human resource screenings, 70, 74, 78
- Likert-type scale for, 68–69
- moral space representation of, 128, 129
- policing, 72, 77, 81
- salary increases, 71–72, 76, 80
- Displacement. See Labor displacement
- Domingos, Pedro, 17
- Drones, military use of, 14
- Durkheim, Emile, 160
- E
- Education levels
- effects on moral judgment, 9, 141–146, 142–143, 145
- of study participants, 167–169, 168
- Emotions, in moral reasoning, 19–22
- Enlightenment, harm basis of morality in, 18–19
- Equality in false acceptances, 66
- Equality of false rejections, 65–66
- Ethical Algorithm, The (Kearns and Roth), 90
- Ethical concerns, 10, 16–18. See also Algorithmic bias; Labor displacement; Privacy and surveillance concerns
- autonomous vehicles, 23–24, 50–55, 53–54
- lewd or disrespectful behavior, 40–49, 41n, 44–45, 48–49
- moral agency, 7, 16–18, 37
- moral status, 7
- national symbol desecration, 56–61, 58–59
- need for research into, 3–6
- normative versus positive approaches to, 5–6
- risk-taking behavior, 35–38, 38–39, 178–179, 178–179, 180–181, 180–181
- strong versus weak AI in, 15–16
- Ethnicity
- correlation with moral judgments, 141–146, 142–143, 145
- of study participants, 167–169, 168
- European Commission, 152
- Explicit features, categorization based on, 82–83
- Explosive Ordinance Disposal (EOD) robots, 17
- F
- Facial recognition systems
- algorithmic bias in, 67
- moral dimensions of, 91–96, 94–95
- Facial recognition technology, 14
- Fairness, 21. See also Algorithmic bias
- equality of false rejections, 65–66
- measurement of, 28–29, 28
- perceived levels of (see Moral scenarios)
- statistical parity, 65–66
- Fake news, 40
- False acceptances, equality in, 66
- False rejections, equality of, 65–66
- Federated Learning, 91
- First Amendment rights, 56
- “Flag-burning” amendment, 56
- Flag desecration scenarios
- moral dimensions of, 56–57, 58–59
- moral space representation of, 128, 129
- Foreign contractors, displacement attributed to, 113–118, 117
- Foreign subsidiaries, displacement attributed to, 113–118, 117
- Foreign temporary workers, displacement attributed to, 109–118, 112–113, 117
- Forest fire scenario
- moral dimensions of, 178–179, 178–179
- moral space representation of, 128, 129, 130
- Frankenstein (Shelley), 2, 162
- Functions, moral. See Moral functions
- G
- Gaby mine, 50
- GANs. See Generative Adversarial Networks (GANs)
- Gas tax scenario
- moral dimensions of, 197, 197
- moral space representation of, 128, 129
- Gender
- correlation with moral judgments, 9, 141–146, 142–143, 145
- of study participants, 167–169, 168
- Generative Adversarial Networks (GANs), 40
- Generative AIs, 40–41
- Gödel, Kurt Friedrich, 153
- Good prediction scenario. See Procurement management scenarios
- Google, 152
- Graetz, G., 108
- Graveyard excavation scenario
- moral dimensions of, 26–31, 29
- moral space representation of, 127, 128, 129
- Griggs v. Duke Power Company (1971), 84
- H
- Haidt, Jonathan, 125
- Harm, 21. See also Intentionality
- measurement of, 28–29, 28
- moral functions of, 133–141, 137, 138, 139, 140
- moral space representation of, 125–132, 126, 128, 129
- patterns across scenarios, 127–132
- perceived levels of (see Moral scenarios)
- Harm basis of morality, 18–19
- Harm-intention plane, 127, 128, 129, 130
- Harm-wrongness plane, 129, 131–132
- Harris, Seth, 119
- Heroic assumptions, 133
- Heuristics, biases as, 82
- Hikers scenario
- moral dimensions of, 191, 191
- moral space representation of, 128, 129
- Hiring decisions. See Human resource screenings scenarios
- HIS Group hotel chain, hacking of, 88
- Human behavior, judgment of. See Moral judgments
- Human learning, limitations of, 161–162
- Human resource screenings scenarios
- moral dimensions of, 70, 74, 78
- moral space representation of, 128, 129,131
- Hume, David, 19n
- Hurricane scenario
- moral dimensions of, 180–181, 180–181
- moral space representation of, 128, 129, 130
- I
- I, Robot (Asimov), 150–152
- Implicit association tests, 19–20
- Implicit.harvard.edu website, 20
- Incongruent trials, 19–20
- Independent worker classification, 119
- Industrial Revolution, 107
- In-group favoritism. See Intergroup bias
- Intentionality, 10, 159–161
- demographic correlates with, 141–146, 142–143, 145
- human behavior judged by, 10, 23–26, 25, 124, 139, 146–147, 157, 159–162
- moral functions of, 133–141, 137, 138, 139, 140
- moral space representation of, 125–132, 126, 128, 129
- patterns across scenarios, 127–132
- perceived levels of (see Moral scenarios)
- wrongness and, 23–26, 25
- Interest rate scenario
- moral dimensions of, 198, 198
- moral space representation of, 128, 129, 131
- Intergroup bias, 124, 125n
- International Federation of Robotics (IFR), 108
- Intuition
- cultural differences in, 22
- implicit association tests of, 19–20
- K
- K-anonymity, 89–90
- Kearns, Michael, 90
- Kleinberg, Jon, 83
- Knowledge diffusion, labor mobility as channel of, 119
- Komatsu, 50
- Kronos, 106
- Krueger, Alan, 119
- L
- Labor displacement
- factors affecting, 109
- fear of, 107–108
- inequality and, 14
- interaction between technology and labor in, 107–109
- mitigation of consequences of, 118–121
- moral dimensions of, 9, 109–118, 112–113, 117, 156
- precarization of work and, 106–107
- Labor mobility, barriers to, 119
- Learning, human versus machine, 161–162
- Legal implications, 10, 156–159
- Lewd behavior
- creative marketing scenarios, 42–44, 44–45
- moral space representation of, 128, 129
- playwright scenario, 47, 49
- Liability for machine actions, 156–159
- Licensing, state, 119
- Life-or-death decisions, attitudes toward AI in
- forest fire scenario, 178–179, 178–179
- hurricane scenario, 180–181, 180–181
- moral space representation of, 128, 129
- tsunami scenario, 35–38, 38–39
- Likability
- demographic correlates with, 141–146, 142–143, 145
- perceived levels of (see Moral scenarios)
- Likert-type scales
- for algorithmic bias scenarios, 68–69
- demographic correlates with, 141–146, 142–143, 145
- for graveyard excavator scenario, 26–27
- for labor displacement scenarios, 111
- for privacy scenarios, 94
- Literai, 40
- Looms, Luddite opposition to, 4, 107
- Loyalty
- definition of, 21
- measurement of, 28–29, 28
- perceived levels of (see Moral scenarios)
- Luddites, opposition to steam-powered looms, 4, 107
- Ludwig, Jens, 83
- M
- Machine actions, human judgment of. See Moral judgments
- Machine learning, 161–162
- Malle, B. F., trolley problem of, 4–5, 4n
- blame attributions for, 174–176, 175
- replication of, 172–173
- wrongness attributions for, 173–174, 174
- Marketing scenarios
- moral dimensions of, 42–44, 44–45
- moral space representation of, 128, 129
- Marx, Karl, 160
- Mathematics, incompleteness of, 153
- McAfee, A., 119
- Mechanical Turk (MTurk), 5n, 28, 166
- Methodology. See Study methodology
- Michaels, G., 108
- Microsoft Tay bot, 3
- Military draft, moral dimensions of, 21–22
- MIT Committee on the Use of Humans as Experimental Subjects, 166
- Mobile phone traces, 89
- Moral agency, 7, 16–18, 37
- Moral dimensions
- components of, 20–22
- impact on moral judgment, 22–26, 25
- intentionality and, 23–26
- measurement of, 28–29
- perceived levels of (see Moral scenarios)
- Moral foundation theory, 125
- Moral functions. See also Moral space
- definition of, 124–125
- modeling of, 133–141, 134, 135, 137, 138, 139, 140
- Morality. See also Moral judgments; Moral scenarios
- cultural differences in, 22
- definition of, 18
- dimensions of, 20–29, 25
- emotions and automatic associations in, 19–22
- Enlightenment theory of, 18–19
- harm basis of, 18–19
- implicit association tests of, 19–20
- moral foundation theory of, 125
- Moral judgments, 18–19. See also Moral scenarios
- algorithmic aversion and, 5–6, 8, 37
- bimodal, 9, 146, 157
- demographic correlates of, 9, 141–146, 142–143, 145
- ethical and legal implications of, 10, 156–159
- intentionality in, 25
- intentions versus outcomes in, 10, 23–26, 25, 124, 139, 146–147, 157, 159–162 (see also Intentionality)
- mathematical representation of, 24–26, 25
- moral functions of, 124–125, 133–141, 134, 135, 137, 138, 139, 140
- moral space representation of, 125–132, 126, 128, 129
- normative approach to, 5
- positive approach to, 5–6
- rejection/acceptance of technology and, 6
- responsibility for machine actions in, 156–159
- statistical models of, 9–10
- trolley problem example, 4–5, 4n, 172–173
- unimodal, 9, 146, 157
- Moral rules, contradictions with social relationships, 150–156
- Moral scenarios
- ambush, 128, 129, 131–132, 186, 186
- amusement park, 128, 129, 193, 193
- autonomous vehicles, 23–24, 50–55, 53–54, 127, 128, 129, 130–131
- bank robbery, 128, 129, 185, 185
- blasphemous comedian, 46–47, 48, 128, 129
- brand names, 128, 129, 192, 192
- citizen scoring, 100–103, 101
- civil engineer, 128, 129, 201–202, 201, 202
- college admissions, 70–71, 73, 75, 79, 83–84, 128, 129, 131
- computer vision systems, 91–96, 94–95
- counseling, 128, 129, 190, 190
- creative marketing, 42–44, 44–45, 128, 129
- definition of, 26
- dilemmas versus, 26
- forest fire, 128, 129, 130, 178–179, 178–179
- gas tax, 128, 129, 197, 197
- graveyard excavation, 26–31, 28, 29, 30, 127, 128, 129
- hikers, 128, 129, 191, 191
- human resource screenings, 70, 74, 78, 128, 129, 131
- hurricane, 128, 129, 130, 180–181, 180–181
- interest rate, 128, 129, 131, 198, 198
- jewelry robbery, 128, 129, 184, 184
- labor displacement, 9, 109–118, 112–113, 117, 156
- lewd playwright, 47, 49, 128, 129
- mathematical representation of, 24–26, 25
- moral agency in, 7, 16–18, 37
- national symbol desecration, 56–61, 58–59, 127, 128, 129
- nursing assistant, 128, 129, 194, 194
- patterns across, 127–132
- peanut butter allergy, 23–26, 25
- personal assistant, 128, 129, 200, 200
- pharmacy robbery, 128, 129, 183, 183
- physical therapist, 128, 129, 195, 195
- plagiarizing songwriter, 46–47, 48, 128, 129
- policing, 72, 77, 81, 84–85, 128, 129
- procurement management, 128, 129, 187–189, 187, 188, 189
- recommender systems, 96–100, 98–99
- salary increases, 71–72, 76, 80, 128, 129
- shoplifting security, 128, 129, 203, 203
- supermarket robbery, 128, 129, 182, 182
- suspected terrorist, 128, 129, 131–132, 199, 199
- tsunami, 35–38, 38–39, 128, 129, 130–131
- Twitter, 128, 129, 196, 196
- Moral space
- definition of, 125–126
- example of, 126–127, 126
- harm-intention plane in, 127, 128, 129, 130
- harm-wrongness plane in, 129, 131–132
- wrongness-intention plane in, 129, 130–131
- Moral status, 7, 16–18
- Moral surface, 133–141
- Moral wrongness. See Wrongness, perceived levels of
- More, Thomas, 121
- Mullainathan, Sendhil, 83
- N
- National symbol scenarios
- moral dimensions of, 56–61, 58–59
- moral space representation of, 127, 128, 129
- Navarro, Jannette, 106
- Negligence, 158
- Normative approaches, 5–6
- Nursing assistant scenario, 194, 194
- O
- OECD. See Organisation for Economic Co-operation and Development (OECD)
- Offshoring, displacement attributed to, 113–118, 117
- Online dating systems, 96–100, 99
- Organisation for Economic Co-operation and Development (OECD), 152
- Outcomes, moral judgment based on, 10, 23–26, 25, 124, 139, 146–147, 157, 159–162
- Outsourcing, displacement attributed to, 113–118, 117
- P
- Parity, statistical, 65–66
- Participants
- attitudes toward AI, 169–170, 171
- demographic characteristics of, 167–169, 168
- recruitment of, 28, 166–167
- PATE, 91
- Peanut butter allergy scenario, 23–26, 25
- Personal assistant scenario
- moral dimensions of, 200, 200
- moral space representation of, 128, 129
- Pharmacy robbery scenario
- moral dimensions of, 183, 183
- moral space representation of, 128, 129
- Physical therapist scenario
- moral dimensions of, 195, 195
- moral space representation of, 128, 129
- Plagiarizing songwriter scenario
- moral dimensions of, 46–47, 48
- moral space representation of, 128, 129
- Policing scenarios
- moral dimensions of, 72, 77, 81
- moral space representation of, 128, 129
- outcome-based approach to, 84–85
- Political orientation of study participants, 168–169
- Positive approaches, 5–6
- Precarization of work, 106–107
- Prediction costs, 109
- Prediction Machines (Agrawal), 109
- Predictive purchasing systems, 96–100, 98
- Prejudice, 82
- Pretrial risk assessment tools, algorithmic bias in, 67–68
- Principal components, 82–83
- Printing, introduction of, 4, 107
- Privacy and surveillance concerns, 14, 156
- citizen scoring scenario, 100–103, 101
- computer vision systems scenarios, 91–96, 94–95
- Likert-type scale for, 94
- privacy-preserving algorithms, 91
- proposed solutions to, 89–91
- recommender system scenarios, 96–100, 98–99
- risks of AI to, 64–65
- Procurement management scenarios
- moral dimensions of, 187–189, 187, 188, 189
- moral space representation of, 128, 129
- Product liability, 158–159
- Promotion decisions. See Salary increase scenarios
- Purity
- definition of, 21
- measurement of, 28–29, 28
- perceived levels of (see Moral scenarios)
- P-values, 30–31
- ambush scenario, 186
- amusement park scenario, 193
- autonomous vehicle scenarios, 53–54
- bank robbery scenario, 185
- blasphemous comedian scenario, 48
- brand names scenario, 192
- citizen scoring scenario, 101
- civil engineer scenarios, 201, 202
- college admissions scenarios, 75, 79
- computer vision systems scenarios, 94–95
- counseling scenario, 190
- creative marketing, 44–45
- forest fire scenario, 178–179
- gas tax scenario, 197
- graveyard excavation scenario, 30
- hikers scenario, 191
- human resource screenings scenarios, 74, 78
- hurricane scenario, 180–181
- interest rate scenario, 198
- jewelry robbery scenario, 184
- labor displacement scenarios, 112–113, 117
- lewd playwright scenario, 49
- national symbol desecration, 58–59
- nursing assistant scenario, 194
- personal assistant scenario, 200
- pharmacy robbery scenario, 183
- physical therapist scenario, 195
- plagiarizing songwriter scenario, 48
- policing scenarios, 77, 81
- procurement management scenarios, 187, 188, 189
- recommender systems scenarios, 98–99
- salary increase scenarios, 76, 80
- shoplifting security scenario, 203
- supermarket robbery scenario, 182
- suspected terrorist scenario, 199
- tsunami scenario, 38–39
- Twitter scenario, 196
- R
- Rambachan, Ashesh, 83
- Randomized response, 90–91
- Rappor, 91
- Rationality, morality and, 18–19
- Raw material shortage/excess. See Procurement management scenarios
- Real-world studies, 7
- Recidivism, 67
- Recklessness, 158
- Recommender systems scenarios, 96–100, 98–99
- Regulation, labor, 118–119
- Reidentification risks, 89
- Religious orientation
- effects on moral judgment, 141–146, 142–143, 145
- of study participants, 167–169, 168
- Replace different/same assessments
- demographic correlates with, 141–146, 142–143, 145
- of scenarios (see Moral scenarios)
- Research methodology. See Study methodology
- Responsibility
- liability and, 156–159
- perceived levels of (see Moral scenarios)
- trolley problem example, 174–176, 175
- Rio Tinto, 50
- Risk-taking, attitudes toward AI in
- forest fire scenario, 178–179, 178–179
- hurricane scenario, 180–181, 180–181
- moral space representation of, 130–131
- tsunami scenario, 35–38, 38–39
- Robbery scenarios
- moral dimensions of, 182–185, 182, 183, 184
- moral space representation of, 128, 129
- Robots
- camera bots, 88
- chatbots, 3
- Explosive Ordinance Disposal, 17
- health-care, 14
- moral agency of, 7, 16–18, 37
- moral status of, 7, 16–18
- tax on use of, 119
- Roth, Aaron, 90
- S
- Salary increase scenarios
- moral dimensions of, 71–72, 76, 80
- moral space representation of, 128, 129
- Scenarios. See Moral scenarios
- Selection bias, 167
- Self-driving cars. See Autonomous vehicle scenarios
- Sex robots, 17
- Shelley, Mary, 2, 162
- Shelley AI, 40–41
- Shoplifting security scenario
- moral dimensions of, 203, 203
- moral space representation of, 128, 129
- Similar situation scale
- demographic correlates with, 141–146, 142–143, 145
- perceived levels of (see Moral scenarios)
- Simulated studies, 7
- Siri, privacy concerns with, 88
- Social networks, contradictions arising from, 150–156
- Split Learning, 91
- Starbucks, 106
- State licensing, 119
- Statistical models, 9–10
- Statistical parity, 65–66
- Status, moral, 7, 16–18
- Stereotyping, 82
- Strong artificial intelligence, 15–16
- Study methodology, 3–4, 8, 26–31. See also Trolley problem
- Likert-type scales, 26–27, 68–69, 94, 111
- participant attitudes, 169–170, 171
- participant demographics, 167–169, 168
- participant recruitment, 28, 166–167
- scenario structure, 26
- study design, 166
- word association exercise, 28–29, 28
- Supermarket robbery scenario
- moral dimensions of, 182, 182
- moral space representation of, 128, 129
- Surveillance. See Privacy and surveillance concerns
- Suspected terrorist scenario
- moral dimensions of, 199, 199
- moral space representation of, 128, 129, 131–132
- Sweeney, Latanya, 89
- T
- Tax, robot, 119
- Tay chatbot (Microsoft), 3
- Technological displacement. See Labor displacement
- Temporary foreign workers, displacement attributed to, 109–118, 112–113, 117
- Texas v. Johnson (1989), 56
- “Three laws of robotics” (Asimov), 152
- Travel company scenario, 96–100, 99
- Trolley problem
- blame attributions for, 4–5, 174–176, 175
- description of, 4n, 172–173
- moral status in, 17
- wrongness attributions for, 173–174, 174
- Tsunami scenario
- moral dimensions of, 35–38, 38–39
- moral space representation of, 128, 129, 130–131
- Tweeting bots, 40
- Twitter scenario
- moral dimensions of, 196, 196
- moral space representation of, 128, 129
- W
- Warmth, person perception and, 82
- Weak artificial intelligence (AI), 15–16
- Weber, Max, 159, 160
- Weld, William, 89
- Word association exercise, 28–29, 28
- Word embedding, 67
- Wrongness, perceived levels of. See also Intentionality; Moral scenarios
- blame attribution, 23, 174–176, 175
- demographic correlates with, 141–146, 142–143, 145
- moral functions of, 133–141, 137, 138, 139, 140
- moral space representation of, 125–132, 126, 128, 129
- patterns across scenarios, 127–132
- in trolley problem, 173–174, 174
- Wrongness-intention plane, 129, 130–131
- Y
- Yang, Andrew, 121
- Younger workers, displacement attributed to, 113–118, 117