Appendix: The Research behind Behind Their Screens

The insights and arguments in this book emerged primarily from a large, mixed-methods study of tweens’ and teens’ perspectives on digital life that we carried out between 2017 and 2021. We launched this study, the Digital Dilemmas Project, with the aim of understanding young people’s perceptions of the upsides and challenges of the digital landscape at that time. We also wanted to understand the ways adults were supporting them, and implications for new or revised supports.

Our broad questions were: How are young people navigating thorny digital issues and dilemmas that surface in networked life? How are adults supporting them? What more could adults do? Our inquiry included digital issues and dilemmas that spanned four spheres: personal well-being, close relationships and intimacy, peers and community, and the broader civic or public sphere.

When we set out to do this research, we envisioned our findings being mainly directed toward new or improved educational materials. Our home base is at Project Zero, a research center at the Harvard Graduate School of Education. Our longstanding partnership with Common Sense Media was key. Carrie first collaborated with Common Sense Media in 2006 to support development of their research-based Digital Literacy and Citizenship curriculum. Over fifteen years later, Common Sense Media is a leader in providing digital educational resources to schools in the United States and, increasingly, around the world. In 2017, we partnered with Common Sense Media once again—this time, to bring fresh insights from empirical research to support updates to their curriculum.

As we collected our data, we saw immediate implications for the content of curricular lessons and pedagogical approaches. These insights were leveraged in short order to develop and pilot test new materials for schools. But we also came to see that our data had relevance beyond schools. The more we began to share selected insights from our survey, the more we realized that what we were learning had relevance for a broader audience, including parents, mental health professionals, youth development organizations, and technology companies. This book is one outcome of a quest to bring this research to a larger public stage.

Throughout the book, we’ve referred to our surveys, interviews, classroom observations, and youth advisory council sessions. Further details follow about these studies and our research participants. Because we have worked on these topics for over a decade, we’ve also had a steady stream of opportunities to have less formal discussions with teens, parents, and professionals who work with youth. These discussions often surface stories that corroborate themes in our more formal studies. They also showcase tensions through real-world dilemmas (Sarah’s story, which we shared in chapter 7, is an example).

Our research with youth is the focus of this book. We first briefly describe the educator study because it was the direct precursor to the focal youth study.

Phase 1: Educator Study

Educator Survey (Phase 1a)

In the fall of 2017, we conducted online surveys of educators, recruiting participants via Common Sense Media’s email listserv of educators interested in digital learning and citizenship. We received responses from more than 1,200 educators, although response numbers varied by question. Of those who reported race/ethnicity, gender, and age, respondents predominantly identified as White (80 percent), female (86 percent), and between the ages of 26 and 55 (79 percent). Open-ended descriptions of their school contexts and learner populations indicated that teachers worked in a wide range of settings and with learners of diverse backgrounds, abilities, and identities. Of respondents who shared geographic information, the majority (n = 715) reported that they live and teach in the United States (although across different regions), while eighty-one respondents were from other countries around the world (selected countries respondents were from included Australia, Canada, Jordan, Kuwait, Malaysia, Mexico, Peru, Senegal, and Vietnam).

Our survey sought to understand educators’ perspectives on the most salient digital topics to teach students, those they had discussed with students over the past several years, their teaching approaches and takeaway messages in relation to challenging digital topics, and memorable digital dilemmas faced in their schools.

Educators’ survey responses gave us important insight into adults’ views on the current digital landscape. We had descriptive data on the digital topics these educators most often taught (#1 digital footprints, #2 how posts might make others feel, #3 talking with strangers), as well as the topics of greatest concern to educators (#1 digital drama and cyberbullying, #2 screen time, #3 digital footprints, tied with pressures to stay connected) and their stances on digital dilemmas. We conducted systematic coding and analysis of educators’ strategies and takeaway messages for teaching challenging digital topics. These data pointed to a tendency toward protectionist messages about digital footprints, privacy risks, and sexting that can amplify young people’s anxieties about growing up digital.

We also asked several questions that we included in our subsequent surveys with students, including a version of our “worries” question (though in the educator survey, we used the term “concerns”; i.e., What concerns you most about today’s digital world?). The response options were the same as those offered to students, as well as: Why is [response] your biggest concern? (See “Survey Details” under “Phase 2: Youth Surveys” section, following, for the full list).

Motivated by our particular interest in dilemmas that surface in networked life, our surveys of both educators and students included a series of related questions. Respondents were asked to respond to a series of short-form dilemma statements (example: “It’s fair for college admissions to consider applicants’ social media posts”—again, see “Survey Details” for a full list) with seven-point Likert scale response choices from “Strongly Agree” to “Strongly Disagree.” We also included three dilemma vignettes on timely topics (doxing, or public shaming on social media; digital footprints and college admissions; and parental responses to sexting) with “Agree” or “Disagree” responses and opportunities to elaborate.

Educator Interviews (Phase 1b)

We went on to recruit and interview a purposive sample of twenty-five educators from those who responded to our educator survey. These educators were invited for interviews based on survey responses that described compelling, novel approaches to teaching students about digital life. The interviews focused on the details of their pedagogical approaches and highlighted a number of effective ways of moving beyond simplistic messages and leaning into complexity. We describe key insights from these interviews in our previously published report, Teaching Digital Citizens in Today’s World: Research and Insights Behind the Common Sense K–12 Digital Citizenship Curriculum.

Design-Based Field Research (Phase 1c)

Drawing on insights from educator surveys, interviews, and emerging findings from our youth surveys (Phase 2, description follows), we developed pilot classroom materials in collaboration with Common Sense Media. Our design-based research process involved field testing dilemma-based activities with educators in sixteen different schools in eight different states (California, Texas, Arizona, DC/Virginia, Philadelphia, Maine, Massachusetts, New York). The resources we co-created are available for free as part of Common Sense Media’s Digital Citizenship curriculum.

Phase 2: Youth Surveys

Across a ten-month period between June 2018 and March 2019, we conducted online surveys of students at fifteen middle and high schools across the United States (three of these schools subsequently participated in the aforementioned pilot study of new classroom approaches to teaching digital citizenship). These schools include traditional public schools (n = 7), public charter schools (n = 7), and one private school, and they were dispersed across ten U.S. states in the Mid-Atlantic, Northeast, Southeast, Southwest, Midwest, and West.

At each school site, we worked directly with educators who reviewed study information with school leadership and pursued requisite school permissions in addition to our university IRB. The study used an opt-out parental consent process outlined in a letter sent home by educators to parents/guardians. During class time allotted to the survey (approximately fifteen minutes), assenting students completed the Qualtrics-based survey anonymously via school-provided devices. Students could skip any questions or discontinue the survey at any time. Per our IRB protocol, we do not have information on students who chose not to participate at all or whose parents opted them out.

Participants

Youth optionally reported gender identity, age, grade, and race and ethnicity. Of those who reported demographic information, approximately 48 percent identified as female, 44 percent male, and two percent nonbinary; 6 percent preferred not to specify. (A notable limitation of the question/response format: we do not know the percentage of gender minority youth in our sample given the binary/nonbinary response option invited self-identification yet did not distinguish transgender binary youth).

Participants’ reported ages ranged from nine to nineteen but most were between the ages of twelve and eighteen. Youth could “select all that apply” in terms of race and ethnicity; of those who responded, 52 percent identified as White, 19 percent as Hispanic/Latinx, 17 percent as Black or African American, 8 percent as Asian or Asian American, 5 percent Native American/American Indian/Alaska Native, 2 percent Middle Eastern or North African, 1 percent Pacific Islander/Native Hawaiian, and 9 percent Other.

When we refer to differences by gender and by grade level (i.e., middle versus high school), we are limited by the information that respondents provided.Student surveys: Self-reported demographic information

Count

%

Gender (select one)

Male

1425

44%

Female

1560

48%

Nonbinary

71

2%

Prefer not to answer

181

6%

Question total

3237

100%

Age (in years)

Mean (Std. Dev.)

13.4 (2.0)

Mode

13

Range

9–19

Race, Ethnicity (select all that apply)

White

1678

52%

Hispanic/Latinx

614

19%

Black or African American

556

17%

Asian or Asian American

258

8%

Native American/American Indian/ Alaska Native

153

5%

Pacific Islander/Native Hawaiian

44

1%

Middle Eastern or North African

77

2%

Other (please specify)

297

9%

Prefer not to answer

213

7%

Question total

3197

Survey Details

The surveys sought to reveal young people’s perspectives on different facets of digital life. Questions covered the social media sites/apps students use across a typical week and their general perspectives about their digital lives—including the upsides (example: What are some of the best parts about growing up with technology like cellphones and social media?) and the challenges (What are some of the most tricky or challenging parts about growing up with technology like cellphones and social media?).

Across the book, we feature youth perspectives shared in response to an especially revealing question: What worries you most about today’s digital world? Participants were offered ten options (randomized order) that were (1) selected based on literature review about salient digital topics and prior fieldwork, and (2) pretested with youth. These options were: Being asked for inappropriate pictures, Comparing to others on social media, Connecting with strangers, Digital drama and cyberbullying, Digital footprints or online posts lasting forever, Pressure to always stay connected, Risks to private information, Seeing inappropriate content, Too much screen time, and Other (please specify). All participants were prompted for an open-ended explanation: Why is [response] your biggest worry? At the time, this list of options felt like a comprehensive representation of relevant topics. If we were conducting the study in 2022, we would certainly include additional topics that we know are salient to youth today (e.g., posting about civic issues on social media, echo chambers and filter bubbles).

This was just one of the shortcomings of our survey approach. We describe in the introduction other relevant limitations of the worries question:

Giving people set options to choose from can narrow the realm of what they consider. Permitting selection of just one worry (“what worries you most . . .”) means that the responses only reflected perspectives from those who were most concerned about a given topic. These are important qualifications. They mean, for example, that it wouldn’t be appropriate to make claims like “X percent of teens are worried about this topic” nor would it be right to say that a certain percentage of teens are not worried about a topic just because it wasn’t their top concern.

Our subsequent qualitative data collection, particularly teen advisory council focus groups and interviews (see “Phase 3: Teen Advisory Council Focus Groups and Interviews”) gave us insight into teens’ worries (including and well beyond those on our initial list) and other salient topics discussed across the book.

As noted in the Educator Survey (Phase 1a), our youth surveys also included a series of questions to capture participants’ attitudes about quandaries or dilemmas that surface routinely in networked life. Participants responded to a series of Likert-type normative statements with a 7-point Likert scale from “Strongly Agree” (7) to “Strongly Disagree” (1) (the scale included “Undecided” as a response option between “Somewhat Agree” and “Somewhat Disagree”). The statements included:

  • It’s reasonable for people to face consequences later in life for social media posts they shared when they were in middle or high school.
  • It’s fair for college admissions to consider applicants’ social media posts.
  • If someone texts you, you should respond as quickly as possible.
  • Parents should monitor their teens’ text messages.
  • Parents should monitor their teens’ social media accounts.
  • It’s okay for people to share violent videos online to call attention to what’s going on in the world.
  • If someone sends a naked picture to someone else, it’s their own fault if the picture ends up getting shared with other people.
  • If someone makes an offensive comment on social media, people have the right to call them out—even if it hurts their reputation.
  • Being a good friend means being available whenever your friend needs you.
  • It’s okay to take a break from social media for a few days, even though you’ll miss some of your friends’ posts.
  • If a friend asks for honest opinions on an anonymous app, you should respond honestly even if it might hurt their feelings.
  • What people do or say in public is fair for others to record and post on social media.
  • If you suspect someone of doing something behind your back, it’s justified to look through that person’s private messages.
  • Schools should monitor what students are doing on social media.

Participants were also presented with three longer form digital dilemma vignettes on the following topics: the appropriateness of doxing individuals who participate in a hateful protest; the fairness of a college’s decision to revoke an offer of admission from students for offensive online speech; and the appropriateness of a parent’s actions when discovering evidence of teen sexting. Respondents were offered “Agree” and “Disagree” response options and an opportunity for explanation of their choice in an open-ended format.

These surveys yielded responses from more than 3,600 youths. Because some students left questions blank or did not complete the survey in its entirety, the total number of respondents varies across questions (for example: 3,630 youths responded to the initial worries question and 3,529 provided elaboration; 3,697 provided open-ended responses about the best parts of growing up with today’s technologies; Likert responses to the normative statements ranged from 2,909 to 3,453). In instances where we report percentages in relation to certain findings, they are calculated based on the number of responses provided for a given question.

Analysis

We generated descriptive statistics of students’ responses to questions about social media app usage, worries about today’s technologies, responses to short-form dilemma statements, agree/disagree responses to longer form dilemma vignettes, and reported demographic characteristics.

We conducted in-depth coding and qualitative analysis of students’ responses to selected questions. For example, for the “What worries you most about today’s digital world? Why is [response] your biggest worry?” questions, our research team analyzed open-ended responses (short, textual data) through a multistep process, looking for themes through inductive and then deductive approaches (including open, axial, and selective coding drawing on Strauss and Corbin1). Key to our process was the inclusion of a teen researcher as a full team member. When discrepancies and edge cases arose in our coding, we favored her interpretations. Our coding and analytic process yielded an array of insights about teens’ worries about screen time, relational pressures to stay connected, digital drama, sexting, privacy, and digital footprint risks. Quotes illustrating key themes are woven throughout the thematic chapters. In our concluding chapter, we articulate a broader throughline finding—the struggle for digital agency—that emerged as a clear cross-cutting theme across youths’ specific concerns. We then completed an additional analytic step after coding, which was key to our analytic process: teen advisory.

Phase 3: Teen Advisory Council

From November 2020 to April 2021, we convened a council of twenty-two U.S.-based teens as advisors for our research on teens and digital life. We recruited teen participants through word of mouth, tapping our networks and colleagues at schools and youth-serving organizations to establish a diverse group. Our advisory council sessions unfolded during the COVID-19 pandemic, which meant that they also provided an opportunity to learn about how the lockdown, remote schooling, and social distancing affected teens’ digital well-being. This also facilitated inclusion of COVID-19-relevant examples shared in prior chapters. When we began our recruitment process, we wondered how the pandemic circumstances might affect participation. We had our recruitment survey open for approximately two weeks. We received responses from eighty-six interested participants, invited twenty-eight, and ultimately had participation from twenty-two teens.

Our youth advisors were teens between the ages of thirteen and eighteen. The group was roughly two-thirds female; diverse with respect to race, ethnicity, and sexual orientation; living in varied family structures and communities in different regions of the United States; and attending different kinds of schools.

We convened our youth advisors over Zoom, meeting in small groups of approximately three to five members to facilitate rapport and deep discussion. An added benefit of meeting in small groups was that we held multiple discussions on particular topics and then compared similarities and differences across these groups. In total, we held twenty-seven small group discussions with our teen advisory council and conducted twenty-eight individual interviews with council members. Crucially, we (Carrie and Emily) jointly led all youth council sessions with our teen research managers Sol Lange and Chloe Brenner, who codesigned the facilitation guides and asked valuable follow-up questions. Their closer proximity in age to adolescents and their personal experiences prompted them to raise topics such as performative commenting (which we describe in chapter 3) and particular facets of social comparison experiences on social media (like those we describe in chapter 2) that weren’t on our adult radar.

Each session was designed around one or two focal topics, but in a semi-structured fashion to allow discussions to follow themes most salient to participants. A key aim of the teen advisory council was to invite teens to be co-interpreters of our survey data. Accordingly, sessions often included sharing thematic quotations from our survey, and our working interpretations of what our coding seemed to reveal about teens’ experiences and struggles with digital habits, footprints, civic issues, and more. We worked with our teen advisors to consider alternative interpretations of data, and to explore questions about what might be missing and how experiences with a particular issue vary for different teens.

We audio- and video-recorded all sessions, created verbatim transcriptions, and examined the transcripts for (a) key illustrative cases related to topical themes from our survey data analysis and (b) counter-cases that suggested alternatives or different perspectives. In several cases, we brought key data points from our survey and working arguments about those data to our advisors for their reactions and pushback.

For example, in chapter 5, we present a list of “9 Reasons Why” teens send nudes. We created an initial version of this list based on the insights from our survey data and prior research, and then workshopped the list with our teen advisors. They critiqued, annotated, and revised the “reasons list” through open discussion and Zoom features that provided different feedback and editing modalities (polls, whiteboard, and chat). The final “9 Reasons Why” version we share in chapter 5 reflects their additions, tweaks, and deletions. Among other things, they suggested adding a direct acknowledgment of the idea that sexting could be “pleasurable, exciting, and fun.” They also advocated removing the reason “it seems like no big deal because ‘everybody does it.’” In this instance, teens clarified that knowing their peers sext is not itself a primary reason to send a nude; rather, peer norms “ease” or normalize sexting decisions made principally for other reasons (e.g., to impress a crush). We followed their guidance and removed the item from the list. Similarly, the opening vignette we share at the beginning of chapter 5 (about a boy showing his friends a nude allegedly from a classmate) is an adapted version of a story shared with us a few years prior. We had our advisory council discuss the vignette in detail so we could understand whether and how it remained relevant, and whether teens felt there were additional aspects of such situations that we should surface for readers. Their reactions helped us understand how sharing of others’ nudes without consent plays out in different ways for older versus younger teens.

For each topical chapter, we used similar approaches: vetting our interpretations and arguments with teen advisors and reworking the chapter text to attend to counter-cases or variations.

RELEVANT PRIOR RESEARCH

Between roughly 2008 and 2018, our research team also conducted 378 interviews with young people in the course of various studies we led about social media and digital life. In numerous places in the book, we draw on insights from these interviews and use corresponding footnotes to signal connections to the relevant studies. Details on the methodological approaches relevant to those interviews and their analysis appear in published articles, including: