Chapter 6
What Personalized Learning Looks Like at the Systems Level

Student voice, like teacher voice, should be integral—not an afterthought, or even a means of validation. There is a tangible attitude in education that children are to do what the adults say, since the adults know best. They can complain and protest, but their voices should be dismissed because, well, it’s petulant and uneducated. This philosophy is synonymous to a physician ignoring his patient’s report of pain or symptoms because he clearly knows more about the human body. Just as a physician must, our educational legislators should know more than children about the specifics of engineering an education system. But this is no reason to dismiss students’ input and ideas.

—Ethan Young, high school senior in Tennessee

We now address the ripple effect that many readers have anticipated from the first few pages of the book: how we organize school needs to become personalized as well. A personalized learning system transforms schooling by providing voice and choice on what, where, and how students learn in relation to disciplinary and cross-disciplinary outcomes aligned with standards. In Why School? Will Richardson asserts, “In this new story, real learning happens anytime, anywhere with anyone we like—not just with a teacher and some same-age peers, in a classroom, from September to June. More important, it happens around the things learners choose to learn, not what someone else tells us to learn.”

Therefore, in this personalized learning system,

In this chapter, we focus on elements generally not within the locus of teacher control but are absolutely vital to making personalized learning a reality: assessment of learning, time, and advancement. All three of these elements are connected to local policy as well as to state, ministry, and/or national requirements.

As we explore each element, consider the “Yes, buts,” both those that come up for you as a reader and those that you can imagine other members of the school community might say.

Element 10: Demonstration of Learning

A personalized learning system captures student learning through the use of multifaceted assessments (see Table 6.1). This robust and rigorous system creates the opportunity for students to develop pride in their work, agonize over details, and work through challenges. Ron Berger eloquently described this difference in student pride and performance when students have a voice and choice built into assessment of learning: “Rather than seeing school as something done to them, students are given the responsibility to carry out original academic projects, save work in portfolios, display their work, and reflect publicly on their work and their learning” (53).

Table 6.1 Personalized Learning Evolution: Demonstration of Learning

Element Minimal Student Input Some Student Input Student Driven
Demonstration of Learning
What constitutes evidence of learning?
Teacher and district assessments specify the way(s) in which disciplinary and cross-disciplinary outcomes will be demonstrated. Student chooses among a set of options to determine how disciplinary and cross-disciplinary outcomes will be demonstrated. Student proposes or shapes way(s) that both disciplinary and cross-disciplinary outcomes will be demonstrated and will provide evidence of learning (e.g., personalized portfolio).

Demonstrations of learning value the solution, interpretation, creation, or conclusion and the explanation or justification that led to that result. In a 1989 Phi Delta Kappan article, Grant Wiggins poses a powerful question that is as timely then as it is now: What is a true test?

We have lost sight of the fact that a true test of intellectual ability requires the performance of exemplary tasks that replicate the challenges and standards of performance that typically face writers, businesspeople, scientists, community leaders, designers, and historians. A genuine test of intellectual achievement doesn’t merely check standardized work in a mechanical way. It reveals achievement on the essentials, even if they are not easily quantified.

As Wiggins reflects in a recent blog post, “Authenticity in Assessment,” he advocates that this is a curriculum design challenge into which local educators can make significant inroads if they focus on the following components: structure and logistics, design features, grading and scoring, and fairness (see table 6.2).

Table 6.2 Authentic Assessment Tasks

Component Authentic Tasks
Structure and Logistics
  • Are more appropriately public; involve an audience, panel, etc.
  • Do not rely on unrealistic and arbitrary time constraints
  • Offer known, not secret, questions or tasks
  • Are not one-shot—more like portfolios or a season of games
  • Involve some collaboration with others
  • Recur—and are worth retaking
  • Make feedback to students so central that school structures and policies are modified to support them
Design Features
  • Are “essential”—not contrived or arbitrary just to shake out a grade
  • Are enabling, pointing the student toward more sophisticated and important use of skills and knowledge
  • Are contextualized and complex, not atomized into isolated objectives
  • Involve the students’ own research
  • Assess student habits and repertories, not mere recall or plug-in
  • Are representative challenges of a field or subject
  • Are engaging and educational
  • Involve somewhat ambiguous (ill-structured) tasks or problems
Grading and Scoring
  • Involve criteria that assess essentials, not merely what is easily scored
  • Are not graded on a curve, but in reference to legitimate performance standards or benchmarks
  • Involve transparent, demystified expectations
  • Make self-assessment part of the assessment
  • Use a multifaceted analytic trait scoring system instead of one holistic or aggregate grade
  • Reflect coherent and stable school standards
Fairness
  • Identify perhaps hidden strengths (not just reveal deficit)
  • Strike a balance between honoring achievement while mindful of fortunate prior experience or training (that can make the assessment invalid)
  • Minimize needless, unfair, and demoralizing comparisons of students to one another
  • Allow appropriate room for student styles and interests (some element of choice)
  • Can be attempted by all students via available scaffolding or prompting as needed (with such prompting reflected in the ultimate scoring)
  • Have perceived value to the students being assessed

Reimagining what a true test looks like opens the door for students to take a lead role in the learning experience. Students can use clear criteria (aligned with standards and disciplinary and cross-disciplinary outcomes) to investigate, create, and analyze problems of interest that are at the heart of various disciplines.

To illustrate this point, in the summer of 2014, curricular designers and administrators from Learn4Life Charter Schools in California worked with Allison to articulate artifacts that are evidence of student learning outside a given content area, but worthy of accomplishment and documentation (see table 6.3). Working from the Personalized Learning Evolution chart, they categorized student artifacts as either minimal student input (authentic task with clear parameters) or some student input/student driven (authentic task that requires significant development from student to establish parameters and what success looks like).

Table 6.3 Artifacts of Student Learning

Minimal Student Input Some Student Input/Student Driven
  • Record of achievement (GPA and credit completion)
  • Letters of recommendation and supervisory reports based on internships, workplace experiences, and service learning projects
  • Updated resume (for example, awards, scholarships, work experience, participation in team sports or cocurricular clubs)
  • Personal narrative (entering Learn4Life and exiting)
  • Significant research tasks that generate findings, patterns, predictions/generalizations for a given course, competition, or local/global project that is already under way
  • Evidence of identifying a problem and creating a viable solution for a given course, competition, or local/global project that is already under way
  • Evidence of dynamic communication for a given purpose (for example, email threads, social media, discussion boards, blog posts, online course conversations) for a given course or local/global project that is already under way
  • Formal presentations for a given purpose and audience (for example, Key Club, YouTube, Skype, church, Toastmasters, student council, student-led conference, graduation)
  • Development of technical skills as demonstrated by certification exams (for example, Microsoft, CISCO, food handling)
  • Service learning project (demonstration of cross-disciplinary outcomes and ideas for next steps or next projects)
  • Demonstration of acquired creative and/or technical skills based on independent tasks (for example, coding, graphic design, game design, original writings, video creation, art pieces)
  • Tangible timelines (calendars, project management, and so on) associated with projects (academic and extracurricular) that focus on quality outcome while still managing the realities of life
  • Goal tracker for a significant project that shows goal setting, action taken, resetting, recalibrating
  • Trajectory (short-term/long-term goals and accomplishments based on course selection, internships, postsecondary plan, and so on)
  • Significant research tasks that generate findings, patterns, predictions/generalizations
  • Evidence of identifying a problem and creating a viable solution
  • Evidence of dynamic communication for a given purpose (for example, email threads, social media, discussion boards, blog posts, online course conversations)
  • Accomplishments/recognition in cocurricular endeavors (for example, enterprise program, community service, entrepreneurship, organized groups)

Developed by staff at Learn4Life Charter Schools

We advocate for the inclusion of three major assessment types to measure and motivate student performance as well as hold schools accountable for the growth of every child: (1) summative assessments; (2) digital portfolios, gateways, or exhibitions; and (3) large-scale assessments.

In a personalized learning system, summative assessments primarily are rich performance-based tasks for which students have to apply their learning to novel situations to demonstrate the strategy, skill, and perseverance needed to be successful. Summative assessments provide information to the student, her family, and staff about the student’s mastery of given competencies. Generally these assessments occur toward the end of a unit, semester, or year. Grading is not something “done” to the student but rather with the student, where she has clarity on how the work will be judged and uses that clarity to continue to inform the development of the task. The teacher provides feedback to the student to inform revision of a given task, validate competency levels, and clarify readiness for taking a state or national assessment. For example, Albany Senior High School in New Zealand is featured on the Ministry of Education’s website for the design of its Impact Project. The four principles of the Impact Project are as follows:

For more details on the project structure, rubric, and rationale, visit http://elearning.tki.org.nz/Teaching/Pedagogy/Personalised-learning#stories and http://wikieducator.org/Albany_Senior_High_School/Impact_Projects. In addition, table AC.2 in appendix C lists the reflective questions centered around each of the four principles.

Digital portfolios, gateways, and exhibitions are formal collections and presentations of what students have learned throughout one or multiple years of schooling. In a personalized learning system, the student owns the work: a demonstration of mastery of competencies through a student-created, student-led conversation about content and skill development over time. Students can showcase a variety of authentic tasks to demonstrate mastery; audiences for these presentations typically consist of parents, staff, and practicing experts.

For example, in his book Personalized Learning: Student-Designed Pathways to High School Graduation, John Clarke describes the power of students creating and organizing learning for a public audience. He highlights how preparing a range of artifacts (for example, films, photos, essays, diagrams) causes students to reflect on the meaning of what they have done over time and the accomplishment and growth they have experienced. Then, in an hourlong presentation to peers, advisors, mentors, and parents, students use artifacts and highlights from their e-portfolio to discuss personal and academic growth. To prepare for this formal presentation, each student leads a fifteen-minute conference to summarize current performance on a range of competencies and delineate next steps.

Another powerful example of authentic assessment is the creation of digital portfolios at High Tech High School. School leaders provide a rationale for why a digital portfolio is necessary: to capture learning that is much richer than a “collection of letter grades.” Staff at High Tech High expect that “each student will create and maintain a portfolio of work to communicate who they are, what they have learned and what they have accomplished. They will be able to demonstrate their true skills and abilities.” The digital portfolio includes a cover page, a personal statement, work samples, a resume, and contact information. For more information and to see several examples of e-portfolios, visit http://www.whatkidscando.org/archives/portfoliosmallschools/HTH/portfolios.html.

Large-scale assessments provide information to students, their families, school and district staff, and the state/ministry/national government about student performance and schoolwide challenges. Typically these assessments are administered for every student in that grade or course level at a predetermined time of the year. Ideally, in a personalized learning system, a student takes a large-scale assessment when he is ready, not on a time schedule. These criterion-referenced large-scale tests are likely to become more innovative over time, using advances in assessment technology to address different contexts of the learning. For example, the external assessment for the International Baccalaureate Diploma Program (http://www.ibo.org/diploma/assessment/methods/) comprises essays; structured problems; short-response, data-response, text-response, case study, and multiple-choice questions; a theory of knowledge essay (a reflection on how we know what we know); an extended essay (self-directed piece of research culminating in a four-thousand-word paper); and world literature assignments. Another example comes from the New Hampshire State Department of Education, which has partnered with the Center for Collaborative Education to develop a performance assessment system to measure student mastery of college- and career-ready competencies that balances local control with statewide accountability and comparability. On the New Hampshire Department of Education website (http://www.education.nh.gov/assessment-systems/), the following description outlines the parameters of the statewide performance-based system: “This system, the Performance Assessment for Competency Education (PACE) option, will include: common performance tasks that have high technical quality, locally designed performance tasks with guidelines for ensuring high technical quality, regional scoring sessions, and local district peer review audits to ensure sound accountability systems and high inter-rater reliability, a web-based bank of local and common performance tasks, and a regional support network for districts and schools.”

Although a sizeable number of schools have been dabbling in the development of robust performance tasks, digital portfolios, exhibitions, student-led conferences, service learning projects, and expanded forms of independent study, they all face the challenge of documenting those pieces in a traditional reporting system. Personalized learning can be a messy process, which is one of the reasons that traditional education has held it at bay for so long. It’s difficult to offer a multitude of learning opportunities for individual students based on their needs and interests. It’s even more difficult to capture student performance on these personalized journeys and to organize results in a way that both students and educators can make sense of, let alone draw insights from.

In a personalized learning environment, a great deal of learning, especially in the area of cross-disciplinary outcomes, is demonstrated within the learning environment and alongside learning tasks. As mentioned, evidence of learning and performance in personalized learning environments can be messy. Evidence of performance may come from different sources, such as artifacts of student learning, metacognitive journals, reflections, and input from various audiences. Evidence may also come from different places—personalized modules, adaptive technology, online sources outside the school setting, and the like. To top it all off, criteria for success are not always standardized, especially in the area of cross-disciplinary outcomes, which tend to be derived from an organization’s vision and mission. Not only does the articulation of cross-disciplinary outcomes vary from place to place, but students may demonstrate achievement of the same outcomes in different ways. Lining up everyone for the end-of-unit test is not likely to be the norm in personalized learning environments, especially where cross-disciplinary outcomes are involved.

In a personalized learning environment, a rich and varied set of student performances demonstrate valid indicators of learning that must be captured and organized to be both recognized and useful. Current grade books and report cards are not equipped to deal with this rich messiness. We need a platform to

Here, again, is where contemporary technology can help. There are an increasing number of learning management systems (LMSs) that will track student performance on learning modules. These systems can capture all of the actions and results of students using the system, from the amount of time spent on a particular learning module to the number of times a student attempted a problem to the aggregate score accumulated by a student across many learning modules within a particular area of study. For example, Khan Academy (www.khanacademy.org) enables students to access their learning modules and their performance on related assessments. These results are then presented back to the student in useful infographics that allow him to gain an overview of his learning over time and make observations about his progress. Project Foundry (www.projectfoundry.org) is a system designed to support personalized learning. This platform has been around since the mid-1990s and supports learning in project-based learning environments. The system enables students to develop personalized projects and align them to existing disciplinary standards. Assessment of student performance is entered into the system, which will generate standards-based report cards from these projects and allow students to “drill down” into their performance on various standards. Summit Public Schools (www.summitps.org) has chosen to develop its own Personalized Learning Plan platform to support its vision of learning in the twenty-first century. The platform offers students learning modules from a menu supported by a third-party delivery platform. Students can access their assessment results, set goals, and manage their learning within an environment that has removed many of the obstacles to personalized learning. A mixture of projects, face-to-face time, and digital delivery and tracking tools help Summit’s students pursue learning anywhere, anytime, and through a variety of modes.

The challenge for the modern LMS is to provide for existing needs while at the same time innovating to meet emerging needs, all in a comprehensive package. This requires an openness and agility with which more traditional and monolithic software companies have difficulty. Developers are often more adept at creating tools to support linear processes than tools to support the relationships and messiness involved in personalized learning. But they are getting closer. For example, Schoology (https://www.schoology.com/home.php) integrates many tools to support the processes involved with “being a teacher” or “being a student.” It is much more open in its architecture, allowing innovative connections with other systems and apps. However, like most comprehensive LMS systems, it organizes the learning environment using a set of structured rules and relationships. These are often one-way in nature. They help us achieve existing tasks, but do not have the flexibility to accomplish new ones. The challenge is to break from these established patterns.

Although these systems are examples of valuable components of a personalized learning environment, their focus is primarily on the attainment of disciplinary outcomes through standardized methods and assessments, although Summit Schools has woven this into a reengineered learning environment. However, the school system misses one of the main goals of our vision for personalized learning: cross-disciplinary outcomes. Although there are some assessments out there for critical thinking and creativity, these tend to be isolated and separated from an environment within which students can authentically demonstrate them. Also, we have yet to see an LMS or other system that adequately captures performance on cross-disciplinary outcomes alongside performance on disciplinary outcomes. For a couple of years now, Greg has been working with a few “lighthouse” schools around the world and an educational technology company, EduTect (http://edutectinc.com), to design such a dynamic platform. The platform is called LearningBoard, and we believe it has great potential to support personalized learning from both the student and school perspective. The LearningBoard may seem like a leap from most forms of assessment and reporting, but this platform helps students, staff, and families see the richness and diversity of student performance in a personalized learning environment.

The LearningBoard combines elements to both capture student learning and provide tools (such as project planners, goal-setting and tracking tools, and student-led conference planners) to support learning in a personalized environment. Let’s briefly explore the two types of performance elements:

In this way, the LearningBoard moves beyond a content delivery system or a data warehouse for students’ marks on assessments. It becomes a dynamic tool that encapsulates the organization’s vision of successful learning in a personalized environment, where performance on both disciplinary and cross-disciplinary outcomes engages students in their own development and growth.

Within this platform, there are three types of evidence of student learning (see table 6.4).

Table 6.4 Types of Evidence in the LearningBoard Platform

Type of Evidence Contributed by… Source Interpreted through (Criteria)
Artifact Student Any tangible product of learning Demonstration of disciplinary outcomes and cross-disciplinary outcome performance indicators
Reflection Student Any learning experience Reflection on how cross-disciplinary outcomes, performance indicators, and mindsets are demonstrated in the work, and on areas of possible improvement
Assessment Teacher Any culminating or summative assessment Demonstration of disciplinary outcomes and cross-disciplinary outcome performance indicators

When evidence of student learning is captured and interpreted with explicit alignment to clear disciplinary and cross-disciplinary outcomes, we can develop an extremely rich picture of student learning that continues to grow, in contrast to a set of separate assignments that are averaged together to produce a score. Because each LearningBoard belongs to a unique individual, the picture that emerges is equally unique, and the entire platform is a demonstration of personalized learning. Schools can use this to help navigate the messiness of personalized learning and maintain accountability. Key stakeholders (staff, students, family members) now have the data to support ongoing improvement efforts and a culture of attention to the individual within the whole.

Element 11: Time

A personalized learning system uses time as a flexible resource depending on the nature of the challenge and the skill level of the students (table 6.5). Blended learning approaches to support student mastery of the competencies are more meaningful and practical to students, staff, families, and community members. If we are moving toward learning that can occur anywhere, anytime, then what we do in the brick-and-mortar, online, or community learning space to further the competencies is more important than the set number of minutes assigned to a particular class.

Table 6.5 Personalized Learning Evolution: Time

Element Minimal Student Input Some Student Input Student Driven
Time
When can/does learning occur?
Schooling is defined by “seat time”—prescribed number of school days (e.g., 180 days, Carnegie units). Schooling is a more variable blend of time-based and outcome-based measures. Schooling can take place 24/7, 365 days a year and be determined by outcome-based measures.

With greater time flexibility, every student will receive customized supports and accelerated opportunity both in and outside school to ensure that they stay on track to graduate ready for college and career. The goal is to provide learning opportunities and meet the educational needs and interests of all children. This opens the door for

Using time as a resource rather than as a fixed structure will require leadership at the local, state, ministry, and national levels to change policies and practices that continue to define learning as “seat time” moments rather than as demonstrated learning accomplishments.

Element 12: Advancement

Robust and varied advancement strategies should be implemented to inform students, families, school and district staff, and state officials about individual and group progress in relation to the competencies (see table 6.6). Such strategies should feature an evidenced-based collection of student tasks and tests designed around clearly defined competencies. Advancement can be used as a meaningful feedback loop through which students, staff, and family members check progress toward competency targets. This would likely make formal reporting cycles unnecessary, which would prevent an artificial rush to get marks in by a certain date. Communication with the family still remains essential and is perhaps more robust, through student-led conferences, formal demonstrations (for example, portfolios or exhibitions), and opportunities for family members to log on to a system (for example, LearningBoards) and engage with the teacher.

Table 6.6 Personalized Learning Evolution: Advancement

Element Minimal Student Input Some Student Input Student Driven
Advancement
How does a student progress through the system?
Student is advanced based on age, irrespective of achievement. Promotion or retention at the end of the year is based on achievement in the course or grade level. Advancement is based on demonstrated competency whenever that is achieved.

At the state or ministry level, policymakers and education officials are beginning to open up what school can become. For example, on its website, the Maine Department of Education (http://www.maine.gov/doe/plan/education_evolving/cpa3.html) advocates for a learner-centered model based on demonstration of mastery rather than movement based on age: “The system of schools we have today is one in which time is the constant and learning is the variable. Teachers and students are given a fixed period of time in which to cover a fixed curriculum. The result is a model that falls short of meeting the needs of all students. In a learner-centered, proficiency-based system, students advance upon demonstration of mastery, rather than remain locked in an age-based cohort that progresses through a fixed curriculum at a fixed pace, regardless of learning achievement.”

Again, we continue to reinforce that standards are not the barrier to personalized learning; the barrier lies in the design of local curriculum and related instruction with an emphasis on standardized assessment and teacher control.

Conclusion and Reflective Questions

This chapter has described elements of a very different type of schooling: one that is bound by rich and varied artifacts to demonstrate growth over time rather than a collection of scores, bound by mastery rather than seat time, and bound by learning that happens 24/7 rather than at certain times of the day and year. The system shifts described in relation to assessment of learning, time, and advancement call into question the assumptions we have about how students demonstrate mastery or competency on disciplinary and cross-disciplinary outcomes. As teachers’ and students’ roles evolve, so, too, should those of leaders and policymakers.

Consider these questions:

  1. In your estimation, how far are we from the tipping point where personalized learning is more widespread? What might that tipping point look like? What are you doing in preparation for this?
  2. In what ways are there design innovations occurring in your classroom, school, or program? (These can be officially sanctioned pilots, quiet campaigns, or covert actions.)
  3. What is still standing in the way of the evolution of schooling in your community?
  4. What was your reaction when you read the various policy examples in the chapter?

Works Cited

  1. Berger, R. An Ethic of Excellence: Building a Culture of Craftsmanship with Students. Portsmouth, NH: Heinemann, 2003. Print.
  2. Clarke, J. Personalized Learning: Student-Designed Pathways to High School Graduation. Thousand Oaks, CA: Corwin, 2013. Print.
  3. Richardson, W. Why School? How Education Must Change When Learning and Information Are Everywhere. New York: TED Conference, 2012. Web.
  4. Wiggins, G. “A True Test: Toward More Authentic and Equitable Assessment.” Phi Delta Kappan 70.9 (1989): 703–713. Print.
  5. Wiggins, G. “Authenticity in Assessment, (Re)defined and Explained.” Granted, and… Grant Wiggins, 26, Jan. 2014. Web. <http://grantwiggins.wordpress.com/2014/01/26/authenticity-in-assessment-re-defined-and-explained/>.
  6. Wiggins, G. “Engagement and Personalization: Feedback Part 2.” Granted, and … Grant Wiggins, 19 April 2014. Web. 20 May 2014. <http://grantwiggins.wordpress.com/2014/04/19/engagement-and-personalization-feedback-part-2/>.