Metacognition, the Representativeness Heuristic, and the Elusive Transfer of Learning

by Dr. Lauren Scharff, U. S. Air Force Academy*

When we instructors think about student learning, we often default to immediate learning in our courses. However, when we take a moment to reflect on our big picture learning goals, we typically realize that we want much more than that. We want our students to engage in transfer of learning, and our hopes can be grand indeed…

  • We want our students to show long-term retention of our material so that they can use it in later courses, sometimes even beyond those in our disciplines.
  • We want our students to use what they’ve learned in our course as they go through life, helping them both in their profession and in their personal lives.

These grander learning goals often involve learning of ways of thinking that we endeavor to develop, such as critical thinking and information literacy. And, for those of us who believe in the broad value of metacognition, we want our students to develop metacognition skills. But, as some of us have argued elsewhere (Scharff, Draeger, Verpoorten, Devlin, Dvorak, Lodge & Smith 2017), metacognition might be key for the transfer of learning and not just a skill we want our students to learn and then use in our course.

Metacognition involves engaging in intentional awareness of a process and using that awareness to guide subsequent behavioral choices (self-regulation). In our 2017 paper, we argued that students don’t engage in transfer of learning because they aren’t aware of the similarities of context or process that would indicate that some sort of learning transfer would be useful or appropriate. What we didn’t explore in that paper is why that first step might be so difficult.

If we look to research in cognitive psychology, we can find a possible answer to that question – the representativeness heuristic. Heuristics are mental short-cuts based on assumptions built from prior experience. There are several different heuristics (e.g. representativeness heuristic, availability heuristic, anchoring heuristic). They allow us to more quickly and efficiently respond to the world around us. Most of the time they serve us well, but sometimes they don’t.

The representativeness heuristic occurs when we attend to obvious characteristics of some type of group (objects, people, contexts) and then use those characteristics to categorize new instances as part of that group. If obvious characteristics aren’t shared, then the new instances are categorized separately.

For example, if a child is out in the countryside for the first time, she might see a four-legged animal in the field. She might be familiar with dogs from her home. When she sees the four-legged creature in the field, so might immediately characterize the new creature as a dog based on that characteristic. Her parents will correct her, and say, “No. Those are cows. They say moo moo. They live in fields.” The young girl next sees a horse in a field. She might proudly say, “Look another cow!” Her patient parents will now have to add characteristics that will help her differentiate between cows and horses, and so on. At some level, however, the young girl must also learn meta-characteristics that make all these animals connected as mammals: warm-blooded, furred, live-born, etc. Some of these characteristics may be less obvious from a glance across a field.

Now – how might this natural, human way-of-thinking impact transfer of learning in academics?

  • To start, what are the characteristics of academic situations that support the use of the representative heuristic in ways that decrease the likelihood of transfer of learning?
  • In response, how might metacognition help us encourage transfer of learning?

There are many aspects of the academic environment that might answer the first question – anything that leads us to perceive differences rather than connections. For example, math is seen as a completely different domain than literature, chemistry, or political science. The content and the terminology used by each discipline are different. The classrooms are typically in different buildings and may look very different (chemistry labs versus lecture halls or small group active learning classrooms), none of which look or feel like the physical environments in “real life” beyond academics. Thus, it’s not surprising that students do not transfer learning across classes, much less beyond classes.

In response to the second question, I believe that metacognition can help increase the transfer of learning because both mental processes rely on awareness/attention as a first step. Representativeness categorization depends on the characteristics that are attended. Without conscious effort, the attended characteristics are likely to be those most superficially obvious, which in academics tend to highlight differences rather than connections.

But, with some guidance and encouragement, other less obvious characteristics can become more salient. If these additional characteristics cross course/disciplinary/academic boundaries, then opportunities for transfer will enter awareness. The use of this awareness to guide behavior, transfer of learning in this case, is the second step in metacognition.

Therefore, there are multiple opportunities for instructors to promote learning transfer, but we might have to become more metacognitive about the process in order to do so. First we must develop awareness of connections that will promote transfer, rather than remaining within the comfort zone of their disciplinary expertise. Then we must use that awareness and self-regulate our interactions with students to make those connections salient to students. We can further increase the likelihood of transfer behaviors by communicating their value.

We typically can’t do much about the different physical classroom environments that reinforce the distinctions between our courses and nonacademic environments. Thus, we need to look for and explicitly communicate other types of connections. We can share examples to bridge terminology differences and draw parallels across disciplinary processes.

For example, we can point out that creating hypotheses in the sciences is much like creating arguments in the humanities. These disciplinary terms sound like very different words, but both involve a similar process of thinking. Or we can point out that MLA and APA writing formats are different in the details, but both incorporate respect for citing others’ work and give guidance for content organization that makes sense for the different disciplines. These meta-characteristics unite the two formatting approaches (as well as others that students might later encounter) with a common set of higher-level goals. Without such framing, students are less likely to appreciate the need for formatting and may interpret the different styles as arbitrary busywork that doesn’t deserve much thought.

We can also explicitly share what we know about learning in general, which also crosses disciplinary boundaries. A human brain is involved regardless of whether it’s learning in the social sciences, the humanities, the STEM areas, or the non-academic professional world. In fact, Scharff et al (2017) found significant positive correlations between thinking about learning transfer and thinking about learning processes and the likelihood to use awareness of metacognition to guide practice.

Cognitive psychologists know that we can reduce errors that occur from relying on heuristics if we turn conscious attention to the processes involved and disengage from the automatic behaviors in which we tend to engage. Similarly, as part of a metacognitive endeavor, we can help our students become aware of connections rather than differences across learning domains, and encourage behaviors that promote transfer of learning.

Scharff, L., Draeger, J., Verpoorten , D., Devlin, M., Dvorakova, L., Lodge, J. & Smith, S. (2017). Exploring Metacognition as Support for Learning Transfer. Teaching and Learning Inquiry, Vol 5, No. 1. DOI: http://dx.doi.org/10.20343/5.1.6 A Summary of this work can also be found at https://www.improvewithmetacognition.com/researching-metacognition/

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


How can I help students become more expert learners, so they engage in active learning?

by Stephanie Chasteen, University of Colorado Boulder

This chapter focuses on helping students engage productively in active learning classrooms by teaching students reflect on their learning and develop productive mindsets towards learning. It is part of a series on helping students engage productively in active learning classrooms.” It includes a list of tangible teaching and student metacognition strategies to use when working with students.


Embedding Metacognition into New Faculty Orientation

By Lauren Scharff, Ph.D., U. S. Air Force Academy *

When and how might faculty become aware of metacognition in general, how student metacognition might enhance student learning, and how personal metacognition might enhance their own teaching? Ideally, faculty learned about metacognition as students and thereafter consciously engaged in metacognitive practices as learners and developing professionals. Based on conversations with many faculty members, however, this is not the case. It certainly wasn’t the case for me. I don’t remember even hearing the term metacognition until after many years of working as a professor. Even now most new faculty seem to only have a vague familiarity with the term “metacognition” itself, and few claim to have spent much time considering how reflection and self-regulation, key components of metacognition, should be part of their own practice or part of the skill set they plan to help develop in their students.

While this reality is not ideal (at least for those of us true believers in the power of metacognition), realization of this lack of understanding about metacognition provides opportunities for faculty development. And why not start right at the beginning when faculty attend new faculty orientation?

New Faculty Orientation

At my institution this summer, we did just that. Our Director of Faculty Development, Dr. Marc Napolitano, worked the topic into his morning session on student learning. We designed a follow-on, small-group discussion session that encouraged faculty to actively engage in reading, personal application, and discussion of metacognition.

The reading we chose was one of my favorite metacognition articles, Promoting Student Metacognition, by Dr. Kimberly Tanner (2012). The session was only 40 minutes, so we only had them read a few pages of the article for the exercise, including her Table 1, which provides a series of questions students can ask themselves when planning, monitoring, evaluating their learning for a class session, while completing homework, while preparing for an exam. We had the new faculty jot down some reflections based on their responses to several guided prompts. Then we had time for discussion. I facilitated one of the small groups and was thus able to first-hand hear some of their responses.

Example questions:

  • What type of student were you as an undergraduate? Did you ever change your approach to learning as you went through school?
  • You obviously achieved success as an undergraduate, but do you think that you could have been more successful if you had better understood the science of learning and had teachers incorporate it into their courses?
  • If you had to share a definition of metacognition [from the reading] with students – and explain to them why it is an essential practice in learning – which definition would you use and how would you frame it with students?
  • If you wished to incorporate metacognition into your class, what approach(es) currently seems most practical for you? Why?
  • Which 3-4 of the questions in Table 1 seem like they would most helpful to use in your class? Why do these questions stand out, and how might they shape your class?

The discussion following the reading and reflection time was very rich. Only one member of my group of eight reported a good prior understanding of metacognition and how it could be incorporated into course design (she had just finished a PhD in physics education). Two others reported having vague prior familiarity with the term. However, after participating in these two faculty development sessions, all of them agreed that learning about the science of learning would have been valuable as a student regardless of level (K-12 through graduate school).

The faculty in my group represent a wide variety of disciplines, so the ways of incorporating metacognition and the questions from the table in the reading that most appealed to them varied. However, that is one of the wonderful things about designing courses or teaching practices to support student metacognition – there are many ways to do so. Thus, it’s not a problem to fit them to your way of teaching and your desired course outcomes.

We also spent a little time discussing metacognitive instruction: being aware of their choices as instructors and their students’ engagement and success, and using that awareness to guide their subsequent choices as instructors to support their students’ learning. They quickly understood the parallels with student metacognitive learning (students being aware of their choices and whether or not those choices are leading to success, and using that awareness to guide subsequent choices related to their learning). Our small groups will continue to meet throughout the coming year as a continuation of our new faculty development process. I look forward to continuing our conversations and further supporting them in becoming metacognitive instructors and promoting their students’ development as metacognitive learners.

————

Tanner, K. (2012). Promoting student metacognition. CBE—Life Sciences Education; Vol. 11, 113–120

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Academic Advising Tools through a Metacognitive Lens

Heather Mitchell, Associate Professor Psychology Webster University hmitchell33@webster.edu; Kim Kleinman, Director Undergraduate Advising Webster University kleinman@webster.edu; & Ronald Daniel, Director London Webster University ronalddaniel93@webster.edu

Background / Motivation

Metacognitive practices in advising have been documented (e.g., Freeman, 2008; Sullivan-Vance, 2008), and the literature is full of suggestions on how to support purposeful student engagements and learning outcomes through advising (e.g., Campbell & Nutt, 2008; Willcoxson & Wynder, 2010). For example, academic advising is a type of student learning, and we know metacognitive approaches are one best practice to improve such learning. By infusing academic advising with such metacognitive tools, we enhance intentional student engagement through the process of advising.

Why do I have to take this class? How is this requirement going to benefit me? When will I ever use this information again? Comments similar to these three questions led us to develop the advising syllabus and curriculum planner as tools to use in academic advising. Purposeful advising is a critical component of higher education as we prepare students to be responsible, global citizens in the 21st century. Additionally, metacognition can be an extremely useful tool in an effort to promote student achievement.

Nuts and Bolts / Method

This report provides a brief overview of two advising practices (i.e., advising syllabi and curriculum planners) we use to help deliver successful, engaging experiences for students. The advising syllabus and planner are both metacognitive in nature and thus can help each student and advisor remain intentional and reflective of the student’s college career. Webster University’s advising center and individual faculty in the college refine as needed the advising syllabus for use with students. Additionally, curriculum planners, or the “Planner”, originally developed at Virginia Tech University, provide a helpful organization tool for students to use while laying out their academic path in higher education. Students at Webster University’s Geneva, Switzerland campus and students at Webster’s St. Louis campus have benefitted from these tools.

Advising Syllabus. Such a syllabus includes an advising mission or statement/ philosophy of advising and allows advisors to outline any expectations and responsibilities for both students and advisors. See Appendix A: Undergraduate Advising Syllabus as an example. Learning outcomes and a timeline / calendar of advising events are key components of such a syllabus as well as a list of resources an advisee may find useful. Advising is an essential component of an educational mission, and an advising syllabus helps specify the importance of advising similar to the way course syllabi are a regular part of every student’s classroom education. Individual advisors, either professional advisors from the University’s Advising Center or individual faculty advisors personalize the specific criteria, descriptions, learning outcomes, and responsibilities for their advisees.

The Planner. Both paper and online versions of the Planner have been created. Computer science students at Virginia Teach developed the online version of the Planner as a way of “saving” the first draft of their holistic academic plan including curricular and extra-curricular components. Both an individual student and their advisor must provide the specific knowledge and details concerning career information, interests, and plans such as graduate school, technology competences, and language competences. Students now commonly use paper versions of the Planner, which requires crucial information such as student’s course and activity interests as well as details about the availability of those courses and/or activities. See Appendix B: Planner Template.

Specifically, the Planner provides students with an opportunity to “map out” their remaining time, requirements, and other activities so students can make the most of their college experience. Students are asked to review the requirements for obtaining their specific degree and they are provided with various resources (through web links and/or Advising Worksheets appropriate to their major) to use in order to create their Planner. To complete the Planner students must include the courses taken as well as those they plan to take in their academic career. In addition to coursework, students should include any other experiences relevant to their own professional development (e.g., volunteer opportunities, research involvement, study abroad, or internships). Use of the Planner is certainly varied (similar to use of the Advising Syllabus). Advisors encourage students to plan, monitor, and evaluate their academic and co-curricular progress with these planners. Students also should include on the Planner when they might begin to search for and apply to jobs, graduate programs, etc.

Outcomes / Lessons Learned

Formal investigations of these tools have not been conducted; however, anecdotal evidence suggests students and faculty have found these tools beneficial. For example, when reflecting on The Planner, one student commented in their course evaluation [the Planner] “is a great opportunity to identify, goals and get a really actionable plan in place to achieve it.” Students appear to benefit from these tools most when students are provided ample time to understand, create, and appropriately adjust the specific mechanisms of both tools. These tools are ideal when engaged as iterative experiences. Specifically, advisors first provide appropriate scaffolding to advisees by introducing these tools. Additionally, the metacognitive nature of both tools allows students to move beyond knowing and understanding their academic requirements to analyzing, evaluating, and creating their plan to meet such requirements. In other words, the tools reflect a change from the bottom, or lower level, Bloom’s taxonomy skills to the top, or higher level, skills. The level of specific metacognitive guidance each advisor provides advisees is also completely variable as neither of these tools are mandatory. Both tools simply provide a metacognitive lens for both advisors and students to view the advising process.

References

Campbell, S.M., & Nutt C.L. (2008). Academic advising in the new global century: Supporting student engagement and learning outcomes achievement. Peer Review, 10, 4-7.

Freeman, L.C. (2008). Establishing effective advising practices to influence student learning and success. Peer Review,10, 12-14.

Sullivan-Vance, K. (2008). Reenvisioning and revitalizing academic advising at Western Oregon University. Peer Review,10, 15-17.

Wilcoxson, L., & Wynder, M. (2010). The relationship between choice of major and career, experience of university and attrition. Australian Journal of Education, 54, 175-189.


Facilitating Student Success through Awareness of One’s Own Study Habits

Randi Shedlosky-Shoemaker, York College of Pennsylvania, rshedlos@ycp.edu

Background

As an academic advisor for new college students, I often see them struggle to understand the demands of high school versus college, particularly in terms of self-regulation. Professors generally expect their students to complete work outside of class, including assigned and self-initiated activities (e.g., reviewing material). Given students’ experience with homework during their K-12 experience, professors and advisors may be tempted to presume college students are well practiced at completing such work and already know how to regulate related behaviors. Although research suggests that self-monitoring and knowledge of how students learn improves as they age (Brown & Smiley, 1977; Pressley, Levin, & Ghatala, 1984), college students can still struggle with this metacognitive skill (Pressley & Ghatala, 1990). The impaired metacognition may be in part due to lack of transition training.

In high school, it was likely easier for students to determine what work they had to do because someone told them what to do. College presents a different environment; perhaps for the first time, students become largely—if not solely—responsible for regulating their studying. Although students may still have assigned homework, they also have to decide what course materials to read, what to study and how, and what and when to review. This leap from being highly guided by others to being responsible for regulating their own studying can present a challenge to college students, particularly when no intermediate steps help scaffold students’ learning of self-regulation and metacognitive skills. To assist students in understanding their own study habits, I sought out to examine what role weekly study reports could play in students’ overall academic performance.

Method

I recruited undergraduate students enrolled in a mid-sized private four-year college to participate in a semester-long study on study habits. Interested students could complete up to 12 weekly study reports (adapted from Bembenutty & White; 2013; see Appendix). Through the online report, students recorded what assigned and self-initiated work they completed during the past week. Students also completed a survey at the beginning and end of the semester, measuring their feelings of motivation related to their courses (items adapted from intrinsic motivation inventory, including the choice, tension, effort, enjoyment, and value subscales; Ryan, 1982) and other academic factors (e.g., high school GPA, cumulative college GPA, credits enrolled during the current semester). In the final survey, students assessed their experience using the study reports. Students who completed both surveys and submitted at least eight weekly reports were entered into a raffle to win a gift card to the college bookstore.

Outcomes/Lessons Learned

Of the 77 students I observed during the semester, most (n = 64, 83%) submitted at least one of the 12; 36 students (47%) completed at least eight reports and 10 students (13%) submitted all 12 reports. Cumulative GPA prior to that semester was unrelated to the number of reports submitted, r(72) = 0.15, p = .21, as was academic standing based on earned credits, r(77) = -.12, p = .29. Motivation measures were unrelated to the number of reports submitted, except for choice: Students who felt they had more choice in selecting their classes also submitted more reports, r(75) = .26, p = .02.

To examine the relation to academic performance that semester, I conducted a multiple regression analysis, including previous cumulative GPA (i.e., prior to the start of the study), number of credits earned (i.e., academic standing), perceived choice in taking their courses, and number of study reports completed as potential predictors. Only two of the variables predicted semester GPA: previous cumulative GPA (B = 0.73, t = 9.54, p < .001) and number of reports submitted (B = 0.19, t = 2.33, p = .02). As previous academic success was accounted and did not predict how many study reports students completed, it seems unlikely that the positive relation between number of reports completed and semester GPA could merely be attributed to a “good student” effect, which might suggest that good students were more inclined to complete the reports, as well as engage in behaviors that improved their academic performance. Finally, none of the measures of motivation predicted whether students completed the weekly reports.

Among the 41 students who completed the final survey, most students indicated that the reports were helpful (n = 34, 83%) and felt they gained useful insight by completing the reports (n = 30, 73%). Among the open-ended remarks, students noted that the reports helped them realize how much work they were (or were not) doing and how much time tasks/classes required. Several students remarked that they had learned about their own study habits, including common distractors they struggled with, inadequate strategies they used, and their own failures in time management. In light of students’ remarks, the reports appeared to provide an opportunity to regularly and explicitly think about their own study habits. In doing so, students may develop improved metacognition related to their own learning.

If employed more intentionally as a learning tool, instructors and advisors could use the reports to go beyond helping students develop a heightened awareness of their study behaviors. Advisors could have new students or students who are struggling academically maintain a journal of their studying during a set period of time and then provide individualized suggestions to the student in an informal one-on-one conversation. Expecting students to have an actual record means that the advisor and student are not relying on autobiographical introspection of a student’s behaviors to understand current problems or develop future studying plans. In a class, particularly for courses that address effective learning strategies as a student learning outcome (e.g., first-year seminar courses, major orientation courses), instructors could create more structured engagement with the tool by not only having students record behaviors but also identify patterns of behavior over time and make evidence-based plans for future studying behaviors. Such a strategy offers an opportunity to incentivize completing the reports (e.g., course credit) and could be a valuable stepping stone for students as they transition from highly-guided learner to self-directed learner. 

References

Bembenutty, H., & White, M. C. (2013). Academic performance and satisfaction with homework completion among college students. Learning and Individual Differences, 24, 83-88. doi: 10.1016/j.lindif.2012.10.013

Brown, A. L., & Smiley, S. S. (1977). Rating the importance of structure units of prose passages: A problem of metacognitive development. Child Development, 48, 1-8. doi: 10.2307/1128873

Pressley, M., & Ghatala, E. S. (1990). Self-regulated learning: Monitoring learning from text. Educational Psychologist, 25, 19-33. doi: 10.1207/s15326985ep2501_3

Pressley, M., Levin, J. R., & Ghatala, E. S. (1984). Memory strategy monitoring in adults and children. Journal of Verbal Learning and Verbal Behavior, 23, 270-288. doi: 10.1016/S0022-5371(84)90181-6

Ryan, R. M. (1982). Control and information in the intrapersonal sphere: An extension of cognitive evaluation theory. Journal of Personality and Social Psychology, 43, 450-461. doi: 10.1037/0022-3514.43.3.450


Improving Metacognition with Pre- and Post-Exam Reflection Exercises (Academic Advising)

Kyle E. Conlon (conlonke@sfasu.edu) and Lauren E. Brewer (brewerle@sfasu.edu), Stephen F. Austin State University

Background/Motivation: We teach psychology at a large southern university of approximately 13,000 students. Many of our students, especially freshman and first-generation students, possess ineffective study strategies that, understandably, lead to considerable frustration. They attend class, take careful notes, ask questions—do all the things we encourage them to do—and yet still underperform on their exams, leading them to ask, “What am I doing wrong?” When we ask students about their study strategies, we tend to find that they (1) rely on poor strategies (e.g., highlighting) and (2) lack insight into why their strategies aren’t working. Hence, we were motivated to create short pre- and post-exam reflection exercises to help students gain metacognitive awareness into their own study strategies.

Nuts & Bolts/Method: The pre-exam reflection exercise was designed for students to reflect on their exam preparation strategies and to identify obstacles to their studying (Appendix Table 1). The post-exam reflection exercise was designed for students to reflect on their exam performance and to determine whether it was necessary to change their study strategies for the next exam (Appendix Table 2). Fifty students (38 women, Mage = 21.10) across three psychology classes consented to participate. Each student completed four exams yielding 200 discreet observations in which a student could have completed no reflections (n = 154), pre-exam reflections only (n = 18), post-exam reflections only (n = 8), or both pre-and post-exam reflections (n = 20). For this study, we compared exam grades for students who completed both pre- and post-exam reflections to exam grades for students who completed neither pre- nor post-exam reflections. Participation was voluntary and students were informed that they could withdraw from the study at any time without penalty. In exchange for their participation, participants were entered into a raffle for one of three $100 gift cards. These gift cards were distributed at the end of the semester after final grades were submitted.

Outcomes/Lessons Learned: The exam scores of students who completed both pre- and post-exam reflections (Mgrade = 86.60, SD = 11.01) were significantly higher than exam scores of students who did not complete the reflections (Mgrade = 76.97, SD = 12.31) t(172) = 3.57, p < .001. Additionally, for each student the number of exam reflections completed was positively correlated with exam average (r = .42, p = .03) and with final course grade (r = .50, p = .01).

Our goal was to create brief reflection exercises to help our students gain insight into the effectiveness of their study strategies. More recently, we’ve begun to share these exercises with our academic advisees, some of whom consider dropping or avoiding classes due to poor performance. Although our specific guidance depends on the advisee, we generally encourage them to apply the exercises to the exams in the course or courses in which they’re struggling. We also try to review their responses with them to foster their metacognitive awareness (e.g., “I see you’re highlighting your notes and rereading the text; why do you think these strategies aren’t working?,” “So you felt prepared for this exam but underperformed; why do you think this happened?”). These exercises, which could be used by any academic advisor, jumpstart a discussion with advisees about how to study, which often gives them a renewed sense of hope and perspective for overcoming obstacles in their courses. In some cases, we’ll share specific articles from the metacognition literature (e.g., Putnam, Sungkhasettee, & Roediger, 2016) that dovetail with the use of these exercises. We typically meet with advisees once a semester for course selection, but we both have an open-door mentoring policy and encourage (and sometimes require) follow-up meetings with advisees, particularly those who are struggling and would benefit most from these exercises. Our experience suggests that advisees (1) generally possess poor insight into their studying (2) express surprise that their strategies aren’t as effective as they believe (or as research shows) and (3) through these exercises are forced to think through their study habits in a way they might not otherwise. We’re hopeful that improving advisees’ metacognition extends beyond the classroom to help improve their grades, motivate them beyond initial struggles, and prevent dropout.

References:

Gurung, R. A. R. (2005). How do students really study (and does it matter)? Teaching of Psychology, 32, 239–241.

Henderson, V., & Dweck, C. S. (1990). Motivation and achievement. In S. S. Feldman & G. R. Elliott (Eds.), At the threshold: The developing adolescent (pp. 308–329). Cambridge, MA: Harvard University Press.

Putnam, A. L., Sungkhasettee, V. W., & Roediger, H. L. (2016). Optimizing leaning in college: Tips from cognitive psychology. Perspectives on Psychological Science, 11, 652-660.

Trockel, M. T., Barnes, M. D., & Egget, D. L. (2000). Health-related variables and academic performance among first-year college students: Implications for sleep and other behaviors. Journal of American College Health, 49, 125–131. doi: 10.1080/07448480009596294


Thinking like a Sociologist, but how? Using Reflective Worksheets to Enhance Metacognition in a Classroom with Diverse Learners

Mabel Ho, Department of Sociology, University of British Columbia
Katherine Lyon, Department of Sociology and Vantage One, UBC
Jennifer Lightfoot, Academic English Program, Vantage One, UBC
Amber Shaw, Academic English Program, Vantage One, UBC

Background and Motivation for Using Reflective Worksheets in Introductory Sociology

Research shows that for first year students in particular, lectures interspersed with active learning opportunities are more effective than either pedagogical approach on their own (Harrington & Zakrajsek, 2017). In-class reflection opportunities are a form of active learning shown to enhance cognitive engagement (Mayer, 2009), critical thinking skills (Colley et al., 2012), and immediate and long-term recall of concepts (Davis & Hult, 1997) while reducing information overload which can limit learning (Kaczmarzyk et al., 2013). Further, reflection conducted in class has been shown to be more effective than outside of class (Embo et al., 2014). Providing students with in-class activities which explicitly teach metacognitive strategies has been shown to increase motivation, autonomy, responsibility and ownership of learning (Machaal, 2015) and improve academic performance (Aghaie & Zhang, 2012; Tanner, 2012).

We created and implemented reflective worksheets (See Appendix) in multiple sections of a first-year sociology course at a large research university with a high proportion of international English as an Additional Language (EAL) students. While all first-year students must learn to navigate both the academic and disciplinary-specific language expectations of university, for many international students additional barriers may exist. For these students, new expectations must be achieved through their additional language and with possible diverse cultural assumptions, such as being unfamiliar with active learning and thought processes privileged in a Western academic institution. With both domestic and international students in mind, our aims with these reflective worksheets are to:

  • facilitate and enhance students’ abilities to notice and monitor disciplinary awareness and knowledge while promoting disciplinary comprehension and practices.
  • connect course material to personal experiences (micro) and social trends (macro).

Nuts and Bolts: Method

We structured individual writing reflection opportunities every 10-15 minutes in each lecture in the small (25 students), medium (50 students) and large (100 students) classes. Each lesson was one hour and students completed the worksheets during class time in five-minute segments. The worksheets had different question prompts designed to help students:

  • identify affective and cognitive pre-conceptions about topics
  • paraphrase or explain concepts
  • construct examples of concepts just learned
  • contrast terms
  • describe benefits and limitations of social processes
  • relate a concept to their own lives and/or cultural contexts
  • discover connections between new material and prior knowledge (Muncy, 2014)
  • summarize key lecture points (Davis & Hult, 1997)
  • reflect on their own process of learning (see Appendix for further examples)

The question prompts are indicative of how to think about a topic, rather than what to think. These reflective worksheets are a way to teach students to think like disciplinary specialists in sociology, which align with the course learning outcomes. Completed worksheets were graded by Teaching Assistants (T.A.) who used the rubric below (see Table 1) to assess students’ application and critical thinking skills. By framing the worksheets as participation marks, students’ were motivated to complete the assigned work while learning how to approach sociology as a discipline. As suggested in “promoting conceptual change” (Tanner, 2012), some of the worksheets required students to recognize their preconceived notions and monitor their own learning and re-learning. For example, in one of the worksheets, students tracked their own preconceptions about a social issue (e.g. marijuana usage) in the beginning of the lecture and they returned to the same question at the end of class. Through this process, a student can have a physical record of his/her evolution of beliefs, whether it be recognizing and adjusting pre-conceived notions or deepening justifications for beliefs.

Table 1: Sample Assessment Rubric                                           

Sample Assessment Rubric
3 2 1
Entry is thoughtful, thorough and specific. Author draws on relevant course material where appropriate. Author demonstrates original thinking. Entries correspond to questions asked. Entry is relevant but may be vague or generic. Author could improve the response by making it more specific, thoughtful or complete. Entry is unclear, irrelevant, incomplete or demonstrates a lack of understanding of core concepts.

Outcomes: Lessons Learned

We found the reflective worksheets were effective because they gave students time to think about what they were learning and, over time, increased their awareness of disciplinary construction of knowledge. As instructors, the worksheets were a useful tool in monitoring students’ learning and ‘take away’ messages from the lectures. We also utilized the worksheets as a starting point in the next lecture to clarify any misunderstandings.

Overall, we found that while the reflective worksheets seemed to be appreciated by all the students, EAL students specifically benefitted from the worksheets in a number of ways. First, the guided questions gave students additional time to think about the topic on hand and preparation time before classroom discussion. Instead of cold-calling students, this reflective time allowed students’ to gather their thoughts and think about what they just learned in an active way. Second, students were able to explore the structure of academic discourse within the discipline of sociology. As students learn through different disciplinary lenses, these worksheets reveal how a sociologist will approach a topic. In our case, international EAL students are taking courses such as psychology, academic writing, and political science. Each of these disciplines engages with a topic using a different lens and language, and having the worksheet made the approach explicit. Last, the worksheets allow students to reflect on both the content and the way language is used within sociology. For example, the worksheets gave students time to brainstorm and think about what questions are explored from a disciplinary perspective and what counts as evidence. Furthermore, when given time to reflect on the strength of disciplinary evidence, students can then determine which language features may be most appropriate to present evidence, such as whether the use of hedges (may indicate, possibly suggest, etc.) or boosters (definitely proves) would be more appropriate. When working with international EAL students, it becomes extremely important to uncover language features so students can in turn take ownership of those language features in their own language use. Looking forward, these worksheets can help guide both EAL and non-EAL students’ awareness of how knowledge is constructed in the discipline and how language can be used to reflect and show their disciplinary understanding.

References

Aghaie, R., & Zhang, L. J. (2012). Effects of explicit instruction in cognitive and metacognitive reading strategies on Iranian EFL students’ reading performance and strategy transfer. Instructional Science40(6), 1063-1081.

Colley, B. M., Bilics, A. R., & Lerch, C. M. (2012). Reflection: A key component to thinking critically. The Canadian Journal for the Scholarship of Teaching and Learning, 3(1). http://dx.doi.org/10.5206/cjsotl-rcacea.2012.1.2

Davis, M., & Hult, R. E. (1997). Effects of writing summaries as a generative learning activity during note taking. Teaching of Psychology24(1), 47-50.

Embo, M. P. C., Driessen, E., Valcke, M., & Van Der Vleuten, C. P. (2014). Scaffolding reflective learning in clinical practice: a comparison of two types of reflective activities. Medical teacher36(7), 602-607.

Harrington, C., & Zakrajsek, T. (2017). Dynamic Lecturing: Research-based Strategies to Enhance Lecture Effectiveness. Stylus Publishing, LLC.

Kaczmarzyk, M., Francikowski, J., Łozowski, B., Rozpędek, M., Sawczyn, T., & Sułowicz, S. (2013). The bit value of working memory. Psychology & Neuroscience6(3), 345-349.

Machaal, B. (2015). Could explicit training in metacognition improve learners’ autonomy and responsibility? Arab World English Journal, 6(1), 267.

Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York, NY: Cambridge University Press.

Muncy, J. A. (2014). Blogging for reflection: The use of online journals to engage students in reflective learning. Marketing Education Review, 24(2), 101-114. doi:10.2753/MER1052-8008240202

Tanner, K. D. (2012). Promoting student metacognition. CBE-Life Sciences Education11(2), 113-120.


Prompted Written Reflection as a Tool for Metacognition: Applying Theory in Feedback

Dr. Phani Radhakrishnan & Emma Kerr
Management Department, University of Toronto
Contact: phani.radhakrishnan@utoronto.ca & emma.kerr@mail.utoronto.ca

Background/Motivation

Understanding how to seek feedback is a core topic in the curriculum in leadership courses. However, not all feedback-seeking activities are effective. The purpose of this activity is to help students to apply empirical research about feedback to their experiences in receiving and interpreting feedback. We hoped the activity would enable them to understand how to seek and interpret feedback but to also learn about how to apply it to their own learning, and thus, encourage metacognitive thinking.

Method

This activity took place approximately midway through the semester in a third-year mandatory course for students in the business administration program. There were approximately 40 students in each class. Students were introduced to the rationale for the activity by reading DeNisi and Kluger’s (2000) review article about the relation between feedback and performance. Then they answered questions requiring them to explain the key concepts in the reading and apply the theory to a real-life example (see Appendix A). Then students listened to a short lecture that explained the theory behind the factors that increase and decrease the effectiveness of feedback. The lecture focused on the finding that the way people seek feedback can have an impact on their subsequent performance. If people seek task-based feedback by looking for the correct answer or, by asking for ways to improve their answer by focusing on how to learn the task, their subsequent performance on the task will improve. But if they seek self-based feedback by looking for information on how they did they did relative to others, by seeking information for the class average their subsequent performance on the task will not improve.

Then, students reflected on how they could apply this knowledge to their own lives. They wrote a response to the following question: “Consider a situation where you got feedback and that did not help you improve your subsequent performance. Explain why the feedback was ineffective in terms of task, task learning or self-focus. What could you have done to increase your focus on task, or task learning and decrease attention to self?” Then, we gave students examples of learning goals (see Appendix B) and asked them to write learning goals with reference to the example they just wrote about. Finally, students who volunteered, read their written reflections out loud to the rest of the class. Students were graded for participation for the homework activity as well as for their participation the in-class discussion. Finally, students were asked a similar question on the final exam about feedback they received in this course. Specifically, they had to reflect on the feedback they received from an assignment in the course where they were evaluated on argumentation, definitional, data analytical, and descriptive skills and to set a learning goal on how to improve themselves on these skills. We hoped that these multiple writing prompts would encourage students to apply research about feedback to their own experiences with receiving and seeking feedback. Such written reflections should encourage students to learn about how they may help or hurt themselves by the kind of information they focus on when asking for feedback from instructors (e.g., asking for the class average vs. asking for how to improve, or asking for the correct answer).

Our activity was guided by research which suggests that providing a written prompt that encourages students to critically think about their own experiences can encourage metacognition. For example, Ratto-Parks (2015) asked first-year college students to think about a rhetorical story assignment they had completed in a course and reflect on what they did well and where they could improve. She found that student reflections improved metacognition and strengthened writing quality. Just as Ratto-Parks’ activity encouraged students to reflect on, and thus improve their writing skills, we hoped that our questions guided students on the kinds of information they should focus on while seeking feedback and also encouraged students to meta-cognize about their experiences with feedback, and to reflect on how to use feedback-seeking opportunities to improve themselves. Similarly, other studies have found that the content of written reflections that prompt critical reflection can elicit metacognitive processes (Erksine, 2009; Harten 2014; Lew & Schmidt 2010).

Further, feedback itself has also been shown to improve metacognition (Callender, 2016). In our activity, we evaluated students on multiple skills in their course-related writing assignment (e.g., argumentation, definitional skills etc.) and then asked students to reflect on what that feedback means. By asking students to relate information learned in the course to their past feedback-seeking experiences and by providing opportunities to apply that knowledge while they are getting feedback in the course, we think we are helping students to improve their metacognitive skills since they are using both written reflections and feedback as tools to develop such skills. Taken together, the in-class writing exercise, an explanation of the theory behind feedback, an opportunity to get feedback, and answering a question on the final exam about that feedback should all improve meta-cognitive skills. This is also predicted by past research cited above.

Outcomes

Preliminary analysis shows that highly engaged students (i.e., those that read the article, answered the homework questions, wrote a reflection and participated in class discussion) tended to achieve higher marks on the related final exam question. Overall, students showed an improved understanding of effective feedback following the in-class activity. We are motivated to continue to systematically analyze student responses to the initial in-class reflection questions and to the final exam questions. We hope to detect metacognitive thinking by using Ratto-Parks’ Index of Metacognitive Knowledge in Critical

Reflective Writing, which shows promise in translating metacognitive language into identifiable traits that can be used to assess students’ reflections (2015). This analysis could be challenging because our activity consists of only one in-class reflection question based on prior feedback-seeking experiences and one final exam question based on a feedback-seeking experience in the course itself. Most studies include multiple written reflections. To detect improvement in metacognition we may need to encourage students to repeatedly answer questions about what they are learning in multiple feedback contexts. This is similar to our prior research (Radhakrishnan, Arrow, & Sniezek, 1996), which shows that asking students to repeatedly evaluate their performance over multiple tests after receiving feedback on each test improves the accuracy of their evaluations. Improving students’ understanding of what they are learning, that is, their meta-cognitive skills, may also follow a similar mechanism. Multiple written reflections about how to interpret feedback while getting feedback on multiple tasks can not only help students gain an improved understanding of the theory of feedback but also about themselves.

We expect that both the improved understanding of effective feedback processes as well as the opportunity to practice metacognition will help students to interpret and give feedback more effectively both within and outside of the course. Since our students are in the management discipline, seeking feedback effectively is a skill that is essential to their professional development as leaders. In addition, we predict that the improved experience with metacognitive processes will aid them in thinking critically and interpreting feedback in their other courses as well.

References

DeNisi, A. S., & Kluger, A. N. (2000). Feedback effectiveness: Can 360-degree appraisals be improved? Academy of Management Perspectives, 14(1), 129-139. doi:10.5465/ame.2000.2909845

Erskine, D. L. (2009). Effect of prompted reflection and metacognitive skill instruction on university freshmen’s use of metacognition (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database.

Harten, M. D. (2014). An evaluation of the effectiveness of written reflection to improve high school students’ metacognitive knowledge and strategies (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses database.

Lew, D. N., & Schmidt, H. G. (2011). Writing to learn: can reflection journals be used to promote self-reflection and learning? Higher Education Research & Development, 30(4), 519-532. doi:10.1080/07294360.2010.512627

Radhakrishnan, P., Arrow, H., & Sniezek, J. A. (1996). Hoping, Performing, learning, and Predicting: Changes in the Accuracy of Self-Evaluations of Performance. Human Performance, 9(1), 23-49. doi:10.1207/s15327043hup0901_2

Ratto Parks, A. E. (2015). The power of critical reflection: Exploring the impact of rhetorical stories on metacognition in first year composition stories (Doctoral dissertation).

*

Appendix A—Homework Question

The way in which feedback is given can draw one’s attention to oneself and this attention to self leads to negative effects on subsequent performance (after the feedback). However, it also discusses conditions where feedback focused on the self, may not necessarily lead to negative effects. Explain how this occurs.

Use the concept of ought vs. ideal self, and promotion vs. prevention focus. Illustrate how this theory occurs by applying it in a concrete real-life situation or example.

*

Appendix B—Examples of Learning Goals

(The following is displayed on a slide in lecture to aid students in developing their own learning goals)

For a professor…

  • Finding specific ways to explain complex material in memorable ways
  • Explain concepts by giving examples & counter examples
  • Explain theories by giving concrete examples of process of how it works
  • Show the relevance of the subject matter to the students’ lives outside the classroom

For a golfer…

  • Mastering the proper grip of the club
  • Master proper placement of the feet
  • Learning when to use what club
  • Understanding the distribution of weight from one foot to the other when swinging the club

Practicing Metacognitive Awareness with Guided Lecture Notes

Dr. Terrell Hooper
Assistant Professor of Music
American University of Sharjah
Email | thooper@aus.edu

Background/Motivation:

I teach an Elements of Music course for music minors and a general populous of engineering, business and architecture students needing to earn a general arts credit. I have experienced many challenges in teaching such a course in the Middle East where students have never been exposed to any elements of western music or history. The course surveys the entire gamut of western music and history, while simultaneously giving a foundational understanding of music literacy. Given the vast parameters of the course, students are expected to have strong independent study skills. While study habits are primarily individual and differ with each student, I found students not prepared or equipped with basic study skills required to be successful in the course. The most basic skill that was lacking was the ability to take notes or organize the material being discussed in class. In addition, student feedback on end of course evaluations revealed that information and material discussed in class was so unfamiliar and vast that students did not know how to organize or digest the information. From the gathered data, I inferred that students needed a note-taking model, an opportunity to take notes on their own volition, and a moment to reflect on their note taking abilities. By implementing the aforementioned objectives, I wanted to observe whether or not said objectives would encourage students to think in a metacognitive manner and would perhaps be awakened to the importance of metacognitive practices regarding their own study habits.

Method:

Due to the pedagogical “bumps” that I experienced in my first semester of teaching Elements of Music, I decided to create a sequential curriculum (see Figure 1) that would provide students with guided lecture notes[1]. The purpose of these notes were two-fold: 1) help students structure information being discussed during class 2) help students remember, reflect, and re-organize course content during independent study. The sequential curriculum was in line with the syllabus and students were not educated about note taking skills, but were merely provided with guided lecture notes that I prepared prior to each class meeting.

Three guided lecture notes were given over a three-week period (see Appendix A). Each week the guided lecture notes were designed to incorporate a progressive guide for helping students become more metacognitive aware of proper note taking habits during in-class lectures. The first guided lecture notes were designed to orient students to the process of taking notes in an outline format and contained fill in the blank areas that were curated throughout the outline. Subsequent guided lecture notes included reflective questions at the end of the lecture. Lecture 2 contained recall questions and Lecture 3 contained essay questions concerning content that was discussed during the lecture. All three guided lecture notes were collected after each class and data recorded on how many students completed the entire handout and rated on its overall completion (i.e number of blanks left on the handout). Lecture 4 (the Classical Lecture) did not use guided lecture notes and no instruction or requirements for note taking was given to students because I wanted to observe how many students saw the need to take notes of their own volition.

Following the review session (see Figure 1) a midterm exam was administered. The midterm exam consisted of multiple choice, fill in the blank, and true or false questions and were copied verbatim from the previous semester exam so data could be compared with how students not exposed to guided lecture notes scored on the same questions. After the midterm exam, students were given a survey via Google Forms and were asked questions regarding the usefulness of the guided lecture notes. Finally, I gave a ten-minute lecture that informed students on the data gathered in the questionnaire, statistics on how many students completed each handout during each lecture, and the exam scores from students who used guided lecture notes with students who did not use guided lecture notes in the Fall semester.

Figure 1. Sequential Curriculum for Guided Lecture Notes

                        Data Outcomes:

All students enrolled in Elements of Music for Spring semester participated in the study (n=29), however, due to random class absences, Lecture 1 had 27 participants, Lecture 2 had 28 participants and Lecture 3 had 25 participants. A set of 30 questions derived directly from the lecture notes were used on the midterm exam for students in Spring semester (n=29) and the final exam for students from Fall semester (n=28). Each exam question (n=30) was scored as correct or incorrect on both Fall and Spring student exams and the total number of incorrect answers was calculated for each student. An independent t-test revealed no significant difference between groups, t(53)=1.02, p=.31; Mean (Std Dev) Fall Semester = 4.9 (3.4) and Mean (Std Dev) Spring Semester = 4.0 (2.9).

On a more positive note regarding the incorporation of the guided lecture notes, students who participated in the questionnaire (n=23) gave strongly positive ratings for the notes. They rated their overall satisfaction on a 3-point Likert scale choosing between unsatisfactory, satisfactory continuum and extremely satisfactory. Results indicated 69.6% (n=15) of students surveyed were extremely satisfied with using guided lecture notes and 30.4% (n=7) of students chose the middle option, indicating neither unsatisfied nor extremely satisfied. Open-ended student feedback on using teacher guided lecture notes is represented in Table 1.

Table 1.Student Feedback Using Guided Lecture Notes

Pros Cons
“Provides important details and helps us focus on what is more important” “More detailed questions”
“Guides me through the chapters while studying from the book” “Sometimes the questions are vague and need clarification”
“They were a very good guide when it came to studying for midterms as they summarized the main concepts“ “Include a list of keywords”
“These outlines make it easier to understand and absorb the material faster” “The information was a lot and we didn’t have enough time to complete it during class while the professor was explaining it. Sometimes I felt I couldn’t keep up the pace while listening to the lecture and writing thus I left many blanks to fill in later which made me unsure of my answers.”

 Observations:

The primary purpose of this research study was to 1) help students structure information being discussed during class 2) help students remember, reflect, and re-organize course content during independent study. The study illuminated the fact that when students organize, reflect, and collaborate with their teachers on their own learning it improves the pedagogical process. Although the data does not necessarily confirm that guided lecture notes improves test scores, it would be remiss to not acknowledge that students do enjoy being provided with a structure for organizing the information presented during lectures. In addition, no negative feedback concerning the amount of material or organizational components of the course were received on end-of-course student evaluations.

The intent of helping students take personal initiative on using guided lecture notes in Lecture 4 (see Figure 1) and giving an informative ten-minute lecture on the possible gains of using such an organizational scheme when listening to class lectures was to help students to think more about their own study skills. However, generally speaking, I did not observe a change in the majority of classroom behavior with students beginning to practice metacognition regarding their own study habits. I actually observed students wanting or expecting the guided lecture notes for every class. The end-of-course student evaluations even noted that students wanted guided lecture notes for each class lecture. Even though students positively reflected on the usefulness of the guided lecture notes, I observed a disconnect in motivating students to take personal initiative for their personal study habits. Future research should investigate the link between in-class lectures and how students become more self-directed within their own independent study habits.

[1] Guided notes are defined as “teacher-prepared handouts that ‘guide’ a student through a lecture with standard cues and prepared space in which to write the key facts, concepts, and/or relationships” (Heward, 1994, p. 304).


Supporting Student Self-Assessment with Knowledge Surveys

by Dr. Lauren Scharff, U. S. Air Force Academy*

In my earlier post this year, “Know Cubed” – How do students know if they know what they need to know?, I introduced three challenges for accurate student self-assessment. I also introduced the idea of incorporating knowledge surveys as a tool to support student self-assessment (an aspect of metacognitive learning) and promote metacognitive instruction. This post shares my first foray into the use of knowledge surveys.

What exactly are knowledge surveys? They are collections of questions that support student self-assessment of their course material understanding and related skills. Students complete the questions either at the beginning of the semester or prior to each unit of the course (pre), and then also immediately prior to exams (post-unit instruction). When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. The type of learning expectation is highlighted by including the Bloom’s level at the end of each question. Completion of knowledge surveys develops metacognitive awareness of learning and can help guide more efficient studying.

Example knowledge survey questions
Example knowledge survey questions

My motivation to include knowledge surveys in my course was a result of a presentation by Dr. Karl Wirth, who was invited to be the keynote speaker at the annual SoTL Forum we hold at my institution, the United States Air Force Academy. He shared compelling data and anecdotes about his incorporation of knowledge surveys into his geosciences course. His talk inspired several of us to try out knowledge surveys in our courses this spring.

So, after a semester, what do I think about knowledge surveys? How did my students respond?

In a nutshell, I am convinced that knowledge surveys enhanced student learning and promoted student metacognition about their learning. Their use provided additional opportunities to discuss the science of learning and helped focus learning efforts. But, there were also some important lessons learned that I will use to modify how I incorporate knowledge surveys in the future.

Evidence that knowledge surveys were beneficial:

My personal observations included the following, with increasing levels of each as the semester went on and students learned how to learn using the knowledge survey questions:

  • Students directly told me how much they liked and appreciated the knowledge survey questions. There is a lot of unfamiliar and challenging content in this upper-level course, so the knowledge survey questions served as an effective road map to help guide student learning efforts.
  • Students asked questions in class directly related to the knowledge survey questions (as well as other questions). Because I was clear about what I wanted them to learn, they were able to judge if they had solid understanding of those concepts and ask questions while we were discussing the topics.
  • Students came to office hours to ask questions, and were able to more clearly articulate what they did and did not understand prior to the exams when asking for further clarifications.
  • Students realized that they needed to study differently for the questions at different Bloom’s levels of learning. “Explain” questions required more than basic memorization of the terms related to those questions. I took class time to suggest and reinforce the use of more effective learning strategies and several students reported increasing success and the use of those strategies for other courses (yay!).
  • Overall, students became more accurate in assessing their understanding of the material prior to the exam. More specifically, when I compared the knowledge survey reports with actual exam performance, students progressively became more accurate across the semester. I think some of this increase in accuracy was due to the changes stated in points above.

Student feedback included the following:

  • End-of-semester feedback from students indicated that vast majority of them thought the knowledge surveys supported their learning, with half of them giving them the highest rating of “definitely supports learning, keep as is.”
  • Optional reflection feedback suggested development of learning skills related to the use of the knowledge surveys and perceived value for their use. The following quote was typical of many students:

At first, I was not sure how the knowledge surveys were going to help me. The first time I went through them I did not know many of the questions and I assumed they were things I was already supposed to know. However, after we went over their purpose in class my view of them changed. As I read through the readings, I focused on the portions that answered the knowledge survey questions. If I could not find an answer or felt like I did not accurately answer the question, I bolded that question and brought it up in class. Before the GR, I go back through a blank knowledge survey and try to answer each question by myself. I then use this to compare to the actual answers to see what I actually need to study. Before the first GR I did not do this. However, for the second GR I did and I did much better.

Other Observations and Lessons learned:

Although I am generally pleased with my first foray into incorporating knowledge surveys, I did learn some lessons and I will make some modifications next time.

  • The biggest lesson is that I need to take even more time to explain knowledge surveys, how students should use them to guide their learning, and how I use them as an instructor to tailor my teaching.

What did I do this past semester? I explained knowledge surveys on the syllabus and verbally at the beginning of the semester. I gave periodic general reminders and included a slide in each lesson’s PPT that listed the relevant knowledge survey questions. I gave points for completion of the knowledge surveys to increase the perception of their value. I also included instructions about how to use them at the start of each knowledge survey:

Knowledge survey instructions
Knowledge survey instructions

Despite all these efforts, feedback and performance indicated that many students really didn’t understand the purpose of knowledge surveys or take them seriously until after the first exam (and some even later than that). What will I do in the future? In addition to the above, I will make more explicit connections during the lesson and as students engage in learning activities and demonstrations. I will ask students to share how they would explain certain concepts using the results of their activities and the other data that were presented during the lesson. The latter will provide explicit examples of what would (or would not) be considered a complete answer for the “explain” questions in contrast to the “remember” questions.

  • The biggest student feedback suggestion for modification of the knowledge surveys pertained to the “pre” knowledge surveys given at the start of each unit. Students reported they didn’t know most of the answers and felt like completion of the pre knowledge surveys was less useful. As an instructor, those “pre” responses helped me get a pulse on their level or prior knowledge and use that to tailor my lessons. Thus, I need to better communicate my use of those “pre” results because no one likes to take time to do what they perceive is “busy work.”
  • I also learned that students created a shared GoogleDoc where they would insert answers to the knowledge survey questions. I am all for students helping each other learn, and I encourage them to quiz each other so they can talk out the answers rather than simply re-reading their notes. However, it became apparent when students came in for office hours that the shared “answers” to the questions were not always correct and were sometimes incomplete. This was especially true for the higher-level questions. I personally was not a member of the shared document, so I did not check their answers in that document. In the future, I will earlier and more explicitly encourage students to be aware of the type of learning being targeted and the type of responses needed for each level, and encourage them to critically evaluate the answers being entered into such a shared document.

In sum, as an avid supporter of metacognitive learning and metacognitive instruction, I believe that knowledge surveys are a great tool for supporting both student and faculty awareness of learning, the first step in metacognition. We then should use that awareness to make necessary adjustments to our efforts – the other half of a continuous cycle that leads to increased student success.

———————————————–

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


How to Get the Most Out of Studying

Dr. Stephen Chew has put together a highly lauded series of short videos that share with students some powerful principles of effective learning, including metacognition. His goal was to create a resource that students can view whenever and as often as they want.

They include

  • Video 1: Beliefs That Make You Fail…Or Succeed
  • Video 2: What Students Should Understand About How People Learn
  • Video 3: Cognitive Principles for Optimizing Learning
  • Video 4: Putting the Principles for Optimizing Learning into Practice
  • Video 5: I Blew the Exam, Now What?

Links to the videos can be found here:

https://www.samford.edu/departments/academic-success-center/how-to-study

Dr. Chew also provides an overview handout that summarizes the purposes of the videos, gives guidance on how to use them, and outlines the main points within the videos:

https://www.samford.edu/departments/files/Academic_Success_Center/How-to-Study-Teaching_Resources.pdf


Developing Metacognition with Student Learning Portfolios

In this IDEA paper #44, The Learning Portfolio: A Powerful Idea for Significant Learning, Dr. John Zubizarreta shares models and guidance for incorporating learning portfolios. He also makes powerful arguments regarding the ability of portfolios to engage students in meaningful reflection about their learning, which in turn will support a metacognitive development and life-long learning.

 


“Know Cubed” – How do students know if they know what they need to know?

by Dr. Lauren Scharff, U. S. Air Force Academy*


Know Cubed

This simple but somewhat of a tongue-twister question takes us to several challenging aspects of teaching and learning that link to both student and instructor metacognition:

  1. How do students self-assess their understanding and abilities prior to assessments?
  2. Are students able to accurately know what they are expected to be able to demonstrate for an assessment?
  3. What can we as instructors reasonably do to be transparent regarding our learning expectations and to support student development of accurate self-assessment?

Generally speaking, humans ARE good at self-assessment, as long as the self-assessment activity/tool is well-aligned with the actual assessment activity/tool (e.g. see Nuhfer, 2015). However, there are many possible reasons why students may not accurately self-assess, and several of those are directly under our control as instructors.

Thus, I believe we should engage in metacognitive instruction by developing our awareness of common reasons that students may not accurately self-assess, what we might be doing that inadvertently leads to those pitfalls, and some means by which we can support more accurate student self-assessment. We should then intentionally use that awareness to adjust what we do. This combination of awareness and self-regulation provides the foundation for metacognitive instruction.

Based on my observations and discussions with colleagues across the years, here are three common reasons students might not accurately self-assess along with some strategies instructors might take to support better student self-assessment:

  1. Lesson-to-Exam Misalignment – For example, classroom instruction and activities sometimes focus on basic concepts and definitions, while exams ask for evaluation and synthesis. Students may self-assess their competency based on what was presented in the lesson, but then feel surprised and perform poorly on the exam when they are asked to go beyond the lower level. Even if instructors “warn” students that they will need to engage in higher-level processing on the exams, if students haven’t been given the opportunity to experience what that means and practice it, those students may not accurately self-assess their preparedness for the exam. Instructors should analyze the levels and types of learning materials they present in class and require of students during formative learning activities (in-class activities, homework, quizzes). Then, they should align their exams to have similar levels of expectation. If they desire higher-level learning to be demonstrated on exams, they should redesign their learning activities to allow scaffolding, practice, and feedback with those higher-level expectations.
  2. Confusing Questions – Students often claim that questions on exams are confusing, even if they don’t seem to be confusing from the instructor’s perspective. Thus, students might actually be accurate in their self-assessment of their understanding of a topic, but then fail to demonstrate it because they were confused by the question or simply misread it. Test anxiety can add additional cognitive load and make it more likely for students to misread questions. Thus, instructors should review their questions to find ways to more clearly indicate what they expect in a response. For example, if there are two parts to the question, rather than having a long question, break it into part (a) and part (b). This symbolism clearly communicates that a good response should have two parts. It often can be difficult for the person writing the question to assess the clarity of their question because they know what they mean, so it seems obvious. (Instructors can also fall into this trap when reviewing test banks questions and the correct answer is clearly indicated. Once the answer is known, it seems obvious.) Being aware of these pitfalls and taking the time to critically analyze one’s test questions is a good way to engage in metacognitive instruction. Having a colleague from a different area of expertise read through the questions before finalizing them can also help catch some instances where clarity could be improved.
  3. Smooth Presentations – Instructors are experts, and they generally like to be perceived as such. Thus, it is far more common for instructors to present problem work-outs or other complex material in ways that make it look smooth and easy. That seems good, right? Actually, smooth presentations can mislead students into thinking that the material is easy and not prompt them to ask questions. Following a smooth presentation, students might then self-assess as understanding the material when really they would not be able to work out a problem on their own. Explicit step-by-step examples in textbooks also sometimes fool students into thinking they know how to workout problems if the assigned homework can be completed by following the examples. Instructors should consider verbalizing points of possible confusion that they know often catch students or sharing their own struggles as they learned the material in the past. As they work out problems in front of class, they could ask what worked, what didn’t, and what changes could be made in the problem-solving approach (or writing approach, or presentation of an argument, etc.). They should also emphasize to students that they will be better able to self-assess their preparation for an exam if they work out problems without the examples in front of them.

The above challenges for accurate student self-assessment and instructor strategies to address them are just a start to help us become metacognitive instructors and help students become more metacognitive learners. In my next post I will share with you my recent exploration into the use of Knowledge Surveys. This tool directly helps students develop more accurate self-assessment. Further, with direction and encouragement from the instructor, knowledge surveys can help students become metacognitive learners by using their awareness of their learning to guide their use of learning strategies.

There are many routes to becoming a metacognitive instructor, although all require intentionality in developing awareness of factors impacting student learning and using that awareness to self-regulate instructional efforts. It is a process with many options and possible strategies, where even small efforts can lead to big pay-offs in student learning and development.

———–

Nuhfer, E. (January 2015). Self-assessment and the Affective Quality of Metacognition: Part 2 of 2. Blog post on Improve with Metacognition, retrieved from https://www.improvewithmetacognition.com/self-assessment-and-the-affective-quality-of-metacognition-part-2-of-2/

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Developmental Framework for Teaching Expertise

A group of faculty at the University of Calgary share a framework for growth of teaching expertise that demonstrates that “teaching expertise involves multiple facets, habits of mind (or ways of knowing and being), and possible developmental activities.” They share this framework with the hope that others will share, adapt and use it in their own local contexts. The full paper is also available. Note that they also refer to it as a “framework for self-reflection” for faculty, which means it can be used to support metacognitive instruction.

 

Developing a Learning Culture: A Framework for the Growth of Teaching Expertise

 


It shouldn’t be Top Secret – Bloom’s Taxonomy

By Lauren Scharff, Ph.D.,  U. S. Air Force Academy *

Across the past year or so I have been reminded several times of the following fact: Most students are not aware of Bloom’s Taxonomy, and even if they are aware, they have no clue how or why their awareness of it might benefit them and their learning. Most instructors have heard of at least one version of Bloom’s Taxonomy, and some keep it in mind when designing learning activities and assessments.  But, rarely do instructors even mention it to their students.

Why don’t instructors share Bloom’s Taxonomy with their students? Is it a top secret, for instructors only? No! In fact, awareness and use of Bloom’s taxonomy can support metacognitive learning, so students should be let in on the “secret.”

What were the key experiences that led me to this strong stance? Let me share….

In May of 2016, I was fortunate to attend a keynote by Dr. Saundra McGuire at High Point University. In her keynote address and in her book, Teach Students How to Learn (2015), McGuire shared stories of interactions with students as they became aware of Bloom’s Taxonomy and applied it to their learning. She also shared data showing how this coupled with a variety of other metacognitive strategies lead to large increases in student academic success. Her work served as the first “ah ha” moment for me, and I realized that I needed to start more explicitly discussing Bloom’s Taxonomy with my students.

An additional way to highlight Bloom’s Taxonomy and support student metacognitive learning was shared this past October (2017) when Dr. Karl Wirth led a workshop as part of our 9th Annual Scholarship of Teaching and Learning (SoTL) Forum at the U. S. Air Force Academy. In his workshop he shared examples of knowledge surveys along with data supporting their use as a powerful learning tool. Knowledge surveys are collections of questions that support student self-assessment of their knowledge, understanding, and skills. When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. Research shows that most students are able to accurately self-assess (confidence ratings correlate strongly with actual performance; Nuhfer, Fleisher, Cogan, & Gaze, 2017). However, most students do not take the time to carefully self-assess their knowledge and abilities without formal guidance and encouragement to do so. In order to be effective, knowledge surveys need to ask targeted / granular questions rather than global questions. Importantly, knowledge survey questions can span the full range of Bloom’s Taxonomy, and Dr. Wirth incorporates best practices by taking the time to explain Bloom’s Taxonomy to his students and explicitly share how his knowledge survey questions target different levels.

Sharing Bloom’s Taxonomy in our classes is a great first step, but ultimately, we hope that students use the taxonomy on their own, applying it to assignments across all their courses. However, just telling them about the taxonomy or explaining how aspects of our course tap into different levels of the taxonomy may not be enough to support their use of the taxonomy beyond our classrooms. In response to this need, and as part of an ongoing Scholarship of Teaching and Learning (SoTL) project at my institution, one of my student co-investigators (Leslie Perez, graduated May 2017), created a workshop handout that walks students through a series of questions that help them apply Bloom’s as a guide for their learning and academic efforts. This handout was also printed in a larger, poster format and is now displayed in the student dorms and the library. Students use the handout by starting in the middle and asking themselves questions about their assignments. Based on their answers, the walk through a path that helps them determine what level of Bloom’s Taxonomy they likely need to target for that assignment. It should help them become more explicitly aware of the learning expectations for their various assignments and support their informed selection of learning strategies, i.e. help them engage in metacognitive learning.

Figure 1. Snapshot of the handout we use to guide students in applying Bloom’s Taxonomy to their learning.  (full-sized version here)

As someone who is a strong proponent of metacognitive learning, I have become increasingly convinced that instructors should more often and more explicitly share this taxonomy, and perhaps even more importantly, share how it can be applied by students to raise their awareness of learning expectations for different assignments and guide their choice of learning strategies. I hope this post motivates instructors to share Bloom’s Taxonomy (and other science of learning information) with their students. Feel welcome to use the handout we created.

————

McGuire, S. (2015). Teach Students How to Learn. Stylus Publishing, LLC, Sterling, VA.

Nuhfer, E., Fleisher, S., Cogan, C., Wirth, K., & Gaze, E. (2017). How random noise and a graphical convention subverted behavioral scientists’explanations of self-assessment data: Numeracy underlies better alternatives. Numeracy, 10(1), Article 4. DOI: http://dx.doi.org/10.5038/1936-4660.10.1.4

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Mind Mapping: A Technique for Metacognition

by Charlie Sweet, Hal Blythe, Rusty Carpenter, Eastern Kentucky University  Downloadable

Background

The Provost at Eastern Kentucky University invited Saundra McGuire to speak on metacognition as part of our University’s Provost’s Professional Development Speaker Series. Our unit was tasked with designing related programming both before and after McGuire’s visit.   Our aim was to provide a series of effective workshops that prepared the ground for our university’s Quality Enhancement Plan 2.0 on metacognition as a cross-disciplinary tool for cultivating reading skills. The following mind mapping exercise from one of four workshops was taught to over 50 faculty from across campus and the academic ranks. Feedback rated its popularity high and suggested its appropriateness for any level of any discipline with any size class.

Scientific Rationale

The Mind Map, a term invented by Tony Buzan in The Mind Map Book (1993), “is a powerful graphic technique which provides a universal key to unlocking the potential of the brain” (9). For that reason, Buzan’s subtitle is How to Use Radiant Thinking to Maximize Your Brain’s Untapped Potential. A mind map provides a way for organizing ideas either as they emerge or after the fact. Perhaps the mind map’s greatest strength lies in its appeal to the visual sense.

We chose to share mind mapping with our faculty because according to Brain Rules (2008), rule number ten is “Vision trumps all other senses” (221). For proof, the author, John Medina, cites a key fact: “If information is presented orally, people remember about 10%, tested 72 hours after exposure. That figure goes up to 65% if you add a picture” (234). Because of its visual nature, mind mapping provides a valuable metacognitive tool.

How Mind Mapping Supports Metacognition

Silver (2013) focuses on reflection in general and in particular “the moment of meta in metacognition—that is the moment of standing above or apart from oneself, so to speak, in order to turn one’s attention back upon one’s own mental work” (1). Mind mapping allows thinkers a visual-verbal way to delineate that moment of reflection and in capturing that moment to preserve its structure. Because analysis is one of Bloom’s higher-order learning skills, mind mapping leads to deep thinking, which makes self-regulation easier.

Method

Essentially, a mind map begins with what Gerry Nosich in Learning to Think Things Through (2009) calls a fundamental and powerful concept, “one that can be used to explain or think out a huge body of questions, problems, information, and situations” (105). To create a mind map, place the fundamental and powerful concept (FPC) you wish to explore in the center of a piece of paper and circle it. If at all possible, do something with color or the actual lettering in order to make the FPC even more visual. For instance, if you were to map the major strategies involved in metacognition, metacognition is the FPC, and you might choose to write it as such:

M E T A
Cognition

Increasing the visual effect of the FPC are lines that run to additional circled concepts that support the FPC. These Sputnik-like appendages are what Buzan calls basic ordering ideas, “key concepts within which a host of other concepts can be organized” (p. 84). For example, if you were working with our metacognition example, your lines might radiate out to a host of also-circled metacognitive strategies, such as retrieving, reflection, exam wrappers, growth mindset, and the EIAG process of Event selection-Identification of what happened-Analysis-Generalization of how the present forms future practice (for a fuller explanation see our It Works for Me, Metacognitively, pp. 33-34). And if you wanted to go one step further, you might radiate lines from, for instance, retrieving, to actual retrieving strategies (e.g., flashcards, interleaving, self-quizzing).

Uses for Mind Maps

Mind mapping has many uses for both students and faculty:

  • Notetaking: mind mapping provides an alternative form of notetaking whether for students or professors participating in committee meetings. It can be done before a class session by the professor, during the session by the student, or afterwards as a way of checking whether the fundamental and powerful concept(s) was taught or understood.
  • Studying: instead of rereading notes taken, a method destined for failure, try reorganizing them into a mind map or two. Mind mapping not only offers the visual alternative here, but provides retrieval practice, another metacognitive technique.
  • Assessing: instead of giving a traditional quiz at the start of class or five-minute paper at the end, ask students to produce a mind map of concept X covered in class. This alternative experiment will demonstrate to students a different approach and place another tool in their metacognitive toolbox.
  • Prioritizing: when items are placed in a mind map, something has to occupy center stage. Lesser items are contained in the radii.

Outcomes

Mind maps are easy, deceptively simplistic, fun, and produce a deep learning experience. Don’t believe it? Stop reading now, take out a piece of paper, and mind map what you just read. We’re willing to bet that if you do so, the result will provide a reflection moment.

References

Buzan, T. (1993). The mind map book: How to use radiant thinking to maximize your brain’s untapped potential. New York: Plume Penguin.

McGuire, S. Y., & McGuire, S. (2015). Teach students how to learn: Strategies you can incorporate into any course to improve student metacognition, study skills, and motivation. Sterling, VA: Stylus.

Medina, J. (2008). Brain rules. Seattle: Pear Press.

Nosich, J. (2009). Learning to think things through. Upper Saddle River, NJ: Pearson.

Silver, N. (2013). Reflective pedagogies and the metacognitive turn in college teaching.

In M. Kaplan, N. Silver, D. Lavaque-Manty, & D. Meizlish (Eds.), Using reflection and metacognition to improve student learning (pp. 1-17). Sterling, VA: Stylus.

Sweet, C., Blythe, H., & Carpenter, R. (2016). It Works for Me, Metacognitively. Stillwater, OK: New Forums.

Appendix: How to Use Word to Create a Mind Map

  1. Click Insert.
  2. Click Shapes and select Circle.
  3. Click on desired position, and the circle will appear.
  4. Click on Draw Textbox.
  5. Type desired words in textbox (you may have to enlarge the textbox to accommodate words).
  6. Drag textbox into center of circle.
  7. Repeat as desired.
  8. To connect circles, click Insert Shapes and then Select Line.
  9. Drag Line between circles.

A ‘New Ear’ for Student Writers: Building Awareness of Audience

by Michael Young, Robert Morris University

 Downloadable

Motivation and Background:

A fundamental hurdle for most inexperienced writers is gaining a sense of their audience, and how a different consciousness may interpret the words, the organization, and the presentation that they (the writers) use to share ideas. It is different than knowing rules, techniques, or traditions of writing. It requires more than knowledge of the topic about which they are writing. Writers must be aware of their own individual thinking, their own choices, their motivations, and how these could be interpreted or misinterpreted by other people’s ways of thinking. This need for awareness of their own thoughts that could then support their writing efforts, i.e. metacognitive writing, led me to develop a new pedagogical process for the writing classroom that uses active presentations by others to convey audience interpretation.

I used this process for three years in creative writing courses, partially because students were already pursuing genres that often are interpreted orally, but believe it could be applicable to any writing course, especially with the following course characteristics: 1) upper division/at least sophomore level so the students are already somewhat experienced collegiate writers and 2) class size is small, ideally 20 or fewer students. No special materials, other than imagination and the means to convey ideas, are needed for the in-class exercises.

Nuts and Bolts:

This pedagogical process has several steps. To first prepare the students and get them thinking about how an audience might interpret their work, the students are given an initial survey on their then-current process of writing and concept of their potential audience. Consistently, three out of five agreed that they had a “mental picture” of their reader, but it was often no further developed than their college peers or even themselves. Most could not describe their readers any further and some said they had not considered a concept of a readership. Perhaps, for them, they had written only and ever with the teacher, and so a grade, in mind.

The second step involves having canonical examples of their genre, fiction or poetry, interpreted by others. During this step those others give a presentation / reading of the work in a manner that conveys their interpretation of the writing. Those others can be classmates or a more external audience. For example, the first two years I used this process, the others were members of the Forensics Team from the University of Nebraska-Lincoln, then led by Professor Ann Burnett.

A third step, which has evolved over the years, was to have others present the students’ own writing back to them. This third step was implemented as a cycle. The students wrote their piece (either individually or as a group) and then gave it to others (classmates or external individuals) for interpretation with no additional input from the writers. The presenters would convey their interpretation, which then could be used by the writers to guide their revisions based on a better understanding of possible audience interpretation. If revisions were made, then the cycle of interpretation could be repeated.

Outcomes:

When this was done at the University of Nebraska-Lincoln, in a project funded by a grant from the university’s Teaching Council, 80% of the collaborative groups elected to revise their texts after hearing them interpreted. They noted the experience of hearing their stories being told by someone else, someone who was sharing their own understandings and insights into the words, heightened an awareness of qualities like the “flow and rhythm” of words or of “trying to make a picture in my head”, and an overall greater attention to what their drafts were able to communicate. For example, the potential hollowness of easy clichés might not have occurred to the writers or a lack of descriptions they had had in mind but which were not articulated were now more evident. Further, the majority of the class reported being much more aware of their own thinking (an aspect of metacognition) and the thinking of others.

By hearing, and sometimes seeing by the use of movements, how another person re-created the writer’s intentions, each writer had the opportunity to perceive how their audience understood what had been written down – in a way, to hear their own thinking – and to questions themselves. Is that what they had wanted someone else to feel, to think or had their expression fallen short of their conception? In other words, the process allowed them to “hear it (their work) with a ‘new ear’” and some of them realized they “should have found another way to get that (sic) message across.” That “new ear”, hopefully, was them more carefully listening to and questioning their own thoughts, i.e. being metacognitive about their own writing.


Participatory Pedagogy: Inviting Student Metacognition

by Nicola Simmons, Brock University, nsimmons@brocku.ca  Downloadable

Background

I teach higher and adult education, including adult developmental psychology, and like to invite my students to be aware of their cognitive processes. I see this as central to being an adult learner. One strategy I have developed is engaging students in creating course outcomes and content. I hope to help students become more aware of, more involved in, and better assessors of their own learning; in short, to examine their learning through a metacognitive lens.

This example is from a Masters of Education class, Exploring Approaches to Professional Development. The class is typically quite small (up to 20 students) but I have used it in groups of 50 students at the undergraduate level as well. 

The Approach 

The course follows Siemens’ (1984) participatory pedagogy (see syllabus excerpt) to invite students to co-construct the course process, including choosing course readings and creating grading rubrics:

As Biggs (2011) notes, student course co-ownership helps engage students in deep learning; it also builds their awareness of their learning processes. The first assignment, for example, asked them to:

Articulate your intended learning during this course, including a focus for personal and professional development. What will your development focus be? What will you do to realize your plan?

This engages students metacognitively as they take responsibility for their learning path and prepares them for the final assignment, a reflective ‘portfolio,’ in which they synthesize their learning over the term:

Create a creative and critical summary of your changing perspectives and reflections throughout the course, integrating readings (both assigned and others). Discuss your key learning, referring to course and outside experiences. Exemplary projects demonstrate critical analysis, synthesis, and self-evaluation. Can be any format (paper, song, performance, art; format negotiable). Addresses:

  • What theories help you?
  • What have you learned?
  • How can you use that?
  • How have you changed?
  • How do you know?

Each of these prompts invites consideration of the learning and development process and supports students in acquiring habits of mind that will allow them to approach future courses with a metacognitive lens. This has also led to their growth as scholars: One year, many of the students engaged in a self-study that included conducting a literature review and creating questions to guide our reflections. The result of that work was several conference presentations and a peer-reviewed paper (Simmons, Barnard, & Fennema, 2011) that outlined the transformative learning resulting from the student co-constructed course.

What was fascinating to me were the ways the course process built not only students’ metacognition about their learning, but also about their teaching. One wrote

I told my colleagues the story of this course and they were moved to consider new ways of doing culminating projects. Why isn’t there more choice? Why do we tell students what they must produce to demonstrate their own learning? Why don’t we add the additional layer of asking students to find the best way to demonstrate their learning?

Outcomes and Lessons Learned

Developing metacognition is not a pain-free process! One student described the transformation during the process from fear to increased confidence.

Activities were out of my comfort zone and there were times that I struggled with the unknown … I was able to see the value once I moved beyond the frozen fear of uncertainty to ask myself “What did I want to gain from this course? How did I learn when pushed out of my comfort zone?” I had to be transformed into a student who was open to this new concept and new territory for learning…[where] mistakes … would not be judged but instead used as stepping stones toward learning.

Instructors should be mindful of the importance of support throughout the process. Just as the students are invited to be metacognitive about their processes, it helps if the instructor is transparently metacognitive about the overall course path. For me that looked like saying things like “this may be new for you, but I’d like you to consider trying it” and reassuring them that discomfort was a sign they were onto something good!

The course format continues to unsettle students but also transform them into metacognitive learners, and I finish with one student’s illustrative words:

I remember thinking at the time that the final project was the most difficult task that I had encountered … I really had to ponder … how my journey through the course could be effectively captured and conveyed … It continues to personify my journey through work/life, the choices we make when we meet resistance or the paths we take … how we travel the road is for our choosing.

References

Biggs, J. B., & Tang, C. (2011). Teaching for quality learning at the university: What the student does. Maidenhead, UK: Society for Research into Higher Education & Open University Press.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34(10), 906-911.

Siemens, G. (2008). New structures and spaces of learning: The systemic impact of connective knowledge, connectivism, and networked learning. Paper Presented for Universidade do Minho, Encontro sobre Web 2.0, Braga, Portugal, October 10. Available online at http://elearnspace.org/Articles/systemic_impact.htm

Simmons, N., Barnard, M., & Fennema, W. (2011). Participatory pedagogy: A compass for transformative learning? Collected Essays on Learning and Teaching, 4.


Make It Stick in Cognitive Psychology

by Jennifer A. McCabe, Goucher College,
jennifer.mccabe@goucher.edu

 Downloadable

 

Motivation and Background: I am a cognitive psychologist with a research program focused on metacognition and applied memory in education. I decided three years ago to structure my Cognitive Psychology course around the principles described in the book, Make It Stick: The Science of Successful Learning by Brown, Roediger, and McDaniel (2014). Many memory-improvement principles are discussed in this book, including: practice retrieving new learning from memory, space out your retrieval practice, interleave the study of different problem types, elaboration, and reflection. Other topics include the fluency illusion, getting past learning styles, and developing a growth mindset. Adopting this book as required reading, and structuring the course to reflect these principles, dovetailed with my increasing commitment to prompt and support students’ metacognitive growth. I hoped that this would both enhance student learning on objective tests (in a notoriously challenging course), and also explicitly support a course learning outcome: Improve your metacognitive skills (knowing what you know, learning how to learn).

Context in which the activity or process has been used: This has been included in three sections of Cognitive Psychology, a 200-level course offered at Goucher College, a small liberal arts institution in Baltimore, Maryland. The class size is 25-30 students, and I have been teaching this course for 13 years.

Description of activity or process methods: The description of the activity is in my Cognitive Psychology syllabus (available through Project Syllabus:  http://teachpsych.org/Resources/Documents/otrp/syllabi/JM16cognitive.pdf). On the first day of class, I describe the Make It Stick” Reflection Papers. For each class period in which a chapter is assigned, students prepare and bring to class a 1-page, single-spaced reflection. Content and style is open, but they must demonstrate deep and careful thinking about the topic, and explicit connections to life experiences, habits and plans/intentions, and course material. They can also include questions and/or other personal reactions to the chapter. I note that this assignment requires elaboration and reflection, two effective learning strategies discussed in the book. Students submit 8 reflection papers during the semester (one per chapter), each worth up to 5 points. Out of a 500-point class, this assignment is worth up to 40 points (8%).

The first reflection paper is due early in the semester, typically the second week, then the subsequent seven chapters/papers are due approximately once per week. We take time in class on those days to engage in small- and large-group discussion. Most of these discussions are framed in terms of metacognition, particularly in light of research suggesting that college students do not always understand how learning works, and cannot always predict which memory strategies lead to the best retention (e.g., McCabe, 2011). I encourage them to consider their lives as learners, and how they can use information from the book to adjust their strategies.

We also talk about how this course is structured to reflect “best practice” learning strategies. For example, students take a self-graded “retrieval practice” quiz at the start of most class periods, because research shows that frequent, effortful, low-stakes, cumulative, spaced (distributed) retrieval practice: (1) produces the most durable learning; and (2) improves metacognitive accuracy of what you know. I strive to be transparent in the purpose for all course elements. In a sense, then, I see Make It Stick as a framework for the entire course – core content and topics for discussion, rationale for course design, and hopefully motivation for students to engage and feel empowered in their own learning.

Outcomes and Lessons Learned:

Since implementing this assignment, I believe that students’ knowledge about effective learning strategies has improved. They seem to enjoy the book as a required course component – on an anonymous questionnaire, 88% agreed that Make It Stick should be included in future classes. When asked whether this course had supported the learning outcome of improving metacognitive skills, 100% agreed or strongly agreed (71% strongly agreed). And when asked about one way this course has changed the way they think or behave in the world, 78% included a statement relating to metacognition. Some examples include:

“I now analyze the way I am absorbing and encoding information. I have never thought about the way I learn but now I am so grateful to accept the study strategies that work and throw away the ones that don’t.”

“It has helped me to develop a better understanding of effective study/learning strategies. Improved my metacognitive skills!”

“When I study and am overconfident in my skills, I think about metacognitive skills and test myself. This class helped me study better.”

Of course the major challenge with teaching students metacognition is that it is only half the battle to acquire knowledge about how learning works. I still struggle with motivating students to actually implement these strategies. Many are desirable difficulties (Bjork, 1994), feeling effortful and error-prone (and even frustrating) in the short term, and only showing benefits due to this initial challenge at a later time. I encourage students to use the strategies regularly, so that they become habits of mind, but I’m not convinced they consistently do so after one semester of exposure to this material. Yet the fact that they make statements such as the ones above gives me hope that they are integrating the Make It Stick ideas about metacognition into their lives.

Though this assignment has been part of a highly relevant course, Cognitive Psychology, the book Make It Stick (or selected chapters) could enhance a number of courses in and outside of psychology – as well as first-year seminars and similar courses that focus on student skill development, with the goal of teaching them how to be better learners.

References

Bjork, R. A. (1994). Memory and metamemory considerations in the training of human beings.

In J. Metcalfe & A. Shimamura (Eds.), Metacognition: Knowing about knowing (pp. 185–205). Cambridge, MA: MIT Press.

Brown, P. C., Roediger, H. L., & McDaniel, M. A. (2014). Make it stick: The science of successful learning. Cambridge, MA: The Belknap Press of Harvard University.

McCabe, J. (2011). Metacognitive awareness of learning strategies in undergraduates. Memory & Cognition, 39, 462–476. doi:10.3758/s13421-010-0035-2


Teaching Transformation Through Becoming a Student of Learning

by Patrick Cunningham, Rose-Hulman Institute of Technology,
Holly Matusovich & Sarah Williams, Virginia Tech

 Downloadable

Motivations and context:

I teach a variety of Mechanical Engineering courses at a small private undergraduate institution with approximately 2000 students. The courses I teach focus on the application of scientific theory and math to solve engineering problems. Since I started teaching I have been interested in how to help students to learn more deeply in my courses. This eventually led me to a sabbatical in the Department of Engineering Education at Virginia Tech, where I established a research partnership with Dr. Holly Matusovich, and later Ms. Sarah Williams, studying student metacognitive development. We have been interested in how to help students to become more sophisticated and lifelong learners and how to aid instructors in supporting this student development. This collaboration initiated a research-to-practice cycle, where my interest in enhancing student learning led to research on student metacognitive development, and research results have influenced my teaching practice.

Description of the process:

The research-to-practice cycle has transformed my teaching by helping me become a student of learning. For me the process has involved formal educational research, but it does not have to. My implementation of the cycle follows:

  1. Identify what teaching and learning issue you care about and develop partnerships.
  2. Plan the study.
  3. Implement the study and analyze the data.
  4. Interpret the results and use them to direct modifications to your teaching.
  5. Repeat steps 1-4.

I am interested in enhancing student learning and that led to collaborative metacognition research with Dr. Matusovich. Other possible partnerships may be with colleagues, your teaching and learning center, disciplinary education researchers (e.g., engineering or physics education), or even education researchers at your own institution (e.g., educational or cognitive psychology).

We planned the research through the preparation of a successfully funded NSF grant proposal. The process included establishing research questions, specifying study phases, determining what data to collect and how, and planning for data analysis. Even if you are not engaging in formal research, the quality and success of your study will depend on a well laid out plan. As a mechanical engineering professor, my collaborators proved to be indispensable partners for this.

Early in our research, we gathered baseline data through student interviews on how students approach learning in engineering science courses and how they define learning. We have found that students predominantly rely on working and reviewing example problems as a means of learning. This approach to learning falls into the category of rehearsal strategies, where students are seeking to memorize steps and match patterns rather than develop a richer conceptual understanding. While it is important to know facts, results from learning science show rehearsal strategies are insufficient for developing adequate conceptual frameworks that are necessary for transferring concepts to new situations and being able to explain their understanding effectively to others – key aspects of engineering work. To construct such rich conceptual frameworks students also need to engage in elaborative and organizational learning strategies, but students reported underutilization of these strategies. Students’ overreliance on example problems does not align with being able to apply course concepts to real-world problems.

In reviewing the data, I also realized that I might be part of the problem. My teaching and assessments had been primarily organized around working problems with little variation. The research helped me change. I decided to scaffold students’ use of a broader range of monitoring, elaborative, and organizational strategies by changing my approach to teaching. I realized that I could empower my students by helping them learn about and refine their learning skills – even as I teach the content of the course.

I made significant changes to my course. I changed the grade category for “homework” to “development activities” to include the regular homework, and new homework learning check quizzes and video quizzes. These quizzes provided low-stakes opportunities for formative feedback to students about their conceptual understanding. I also changed my classroom activities, engaging students in evaluating and explaining given solutions with errors, recall practice, interrogating examples with “what if” questions and answering them, and creating problems for specific concepts. For the next project steps, we are collecting data on these implementations so the research-to-practice cycle can begin again.

Outcomes:

My students performed at least as well on traditional problem solving exams as students in other sections of the same course. Importantly, they reported feeling more responsible for their learning and that they had to exert more effort in their learning than in other engineering science courses. For me, this has been a more fulfilling teaching experience. Not only have I found that students asked better questions about course content, but I also had more conversations with students about how they can learn more effectively and efficiently. It has added rigor and a clarity of purpose in my teaching that reaches beyond course content.

Lessons learned:

I learned to articulate the differences between my course and other courses and to get buy-in from students as to what I was trying to do. As a teacher, student resistance to change can be hard but it is worth it to improve teaching and learning experiences. Collaborative partnerships help!

Acknowledgement:

The metacognition research was supported by the National Science Foundation under Grant Nos. 1433757, 1433645, & 1150384. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.