Learning. Design. Analytics. Post 2: Utilizing Instructional Design Methodologies and Learning Analytics to Encourage Metacognition

By Zhuqing Ding, MA

What’s the interaction like in higher education between faculty and instructional designers? While faculty often have full autonomy in their course design and teaching methodologies (Martin, 2009), instructional designers play the role of a change agent. When instructional designers propose improvements in instructional strategies and recommend significant changes to existing courses, faculty may resist adopting their recommendations. With this in mind, the online course design team at the Center for New Designs in Learning and Scholarship (CNDLS), Georgetown University, had implemented an adaptive approach to the course design process since 2018. During the pandemic, we were able to pivot and expand this approach to offer support to faculty across the university and not only to the ones who had been part of online programs.

So, what makes our process adaptive? The traditional ADDIE (Analyze, Design, Develop, Implement and Evaluate) instructional design model, focuses on linear processes in content development. Compared to the traditional instructional design approach, an adaptive approach has distinct iterative phases where we use learning analytics as evidence to initiate faculty members’ metacognition, thereby inspiring changes to future iterations of the course.

Here is how our course design team implemented the adaptive approach for transforming an existing classroom-based law course to fully online. Before moving online, this tax law course was offered once a week through one 2.5-hour lecture, accompanied by reading assignments and graded only by one final exam. During the first phase of our adaptive process our team tackled the following questions:

  • How can we transform the passive learning experiences in the classroom (receiving information) into active learning (interacting with the course materials)?
  • How can we strike a balance between good practices for online course design and the traditional methods familiar to/preferred by law school professors?
  • What are some of the strategies we will use to encourage the faculty members to evaluate their own teaching strategies to become more mindful and intentional about their own teaching?

While lecturing has been proven to be the most used instructional method, especially in higher education, it is more suitable for a face-to-face traditional classroom than an online environment (McKeachie, 1990). In an online course, passively receiving a large amount of information by watching a series of 2.5-hour lectures can be challenging for students. Following an adaptive process allowed us to guide faculty toward an awareness that designing an online course is more than recording lectures and that students need to interact with the lectures in order to be able to recall and retrieve what they are learning. The following image shows the metacognitive activities that we incorporated in the adaptive design approach. During the first phase of the course, our course design team introduced activities that support reflective practice to enable faculty members to become more self-aware about their own teaching and learning. Then, in the second phase we introduce activities that encourage a deepening of the reflective practice to spark creativity.

Flow diagram showing Phase 1: Course Translation and Phase 2: Course Adaptation

 

Interactive and short lectures

In order to collect data that can help us understand the students’ learning experience, we proposed to the faculty to create interactive and short lectures to replace the 2.5-hour long lectures in each module. To demonstrate to faculty the types of interactive elements that can be inserted into lectures, we introduced a storyboarding method. In the script, the faculty broke their long lectures into subtopics. Our instructional designers highlighted the keywords and areas that could be illustrated by graphics or animations. Then, the faculty confirmed the highlights and the graphics, and added/deleted as needed.

The resultant, short, subtopic videos were presented as playlists in the course. Within each video, the professor is shown lecturing on the right side of the screen, and on the left, relevant animated keywords, charts, and graphics appear. Certain parts of the charts were highlighted as the professor talked through certain elements within the charts. Such interactive elements within the lectures are designed to help students make a connection to the professor while also focusing on important keywords and short summaries of the lecture topics.

The analytics report provided by the video hosting platform that became available after the course launched helped with the faculty’s meta-thinking. Based on the total views, total minutes of content delivered, unique viewers, and the percentage of completion data for each video, the course team was able to understand important aspects of the students’ learning experience: the videos that had the highest views, the videos that students were not able to finish watching, and the videos that students watched again and again. Such evidence helped the faculty identify knowledge areas that students were not able to understand right away and recognize times when students’ participation dropped.

In the second iteration/phase of this course, the faculty took several actions to not only improve the course design, but also improved his teaching presence in this online course. First, during the low-participation time observed from the analytics report of the first iteration, reminders were sent to students to encourage them to keep up the pace. Additionally, more office hours including one-on-one and group office hours were scheduled, allowing students to clear up questions with the professor if they got stuck. The faculty was able to address common questions during the recorded office hour sessions, and made these sessions available to students. Overall, the student-faculty interaction was improved since, due to metacognition, the faculty became more aware of the importance of building interactive touch points to keep online students on track. The metacognition has initiated his awareness of the importance of the teaching presence in the online courses.

Weekly Activities

The law school has a long-standing tradition of using final exams as the only assessment in a given course. In face-to-face classes, interactions such as small talk among peers before and after the class and question-and-answer sessions after each lecture help students confirm whether they are on track. For online students, such checkpoints are missing and, therefore, it is necessary to periodically build them in so students can make sure they are following along.

In the first iteration of the course, we introduced weekly, ungraded quizzes, allowing students to practice, experiment, and reflect on their learning. Since this particular course is related to tax law, the quiz questions — most requiring calculation in Excel — were extracted from previous exams. Correct answers and short explanations were provided for each question at the end of the quiz. Students were allowed to take the quizzes multiple times. Such low-stakes activities provided the space for students to explore and discover the answers during their learning.

After the first iteration of the course, the professor was able to review the quiz analytics report provided by the quiz tool in Canvas, which allowed for metacognition on the activity design. There was a correlation between students’ performance on the weekly quizzes and the final exam. Students who didn’t participate in the practice quizzes at all achieved lower scores than students who did. Students who completed practice quizzes were also more active in the online office hours and found more opportunities to engage with the professor throughout the semester. Based on this finding, the professor realized the importance of motivating students to practice on a weekly basis in order to assess their understanding and ask questions before they fall too far behind.

By the second iteration of the course, a few actions were taken to improve students’ engagement in weekly quizzes. The professor improved the design of the quiz questions by adding downloadable Excel spreadsheets with formulas to the provided explanations of the quiz answers, allowing students to tinker with formulas and reflect on their own calculations. He also offered additional office hours following the quizzes in each module to make sure students had the opportunity to ask questions.

The professor moved from reluctance to including weekly quizzes at all, since they didn’t exist in the face-to-face class, to encouraging active learning processes by improving the quiz question design and proactively providing space for students to reach out to him with questions. Quiz analytics served as evidence that drove the faculty member to metacognition and improved the way he teaches online.

Summary

Metacognition is implicitly part of faculty development programs across disciplines. However, while working with law faculty, the adaptive approaches our course design team followed led to new reflections about their teaching practice. It was challenging to find the balance between traditional ways of teaching in the law school and an interactive online course that would allow students to succeed in a virtual environment. The adaptive approach allows the instructional designers and faculty to reach more agreements with iterative efforts. We used the analytics provided by the media-hosting tool and the quiz tools as evidence to encourage meta-thinking about the faculty’s teaching practice. This led to minor changes to the course with significant impacts. With such evidence, faculty are more open to adapting their long-standing teaching practices and embracing new ways of designing online courses. Sometimes these metacognitive approaches to teaching online also inspire faculty to rethink their teaching practices in traditional classrooms, such as providing more measurable learning goals or diversifying assessment methods.

References

McKeachie, W. J. (1990) Research on College Teaching: The Historical Background, Journal of Educational Psychology, 82, 2, 189-200.

Martin, R. E. (2009). The revenue-to-cost spiral in higher education. Raleigh, NC: John William Pope Center for Higher Education Policy. Retrieved from http://files.eric.ed.gov/fulltext/ED535460.pdf


Learning. Design. Analytics. Post 1: A Faculty Development Approach To Support Metacognitive Awareness During Course Adaptation

By Yianna Vovides, PhD (Series Editor), Georgetown University

I once worked with a faculty member who was skeptical about teaching an online course, let alone spending time working with a designer on it. So, after my first meeting with him, realizing his hesitations, I created a prototype based on his course syllabus to show him what was possible. I remember him saying, I couldn’t see how to teach my course online, I am not a techie, but maybe I can if you help me.

He now saw me as his coach and partner, helping him plan how to engage students, helping him put in place assignments that he could manage within the course management system, helping him during his teaching. All along, during the four months we spent on his course, I would ask him about his teaching philosophy and his approach to teaching in his discipline. About a month before the course was ready to launch, I asked him if he could write a few paragraphs to explain to his students what he was sharing with me about his choices in the readings, his expectations in relation to how students approached a text and what he looked for in their assignments. He did.

We ended up recording these (only audio) and adding them to his week-by-week course structure. I then asked him if he was up for doing some more recording that focused on the selection of texts in his courses. I asked him to share his study of the authors themselves. He did. I then created an e-book that students would use to explore a bit more about the authors from their instructor’s perspective.

When the course opened, I spent an hour on the phone walking him through how to respond to student posts in the discussion board. He said, Thank you, I think I can do this! And he did. During the first run of his course, I sent him weekly emails to check in and point out the student monitoring/analytic features for making sure his students were keeping up.

What does metacognition have to do with it?

Because the process of online course development takes time, the relationship between the designer and the faculty tends to result in one that lasts past that one course experience. It is usually after the first course design and the first time faculty teach their course when they realize how much they learned about teaching and learning. They then go on to adapt their other courses. They are more metacognitively aware. They are aware of their own approaches to teaching and learning, aware of what it takes to design and teach a course in another mode, and are aware that good design and teaching involves planning, monitoring, reflecting, evaluating, and adapting existing practices. This is how I define the process of course adaptation that we will explore further in this post.

Let us dig a bit deeper into course adaptation.

In this post, I describe the adaptive approach we have implemented as part of our online programs efforts at the Center for New Designs in Learning and Scholarship (CNDLS), Georgetown University. The approach connects instructional design practices with a faculty development focus that encourages metacognition (planning, monitoring, evaluating). I started with the following overarching question: How should instructional designers guide faculty to rethink their approach to course design to follow an adaptive faculty development process? I then identified the following sub-questions that formed the basis of the approach and operationalizing the process: 

  • What techniques can instructional designers follow to engage faculty in design thinking? 
  • What techniques can instructional designers follow to engage faculty in meta reflections about their teaching methods? 
  • How can instructional designers use learning analytics to help faculty continue engaging in meta reflections during their online teaching? 

The questions I listed above offered the CNDLS online programs team a way to problematize our approach to design. I realized that we needed to make visible the levels (macro, meso, micro) that we address during the design process and enable faculty to navigate these successfully. We implemented a model that enables conversations about design and development at all levels by following a before, during, after approach (see Figure 1).

circle schematic with three equal components: before, during, after
Figure 1. Macro – Before, during, after model

The guiding questions start at the course level and move to sessions and sequence of engagement exploring the teaching and learning experience across time. These questions include but are not limited to the following:

  • Tell us about your course. What do you love about this course? What do you think the students love about this course?
  • What are the things that you think about when you prepare to teach this course?
  • How do you engage your students before the semester starts?
  • What do you do during that first class session?
  • What do you expect students to do during the first class session?
  • What do you do after the class session?
  • What do you want students to do after the class session?

These questions help faculty reflect about their approach to teaching and learning. By asking these questions up front and throughout the course adaptation process we are embedding metacognitive instruction within the course design model itself. In addition, throughout the design process we include check-in sessions that allow both the designer and faculty to pause and ask:

  • Is our design plan still valid?
  • Is our choice of technology going to support students in their learning process?
  • Do we need to do anything differently?

What these check-in questions do over the span of four to six months of engaging with an individual faculty on the course design and development process is that the conversations become connected across time and merge into a spiral design model. Figure 2 visualizes the spiral model that supports the faculty development approach that instructional designers take. Once faculty members experience this model, they continue to follow this design approach as they envision their other courses. In addition, they tend to re-visit their approach to their teaching shifting from an instructor-centered to a student-centered approach.

Schematic of a spiral illustrating loops of before, during, after
Figure 2. Spiral – Before, during, after Model

Because the model is based on time, it easily communicates across the various disciplines. What do I mean by that? Because the conversations that surround this model are related to teaching practices, it is also a way to account for contact time (faculty-student interaction) and learning time (student effort). We refer to the combination of contact time and learning time as instructional time in conversations with faculty. In remote teaching and learning, instructional time is an entry point to envisioning how learning can happen in different ways.

The rest of the mini-series on Learning. Design. Analytics. includes examples using this approach that highlight strategies used to activate metacognitive awareness during the course design and re-design process through the designer-faculty interaction. In addition, the series highlights how technology interventions and learning analytics are integrated as part of the process.

Some background about instructional design and online education to frame the approach

Adapting traditional classroom-based courses to online may sound simple given that online education has been around for more than two decades. In fact, instructional design, a field of study that is over 80 years old, offers theories, models, and processes that guide designers to make this adaptation from traditional classroom-based teaching to online. This is a technical challenge – solutions are available and are knowable. However, in higher education, the instructional design process, when framed to support faculty development, introduces complexity. The challenge is no longer technical because the focus of the challenge is no longer about the course adaptation from traditional to online but the people involved in making the adaptation happen (faculty, designers, media specialists, students, and other members of the team that supports this process). It is a process of transformation.

Let us pull this apart a bit more. Higher education as an institution has been described as lacking innovation and flexibility for promoting impactful teaching and learning (Rooney et al., 2006). That was in 2006. Between 2006 and 2016,  we have seen online education grow and thrive with over 6 million students (approximately 30% of all higher education students) enrolled in at least one distance education course in the United States (Allen & Seaman, 2017). Then COVID-19 happened. Remote teaching and learning is happening across the globe and is now the new normal. Given the speed of the changes, some schools have been able to pivot and put in place the needed support for their instructors while others are struggling to determine what that support needs to be and how to operationalize it. 

There are many factors that contribute to these decisions besides resources such as institutional, departmental, and individual cultural norms. For example, the institutional culture may be known by those who are in it but much of it is hidden from those new to it which may lead to actions that are oftentimes driven by assumptions rather than visible evidence (Halupa, 2019). Many academic departments tend to value individual contributions and can propagate a competitive rather than a collaborative environment. This may then lead to a less cohesive curriculum online. Individual faculty members are experts in their discipline but not necessarily in the discipline of teaching and learning. Therefore, within this complex network of needs, faculty development efforts in higher education try to balance group and individual engagements to provide opportunities for faculty to get the support they need in their teaching.

Recognizing that there are different instructional development needs necessitates that we offer different entry points and pathways in our faculty development programming. Within the online course design efforts, we work with faculty to help them see their teaching challenge from a design thinking perspective that begins with an exploration of what individual learners will experience. By doing so, we are no longer facing a technical challenge but rather an adaptive one because we are now focusing on individual learner needs. To tackle this adaptive challenge that is implicitly dynamic because of the focus on humans, we argue that the approach requires that planning, monitoring, and evaluation become an integral part of the process at both the cognitive and metacognitive levels. 

References

Allen, I. E., & Seaman, J. (2017). Digital Compass Learning: Distance Education Enrollment Report 2017. Babson survey research group.

Halupa, C. (2019). Differentiation of Roles: Instructional Designers and Faculty in the Creation of Online Courses. International Journal of Higher Education, 8(1), 55-68.

Rooney, P., Hussar, W., Planty, M., Choy, S., Hampden-Thompson, G., Provasnik, S., & Fox, M. A. (2006). The Condition of Education, 2006. NCES 2006-071. National Center for Education Statistics.


Am I responsible for engaging my students in learning how to learn?

by Patrick Cunningham, Rose-Hulman Institute of Technology

I’m a mechanical engineering professor and since my first teaching experience in graduate school I’ve wanted my students to walk away from my classes with deep learning. Practically, I want my students to remember and appropriately apply key concepts in new and different situations, specifically while working on real engineering problems.

In my early years of teaching, I thought if I just used the right techniques, exceptional materials, the right assignments, or the right motivational contexts, then I would get students to deeper learning. However, I still found a disconnect between my pedagogy and student learning. Good pedagogy is important, but it isn’t enough.

On sabbatical 4 years ago, I sat in on a graduate-level cognitive processes course that helped explain this disconnect. It helped me realize student learning is principally determined by the student. What the student does with the information determines the quality of their learning. How they use it. How they apply it. How they practice it. How engaged they are with it. I can provide a context conducive to deeper learning, but I cannot build the foundational and rich knowledge frameworks within the students’ minds. Only the students can do this. In other words, while we, as educators, are important in the learning process, we are not the primary determinants of learning, students are. Students are responsible for their learning, but they don’t universally realize it.

So, how do we help students realize their responsibility for learning? It requires presenting explicit instruction on how learning really works, providing practice with effective approaches to learning, and giving constructive feedback on the learning process (Kaplan, et al. 2013). When left unchecked, flawed conceptions of the learning process at best are allowed to persist and at worst are reinforced. Even when we do not explicitly speak to the learning process with our students, we say something about it. For example, when our primary mode of instruction is walking students through example problems, we may reinforce the belief that learning is about memorizing the process rather than connecting concepts to different contexts and knowing when to apply one concept versus another concept. Sometimes we do speak to students about the learning process, but we offer vague and unhelpful advice, such as, “work more problems”, or “study harder”. Such advice doesn’t point students to specific strategies instrumental in building more interconnected knowledge frameworks (e.g., elaborative and organizational strategies) (Dembo & Seli 2013) and can reinforce surface-level memorization and pattern matching approaches.

Because our teaching doesn’t guarantee student learning, because we desire our students develop deep and meaningful learning, and since we always say something about the learning process (intentionally or not), we, as educators, are responsible for engaging our students in developing as learners. We should be explicitly engaging our students in learning about and regulating their learning processes, i.e., developing their metacognitive skills.

As I advocate for our responsibility to aid students’ in learning how to learn, some common reactions include:

  1. Don’t people figure out how to learn naturally?
  2. Shouldn’t students already do this on their own?
  3. I don’t know metacognition and the science of learning like I know my specialty area.

Don’t we figure out how to learn naturally? Yes, learning is a natural process, but, no, we do not naturally develop deep and efficient approaches to learning – anymore than we naturally develop the skill of a concert musician or any other highly refined practice. Shouldn’t students already do this on their own? Ideally, yes, but the reality is most students’ prior learning experiences have led to ingrained surface learning habits.

Prior learning experiences condition how we go about learning, along with contextual factors, such as the guidance of parents and teachers. In general, students think they are good at learning and don’t see a need to change their approaches. They continue to get good grades using memorization and pattern matching – often cramming for exams – while lacking long-term memory of concepts and the ability to transfer these concepts to real applications. As long as our courses allow students to get good grades (their measures of “success”) with surface learning habits, such views will persist. Deep learning includes memorizing, i.e., knowing, things, but such durable and transferable learning requires much more than just memorization. It takes effortful intellectual engagement with concepts, exploring connections and sorting out relationships between concepts, and accurate self-assessment. Such approaches can be learned, and a few students do. More can if we explicitly guide them. Our students are not lazy, rather they are misguided by prior experiences. Let’s guide them!

I don’t know metacognition and the science of learning like I know my specialty area. Yes, it is important to be knowledgeable and proficient with what we teach. While we have done much with the content in our specialties, we have limited training, if any, training on metacognition (the knowledge and regulation of our thinking/learning processes) and the science of learning. However, as educators trying to improve our craft, shouldn’t we also be students of learning? This can start small and continue as a career-long pursuit. We can always improve! You also likely know more than you think you do. Your self-selection into advanced studies and a college teaching career are not an accident. As part of the select group of academics, you are likely already metacognitively skilled, even if you don’t realize it. Start small, with one thing. Learn about it and practice or recognize it in your own life. For example, peruse a copy of Linda Nilson’s Creating Self-Regulated Learners or James Lang’s Small Teaching, or attend a teaching workshop that sparks your interest. Then, confidently share it with your students and engage them in it as you teach your content. Your authentic experience with it demonstrates its relevance and importance. Once you have become comfortable with this, add another element. Over time, you will build practical expertise about the learning process. Along the way you will likely learn about yourself and make sense of your past (and present) learning experiences. I did!

Need help? Look for my next post, “Where should I start with metacognition?”

Acknowledgements

This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757, 1433645, & 1150384. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation. I also extend my gratitude to my collaborating researchers, Dr. Holly Matusovich and Ms. Sarah Williams, for their support and critical feedback.

References

Dembo, M. & Seli, H. (2013). Motivation and Learning Strategies for College Success: A Focus on Self-Regulated Learning (4th ed.). New York, NY: Routledge.

Kaplan, M., Silver, N., Lavaque-Manty, D., Meizlish, D. (Eds.). (2013). Using Reflection and Metacognition to Improve Student Learning. Sterling, VA: Stylus.


In Remembrance of Dr. Gregg Schraw and Dr. Marty Carr

By Hillary Steiner, Ph.D., Kennesaw State University and Aaron S. Richmond, Ph. D., Metropolitan State University of Denver

In this first blog post of 2018 we remember two educational psychologists with interests in metacognition who recently passed away. Aaron Richmond and Hillary Steiner describe how their personal and professional interactions with these scholars influenced their own work.

From Aaron: In my career as an educational psychologist, I was more than lucky to work with Gregg—I was honored. On September 16th, 2016, Gregg passed away with his sweet wife Lori by his side. Gregg was a prolific researcher in metacognition and in other educational research fields. He published over 90 journal articles, 15 books, and 45 book chapters. He sat on several editorial boards including the Journal of Educational Psychology, Metacognition and Learning, Educational Psychology Review, etc. He was an active member of Division C (Learning and instruction) in the American Educational Research Association, and several other regional academic conferences such as Northern Rocky Mountain Educational Research Association (NRMERA).photo of Dr. Gregg SchrawYes, Gregg was a prolific scholar, however, his greatest gift was to his students and colleagues. One of my dear friends and fellow metacognitive researcher Rayne Sperling at Pennsylvania State University wrote, “Gregg’s confidence in me and steady, supportive guidance provided the self-efficacy boost I needed in order to believe in myself as a scholar with something to say. As co-chair of my dissertation, mentor throughout my career, and dear friend always, Gregg was a strong, positive force in my life. Now, my own doc students tell me I am a wonderful, supportive mentor, and I always tell them, “I am just doing what I was taught; mentoring as I was mentored.” Gregg taught me this too. His mentoring continues with the students he mentored (and there are a lot of us) who now have students of our own.” (McCrudden, 2016, p. 681).

I had followed Gregg’s career, seen him at conferences—in awe of course with a star-struck gaze and for me, Gregg was a research icon. He was a mega-god for which I was not worthy. However, when I first met Gregg at NRMERA in the fall of 2003, I was a dewy-eyed graduate student who had plucked up the courage to introduce myself to discuss metacognitive research. I quickly realized that yes—he was a research god, but more importantly he was a kind, generous, supportive, and inclusive person / human. He listened to my good ideas and listened to my half-cocked ideas that needed serious fine-tuning. After that fateful day in Jackson Hole, Wyoming I knew that I had gained a mentor of all things. Gregg supported me through my career in both research and teaching. We published together, and he was one of my advocates. He advanced my career like so many others. He doled out sound and sincere professional advice willingly. For example, Gregg, Fred Kuch, and I were working on some metacognition research together and my students were working quite hard and doing a great job on the project. Mind you, I am at a large state university with no graduate students so these were undergraduate students. Gregg was so impressed with one of my students (because of the mentorship he and his students had provided me which I had passed on to my students) he offered to write her a letter of recommendation for graduate school. I found this simple but powerful and impactful gesture to be astonishing and yet typical of Gregg’s passion for advancing high quality scholars in the field of metacognition and educational psychology. This was just one simple example of how Gregg went out of his way to help people and support their goals and pursuits.

In the end, Gregg didn’t have to be, but he was my mentor like so many others. Gregg am I indebted to you and you will truly be missed. The field of metacognition lost a great scholar, mentor, and friend.

From Hillary:

On July 30, 2017, the field of metacognition lost another great. Dr. Martha “Marty” Carr, Professor of Educational Psychology at the University of Georgia, passed away at the young age of 59. A prolific researcher who mentored countless students to become scholars in their own right, Marty combined her interests in metacognition, motivation, giftedness, and mathematics achievement to impact the field of educational psychology in a unique way, asking big questions about how children’s metacognitive strategies influence the gender differences that emerge in mathematics achievement, and how metacognition differs in gifted children.

photo of Dr. Marty Carr

Marty began her career in developmental psychology at the University of Notre Dame under the tutelage of John Borkowski, followed by a postdoctoral stint at the Max Planck Institute for Psychological Research in Germany, where she quickly made important contributions related to the influence of motivation and metacognition on children’s learning strategy development. After joining the faculty of the University of Georgia in 1989, where she remained for her entire career, she began to cultivate additional interests in giftedness and mathematics strategy development. These varied interests dovetailed throughout the years, as she wrote about metacognition in gifted children, motivational and self-regulatory components of underachievement, and metacognitive influences on gender differences in math. Marty’s work was known for its methodological rigor, its unique application of developmental models and methods to learning processes, and its applicability to the classroom. She was recognized in particular for groundbreaking work on the predictors and influential factors of gender differences in mathematics. Her contributions led to national recognition and leadership, including presidency of the American Psychological Association’s Educational Psychology Division (Division 15), presidency of the Women in Mathematics Education division of the National Council for Teachers in Mathematics, and numerous awards, including the American MENSA Education and Research Foundation Award for Excellence.

As my dissertation advisor in the early 2000’s, Marty was the first person to make me feel like a scholar. She recognized my interests in giftedness and cognitive development and provided the perfect combination of support and encouragement that helped me craft a line of research that continues to this day. And I am not alone. At her memorial service, several students commented on how much her mentorship had meant to them. According to student Kellie Templeman, “her skill in striking the balance between technical knowledge, compassionate guidance, and tireless work ethic was what separated her from any other professor I have worked with.” She promoted metacognition in her own students by asking them to reflect constantly on the “why” questions of their individual projects and to remain goal-driven. As another former student noted, Marty pushed us to “keep going, get busy, and keep writing,” learning from our mistakes as we went. Yet, as a devoted mother who had many outside interests, including marathon running and working with animals (especially cats and horses) Marty was also an excellent model of work-life balance.

When I attended the American Educational Research Association conference as a graduate student, Marty introduced me to Gregg Schraw, who was to be my assigned mentor for the week. I was starry-eyed at meeting such a great figure in my field, but later realized that others were equally starry-eyed to meet Marty. Marty and Gregg were truly giants in educational psychology whose contributions have transformed the way we think about metacognition. May we continue to honor their memory in our own work.

References

McCrudden, M. T. (2016). Remembering Gregg Schraw. Educational Psychology Review28(4), 673-690.

 


Using Metacognition to select and apply appropriate teaching strategies

by John Draeger (SUNY Buffalo State) & Lauren Scharff (U. S. Air Force Academy)

Metacognition was a recurring theme at the recent Speaking SoTL (Scholarship of Teaching and Learning) conference at Highpoint university. Invited speaker Saundra McGuire, for one, argued that metacognition is the key to teaching students how to learn. Stacy Lipowski, for another, argued for the importance of metacognitive self-monitoring through the regular testing of students. We argued for the importance of metacognitive instruction (i.e. the use of reflective awareness and self-regulation to make intentional and timely adjustments to teaching a specific  individual or group of students) as a tool for selecting and implementing teaching strategies. This post will share a synopsis of our presentation from the conference.

We started with the assumption that many instructors would like to make use of evidence-based strategies to improve student learning, but they are often faced with the challenge of how to decide among the many available options. We suggested that metacognitive instruction provides a solution. Building blocks for metacognitive instruction include 1) consideration of student characteristics, context, and learning goals, 2) consideration of instructional strategies and how those align with the student characteristics, context, and learning goals, and 3) ongoing feedback, adjustment and refinement as the course progresses (Scharff & Draeger, 2015).

Suppose, for example, that you’re teaching a lower-level core course in your discipline with approximately 35 students where the course goals include the 1) acquisition of broad content and 2) application of this content to new contexts (e.g., current events, personal situations, other course content areas). Students enrolled in the course typically have a variety of backgrounds and ability levels. Moreover, they don’t always see the relevance of the course and they are not always motivated to complete assignments. As many of us know, these core courses are both a staple of undergraduate education and a challenge to teach.

Scholarly teachers (Richlin, 2001) consult the literature to find tools for addressing the challenges just described. Because of the recent growth of SoTL work, they will find many instructional choices to choose from. Let’s consider four choices. First, Just-in-Time teaching strategies ask students to engage course material prior to class and relay those responses to their instructor (e.g., select problem sets or focused writing). Instructors then use student responses to tailor the lesson for the day (Novak, Patterson, & Gavrin, 1999; Simkins & Maier, 2004; Scharff, Rolf, Novotny, & Lee, 2013). In courses where Just-in-Time teaching strategies are used, students are more likely to read before class and take ownership over their own learning. Second, Team-Based Learning (TBL) strategies also engage students in some pre-class preparation, and then during class, students engage in active learning through a specific sequence of individual work, group work, and immediate feedback to close the learning loop (Michaelsen & Sweet, 2011). TBL has been shown to shift course goals from knowing to applying and create a more balanced responsibility for learning between faculty and students (with students taking on more responsibility). Third, concept maps provide visual representations of important connections (often hierarchical connections) between important concepts. They can help students visualize connections between important course concepts (Davies, 2010), but they require some prior understanding of the concepts being mapped. Fourth, mind mapping also leads to visual representations of related concepts, but the process is more free-form and creative, and often requires less prior knowledge. It encourages exploration of relationships and is more similar to brainstorming.

Any of these three tools might be good instructional choices for the course described above. But how is an instructor supposed to choose?

Drawing inspiration from Tanner (2012) who shared questions to prompt metacognitive learning strategies for students, we recommend that instructors ask themselves a series of questions aligned with each of our proposed building blocks to prompt their own metacognitive awareness and self-regulation (Scharff & Draeger, 2015). For example, instructors should consider the type of learning (both content and skills) they hope their students will achieve for a given course, as well as their own level of level of preparedness and time / resources available for incorporating that particular type of teaching strategy.

In the course described above, any of the four instructional strategies might help with the broad acquisition of content, and depending upon how they are implemented, some of them might promote student application of the material to new contexts. For example, while concept maps can facilitate meaningful learning their often hierarchical structure may not allow for the flexibility associated with making connections to personal context and current events. In contrast, the flexibility of mind-mapping might serve well to promote generation of examples for application, but it would be less ideal to support content acquisition. Team-Based-Learning can promote active learning and facilitate the application of knowledge to personal contexts and current events, but it requires the instructor to have high familiarity with the course and the ability to be very flexible during class as students are given greater responsibility (which may be problematic with lower-level students who are not motivated to be in the course).   Just-in-Time-Teaching can promote both content acquisition and application if both are addressed in the pre-class questions. During class, the instructor should show some flexibility by tailoring the lesson to best reach students based on their responses to the pre-class questions, but overall, the lesson is much more traditional in its organization and expectations for student engagement than with TBL. Under these circumstances, it might be that Just-in-Time strategies offer the best prospect for teaching broad content to students with varying backgrounds and ability levels.

While the mindful choice of instructional strategies is important, we believe that instructors should also remain mindful in-the-moment as they implement strategies. Questions they might ask themselves include:

  • What are you doing to “check in” with your learners to ensure progress towards daily and weekly course objectives?
  • What are signs of success (or not) of the use of the strategy?
  • How can you  adjust the technique to better meet your student needs?
  • Are your students motivated and confident, or are they bored or overwhelmed and frustrated? Are your students being given enough time to practice new skills?
  • If learning is not where it needs to be or student affect is not supportive of learning, what are alternate strategies?
  • Are you prepared to shift to them? If not, then why not?

These prompts can help instructors adjust and refine their implementation of the chosen instructional strategy in a timely manner.

If, for example, Just-in-Time assignments reveal that students are understanding core concepts but having difficulty applying them, then the instructor could tweak Just-in-time assignments by more explicitly requiring application examples. These could then be discussed in class. Alternatively, the instructor might keep the Just-in-Time questions focused on content, but start to use mind mapping during class in order to promote a variety of examples of application.  In either case, it is essential that instructors are explicitly and intentionally considering whether the instructor choice is working as part of an ongoing cycle of awareness and self-regulation. Moreover, we believe that as instructors cultivate their ability to engage in metacognitive instruction, they will be better prepared to make in-the-moment adjustments during their lessons because they will be more “tuned-in” to the needs of individual learners and they will be more aware of available teaching strategies.

While not a magic bullet, we believe that metacognitive instruction can help instructors decide which instructional strategy best fits a particular pedagogical situation and it can help instructors adjust and refine those techniques as the need arises.

References

Davies, M. (2011). Concept mapping, mind mapping and argument mapping: what are the differences and do they matter? Higher education, 62(3), 279-301.

Michaelsen, L. K., & Sweet, M. (2011). Team‐based learning. New directions for teaching and learning,(128), 41-51.

Novak, G., Patterson, E., Gavrin, A., & Christian, W. (1999). Just-in-time teaching:

Blending active learning with web technology. Upper Saddle River, NJ: Prentice Hall.

Richlin, L. (2001). Scholarly teaching and the scholarship of teaching. New directions for teaching and learning, 2001(86), 57-68.

Scharff, L. and Draeger, J. (2015). “Thinking about metacognitive instruction” National Teaching and Learning Forum 24 (5), 4-6.

Scharff, L., Rolf, J. Novotny, S. and Lee, R. (2011). “Factors impacting completion of pre-class assignments (JiTT) in Physics, Math, and Behavioral Sciences.” In C. Rust (ed.), Improving Student Learning: Improving Student Learning Global Theories and Local Practices: Institutional, Disciplinary and Cultural Variations. Oxford Brookes University, UK.

Simkins, S. & Maier, M. (2009). Just-in-time teaching: Across the disciplines, across the

academy. Stylus Publishing, LLC.

Tanner, K. D. (2012). Promoting student metacognition. CBE-Life Sciences Education, 11(2), 113-120.


Lean Forward, but Do It Metacognitively!

by Lauren Scharff, Ph.D. (U. S. Air Force Academy)

As the Director for the Scholarship of Teaching and Learning (SoTL) at my institution, a large part of my job description involves helping faculty intentionally explore new approaches and how they impact student learning. In other words – I work with forward-leaning faculty who are ready to try new things. So, I think a lot about how, when, and why faculty members adopt new pedagogies, tools, and activities, and about when, for whom, and in what contexts these new approaches enhance learning. This work dovetails nicely with the development and goals of metacognitive instruction.

As a reminder if you’re relatively new to our site, one of the premises we’ve previously shared here (e.g. Scharff, March 2015) and elsewhere (Scharff and Draeger, NTLF, 2015) is that Metacognitive Instruction involves the intentional and ongoing interaction between awareness and self-regulation, specifically with respect to the pedagogical choices instructors make as they design their lessons and then as they carry them out.

I was happy to see these connections reinforced last month at our 7th Annual SoTL Forum. Dr. Bridget Arend was invited to give a morning workshop and the keynote address. Along with James R. Davis, she is co-author of Facilitating Seven Ways of Learning: A Resource for More Purposeful, Effective and Enjoyable College Teaching. In her workshop Bridget dug into how to facilitate critical thinking, promote problem-solving, and support the building of skills (3 of the 7 ways of learning), while in her keynote she focused more strongly on the concept of matching student learning goals with the most effective teaching methods. She went beyond the usual discussion of tips and techniques to explore the underlying purpose, rationale, and best use of these [pedagogical] methods.

Dr. Bridget Arend giving the keynote address at the 7th Annual SoTL Forum at the U. S. Air Force Academy
Dr. Bridget Arend giving the keynote address at the 7th Annual SoTL Forum at the U. S. Air Force Academy

7_Ways_of_Learning
Books such as these can help support metacognitive instruction.

While Bridget did not explicitly use the term “metacognitive instruction,” it struck me that her message of purposeful choice of methods directly supported key aspects of metacognitive instruction, especially those related to awareness of our pedagogical decisions. We (instructors) should not incorporate pedagogies (or new tools or activities) just because they are the ones typically used by our colleagues, or because they are what was “done to us as students and it worked for us,” or because they are the “new, latest-greatest thing” we’ve heard about. Rather, we should carefully review our learning goals and consider how each possible approach might support those goals for our students and our context.

We should also be mindful of other factors that might influence our adoption of new approaches. For example, administrators or institutions often reward faculty who are leading the adoption of new technologies. Sometimes the message seems “the more new technologies incorporated the better” or “out with the old and in with the new” so a program or institution can market itself as being the most cutting edge in education. However, while many of us appreciate being rewarded or showcased for new efforts, we also need to pause to consider whether or not we’re really supporting student learning as well as we could with these practices.

Questions we should ask ourselves before implementation include, How will our new pedagogical approach or a new app really align with the learning goals we have for our students? Will all of our choices complement each other, or might they work at cross-purposes with each other? Realistically, there are a limited number of learning outcomes we can successfully accomplish within a lesson or even across a semester.

As we implement these new approaches and tools, we should ask additional questions. How are they actually impacting aspects of student engagement, attitudes towards learning, and ultimately, the learning itself? How might they be adjusted (either “in the moment” or in future lessons) as we use them in order to better support our learning goals for our students in our context? No group of students is the same, and the context also shifts over time. What worked well in the past might need adjusting or more radically changing in the future.

In sum, we know that no single approach is going to work for all learning goals or all students across all situations. But if we build our awareness of possibilities using resources such as Facilitating Seven Ways of Learning (and many other published papers and texts) to help guide our pedagogical choices; if we carefully attend to how our approaches affect students and student learning; and we if modify our approach based on those observations (and maybe using systematic data if we’re conducting a SoTL research project), then we WILL be more likely to enhance student learning (and our own development as metacognitive instructors).

Thus, lean forward as instructors, but do it metacognitively!

————————-

Davis, James R. & Arend, B. (2013). Facilitating Seven Ways of Learning: A Resource for More Purposeful, Effective and Enjoyable College Teaching. Stylus Publishing, Sterling, VA.

Scharff, L. & Draeger, J. (September, 2015). Thinking about metacognitive instruction. The National Teaching and Learning Forum, 24(5), p. 4-6. http://onlinelibrary.wiley.com/doi/10.1002/ntlf.2015.24.issue-5/issuetoc


Teaching a new course requires metacognition

by John Draeger, SUNY Buffalo State

One of the joys of being an academic philosopher is the freedom to explore new ideas. For example, the recent retirement of a colleague left a gap in my department’s usual offerings. I agreed to take over a course on the philosophy of love and sex. While I have written scholarly articles on related topics, I confess that teaching this new material had me feeling the sort of constructive discomfort that I seek to foster in my students (Draeger 2014). As a result, I experienced a heightened sense of awareness concerning what I was doing and why. In particular, I came to believe that teaching a new course requires metacognition.

As I sat down to construct the course, I was guided by the thought that philosophy can help students learn to have careful conversations about ideas that matter. With respect to this new course, I wanted students to learn to ask tough questions. Can we really promise to love someone forever? Can sex ever be meaningless? Is becoming emotionally attached to someone other than your partner worse than sleeping around? Is it possible to love more than one person at the same time or does romantic love require some form of exclusivity? Such questions prompt students to consider whether commonly held beliefs are actually justified. If these views withstand scrutiny, then students have the conceptual resources to offer a proper defense. If not, then students can begin searching for ideas worth having. Such questions can also open up a larger conversation about related concepts (e.g., trust, intimacy, respect, jealousy, loyalty).  Because much of the course material was new to me, I had not always thought through the various permutations and implications of each philosophical position. I often found myself learning “on the fly” along with my students as I reflected on my own assumptions and preconceived ideas in “real time” while the discussion unfolded in front of me.

In an earlier post (Draeger 2015), I argued that “critical thinking involves an awareness of mode of thinking within a domain (e.g., question assumptions about gender, determine the appropriateness of a statistical method), while metacognition involves an awareness of the efficacy of particular strategies for completing that task.” As I reflect on my philosophy of love and sex course, I realize that my heightened awareness contained elements of both critical thinking and metacognition. Because the material was largely new to me, I was more aware of my own critical thinking processes as I engaged in them and more “tuned into” what my students were grappling with (e.g., assumptions about love and sex, related concepts, implications of the view we are considering). I also found myself metacognitively evaluating whether my students were critically engaged and whether my choices were moving the conversation in philosophically fruitful directions. I like to think that this sort of monitoring happens in all of my classes, but I was acutely aware of its importance given that the material was unfamiliar and my discussion prompts were untested. Moreover, I like to think that I never resort to autopilot and that I am always keenly aware of fluid learning environments. However, because the material was so fresh, I could not help but engage in self-regulation. I did not have a reliable stock of examples and responses at my fingertips. Even more than usual, I found myself making intentional changes to my approach based on “in-the-moment” feedback from students (Scharff 2015).

Teaching a new course always rejuvenates me because it reminds me how much I love to learn. As the teacher, however, I was responsible for more than my own learning. Effective teaching requires thinking about the critical thinking processes of all the learners in the room, including my own. It also requires monitoring fluid learning environment and making intentional changes (often in-the-moment changes) if students are to have careful conversations about ideas that matter (e.g., love, sex). While teaching with metacognition is generally a good idea, this semester taught me that teaching a new course requires metacognition.

Teaching a new course requires metacognition Click To Tweet

References

Draeger, John (2015). “Two forms of ‘thinking about thinking’: metacognition and critical thinking.” Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking

Draeger, John (2014). “Cultivating a habit of constructive discomfort.” Retrieved from https://www.improvewithmetacognition.com/cultivating-a-habit-of-constructive-discomfort
Scharff, Lauren (2015). “What do we mean by ‘metacognitive instruction?” Retrieved from https://www.improvewithmetacognition.com/what-do-we-mean-by-metacognitive-instruction/


Metacognitive Judgments of Knowing

Roman Taraban, Ph.D., Dmitrii Paniukov, John Schumacher, Michelle Kiser, at Texas Tech University

“The more you know, the more you know you don’t know.” Aristotle

Students often make judgments of learning (JOLs) when studying. Essentially, they make a judgment about future performance (e.g., a test) based on a self-assessment of their knowledge of studied items. Therefore, JOLs are considered metacognitive judgments. They are judgments about what the person knows, often related to some future purpose. Students’ accuracy in making these metacognitive judgments is academically important. If students make accurate JOLs, they will apply just the right amount of time to mastering academic materials. If students do not devote enough time to study, they will underperform on course assessments. If students spend more time than necessary, they are being inefficient.

As instructors, it would be helpful to know how accurate students are in making these decisions. There are several ways to measure the accuracy of JOLs. Here we will focus on one of these measures, termed calibration. Calibration is the difference between a student’s JOL related to some future assessment and his actual performance on that assessment. In the study we describe here, college students made JOLs (“On a scale of 0 to 100, what percent of the material do you think you can recall?”) after they read a brief expository text. Actual recall was measured in idea units (IUs) (Roediger & Karpicke, 2006). Idea units are the chunks of meaningful information in the text.   Calibration is here defined as JOL – Recalled IUs, or simply, predicted recall minus actual recall. If the calibration calculation leads to a positive number, you are overconfident to some degree; if the calculation result is negative, then you are underconfident to some degree. If the result is zero, then you are perfectly calibrated in your judgment.

The suggestion from Aristotle (see quote above) is that gains in how much we know lead us to underestimate how much we know, that is, we will be underconfident. Conversely, when we know little, we may overestimate how much we know, that is, we will be overconfident. Studies using JOLs have found that children are overconfident (predicted recall minus actual recall is positive) (Lipko, Dunlosky, & Merriman, 2009; Was, 2015). Children think they know more than they know, even after several learning trials with the material. Studies with adults have found an underconfidence with practice (UWP) effect (Koriat et al., 2002), that is, the more individuals learn, the more they underestimate their knowledge. The UWP effect is consistent with Aristotle’s suggestion. The question we ask here is ‘which is it’: If you lack knowledge, do your metacognitive judgments reflect overconfidence or underconfidence, and vice versa? Practically, as instructors, if students are poorly calibrated, what can we do to improve their calibration, that is, to recalibrate this metacognitive judgment.

We addressed this question with two groups of undergraduate students, as follows. Forty-three developmental-reading participants were recruited from developmental integrated reading and writing courses offered by the university, including Basic Literacy (n = 3), Developmental Literacy II (n = 29), and Developmental Literacy for Second Language Learners (n = 11). Fifty-two non-developmental participants were recruited from the Psychology Department subject pool. The non-developmental and developmental readers were comparable in mean age (18.3 and 19.8 years, respectively) and the number of completed college credits (11.8 and 16.7, respectively), and each sample represented roughly fifteen academic majors. All participants participated for course credit. The students were asked to read one of two expository passages and to recall as much as they could immediately. The two texts used for the study were each about 250 words in length and had an average Flesch-Kincaid readability score of 8.2 grade level. The passages contained 30 idea units each.

To answer our question, we first calculated calibration (predicted recall – actual recall) for each participant. Then we divided the total sample of 95 participants into quartiles, based on the number of idea units each participant recalled. The mean proportion of correct recalled idea units, out of 30 possible, and standard deviation in each quartile for the total sample were as follows:

Q1: .13 (.07); Q2: .33 (.05); Q3: .51 (.06); Q4: .73 (.09). Using quartile as the independent variable and calibration as the dependent variable, we found that participants were overconfident (predicted recall > actual recall) in all four quartiles. However, there was also a significant decline in overconfidence from Quartile 1 to Quartile 4 as follows: Q1: .51; Q2: .39; Q3: .29; Q4: .08. Very clearly, the participants in the highest quartile were nearly perfectly calibrated, that is, they were over-predicting their actual performance by only about 8%, compared to the lowest quartile, who were over-predicting by about 51%. This monotonic trend of reducing overconfidence and improving calibration was also true when we analyzed the two samples separately:

NON-DEVELOPMENTAL: Q1: .46; Q2: .39; Q3: .16; Q4: .10;

DEVELOPMENTAL: Q1: .57; Q2: .43; Q3: .39; Q4: .13.

The findings here suggest that Aristotle may have been wrong when he stated that “The more you know, the more you know you don’t know.” Our findings would suggest that the more you know, the more you know you know. That is, calibration gets better the more you know. What is striking here is the vulnerability of weaker learners to overconfidence. It is the learners who have not encoded a lot of information from reading that have an inflated notion of how much they can recall. This is not unlike the children in the Lipko et al. (2009) research mentioned earlier. It is also clear in our analyses that typical college students as well as developmental college students are susceptible to overestimating how much they know.

It is not clear from this study what variables underlie low recall performance. Low background knowledge, limited vocabulary, and difficulty with syntax, could all contribute to poor encoding of the information in the text and low subsequent recall. Nonetheless, our data do indicate that care should be taken in assisting students who fall into the lower performance quartiles to make better calibrated metacognitive judgments. One way to do this might be by asking students to explicitly make judgments about future performance and then encouraging them to reflect on the accuracy of those judgments after they complete the target task (e.g., a class test). Koriat et al. (1980) asked participants to give reasons for and against choosing responses to questions before the participants predicted the probability that they had chosen the correct answer. Prompting students to consider the amount and strength of the evidence for their responses reduced overconfidence. Metacognitive exercises like these may lead to better calibration.

References

Koriat, A., Lichtenstein, S., Fischoff, B. (1980). Reasons for confidence. Journal of Experimental Psychology: Human Learning and Memory, 6(2), 107-118.

Koriat, A., Sheffer, L., & Ma’ayan, H. (2002). Comparing objective and subjective learning curves: Judgments of learning exhibit increased underconfidence with practice. Journal of Experimental Psychology: General, 131, 147–162.

Lipko, A. R., Dunlosky, J., & Merriman, W. E. (2009). Persistent overconfidence despite practice: The role of task experience in preschoolers’ recall predictions. Journal of Experimental Child Psychology, 102(2), 152-166.

Roediger, H., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17(3), 249-255.

Was, C. (2015). Some developmental trends in metacognition. Retrieved from

https://www.improvewithmetacognition.com/some-developmental-trends-in-metacognition/.

 


Pausing Mid-Stride: Mining Metacognitive Interruptions In the Classroom

By Amy Ratto Parks, Ph.d., University of Montana

Metacognitive interventions are often the subject of research in educational psychology because researchers are curious about how these planned, curricular changes might impact the development of metacognitive skills over time. However, as a researcher in the fields of metacognition and rhetoric and composition, I am sometimes struck by the fact that the planned nature of empirical research makes it difficult for us to take advantage of important kairic moments in learning.

The rhetorical term kairic, taken from the Greek concept of kairos, generally represents a fortuitous window in time in which to take action toward a purpose. In terms of learning, kairic moments are those perfect little slivers in which we might suddenly gain insight into our own or our students’ learning. In the classroom, I like to think of these kairic moments as metacognitive interruptions rather than interventions because they aren’t planned ahead of time. Instead, the “interruptions” arise out of the authentic context of learning. Metacognitive interruptions are kairic moments in which we, as teachers, might be able to briefly access a point in which the student’s metacognitive strategies have either served or not served them well.

A few days ago I experienced a very typical teaching moment that turned out to be an excellent example of a fruitful metacognitive interruption: I asked the students to take out their homework and the moment I began asking discussion questions rooted in the assignment, I sensed that something was off. I saw them looking at each other’s papers and whispering across the tables, so I asked what was going on. One brave student said, “I think a bunch of us did the homework wrong.”

They were supposed to have completed a short analysis of a peer-reviewed article titled, “The Daily Show Effect: Candidate Evaluations, Efficacy, and American Youth” (Baumgartner & Morris, 2014). I got out the assignment sheet and asked the brave student, Rasa*, to read it aloud. She said, “For Tuesday, September 15. Read The Daily Show Effect: Candidate Evaluations…. oh wait. I see what happened. I read the other Jon Stewart piece in the book.” Another student jumped in and said, “I just analyzed the whole show” and a third said, “I analyzed Jon Stewart.”

In that moment, I experienced two conflicting internal reactions. The teacher in me was annoyed. How could this simple set of directions have caused confusion? And how far was this confusion going to set us back? If only half of the class had done the work, the rest of my class plan was unlikely to go well. However, the researcher in me was fascinated. How, indeed, had this simple set of instructions caused confusion? All of these students had completed a homework assignment, so they weren’t just trying to “get out of work.” Plus, they also seemed earnestly unsure about what had gone wrong.

The researcher in me won out. I decided to let the class plan go and I began to dig into the situation. By a show of hands I saw that 12 of the 22 students had done the correct assignment and 10 had completed some customized, new version of the homework. I asked them all to pause for a moment and engage in a metacognitive activity: they were to think back to moment they read the assignment and ask themselves, where did I get mixed up?

Rasa said that she just remembered me saying something about The Daily Show in class, and when she looked in the table of contents, she saw a different article, “Political Satire and Postmodern Irony in the Age of Stephen Colbert and Jon Stewart” (Colletta, 2014), and read it instead. Other students said that they must not have read closely enough, but then another student said something interesting. She said, “I did read the correct essay, but it sounded like it was going to be too hard to analyze and I figured that you hadn’t meant for this to be so hard, so I just analyzed the show.” Other students nodded in agreement. I asked the group to raise their hands if had read the correct essay. Many hands went up. Then I asked if they thought that the analysis they chose to do was easier than the one I assigned. All of them raised their hands.

Again, I was fascinated. In this very short conversation I had just watched rich, theoretical research play out before me. First, here was an example of the direct effect of power browsing (Kandra, Harden, & Babbra, 2012) mistakenly employed in the academic classroom. Power browsing is a relatively recently coined term that describes “skimming and scanning through text, looking for key words, and jumping from source to source” (Kandra et al., 2012).  Power browsing can be a powerful overviewing strategy (Afflerbach & Cho, 2010) in an online reading environment where a wide variety of stimuli compete for the reader’s attention. Research shows that strong readers of non-electronic texts also employ pre-reading or skimming strategies (Dunlosky & Metcalfe, 2009), however, when readers mistakenly power browse in academic settings, it may result in “in missed opportunities or incomplete knowledge” (Kandra et al., 2012, par. 18). About metacognition and reading strategies, Afflerbach and Cho (2010) write, “the good strategy user is always aware of the context of reading” (p. 206); clearly, some of my students had forgotten their reading context. Some of the students knew immediately that they hadn’t thoroughly read the assignment. As soon as I described the term “power browse” their faces lit up. “Yes!” said, Rasa, “that’s exactly what I did!” Here was metacognition in action.

Second, as students described the reasoning behind choosing to read the assigned essay, but analyze something unassigned, I heard them offering a practical example of Flower and Hayes’ (1981/2011) discussion of goal-setting in the writing process. Flower and Hayes (1981/2011) said that writing includes, “not only the rhetorical situation and audience which prompts one to write, it also includes the writer’s own goals in writing” (p. 259). They went on to say that although some writers are able to “juggle all of these demands” others “frequently reduce this large set of restraints to a radically simplified problem” (p. 259). Flower and Hayes allow that this can sometimes cause problems, but they emphasize that “people only solve the problems they set for themselves” (p. 259).

Although I had previously seen many instances of students “simplifying” larger writing assignments in my classroom, I had never before had a chance to talk with students about what had happened in the moment when they realized something hadn’t worked. But here, they had just openly explained to me that the assignment had seemed too difficult, so they had recalibrated, or “simplified” it into something they thought they could do well and/or accomplish during their given timeframe.

This metacognitive interruption provided an opportunity to “catch” students in the moment when their learning strategies had gone awry, but my alertness to the kairic moment only came as a result of my own metacognitive skills: when it became clear that the students had not completed the work correctly, I paused before reacting and that pause allowed me to be alert to a possible metacognitive learning opportunity. When I began to reflect on this class period, I realized that my own alertness came as a result of my belief in the importance of teachers being metacognitive professionals so that we can interject learning into the moment of processing.

There is yet one more reason to mine these metacognitive interruptions: they provide authentic opportunities to teach students about metacognition and learning. The scene I described here could have had a very different outcome. It can be easy to see student behavior in a negative light. When students misunderstand something we thought we’d made clear, we sometimes make judgments about them being “lazy” or “careless” or “belligerent.” In this scenario it seems like it would have been justifiable to have gotten frustrated and lectured the students about slowing down, paying attention to details, and doing their homework correctly.

Instead, I was able to model the kind of cognitive work I would actually want to teach them: we slowed down and studied the mistake in a way that led the class to a conversation about how our minds work when we learn. Rather than including a seemingly-unrelated lecture on “metacognition in learning” I had a chance to teach them in response to a real moment of misplaced metacognitive strategy. Our 15-minute metacognitive interruption did not turn out to be a “delay” in the class plan, but an opening into a kind of learning that might sometimes just have to happen when the moment presents itself.

References

Baumgartner, J., & Morris, J., (2014). The Daily Show effect: Candidate evaluations, efficacy, and American youth. In C. Cucinella (Ed.), Funny. Southlake, Fountainhead Press. (Reprinted from American Politics Journal, 34(3), (2006), pp.341-67).

Colletta, L. (2014). Political satire and postmodern irony in the age of Stephen Colbert and Jon Stewart. In C. Cucinella (Ed.), Funny. Southlake, Fountainhead Press. (Reprinted from The Journal of Popular Culture, 42(5), (2009), pp. 856-74).

Dunlosky, J., & Metcalfe, J. (2009). Metacognition. Thousand Oaks, CA: Sage.

Flower, L., & Hayes, J. (2011). A cognitive process theory of writing. In V. Villanueva & K. Arola (Eds.), Cross-talk in comp theory: A reader, (3rd ed.), (pp. 253-277). Urbana, IL: NCTE. (Reprinted from College Composition and Communication, 32(4), (Dec., 1981), pp. 365-387).

Kandra, K. L., Harden, M., & Babbra, A. (2012). Power browsing: Empirical evidence at the college level. National Social Science Journal, 2, article 4. Retrieved from http://www.nssa.us/tech_journal/volume_2-2/vol2-2_article4.htm

Waters, H. S., & Schneider, W., (Eds.). (2010). Metacognition, strategy use, and instruction. New York, NY: The Guilford Press.

* Names have been changed to protect the students’ privacy.


5 Things Every Student Should Know Before Starting College

This article is about Geddes’ five tips to students who are entering college. Once you read the subtitles, I’m sure you will be intrigued to read this brief article.

Five Tips

  1. Your Professors Hate Your Favorite High School Teachers!
  2. Understand the 80/20 Rule / 20/80 Rule Shift
  3. Read Material Before Class
  4. Know the Difference Between Memorizing and Learning
  5. Be Confident. You are not broken

Geddes, Leonard. (2015) . 5 Things Every Student Should Know Before Starting College. The Learnwell Projects. Retrieved from http://www.thelearnwellprojects.com/thewell/5-things-every-student-should-know-before-starting-college/

5 Things Every Student Should Know Before Starting College

 


Making sense of how I learn: Metacognitive capital and the first year university student

By Lodge and Larmar, This article focuses on how significant it is to encourage metacognitive processing as a means of increasing student retention, enhancing university engagement and lifelong learning.

Larmar, S. & Lodge, J. (2014). Making sense of how I learn: Metacognitive capital and the first year

university student. The International Journal of the First Year in Higher Education, 5(1). 93-105. doi:

10.5204/intjfyhe.v5i1.193

Lodge and Larmar article


Meta-Studying: Teaching Metacognitive Strategies to Enhance Student Success

“Elizabeth Yost Hammer, PhD, of Xavier University of Louisiana, discusses why psychology teachers are uniquely positioned not only to teach the content of psychology but also to teach students how to learn. Hammer presents some strategies to teach metacognitive skills in the classroom to enhance learning and improve study skills and encourages teachers to present students with information about Carol Dweck’s model of the “Fixed Intelligence Mindset.””

Dr. Elizabeth Yost Hammer’s Presentation (45 Minutes)


Dr. Derek Cabrera – How Thinking Works

“Dr. Derek Cabrera is an internationally recognized expert in metacognition (thinking about thinking), epistemology (the study of knowledge), human and organizational learning, and education. He completed his PhD and post-doctoral studies at Cornell University and served as faculty at Cornell and researcher at the Santa Fe Institute. He leads the Cabrera Research Lab, is the author of five books, numerous journal articles, and a US patent. Derek discovered DSRP Theory and in this talk he explains its benefits and the imperative for making it part of every students’ life.”

DSRP consists of four interrelated structures (or patterns), each structure has two opposing elements. The structures and their elements are:

  • Making Distinctions – which consist of an identity and an other
  • Organizing Systems – which consist of part and whole
  • Recognizing Relationships – which consist of action and reaction
  • Taking Perspectives – which consist of point and view

https://youtu.be/dUqRTWCdXt4  (15 minutes)


Metacognition in Psychomotor Development and Positive Error Cultures

Ed Nuhfer, Retired Professor of Geology and Director of Faculty Development and Director of Educational Assessment, enuhfer@earthlink.net, 208-241-5029

All of us experience the “tip of the tongue” phenomenon. This state occurs when we truly do know something, such as the name of a person, but we cannot remember the person’s name at a given moment. The feeling that we do know is a form of metacognitive awareness that confirms the existence of a real neural network appropriate to the challenge. It is also an accurate knowing that carries confidence that we can indeed retrieve the name given the right memory trigger.

In “thinking about thinking” some awareness of the connection between our psychomotor domain and our efforts to learn can be useful. The next time you encounter a tip-of-the-tongue moment, try clenching your left hand. Ruth Propper and colleagues confirmed that left hand clenching activates the right hemisphere of the brain and can enhance recall. When learning names, clenching of the right hand activates the left hemisphere and can enhance encoding (http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0062474). Not all connections between the psychomotor domain and intellectual development are this direct, but it is very useful to connect efforts to develop intellectually with established ways that promote psychomotor development.

Young people are active, so many things that excite them to initiate their learning have a heavy emphasis on psychomotor development. Examples are surfing, snowboarding, dance, tennis, martial arts, yoga, or a team sport. We can also include the hand-eye coordination and learning patterns involved in many addictive video games as heavy on kinesthetic learning, even though these do not offer health benefits of endurance, strength, flexibility, balance, etc. It is rare that anyone who commits to learning any of these fails to achieve measurably increased proficiency.

K-12 teacher Larry Ferlazzo uses the act of missing a wastebasket with a paper wad to help students understand how to value error and use it to inform strategies for intellectual development (http://larryferlazzo.edublogs.org/2011/10/31/an-effective-five-minute-lesson-on-metacognition). His students begin to recognize how the transfer of practices that they already accept as valid from their experiences may likely improve their mastery in less familiar challenges during intellectual development.

College teachers also know that the most powerful paths to high-level thinking engage the psychomotor domain. Visualization that involves explaining to self by diagram and developing images of the knowledge engages psychomotor skills. Likewise, writing engages the psychomotor in developing text, tracking and explaining reasoning and in revising the work (Nuhfer, 2009, 2010 a, b).

Students already “get” that many trips down the ski trail are needed to master snowboarding; they may not “get” that writing many evaluative argument papers is necessary to master critical thinking. In the former, they learn from their most serious error and focus on correcting it first. They correctly surmise that the focused effort to correct one troublesome issue will be beneficial. In efforts to develop intellectually, students deprived of metacognitive training may not be able to recognize or prioritize their most serious errors. This state deprives them of awareness needed to do better on subsequent challenges.

It is important for educators to recognize how particular cultures engage with error. Author and neuroscientist Gerd Gigerenzer, Director of the Max Planck Institute for Human Development and  the Harding Center for Risk Literacy (2014) contrasts positive and negative error cultures. A positive error culture promotes recognition and understanding of error. They discuss error openly, and sharing of experienced error is valued as a way to learn. This culture nurtures a growth mindset in which participants speak metacognitively to self in terms of: “Not yet… change this …better next time.” Gigerenzer cites aviation as a positive error culture of learning that has managed to reduce plane crashes to one in ten million flights. Interestingly, the cultures of surfing, snowboarding, dance, tennis, martial arts and yoga all promote development through positive error cultures. Positive error cultures make development through practice productive and emotionally safe.

Gigerenzer cites the American system of medical practice as one example of a negative error culture, wherein systems of reporting, discussing and learning from serious errors are nearly nonexistent. Contrast aviation safety with the World Heath Organization report that technologically advanced hospitals harm about 10% of their patients. James (2013) deduced that hospital error likely causes over 400,000 deaths annually (http://journals.lww.com/journalpatientsafety/Fulltext/2013/09000/A_New,_Evidence_based_Estimate_of_Patient_Harms.2.aspx). Negative error cultures make it unsafe to discuss or to admit to error and therefore, they are ineffective learning organizations. In negative error cultures, error discovery results in punishment. Negative error cultures nurture fear and humiliation and thereby make learning unsafe. Error there delivers the metacognitive declaration, “I failed.”

We should think in what ways our actions in higher education support positive or negative error cultures and what kinds of metacognitive conversations we nurture in participants (colleagues, students) of the culture. We can often improve intellectual development through understanding how the positive error cultures promote psychomotor development.

 

References

Gigerenzer, G. (2014) Risk Savvy: How to Make Good Decisions. New York New York: Penguin.

Nuhfer, E.B. (2009) “A Fractal Thinker Designs Deep Learning Exercises: Learning through Languaging. Educating in Fractal Patterns XXVIII, Part 2.” The National Teaching & Learning Forum, Vol. 19, No. 1, pp. 8-11.

Nuhfer, E.B. (2010a) “A Fractal Thinker Designs Deep Learning Exercises: Acts of Writing as “Gully Washers”- Educating in Fractal Patterns XXVIII, Part 3.” The National Teaching & Learning Forum, Vol. 19, No. 3, pp. 8-11.

Nuhfer, E.B. (2010b) “A Fractal Thinker Designs Deep Learning Exercises: Metacognitive Reflection with a Rubric Wrap Up – Educating in Fractal Patterns XXVIII, Part 4.” The National Teaching & Learning Forum, Vol. 19, No. 4, pp. 8-11.


Metacognitive Development in Professional Educators

Stewart, Cooper and Moulding investigate adult metacognition development, specifically comparing pre-service teachers and practicing teachers. They used the Metacognitive Awareness Inventory and found that metacognition improves significantly with age and with years of teaching experience, but not with gender or level of teaching (Pre-K though post-secondary ed levels).

Stewart, P. W., Cooper. S. S., & Moulding, L. R. (2007). Metacognitive development in professional educators. The Researcher, 21(1), 32-40.


Breaking the Content Mold: The Challenge of Shaping Student Metacognitive Development

by Dr. Lauren Scharff, U. S. Air Force Academy

We all know that it’s difficult to break long-term patterns of behavior, even when we’re genuinely motivated and well intentioned. It becomes significantly more difficult when we are trying to shift behavioral patterns of groups. This is true across a spectrum of situations and behaviors, but in this post I will focus on teachers and students shifting from a focus on content and basic skills to a focus on higher-level thinking and metacognitive skills.

These musing on “breaking the content mold” have become much more salient as I look forward to a new semester and I exchange ideas with colleagues about how we will approach our upcoming classes. I refer to the “content mold” as a way of illustrating how we, both students and teachers, have been shaped, or molded, by many years of prior experiences and expectations. Due to this shaping, the natural default for both groups is to teach or learn in ways that we have been exposed to in the past, especially if those approaches have seemed successful in the past. For many of us, this default is a focus on content and on disciplinary skills closely linked with the content. With conscious effort we can break out of that molded pattern of behavior to encourage interdisciplinary thinking and higher-level thinking skills that transfer beyond our course. However, when things get tough (e.g. when there are time constraints, high cognitive load situations, or pressures to achieve success as portrayed by exam scores), we tend to revert back to the more familiar patterns of behaviors, which for many of us means a focus on content and basic skills, rather than the use of higher-level thinking or metacognitive strategies.

Similarly, in an earlier post on this site, Ed Nuhfer points out that, “When students learn in most courses, they engage in a three-component effort toward achieving an education: (1) gaining content knowledge, (2) developing skills (which are usually specific to a discipline), and (3) gaining deeper understanding of the kinds of thinking or reasoning required for mastery of the challenges at hand. The American higher educational system generally does best at helping students achieve the first two. Many students have yet to even realize how these components differ, and few ever receive any instruction on mastering Component 3.”

One of the biggest challenges to breaking this molded pattern is that it will be far more likely to be successful if both the teacher and the student are genuinely engaged in the effort. No matter how much effort is put forth by an instructor, if value is not perceived by the student, then little change will occur. Similarly, even if a student has learned the value of higher-level thinking and metacognitive approaches, if a teacher doesn’t seem to value those efforts, then a student will astutely focus on what does seem to be valued by the teacher. A further challenge is that, over the course of a semester, the effort and motivation from both groups might wax and wane in a non-synchronous manner. As I explore these challenges, I will use myself and my less-than-successful efforts last semester as an example.

I taught an upper-level majors course in vision science, and because I have taught this course many times, I knew going in that the material is often unexpectedly challenging to students and most of them find the chapter readings to be difficult. (They contain a lot of brain biology and neural communication topics, and my students are not biology majors). Thus, I decided to build in a low-threat (with a small number of points), intentional, metacognitive reflection assignment for each lesson that had a reading. Students would indicate their level of reading completion (six levels encompassing a thorough reading with annotations, skimming, not at all) and their level of understanding of the material before class. If they had problems with any of the materials, they were supposed to indicate what steps they would take to develop understanding. They would record these and turn them in at mid-semester and at the end of the semester. I had hoped that this regular reflection would prompt their awareness of their reading behaviors and their level of learning from the reading, initiate proactive behaviors if they had poor understanding, and build habits by being completed regularly. I also took time at the start of the semester to explicitly explain why I was incorporating this regular reflection assignment.

Unfortunately, except for a couple of students, I would rate this assignment as a failure. I don’t believe it did any harm, but I also don’t believe that students used it as intended. Rather, I think most of them quickly and superficially answered the questions just so they could turn in their logs at the two required times. This type of reflection is not something that they have been asked to explicitly do in the majority (all?) of their prior courses, and they already had other strategies that seemed to work for their success in other classes For example, more than half way through the semester one student informed me that it was simply easier and faster to come to the teacher’s office and get reading guide answers (or homework problem solutions in other courses), rather than deeply read and try to figure it out on his own. Thus, if he didn’t understand as he skimmed, he didn’t worry about it. This approach wasn’t working well in my course, but up to that point he’d been very successful, so he persisted in using it (although I stopped answering his questions in my office until he could demonstrate that he’d at least tried to figure them out).

In hindsight, I believe that my actions (or lack of them) also fed into the failure. I assumed that students would bring their questions to class if they had them due to their increased awareness of them and the prompt about what they would do to increase their understanding. Thus, if there were no questions (typically the case), I used the class time to connect the readings with related application examples and demonstrations rather than reiterated what was in the readings. The students seemed engaged in class and showed no indication of specific problems with the readings. Their personal application reflection writing assignments (separate from the reading logs) were fantastic. However, their poor exam performance suggested that they weren’t deeply understanding the content, and I instinctively shifted back to my prior content-focused approaches. I also did not take time in class to directly ask them about their understanding of the readings, what parts they found most challenging, and why.

Thus, although I know I wanted to support the development of student metacognitive skills, and my students also seemed accepting of that goal when I introduced it to them at the beginning of the semester, both groups of us quickly reverted to old content-focused habits that had been “successful” in the past. I am not the first to note the challenges of developing metacognitive skills. For example, Case and Gunstone (2002) state the following, “Many … authors have emphasized that metacognitive development is not easy to foster (e.g., Gunstone & Mitchell, 1998; White, 1998). Projects to enhance metacognition need to be long-term, and require a considerable energy input from both teachers and students.”

So, what will I do in the future? My plans are to more regularly and explicitly engage in discussion of the reading reflection prompts (and other metacognitive prompts) during class. By giving class time to such discussion and bringing the metacognitive processes into the open (rather than keeping them private due to completion outside of class), I hope to indicate the value of the processes and more directly support student exploration of new ways of thinking about learning. Importantly, I hope that this more public sharing will also keep me from falling back to a simple content focus when student performance isn’t what I’d like it to be. Ultimately, metacognitive development should enhance student learning, although it is likely to take longer to play out into changed learning behaviors. I need to avoid the “quick fix” of focusing on content. Thus, I plan to shape a new mold for myself and openly display it my students. We’ll all be more likely to succeed if we are “all in” together.

——–

Nuhfer, E. (15 July 2014). Metacognition for Guiding Students to Awareness of Higher-level Thinking (Part 1). Improve with Metacognition. https://www.improvewithmetacognition.com/metacognition-for-guiding-students-to-awareness-of-higher-level-thinking-part-1/

Case, J. & Gunstone, R. (2002). Metacognitive Development as a Shift in Approach to Learning: An in-depth study. Studies in Higher Education 27(4), p. 459-470. DOI: 10.1080/0307507022000011561

 

 


Metacognition: What Makes Humans Unique

by

Arthur L. Costa, Professor Emeritus, California State University, Sacramento

And

Bena Kallick, Educational Consultant, Westport, CT

————–

 

“I cannot always control what goes on outside.But I can always control what goes on inside.”  Wayne Dyer

————–

Try to solve this problem in your head:

How much is one half of two plus two?

Did you hear yourself talking to yourself? Did you find yourself having to decide if you should take one half of the first two (which would give the answer, three) or if you should sum the two’s first (which would give the answer, two)?

If you caught yourself having an “inner” dialogue inside your brain, and if you had to stop to evaluate your own decision making/problem-solving processes, you were experiencing metacognition.

The human species is known as Homo sapiens, sapiens, which basically means “a being that knows their knowing” (or maybe it is “knows they are knowing”). What distinguishes humans from other forms of life is our capacity for metacognition—the ability to be a spectator of own thoughts while we engage in them.

Occurring in the neocortex and therefore thought by some neurologists to be uniquely human, metacognition is our ability to know what we know and what we don’t know. It is our ability to plan a strategy for producing what information is needed, to be conscious of our own steps and strategies during the act of problem solving, and to reflect on and evaluate the productiveness of our own thinking. While “inner language,” thought to be a prerequisite, begins in most children around age five, metacognition is a key attribute of formal thought flowering about age eleven.

Interestingly, not all humans achieve the level of formal operations (Chiabetta, 1976). And as Alexander Luria, the Russian psychologist found, not all adults metacogitate.

Some adults follow instructions or perform tasks without wondering why they are doing what they are doing. They seldom question themselves about their own learning strategies or evaluate the efficiency of their own performance. They virtually have no idea of what they should do when they confront a problem and are often unable to explain their strategies of decision making, There is much evidence, however, to demonstrate that those who perform well on complex cognitive tasks, who are flexible and persevere in problem solving, who consciously apply their intellectual skills, are those who possess well-developed metacognitive abilities. They are those who “manage” their intellectual resources well: 1) their basic perceptual-motor skills; 2) their language, beliefs, knowledge of content, and memory processes; and 3) their purposeful and voluntary strategies intended to achieve a desired outcome; 4) self-knowledge about one’s own leaning styles and how to allocate resources accordingly.

When confronted with a problem to solve, we develop a plan of action, we maintain that plan in mind over a period of time, and then we reflect on and evaluate the plan upon its completion. Planning a strategy before embarking on a course of action helps us keep track of the steps in the sequence of planned behavior at the conscious awareness level for the duration of the activity. It facilitates making temporal and comparative judgments; assessing the readiness for more or different activities; and monitoring our interpretations, perceptions, decisions, and behaviors. Rigney (1980) identified the following self-monitoring skills as necessary for successful performance on intellectual tasks:

  • Keeping one’s place in a long sequence of operations;
  • Knowing that a subgoal has been obtained; and
  • Detecting errors and recovering from those errors either by making a quick fix or by retreating to the last known correct operation.

Such monitoring involves both “looking ahead” and “looking back.” Looking ahead includes:

  • Learning the structure of a sequence of operations;
  • Identifying areas where errors are likely;
  • Choosing a strategy that will reduce the possibility of error and will provide easy recovery; and
  • Identifying the kinds of feedback that will be available at various points, and evaluating the usefulness of that feedback.

Looking back includes:

  • Detecting errors previously made;
  • Keeping a history of what has been done to the present and thereby what should come next; and
  • Assessing the reasonableness of the present immediate outcome of task performance.

A simple example of this might be drawn from reading. While reading a passage have you ever had your mind “wander” from the pages? You “see” the words but no meaning is being produced. Suddenly you realize that you are not concentrating and that you’ve lost contact with the meaning of the text. You “recover” by returning to the passage to find your place, matching it with the last thought you can remember, and, once having found it, reading on with connectedness.

Effective thinkers plan for, reflect on, and evaluate the quality of their own thinking skills and strategies. Metacognition means becoming increasingly aware of one’s actions and the effects of those actions on others and on the environment; forming internal questions in the search for information and meaning; developing mental maps or plans of action; mentally rehearsing before a performance; monitoring plans as they are employed (being conscious of the need for midcourse correction if the plan is not meeting expectations); reflecting on the completed plan for self- evaluation; and editing mental pictures for improved performance.

This inner awareness and the strategy of recovery are components of metacognition. Indicators that we are becoming more aware of our own thinking include:

  • Are you able to describe what goes on in your head when you are thinking?
  • When asked, can you list the steps and tell where you are in the sequence of a problem-solving strategy?
  • Can you trace the pathways and dead ends you took on the road to a problem solution?
  • Can you describe what data are lacking and your plans for producing those data?

When students are metacognitive, we should see them persevering more when the solution to a problem is not immediately apparent. This means that they have systematic methods of analyzing a problem, knowing ways to begin, knowing what steps must be performed and when they are accurate or are in error. We should see students taking more pride in their efforts, becoming self-correcting, striving for craftsmanship and accuracy in their products, and becoming more autonomous in their problem-solving abilities.

Metacognition is an attribute of the “educated intellect.” Learning to think about their thinking can be a powerful tool in shaping, improving, internalizing and habituating their thinking.

REFERENCES

Chiabetta, E. L. A. (1976). Review of piagetian studies relevant to science instruction at the secondary and college level. Science Education, 60, 253-261.

Costa, A. and Kallick B.(2008). Learning and Leading with Habits of Mind: 16 Characteristics for Success. Alexandria, VA: ASCD

Rigney, J. W. (1980). Cognitive learning strategies and qualities in information processing. In R. Snow, P. Federico & W. Montague (Eds.), Aptitudes, Learning, and Instruction, Volume 1. Hillsdale, NJ: Erlbaum.

 


How Do You Increase Your Student’s Metacognition?

Aaron S. Richmond

Metropolitan State University of Denver

 

How many times has a student come to you and said “I just don’t understand why I did so bad on the test?” or “I knew the correct answer but I thought the question was tricky.” or “I’ve read the chapter 5 times and I still don’t understand what you are talking about in class.”? What did you say or do for these students? Did it prompt you to wonder what you can do to improve your students’ metacognition? I know many of us at Improve with Metacognition (IwM), started pursuing research on metacognition because of these very experiences. As such, I have compiled a summary of some of the awesome resources IwM bloggers have posted (see below). These instructional strategies can be generally categorized into either self-contained lessons. That is a lesson that can teach some aspect of metacognition in one or two class sessions. Or metacognitive instructional strategies that require an entire semester to teach.

Self-Contained Instructional Strategies

In Stephen Chew’s Blog, Metacognition and Scaffolding Student Learning, he suggests that one way to improve metacognitive awareness is through well-designed review sessions (Chew, 2015). Chew suggests that students would metacogntively benefit by actively participate and incentivize participation in study review sessions. Second, Chew suggests that students should self-test before review so that it is truly a review. Third, have students predict their exam scores based on the review performance and have them reflect on their predictions after the exam.

Ed Nuhfer (2015) describes a way to increase metacognition through role-play. Ed suggests that we can use Edward De Bono’s Six Thinking hats method to train our students to increase their metacognitive literacy. In essence, using this method we can train our students to think in a factual way (white hat), be positive and advocate for specific positions (yellow hat), to be cautious (black hat), recognize all facets of our emotions (red hat), be provocative (green hat), and be reflective and introspective (blue hat). We can do this through several exercises where students get a turn to have different hats.

In David Westmoreland’s (2014) blog, he discusses a classroom exercise to improve metacognition. David created a “metacognitive lab that attempts to answer the question How do you know?” In the lab, he presents students in small groups a handful of “truth” statements (e.g., Eggs are fragile.). Then students must take the statement and justify (on the board) how it is true. Then the class eliminates the justifications if they know them not to be true. Then the students with one another about the process and why the statements were eliminated.

Course Long Instructional Strategies

Chris Was (2014) investigated whether “variable weight-variable difficulty tests” would improve students’ calibration (i.e., knowing when you know something and knowing when you don’t). Chris has his students take several quizzes. In each quiz, students can weight each question for varied amount of points (e.g., question 1 is easy so I will give it 5 points whereas question 4 is hard so I will only give it 2 points). Then students answer whether they believe they got the question correct or not. After each quiz is graded, a teaching assistant goes over the quiz and discusses with the students why they weighted the question the way they did and why the thought they would or would not get the question correct. Was found that this activity caused his students to become better at knowing when they knew or did not know something.

Similarly, Shumacher and Taraban (2015) discussed the use of the testing effect as a method to improve metacognition. They suggest there are mixed results of the testing method as an effective instructional method. That is, when students were repeatedly tested and were exposed to questions on multiple exams, only low achieving students metacognitively benefited.

John Draeger (2015) uses just-in-time teaching in attempt to improve metacognition. John asks students metacognitive prompting questions (e.g., What is the most challenging part of the reading?) prior to class and they submit their answers before coming to class. Although, he has not measured the efficacy of this method, students have responded positively to the process.

Parting Questions to Further this Important Conversation

There are many other instructional methods used to increase student metacognition described throughout IwM that are both self-contained and semester long. Please check them out!

But even considering all of what has been presented in this blog and available on IwM, I couldn’t help but leave you with some unanswered questions that I myself have:

  1. What other instructional strategies have you used to increase student metacognition?
  2. If you were to choose between a self-contained or semester long method, which one would you choose and why? Meaning, what factors would help you determine which method to use? Insructional goals? How closely related to course content? Time commitment? Level of student metacogntive knowledge? Level of course?
  3. Once you have chosen a self-contained or semester long method, how should implementation methods differ? That is, what are the best practices used when implementing a self-contained vs. semester long technique?
  4. Finally, often in the metacognition research in higher education, instructional strategies for improving metacognition are pulled from studies and experiments conducted in k-12 education. Are there any studies, which you can think of, that would be suitable for testing in higher education? If so, how and why?

References

Beziat, T. (2015). Goal monitoring in the classroom. Retrived from https://www.improvewithmetacognition.com/goal-monitoring-in-the-classroom/

Chew, S. (2015). Metacognition and scaffolding student learning. Retrieved from https://www.improvewithmetacognition.com/metacognition-and-scaffolding-student-learning/

Draeger, J. (2015). Using Justin-in-Time assignments to promote metacognition. Retrieved from https://www.improvewithmetacognition.com/using-just-in-time-assignments-to-promote-metacognition/

Nilson, L. B. (2015). Metacognition and specifications grading: The odd couple? Retrieved from https://www.improvewithmetacognition.com/metacognition-and-specifications-grading-the-odd-couple/

Nuhfer, E. (2015). Developing metacognitive literacy through role play: Edward De Bono’s six thinking hats. Retrieved from https://www.improvewithmetacognition.com/developing-metacognitive-literacy-through-role-play-edward-de-bonos-six-thinking-hats/

Shumbacher, J., & Traban, R. (2015). To test or not to test: That is the metacognitive question. Retrieved from https://www.improvewithmetacognition.com/to-test-or-not-to-test-that-is-the-metacognitive-question/

Was, C. (2014). Testing improves knowledge monitoring. Retrieved from https://www.improvewithmetacognition.com/testing-improves-knowledge-monitoring/

Westmoreland, D. (2014). Science and social controversy—A classroom exercise in metacognition. Retrieved from https://www.improvewithmetacognition.com/science-and-social-controversy-a-classroom-exercise-in-metacognition/

 


Teacher-led Self-analysis of Teaching

Clinical Supervision is a model of supervisor (or peer) review that stresses the benefits of a teacher-led self-analysis of teaching in the post-conference versus a conference dominated by the judgments of the supervisor.  Through self-reflection, teachers are challenged to use metacognitive processes to determine the effects of their teaching decisions and actions on student learning.  The Clinical Supervision model is equally applicable to all levels of schooling and all disciplines. This video walks you through the process.


Reciprocal Peer Coaching for Self-Reflection, Anyone?

By Cynthia Desrochers, California State University Northridge

I once joked with my then university president that I’d seen more faculty teach in their classrooms than she had. She nodded in agreement. I should have added that I’d seen more than the AVP for Faculty Affairs, all personnel committees, deans, or chairpersons. For some reason, university teaching is done behind closed doors, no peering in on peers unless for personnel reviews. We attempted to change that at CSU Northridge when I directed their faculty development center from 1996-2005. Our Faculty Reciprocal Peer Coaching program typically drew a dozen or more cross-college dyads over the dozen semesters it was in existence. The program’s main goal was teacher self-reflection.

I believe I first saw the term peer coaching when reading a short publication by Joyce and Showers (1983). What stuck me was their assertion that to have any new complex teaching innovation become part of one’s teaching repertoire required four steps: 1) understanding the theory/knowledge base undergirding the innovation, 2) observing an expert who is modeling how to do the innovation, 3) practicing the innovation in a controlled setting with coaching (e.g., micro-teaching in a workshop) and 4) practicing the innovation in one’s own classroom with coaching. They maintained that without all four steps, the innovation taught in a workshop would likely not be implemented in the classroom. Having spent much of my life teaching workshops about using teaching innovations, these steps became my guide, and I still use them today. In addition, after many years of coaching student teachers at UCLA’s Lab School, I realized that they were more likely to apply teaching alternatives that they identified and reflected upon in the post-conference than ones that I singled out. That is, they learned more from using metacognitive practices than from my direct instruction, so I began formulating some of the thoughts summarized below.

Fast forward many years to this past year, where I co-facilitated a yearlong eight-member Faculty Learning Community (FLC) focused on implementing the following Five Gears for Activating Learning: Motivating Learning, Organizing Knowledge, Connecting Prior Knowledge, Practicing with Feedback, and Developing Mastery [see previous blog]. With this FLC, we resurrected peer coaching on a voluntary basis in order to promote conscious use of the Five Gears in teaching. All eight FLC members not only volunteered to pair up for reciprocal coaching of one another, but they were eager to do so.

I was asked by one faculty member why is it called coaching, because an athletic coach often tells players what to do, versus helping them self-reflect. I responded that it’s because Joyce and Showers’ study looked at the research on training athletes and what that required for skill transfer. They showed the need for many practice sessions combined with coaching in order to achieve mastery of any new complex move, be it on the playing field or in the classroom. However, their point of confusion was noted, so now I refer to the process as Reciprocal Peer Coaching for Self-Reflection. This reflective type of peer coaching applies to cross-college faculty dyads who are seeking to more readily apply a new teaching innovation.

Reciprocal Peer Coaching for Self-Refection applies all or some of the five phases of the Clinical Supervision model described by Goldhammer(1969), which include: pre-observation conference, observation and data collection, data analysis and strategy, post-observation conference, and post-conference analysis. However, it is in the post-conference phase where much of the teacher self-reflection occurs and where the coach can benefit from an understanding of post-conference messages.

Prior to turning our FLC members loose to peer coach, we held a practicum on how to do it. And true to my statement above, I applied Joyce and Showers’ first three steps in our practicum (i.e., I explained the theory behind peer coaching, modeled peer coaching, and then provided micro-practice of a videotaped lesson taught by one of our FLC members). But in the micro-practice, right out of the gate, faculty coaches began telling the teacher how she used the Five Gears versus prompting her to reflect upon her own use first. Although I gently provided feedback in an attempt to redirect the post-conferences from telling to asking, it was a reminder of how firmly ingrained this default position has become with faculty, where the person observing a lesson takes charge and provides all the answers when conducting the post-conference. The reasons for this may include 1)prior practice as supervisors who are typically charged with this role, 2) the need to show their analytic prowess, or 3) the desire to give the teacher a break from doing all the talking. Whatever the reason, we want the teacher doing the reflective analysis of her own teaching and growing those dendrites as a result.

After this experience with our FLC, I crafted the conference-message matrix below and included conversation-starter prompts. Granted, I may have over-simplified the process, but it illustrates key elements for promoting Reciprocal Peer Coaching for Self-Reflection. Note that the matrix is arranged into four types of conference messages: successful and unsuccessful teaching-learning situations, where the teacher identifies the topic of conversation after being prompted by the coach (messages #1 and #3) and successful and unsuccessful teaching-learning situations, where the coach identifies the topic of conversation after being prompted by the teacher (messages #2 and #4). The goal of Reciprocal Peer Coaching for Self-Reflection is best achieved when the balance of the post-conference contains teacher self-reflection; hence, messages #1 and #3 should dominate the total post-conference conversation. Although the order of messages #1 through #4 is a judgment call, starting with message #1permits the teacher to take the lead in identifying and reflecting upon her conscious use of the Gears and their outcome –using her metacognition—versus listening passively to the coach. An exception to beginning with message #1 may be that the teacher is too timid to sing her own praises, and in this instance the coach may begin with message #2 when this reluctance becomes apparent. Note further that this model puts the teacher squarely in the driver’s seat throughout the entire post-conference; this is particularly important when it comes to message #4, which is often a sensitive discussion of unsuccessful teaching practices. If the teacher doesn’t want another’s critique at this time, she is told not to initiate message #4, and the coach is cautioned to abide this decision.

Reciprocal Peer Coaching for Self-Reflection

The numbered points under each of the four types of messages are useful components for discussion during each message in order to further cement an understanding of which Gear is being used and its value for promoting student learning: 1) Identifying the teaching action from the specific objective data collected by the coach (e.g., written, video, or audio) helps to isolate the cause-effect teaching episode under discussion and its effect on student learning. 2) Naming the Gear (or naming any term associated with the innovating being practiced) increases our in-common teaching vocabulary, which is considered useful for any profession. 3) Discussing the generalization about how the Gear helps students learn reiterates its purpose, fostering motivation to use it appropriately. And 4) crafting together alternative teaching-learning practices for next time expands the teacher’s repertoire.

The FLC faculty reported that their classroom Reciprocal Peer Coaching for Self-Reflection sessions were a success. Specifically, they indicated that they used the Five Gears more consciously after discussing them during the post-conference; that the Five Gears were beginning to become part of their teaching vocabulary; and that they were using the Five Gears more automatically during instruction. Moreover, unique to message #2, it provided the benefit of having one’s coach identify a teacher’s unconscious use of the Five Gears, increasing the teacher’s awareness of themselves as learners of an innovation, all of which serve to increase metacognition.

When reflecting upon how we might assist faculty in implementing the most promising research-based teaching-learning innovations, I see a system where every few years we allot reassigned time for faculty to engage in Reciprocal Peer Coaching for Self-Reflection.

References

Goldhammer, R. (1969). Clinical supervision. New York: Holt, Rinehart and Winston.

Joyce, B. & Showers, B. (1983). Power in staff development though research on training. Alexandria, VA: Association for Supervision and Curriculum Development.