Metacognitive Self-Assessment, Competence and Privilege

by Steven Fleisher, Ph.D., California State University Channel Islands

Recently I had students in several of my classes take the Science Literacy Concept Inventory (SLCI) including self-assessment (Nuhfer, et al., 2017). Science literacy addresses one’s understanding of science as a way of knowing about the physical world. This science literacy instrument also includes self-assessment measures that run parallel with the actual competency measures. Self-assessment skills are some of the most important of the metacognitive competencies. Since metacognition involves “thinking about thinking,” the question soon becomes, “but thinking about what?”

Dunlosky and Metcalfe (2009) framed the processes of metacognition across metacognitive knowledge, monitoring, and control. Metacognitive knowledge involves understanding how learning works and how to improve it. Monitoring involves self-assessment of one’s understanding, and control then involves any needed self-regulation. Self-assessment sits at the heart of metacognitive processes since it sets up and facilitates an internal conversation in the learner, for example “Am I understanding this material at the level of competency needed for my upcoming challenge?” This type of monitoring then positions the learner for any needed control or self-regulation, for instance “Do I need a change my focus, or maybe my learning strategy?” Further, self-assessment is affective in nature and is central to how learning works. From a biological perspective, learning involves the building and stabilizing of cognitive as well as affective neural networks. In other words, we not only learn about “stuff”, but if we engage our metacognition (specifically self-assessment in this instance), we are enhancing our learning to include knowing about “self” in relation to knowing about the material.

This Improve with Metacognition posting provides information that was shared with my students to help them see the value of self-assessing and for understanding its relationship with their developing competencies and issues of privilege. Privilege here is defined by factors that influence (advantage or disadvantage) aggregate measures of competence and self-assessment accuracy (Watson, et al., 2019). Those factors involved: (a) whether students were first-generation college students, (b) whether they were non-native English-language students, and (c) whether they had an interest in science.

The figures and tables below result from an analysis of approximately 170 students from my classes. The narrative addresses the relevance of each of the images.

Figure 1 shows the correlation between students’ actual SLCI scores and their self-assessment scores using Knowledge Survey items for each of the SLCI items (KSSLCI). This figure was used to show students that their self-assessments were indeed related to their developing competencies. In Figure 2, students could see how their results on the individual SLCI and KSSLCI items were tracking even more closely than in Figure 1, indicating a fairly strong relationship between their self-assessment scores and actual scores.

scatterplot graph of knowledge survey compared to SCLI scores
Figure 1. Correlation with best-fit line between actual competence measures via a Science Literacy Concept Inventory or SLCI (abscissa) and self-assessed ratings of competence (ordinate) via a knowledge survey of the inventory (KSSLCI) wherein students rate their competence to answer each of the 25 items on the inventory prior to taking the actual test.
scatter plot of SCLI scores and knowledge survey scores by question
Figure 2. Correlation with best-fit line between the group of all my students’ mean competence measures on each item of the Science Literacy Concept Inventory (abscissa) and their self-assessed ratings of competence on each item of the knowledge survey of the inventory (KSSLCI).

Figure 3 demonstrates the differences in science literacy scores and self-assessment scores among their different groups as defined by the number of science courses taken. Students could readily see the relationship between the number of science courses taken and improvement in science literacy. More importantly in this context, students could see that these groups had a significant sense of whether or not they knew the information, as indicated by the close overlapping of each pair of green and red diamonds. Students learn that larger numbers of participants can provide more confidence to where the true means actually lies. Also, I can show the meaning of variation differences within and between groups. In answering questions about how we know that more data would clarify relationships, I bring up an equivalent figure from our national database that shows the locations of the means within 99.9% confidence and the tight relationship between groups’ self-assessed competence and their demonstrated competence.

categorical plot by number of college science courses completed
Figure 3. Categorical plot of my students in five class sections grouped by their self-identified categories of how many college-level science courses that they have actually completed. Revealed here are the groups’ mean SLCI scores and their mean self-assessed ratings. Height of the green (SLCI scores) and red (KSSLCI self-assessments) diamonds reveals with 95% confidence that the actual mean lies within these vertical bounds.

Regarding Figure 4, it is always fun to show students that there’s no significant difference between males and females in science literacy competency. This information comes from the SLCI national database and is based on over 24,000 participants.

categorical plot by binary gender
Figure 4. Categorical plot from our large national database by self-identified binary gender categories shows no significant difference by gender in competence of understanding science as a way of knowing.

It is then interesting to show students in that, in their smaller sample (Figure 5), there is a difference between the science literacy scores of males and females. The perplexed looks on their faces are then addressed by the additional demographic data in Table 1 below.

categorical plot by binary gender for individual class
Figure 5. Categorical plot of just my students by binary gender reveals a marginal difference between females and males, rather than the gender-neutral result shown in Fig. 4.

In Table 1, students could see that higher science literacy scores for males in their group were not due to gender, but rather, were due to significantly higher numbers of English as a non-native language for females. In other words, the women in their group were certainly not less intelligent, but had substantial, additional challenges on their plates.  

Table 1: percentages of male and female students as first generation, English and non-native speaker, and with respect to self-report interest to major in science

Students then become interested in discovering that the women demonstrated greater self-assessment accuracy than did the men, who tended to overestimate (Figure 6). I like to add here, “that’s why guys don’t ask for directions.” I can get away with saying that since I’m a guy. But more seriously, I point out that rather than simply saying women need to improve in their science learning, we might also want to help men improve in their self-assessment accuracy.   

categorical plot by gender including self-assessment data
Figure 6. The categorical plot of SLCI scores (green diamonds) shown in Fig. 5 now adds the self-assessment data (red diamonds) of females and males. The trait of females to more accurately self-assess that appears in our class sample is also shown in our national data. Even small samples taken from our classrooms can yield surprising information.

In Figure 7, students could see there was a strong difference in science literacy scores between Caucasians and Hispanics in my classes. The information in Table 2 below was then essential for them to see. Explaining this ethnicity difference offers a wonderful discussion opportunity for students to understand not only the data but what it reveals is going on with others inside their classrooms.

Figure 7. The categorical plot of SLCI scores by the two dominant ethnicities in my classroom. My campus is a Hispanic Serving Institution (HSI). The differences shown are statistically significant.

Table 2 showed that the higher science literacy scores in this sample were not simply due to ethnicity but were impacted by significantly greater numbers of first-generation students and English as a non-native language between groups. These students are not dumb but do not have the benefits in this context of having had a history of education speak in their homes and are navigating issues of English language learning. 

Table 2: percentage of white and hispanic students who report to be first generation students, English as non-native speakers, and interested in majoring in science.

When shown Figure 8, which includes self-assessment scores as well as SLCI scores, students were interested to see that both groups demonstrated fairly accurate self-assessment skills, but that Hispanics had even greater self-assessment accuracy than their Caucasian colleagues. Watson et. al (2019) noted that strong self-assessment accuracy for minority groups comes about from a need for being understandably cautious.

categorical plot by ethnicity and including self-assessment
Figure 8. The categorical plot of SLCI scores and self-assessed competence ratings for the two dominant ethnicities in my classroom. Groups’ collective feelings of competence, on average, are close to their actual competence. Explaining these results offered a wonderful discussion opportunity for students.

Figure 9 shows students that self-assessment is real. In seeing that most of their peers fall within an adequate range of self-assessment accuracy (between +/- 20 percentage points), students begin to see the value of putting effort into developing their own self-assessment skills. In general, results from this group of my students are similar to those we get from our larger national database (See our earlier blog post, Paired Self-Assessment—Competence Measures of Academic Ranks Offer a Unique Assessment of Education.)

distribution of self-assessment accuracy for individual course
Figure 9. The distribution of self-assessment accuracy of my students in percentage points (ppts) as measured by individuals’ differences between their self-assessed competence by knowledge survey and their actual competence on the Concept inventory.

Figure 10 below gave me the opportunity to show students the relationship between their predicted item-by-item self-assessment scores (Figure 9) and their postdicted global self-assessment scores. Most of the scores fall between +/- 20 percentage points, indicating good to adequate self-assessment. In other words, once students know what a challenge involves, they are pretty good at self-assessing their competency.

distribution of self-assessment accuracy for individual course after taking SCLI
Figure 10. The distribution of self-assessment accuracy of my students in percentage points (ppts) as measured by individuals’ differences between their postdicted ratings of competence after taking the SLCI and their actual scores of competence on the Inventory. In general, my students’ results are similar in self-assessment measured in both ways.

In order to help students further develop their self-assessment skills and awareness, I encourage them to write down how they feel they did on tests and papers before turning them in (postdicted global self-assessment). Then they can compare their predictions with their actual results in order to fine-tune their internal self-assessment radars. I find that an excellent class discussion question is “Can students self-assess their competence?” Afterward, reviewing the above graphics and results becomes especially relevant. We also review self-assessment as a core metacognitive skill that ties to an understanding of learning and how to improve it, the development of self-efficacy, and how to monitor their developing competencies and control their cognitive strategies.

References

Dunlosky, J. & Metcalfe, J. (2009). Metacognition. Sage Publications Inc., Thousand Oaks, CA.

Nuhfer, E., Fleisher, S., Cogan, C., Wirth, K., & Gaze, E. (2017). How Random Noise and a Graphical Convention Subverted Behavioral Scientists’ Explanations of Self-Assessment Data: Numeracy Underlies Better Alternatives. Numeracy, Vol 10, Issue 1, Article 4. DOI: http://dx.doi.org/10.5038/1936-4660.10.1.4

Watson, R., Nuhfer, E., Nicholas Moon, K., Fleisher, S., Walter, P., Wirth, K., Cogan, C., Wangeline, A., & Gaze, E. (2019). Paired Measures of Competence and Confidence Illuminate Impacts of Privilege on College Students. Numeracy, Vol 12, Issue 2, Article 2. DOI: https://doi.org/10.5038/1936-4660.12.2.2


The Evolution of Metacognition in Biological Sciences

By Lindsay Doukopoulos, Assistant Director of the Biggio Center for the Enhancement of Teaching and Learning at Auburn University, and blog mini-series editor.

Much of the literature on metacognition focuses on strategies that faculty can use to improve metacognitive skills in their students and the benefits of such skills. Our mini-series tackles a different kind of problem: how can a department redesign its curriculum to improve metacognition for all students and how will it know if improvement has actually occurred?  We believe our efforts can inform others across a variety of disciplines.

Our answer to this question takes the form of a case study in five parts about our collaborative and ongoing efforts to redesign the Department of Biological Sciences’ undergraduate curriculum and program assessment with a goal of improving metacognition for its students and demonstrating that improvement with data. We use a narrative structure to present the key inflection points in this process as well as lessons learned and best practices from our diverse perspectives.

Our collaborators include: Associate Dean for Academic Affairs for the College of Sciences and Mathematics, Bob Boyd (also a Biological Sciences professor and formerly the department’s Undergraduate Program Officer); Associate Director of Academic Assessment, Katie Boyd; Associate Director of the Office of University Writing, Chris Basgier; Chair of the Department of Biological Sciences, Scott Santos; and Assistant Director of the Biggio Center for the Enhancement of Teaching and Learning, Lindsay Doukopoulos.  

This timeline provides an overview of our efforts while our individual posts go into more detail about specific strategies and outcomes:  

Ideation: 

June 2016: Department leaders attend PULSE Institute and decide to make metacognition a student learning outcome (SLO) for all undergraduate programs the Department of Biological Sciences (hereafter, Biology) 

May 2017: Program assessment reports at this time include only two student learning outcomes (metacognition not one of them) for each of the three undergraduate programs in Biology 

August 2017: Faculty retreat led by NSF Vision & Change experts introducing metacognitive teaching strategies  

Commitment: 

October 2017: Learning Improvement Initiative launched by Biggio Center and Office of Academic Assessment: Biology proposes to improve SLO 6 – Metacognition  

Spring 2018: Biology’s curriculum committee develops a plan for improvement and creates an ideal (“aspirational”) curriculum map to share at the 2018 fall faculty retreat 

Lindsay Doukopoulos leading faculty development on metacognition at the 2018 Biology Faculty Retreat
Lindsay Doukopoulos leading faculty development on metacognition at the 2018 Biology Faculty Retreat

Conflict: 

August 2018: Faculty retreat, aka “Metacognition Massacre” – widespread faculty rejection of the metacognition SLO on the curriculum map 

A New Approach: 

Fall 2018: A three-part workshop series created by Office of University Writing (OUW) and the Biggio Center leads faculty to redefine the metacognition SLO and introduces strategies to support faculty teaching  

Turning Point:  

December 2018: Outcomes of the workshop series, including the new definition of SLO 6, are presented at a faculty meeting and the faculty vote to approve the new definition  

Assessing Metacognition:  

January – April 2018: Office of Academic Assessment and the Biggio Center lead Biology’s curriculum committee in creating a metacognitive questionnaire for graduating students and a rubric to assess the level of metacognition evidenced in the responses 

Improving Metacognition: 

Summer 2019: Biology invests in comprehensive strategy to promote metacognition across the curriculum using ePortfolios and several faculty participate in an intensive course redesign program 

What now?  

Fall 2019: OUW and Biggio provide ongoing support of teaching interventions to improve metacognition; Office of Academic Assessment provides ongoing support of the assessment of this work 

What’s next? 

Spring 2020: Gather baseline data on graduates’ metacognitive capabilities  Goals: Based on our efforts and an ongoing collection of data, we expect to see increases in students’ metacognitive abilities over time 


How do you know you know what you know?

by Patrick Cunningham, Ph.D., Rose-Hulman Institute of Technology

Metacognition involves monitoring and controlling one’s learning and learning processes, which are vital for skillful learning. In line with this, Tobias and Everson (2009) detail the central role of accurate monitoring in learning effectively and efficiently. Metacognitive monitoring is foundational for metacognitive control through planning for learning, selecting appropriate strategies, and evaluating learning accurately (Tobias & Everson, 2009).

Hierarchy of Metacognitive Control, with Monitoring Knowledge at the bottom, followed by Selecting Strategies, Then Evaluating Learning, with Planning at the top

Figure 1 – Hierarchy of metacognitive regulatory processes. Adapted from Tobias and Everson (2009).

Unfortunately, students can be poor judges of their own learning or fail to engage in the judging of their learning and, therefore, often fail to recognize their need for further engagement with material or take inappropriate actions based on inaccurate judgements of learning (Ehrlinger & Shain, 2014; Winne and Nesbit, 2009). If a student inaccurately assesses their level of understanding, they may erroneously spend time with material that is already well known or they may employ ineffective strategies, such as a rehearsal strategy (e.g., flash cards) to build ROTE memory when they really need to implement an elaborative strategy (e.g., explaining the application of concepts to a new situation) to build richer integration with their current knowledge. This poor judgement extends to students’ perceptions of the effectiveness of their learning processes, as noted in the May 14th post by Sabrina Badali, Investigating Students’ Beliefs about Effective Study Strategies[. There Badali found that students were more confident in using massed practice over interleaved practice even though they performed worse with massed practice.

Fortunately, we can help our students to develop more accurate self-monitoring skills. The title question is one of my go-to responses to student claims of knowing in the face of poor performance on an assignment or exam. I introduced it in my April 4th blog post, Where Should I Start with Metacognition? It gently, but directly asks for evidence for knowing. In our work on an NSF grant to develop transferable tools for engaging students in their metacognitive development, my colleagues and I found that students struggle to cite concrete and demonstrable (i.e., objective) evidence for their learning (Cunningham, Matusovich, Hunter, Blackowski, and Bhaduri, 2017). It is important to gently persist. If a student says they “reviewed their notes” or “worked many practice problems,” you can follow up with, “What do you mean by review your notes?” or “Under what conditions were you working the practice problems?” The goal is to learn more about the students’ approach while avoiding making assumptions and helping the student discover any mismatches.

We can also spark monitoring with pedagogies that help students accurately uncover present levels of understanding (Ehrlinger & Shain, 2014). Linda Nilson (2013) provides several good suggestions in her book Creating Self-Regulated Learners. Retrieval practice takes little time and is quite versatile. Over a few minutes a student recalls all that they can about a topic or concept, followed by a short period of review of notes or a section of a book. The whole process can be done individually, or as individual recall followed by pair or group review. Things that are well-known are present with elaborating detail on the list. Less well-known material is present, but in sparse form. Omissions indicate significant gaps in knowledge. The practice is effortful, and students may need encouragement to persist with it.

I have used retrieval practice at the beginning of classes before continuing on with a topic from the previous day. It can also be employed as an end-of-class summary activity. I think the value added is worth the effort. Because of its benefits and compactness, I also encourage students to use retrieval practice as a priming activity before regular homework or study sessions. Using it in class can also lower students’ barriers to using it on their own, because it makes it more familiar and it communicates the value I place on it.

Nilson (2013) also offers “Quick-thinks” and Think Aloud problem -solving. “Quick-thinks” are short lesson breaks and can include “correct the error” in a short piece of work, “compare and contrast”, “reorder the steps”, or other activities. A student can monitor their understanding by comparing to the instructor’s answer or class responses. Think Aloud problem-solving is a pair activity where one student talks through their problem-solving process while the other student listens and provides support, when needed, for example, by prompting the next step or asking a guiding question. Students take turns with the roles. A student’s fluency in solving the problem or providing support indicates deeper learning of the material. If the problem-solving or the support are halting and sparse, then those concepts are less well-known by the student. As my students often study in groups outside of class, I recommend that they have the person struggling with a problem or concept talk through their thinking out loud while the rest of the group provides encouragement and support.

Related to Think Alouds, Chiu and Chi (2014) recommend Explaining to Learn. A fluid explanation with rich descriptions is consistent with deeper understanding. A halting explanation without much detail uncovers a lack of understanding. I have used this in various ways. In one form, I have one half of the class work one problem and the other half work a different problem or a variant of the first. Then I have them form pairs from different groups and explain their solutions to one another. Both students are familiar with the problems, but they have a more detailed experience with one. I also often use this as I help students in class or in my office. I ask them to talk me through their thinking up to the point where they are stuck, and I take the role of the supporter.

The strategies above provide enhancements to student learning in their own right, but they also provide opportunities for metacognitive monitoring – checking their understanding against a standard or seeking objective evidence to gauge their level of understanding. To support these metacognitive outcomes I make sure to explicitly draw students’ attention to the monitoring outcomes when I use pedagogies to support monitoring. I am also transparent about this purpose and encourage students to seek better evidence on their own, so they can truly know what they know.

As you consider adding activities to your course that support accurate self-assessment and monitoring, please see the references for further details. You may also want to check out Dr. Lauren Scharff’s post “Know Cubed” – How do students know if they know what they need to know? In this post Dr. Scharff examines common causes of inaccurate self-assessment and how we might be contributing to it. She also offers strategies we can adopt to support more accurate student self-assessment. Let’s help our student generate credible evidence for knowing the material, so they can make better choices for their learning!

References

Chiu, J. L. & Chi, M. T. H.  (2014). Supporting Self-Exlanation in the Classroom. In V. A. Benassi, C. E. Overson, & C. M. Hakala (Eds.). Applying science of learning in education: Infusing psychological science into the curriculum. Retrieved from the Society for the Teaching of Psychology web site: http://teachpsych.org/ebooks/asle2014/index.php

Cunningham, P., & Matusovich, H. M., & Hunter, D. N., & Blackowski, S. A., & Bhaduri, S. (2017), Beginning to Understand Student Indicators of Metacognition.  Paper presented at 2017 ASEE Annual Conference & Exposition, Columbus, Ohio. https://peer.asee.org/27820

Ehrlinger, J. & Shain, E. A.  (2014). How Accuracy in Students’ Self Perceptions Relates to Success in Learning. In V. A. Benassi, C. E. Overson, & C. M. Hakala (Eds.). Applying science of learning in education: Infusing psychological science into the curriculum. Retrieved from the Society for the Teaching of Psychology web site: http://teachpsych.org/ebooks/asle2014/index.php

Nilson, L. B. (2013). Creating Self-Regulated Learners: Strategies to Strengthen Students’ Self-Awareness and Learning Skills. Stylus Publishing: Sterling, VA.

Tobias, S. & Everson, H. (2009). The Importance of Knowing What You Know: A Knowledge Monitoring Framework for Studying Metacognition in Education. In Hacker, D., Dunlosky, J., & Graesser, A. (Eds.) Handbook of Metacognition in Education. New York, NY: Routledge, pp. 107-127.

Winne, P. & Nesbit, J. (2009). Supporting Self-Regulated Learning with Cognitive Tools. In Hacker, D., Dunlosky, J., & Graesser, A. (Eds.) Handbook of Metacognition in Education. New York, NY: Routledge, pp. 259-277.



Changing Campus Culture with the Ace-Your-Course Challenge

In the final post of the guest blog series on “Working with Faculty to Promote Metacognition,” Dr. Eric Kaldor discusses lessons learned from the implementation of a campus-wide metacognition program inspired by Saundra McGuire’s work. The associated research project was awarded the Robert J. Menges Award for Outstanding Research in Educational Development by the Professional and Organizational Development (POD) Network.

by Eric Kaldor, Ph.D.; Associate Director, Sheridan Center for Teaching & Learning, Brown University

For many faculty members, the “fact” that some students are just not capable of college-level learning remains part of the taken-for-granted assumptions embedded in the culture of disciplines and campuses. Despite significant efforts to share insights from the scholarship on metacognition and growth mindsets (e.g. Doyle & Zakrajsek, 2013; Dweck, 2016; McGuire, 2015; Nilson, 2013), campus cultures are slow to change, and fixed mindsets continue to dominate many institutions. This post describes efforts to change the culture at the University of Rhode Island, the communication strategy we used, and some lessons learned.

With approximately 14,000 undergraduate students and 1,000 full and part-time faculty, the University of Rhode Island is a challenging setting to advance culture change. Our story began with a conversation with Melvin Wade, former Director of the Multicultural Student Services Center (MSSC). I was working in the Office for the Advancement of Teaching & Learning (ATL) and planning for Saundra McGuire to visit our campus. I was particularly concerned to fill our 1,000-person auditorium with students for her “Metacognition is Key” workshop. When I asked for his advice, Melvin insisted we must ensure her visit had a lasting impact on our campus. Toward this end, we assembled a group of professional staff and graduate students from ATL, the MSSC, the Academic Enhancement Center, First-Year Programs, and Professional Advising. Over a series of conversations, this informal group conceived of something we came to call the Ace-Your-Course (AYC) Challenge. We assumed we would only run the AYC Challenge once as a companion to Dr. McGuire’s workshop. Instead, a snowstorm gave the Challenge a much longer life.

Building on the McGuire Model

We designed the AYC Challenge to extend students’ metacognitive experience and reflections beyond Dr. McGuire’s workshop. We developed the AYC Challenge as four weekly self-assessment surveys (for detailed description see Kaldor & Swanson, 2019) to create additional metacognitive experiences (Flavell, 1979) by encouraging students to:

  1. Test learning strategies relevant to them individually.
  2. Engage in key practices for metacognitive reflection: observation, description, evaluation, and action planning.
  3. Feel part of a larger community working to grow as learners.

When a snowstorm postponed Dr. McGuire’s visit to the next semester, our multi-unit team led her workshop twice using slides and talking points from her book (McGuire, 2015) and invited students to participate in the AYC Challenge. Of the 240 students attending a workshop, 50 completed all four weeks of the challenge. After we shared the positive results from our pilot with faculty members, many encouraged their students to attend Dr. McGuire’s rescheduled workshop in September 2017. Some went further and agreed to share grade data as part of an IRB-approved study to examine how participation affected grades. We specifically identified a set of gateway science courses from Chemistry, Biology, and Nutrition and Food Sciences that have large enrollments of first-year students.

Over 1,000 students attended Dr. McGuire’s workshop with some in remote viewing locations, and 202 of those completed the second AYC Challenge. The self-reported results for this larger group were strikingly similar to those from students in the pilot AYC Challenge when we led the workshops. Holly Swanson and I analyzed final grades for 979 students in the eight gateway science sections (347 attended the workshop and of those 55 completed the challenge) using OLS regression with controls for several predictors of academic performance including high school GPA and exam 1 z-score. Compared with their peers who did not attend the workshop or participate in the challenge, attending the workshop and completing the AYC Challenge was associated with a final course grade half of a letter grade higher (Kaldor & Swanson, 2018).

Inclusive and Extensive Communication

Much of our success originated from a spiral of communication that grew outwards from a core group of professional staff and graduate students who became involved in planning for Dr. McGuire’s originally scheduled visit. Our colleagues working in various student support services helped develop a plan to reach students and motivate them to attend the workshop and participate in the challenge. These colleagues advised us on when to hold the workshop, how to market our efforts, and what kinds of messages would appeal to students.

One critical piece of advice was that students were more likely to attend if instructors offered extra credit. In the faculty development office, we knew that instructors of large enrollment courses would only offer extra credit if it did not add significant work. Using google forms, a mail merge add-on, card swipe readers, and course rosters, we developed a system for students to pre-register, receive reminder emails, and swipe their id cards after the workshop. With this system in place, instructors for over 30 courses received a list of student attendees within a week of the workshop.

To nudge students who attended the workshop to start the Challenge and complete all four weeks, we used two techniques. First, students were told that completing all four weeks would make them eligible for a drawing for ten $100 gift cards to the campus book store. Second, we started the Challenge at the end of the workshop with students selecting one or more strategies to try on a Google form at the end of the workshop.

photo of Ace Your Course Challenge winners
Four of the ten winners of a raffle for students who completed the Ace Your Course Challenge.

The next spiral outwards involved engaging more faculty in a conversation on the powerful ways they could help their students learn. Prior initiatives that had promoted Dweck’s (2016) insights on growth mindsets had primed many faculty and staff for these conversations. Specifically, they wanted to know what else they could do beyond promoting a growth mindset, and a metacognitive approach to learning strategies offered them concrete answers.

In addition, faculty members who had moved away from fixed mindsets about who could succeed in their courses shared their insights on how to approach their still skeptical peers. We developed a strategy of presenting quantitative data alongside student voices to describe the student experience (examples are available here: https://web.uri.edu/atl/ace-your-course-challenge/). Initially, our quantitative data was limited to student self-reports. With the benefit of a snowstorm, we had the chance to organize an IRB approved research project to answer important questions that skeptics raised.

As we shared this data on campus, we were asked to try different permutations of the Metacognition Workshop plus AYC Challenge in two different settings – a support program for conditionally readmitted students and two gateway chemistry courses. In addition, we were asked to offer workshops for professional staff and faculty so they could include McGuire’s approach in their programs and courses.

One of the most successful workshops, “Teach Your Students How to Learn in 50-minutes” provided an annotated version of Dr. McGuire’s slides with breakout discussions about the key messages to motivate students. This led many instructors to experiment with including different elements of her metacognitive approach to learning strategies into their courses.

Some Lessons Learned and Suspected

Each AYC Challenge has generated new data and insights into the potential for URI students to make significant gains in their metacognition. This new data has generated new conversations, which have led to variations on the McGuire workshop and/or the AYC Challenge. This has been a fruitful if unintended process.

Our skeptical internal voice continues to ask how we could nudge more students into participating. We noticed lower participation rates for students from historically marginalized groups in our gateway science course study. This led us to experiment with embedding the workshop plus challenge into courses, but our early experience raised many concerns around overloading instructors and maintaining fidelity with the core AYC challenge experience.

In a promising next iteration, my URI colleague Michelle Fontes-Barros has suggested a partnership with student organizations and clubs, particularly STEM affinity groups for students from historically marginalized groups. Convinced of the value, a student group might sponsor a workshop in a regular meeting space. Student leaders might promote peer commitments to complete the AYC Challenge. Past AYC Challenge participants might help present the workshop and send messages during the Challenge to encourage persistence. This next iteration has the potential to be much more student-centered, but it will be important to critically evaluate the student experience and share results with the wider university community to energize the campus conversation on metacognitive development.

Doyle, T., & Zakrajsek, T. (2013). The New science of learning: How to learn in harmony with your brain. Sterling, VA: Stylus Publishing.

Dweck, C. S. (2016). Mindset: The new psychology of success (Updated Edition). New York, NY: Ballantine Books.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive developmental inquiry. American Psychologist, 34(10), 906–911.

Kaldor, E., & Swanson, H. (2018, November). A campus-wide strategy to develop metacognition in gateway science courses. Paper presented at the POD Network Conference, Portland, Oregon.

Kaldor, E., & Swanson, H. (2019). How can you elevate metacognition on your campus? Try the Ace-Your-Course Challenge. The National Teaching & Learning Forum, 28(2), 5–7.

McGuire, S. Y. (2015). Teach students how to learn: Strategies you can incorporate into any course to improve student metacognition, study skills, and motivation. Sterling, VA: Stylus Publishing.

Nilson, L. B. (2013). Creating self-regulated learners: Strategies to strengthen students’ self-awareness and learning skills. Sterling, VA: Stylus Publishing.


Who is Qualified to Teach Metacognition?

In the second post of the “Working with Faculty to Promote Metacognition” guest series, Dr. Nirmal Trivedi discusses several ways he helps a diverse set of instructors with varying metacognition experience integrate the topic into their first-year seminar courses. For his work with first-year seminars, Dr. Trivedi received the 2018 Excellence in Teaching First-Year Seminars Award from the National Resource Center for The First-Year Experience and Students in Transition.

by Nirmal Trivedi, Ph.D.
Director, First-Year Seminars and Assistant Professor of English
Department of First-Year and Transition Studies
Kennesaw State University 

Who is qualified to teach metacognition? If we agree that teaching the concept often results in improved academic performance, shouldn’t all faculty members be trained on how, when, and why metacognition should be embedded into their courses, regardless of content area?

At Kennesaw State University, we’ve had a unique opportunity to redesign our First-Year Seminar course to include a heavy focus on metacognition. This 3-credit academic seminar, which is largely uniform in content and required of most first-time students, serves approximately 3,500 students each fall semester with anywhere between 65-80 part-time and full-time faculty teaching the course. The vast majority of these instructors do not have a background in psychology of human learning, and many have either never taught college students, or have only taught them beyond the first year.

Thankfully, student testimonials reflecting positive experiences with our seminar’s focus on metacognition have served to intrigue those who are new to metacognition and convinced skeptical faculty of the value of teaching the concept and its practice. Recent popularization of the concept by Professor Emerita of Chemical Education, Dr. Sandra McGuire, has helped to demystify the term for students and educators alike. Her two books, Teach Students How to Learn and Teach Yourself How to Learn, written for faculty and students, respectively, provide guidance for the uninitiated. McGuire effectively shows why metacognition is essential for all educators to know and teach. An answer to how one builds a local cohort of metacognition experts without the disciplinary expertise in educational psychology, however, can be elusive.

As someone new to teaching metacognition—a kind of “metacognition convert” myself—I can relate to the need of faculty for a clear rationale for changing teaching methods. In this post, I outline five steps that we have used at KSU to develop faculty to incorporate metacognition into their own teaching. In our First-Year Seminar, we train faculty on how and when to take each of the steps through an initial training session at the beginning of the semester and by providing template assignments with embedded reflection questions that call for metacognitive thinking. This approach has helped us build a growing cohort of local metacognition advocates.

Step 1: Make Student Learning Transparent to Faculty

Most faculty agree that they want to see more engagement in class material from their students. Most want to see their students read carefully, practice their writing skills, and to self-evaluate how and why certain learning strategies work or don’t work. Perhaps most of all, faculty want more time to discuss ideas and less time guessing how much or how little students have learned by the end of each class session.

In our program, we train faculty to make learning transparent by asking students to write two “takeaways” at the end of a class session. The questions are “what are two points you will remember from today’s lecture and what is one question that you still have about the topic?” The exercise demonstrates what students actually remember from a class session. In our faculty training, we incorporate the “takeaways” after each component of the training to showcase how the exercise works and to help the training presenters to clarify their own message. In seeing how the simple metacognitive exercise can assist faculty in their own learning by making learning transparent, faculty begin to ask about how much of what they themselves teach is actually absorbed by their students.

The takeaways also serve as a useful beginning of the next session’s lecture or discussion. Faculty have found that their own class preparation is significantly simplified since they are better informed as to what their students understand and don’t understand. It’s important that these takeaways remain anonymous to allow for authentic student responses.

Step 2: Relate to the Student Experience

We train faculty to respond to the students’ desire to learn deeply by focusing learning outcomes not only on teaching content, but on how to learn the content. For example, we prepare faculty to show their students how to read the course textbook through strategic skimming, annotating, and self-testing. In one component of the training, each faculty member is taught a short lesson about how to read actively (as if the faculty were the students) followed by a series of student testimonials reflecting on the lesson’s impact.

From the student perspective, it matters less that the faculty member is a metacognition expert than if he or she truly cares about their learning. We know how underprepared many students are for college academics, but we often neglect to understand that many students crave to learn and just need appropriate challenge and support. As one student says in a takeaway after an active reading lesson, “why don’t they teach

Student quote: "Why don't they teach this earlier in the school system?"

Step 3: Conduct a Learning Demonstration

Faculty can be students too. In addition to demonstrating how to teach active reading, we introduce metacognition to faculty through common learning experiments such as the “Count the Vowels” Activity,” or a “Levels of Processing” activity. The goal of these exercises is to emphasize how memory is tied to our brain chemistry.

Faculty tend to value the impact of these exercises as they themselves are in the process of mastering the new content.

Step 4: Demonstrate Value of Process Alongside Content

Of course, it is essential that faculty understand that conceptual knowledge about metacognition is distinct from practicing metacognitive techniques (Pintrich, 2002). The distinction makes way for a productive discussion about the amount of content required for each course.

For some time, teaching scholars have been urging faculty to consider how much specific course content is necessary. The Association of American Colleges & Universities, for example, conceptualized their “Essential Learning Outcomes” (ELO) as balancing content knowledge with skill-based actions. With a course like a First-Year Seminar that makes metacognition and its practice very explicit, achieving an ELO like “developing skills for lifelong learning” quite feasible.

Each of our template assignments and rubrics include prompts for students to explain how they learned the material as well as what they have learned.

Step 5: Make the Case for Equity in Learning Skills

Ultimately, teaching students how we learn can bridge the gap between those who have had opportunities to explicitly practice these metacognition techniques in secondary school and those who have never encountered them before. The responsibility of educators is all the more important since even non-experts in learning theory can learn and disseminate the techniques with minimal training. Elaborative interrogation, self-explanation, and practice testing, for example, are among the most impactful learning strategies and least complicated to teach (Dunlosky, Rawson, Marsh, Nathan, & Willingham, 2013).

In sum, faculty may at first feel like they have to learn a new field to teach metacognition. We tell them that they may already be teaching students these skills, and that once they see the scholarly basis for these techniques, they can teach them with confidence. In a post-semester reflection on the impact of incorporating metacognition and its practice for the first time with an assignment, one faculty member made a comment that was echoed by several instructors—that “it made more of a difference to my students than any other assignment I’ve ever taught.”

References

Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving Students’ Learning with Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology. Psychological Science in the Public Interest: A Journal of the American Psychological Society, 14(1), 4–58. https://doi.org/10.1177/1529100612453266

Pintrich, P. R. (2002). The Role of Metacognitive Knowledge in Learning, Teaching, and Assessing. Theory Into Practice, 41(4), 219–225. https://doi.org/10.1207/s15430421tip4104_3


Promoting Metacognition Across the Institution through our Partnerships with Faculty: The Educational Developer’s Role

by Hillary H. Steiner, Ph.D., Kennesaw State University

Dr. Hillary Steiner is the Interim Associate Director for the Scholarship of Teaching and Learning (SoTL) for the Center for Excellence in Teaching and Learning at Kennesaw State University in Georgia, USA. She is our Summer 2019 guest editor for a blog post series that shares case studies across three institutions. These case studies demonstrate that educational developers can be agents of change within their institutions with respect to supporting the development of metacognition.


How can we ensure students know about metacognition? By promoting it to the faculty who teach them. My students often joke that “metacognition” is my favorite word because they hear it so often in the classroom. The faculty on my campus might be starting to think the same thing, as I integrate the concept into so much of what I do. In my dual role as an educational developer and a faculty member with research interests in the application of educational psychology to higher education, I consider myself a metacognition advocate.

My advocacy for metacognition branches out through all levels of my institution—featuring prominently in the activities of our Center for Excellence in Teaching and Learning and trickling into my own classroom through major assignments as well as everyday conversations with students. It is central to my own Scholarship of Teaching and Learning, as well as the SoTL of many of the faculty members with whom I work. In order for metacognition to take hold in an institution’s culture, we must ensure students and faculty know about its power.

Group picture at new faculty orientation at Kennesaw State University

Our university is one of the many who have invited metacognition advocate Dr. Saundra McGuire (McGuire, 2015; 2018) to speak to students and faculty, and the buzz that was created around her visit has generated considerable interest in the concept. Many other colleges and universities have experienced a similar ground swelling of support for the idea’s application to the classroom. Those of us who work in educational development roles can capitalize on this current attention to metacognition by helping faculty who are unfamiliar with the concept realize the importance of a reflective, goal-directed approach to one’s own learning and performance. Ultimately, this advocacy can change the culture of an institution by transforming, in small ways, the way instructors teach and students learn. In this guest blog series on “Working with Faculty to Promote Metacognition,” three authors offer their thoughts on promoting metacognition at the student, faculty, and institutional level through their partnerships with instructors.

First, Dr. Nirmal Trivedi, Director of First-Year Seminars at Kennesaw State University (KSU), writes about the ways in which he helps faculty—many of whom are part-time instructors from outside academia and initially unfamiliar with metacognition—infuse metacognitive practices into their courses, with a goal of changing students’ approaches to studying. First-year seminars at KSU have been transformed to include metacognition as a key focus, which has helped many students successfully navigate the college transition. This transformation earned the program the 2018-19 Momentum Year Award from the University System of Georgia, given to the program that best encourages student achievement in the first year of college.

Second, Valencia Gabay, educational consultant and doctoral student at Indiana Wesleyan University, writes about establishing communities of practice with faculty at a fully online institution to promote metacognition through the instructors’ own reflections on teaching. By focusing on ways the instructors themselves can be metacognitive and using a model from organizational development (Algozzini, Gabay, Voyles, Bessolo, & Batchelor, 2017), she modeled reflective practice in a way that is transferable from instructor to student.

Finally, Dr. Eric Kaldor, Associate Director for the Sheridan Center for Teaching and Learning at Brown University and formerly at the University of Rhode Island (URI), describes the way in which URI took advantage of the buzz surrounding Dr. McGuire’s visit to his campus, creating an institution-wide program that changed the culture of the university at large. Particularly important to this effort was the support and communication provided by various campus partners that made it easier for faculty to understand and implement changes in their curricula.

Readers of this blog series will find useful suggestions to help them ensure that the word gets out about metacognition on their campuses. Educational developers can be agents of change within our institutions because of our relationships with many of the institution’s stakeholders. Through our partnerships with faculty, we have an indirect, but still palpable, influence on student learning (Condon, Iverson, Manduca, Rutz, & Willett, 2016). And as metacognitive practitioners ourselves, we can practice what we preach, engaging in reflective and purposeful analysis of our own messages to the academy about how people learn.

References

Algozzini, L., Gabay,V., Voyles, S., Bessolo, K., & Batchelor, G. (2017). Group coaching and mentoring: A framework for fostering organizational change. Campbell, CA: FastPencil, Inc. 

Condon, W., Iverson, E. R., Manduca, C. A., Rutz, C., & Willett, G. (2016). Faculty development and student learning: Assessing the connections. Bloomington, IN: Indiana University Press.

McGuire, S.Y. (2015). Teach students how to learn: Strategies you can incorporate into any course to improve student metacognition, study skills, and motivation. Sterling, VA: Stylus.

McGuire, S.Y. (2018). Teach yourself how to learn: Strategies you can use to ace any course at any level. Sterling, VA: Stylus.


Setting Common Metacognition Expectations for Learning with Your Students

by Patrick Cunningham, Ph.D., Rose-Hulman Institute of Technology

We know that students’ prior subject knowledge impacts their learning in our courses. Many instructors even give prior knowledge assessments at the start of a term and use the results to tailor their instruction. But have you ever considered the impact of students’ prior knowledge and experiences with learning on their approaches to learning in your course? It is important for us to recognize that our students are individuals with different expectations and learning preferences. Encouraging our students’ metacognitive awareness and growth can empower them to target their own learning needs and establish common aims for learning.

image of target with four colored arrows pointed at the center

Among other things, our students often come to us with having experienced academic success using memorization and pattern matching approaches to material, i.e., rehearsal strategies. Because they have practiced these approaches over time and have gotten good grades in prior courses or academic levels, these strategies are firmly fixed in their learning repertoire and are their go-to strategies. Further, when they get stressed academically, they spend more time employing these strategies – they want more examples, they re-read and highlight notes, they “go-over” solutions to old exams, they memorize equations for special cases, and more. And many of us did too, when we were in their shoes.

However, rehearsal strategies only result in shorter-term memory of concepts and surface-level understanding. In order to build more durable memory of concepts and deeper understanding, more effortful strategies are needed. Recognizing this and doing something about it is metacognitive activity – knowing about how we process information and making intentional choices to regulate our learning and learning approaches. One way to engage students in building such metacognitive self-awareness and set common expectations for learning in your course starts with a simple question,

‘What does it mean to learn something?”

I often ask this at the start of a course. In an earlier post, Helping Students Feel Responsible for Their Learning, I introduced students’ common responses. Learning something, they say, means being able to apply it or explain it. With some further prompting we get to applying concepts to real situations and explaining material to a range of people, from family member to bosses, to cross-functional design teams. These are great operational definitions of learning, and I affirm my students for coming up with them.

Then I go a step further, explaining how transferring to new applications and explaining to a wide range of audiences requires a richly interconnected knowledge framework. For our knowledge to be useful and available, it must be integrated with what we already know.

So, I tell my students, in this class we will be engaging in activities to connect and organize our knowledge. I also try to prepare my students for doing this, acknowledging it will likely be different than what they are used to. In my engineering courses students love to see and work more and more example problems – i.e., rehearsal. Examples are good to a point, particularly as you engage a new topic, but we should be moving beyond just working and referencing examples as we progress in our learning. Engaging in this discussion about learning helps make my intentions clear.

I let my students know that as we engage with the material differently it will feel effortful, even hard at times. For example, I ask my students to come up with and explore variations on an example after we have solved it. A good extension is to have pairs working different variations explain their work to each other. Other times I provide a solution with errors and ask students to find them and take turns explaining their thinking to a neighbor. In this effortful processing, they are building connections. My aim is to grow my students’ metacognitive knowledge by expanding their repertoire of learning strategies and lowering the ‘activation energy’ to using these strategies on their own. It is difficult to try something new when there is so much history behind our habitual approaches.

Another reason I like this opening discussion, is that it welcomes opportunities for metacognitive dialogue and ongoing conversations about metacognition. I have been known to stop class for a “meta-moment” where we take time to become collectively more self-aware, recognizing growth or monitoring our level of understanding. The discussion about what it means to learn something also sets a new foundation and changes conversations about exam, quiz, and homework preparations and performance. You might ask, “How did you know you knew the material?” Instead of suggesting “working harder” or “studying more”, we can talk meaningfully about the context and choices and how effective or ineffective they were.

Such metacognitive self-examination can be challenging for students and even a little uncomfortable, especially if they exhibit more of a fixed mindset toward learning. It may challenge their sense of self, their identity. It is vital to recognize this. Some students may exhibit resistance to the conversation or to the active and constructive pedagogies you employ. Such resistance is challenging, and we must be careful with our responses. Depersonalizing the conversation by focusing on the context and choices can make it feel less threatening. For example, if a student only studied the night or two before an exam, instead of thinking they are lazy or don’t care about learning, we can acknowledge the challenge of managing competing priorities and ask them what they could choose to do differently next time. We need to be careful not to assume too much, e.g., a student is lazy. Questions can help us understand our students better and promote student self-awareness. For more on this approach to addressing student resistance see my post on Addressing Student Resistance to Engaging in their Metacognitive Development.

Students’ prior learning experiences impact how they approach learning in specific courses. Engaging students early in a metacognitive discussion can help develop a common set of expectations for learning in your course, clarifying your intentions. It also can open doors for metacognitive dialogue with our students; one-on-one, in groups, or as a class. It welcomes metacognition as a relevant topic into the course. However, as we engage in these discussions, we must be sensitive to our students, respectfully and gently nudging their metacognitive growth. Remember, this is hard work and it was (and often still is) hard for us too!

Acknowledgements This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757 & 1433645. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.


Utilizing Student-Coded Exams for Responsive Teaching and Learning

by Dana Melone, Cedar Rapids Kennedy High School

Welcome to the start of a semester for most teachers.  My name is Dana Melone and I teach AP Psychology and AP Research at Cedar Rapids Kennedy High School.  Most educators will give some sort of multiple-choice test during the semester, and as educators we want our students to use their exams as a learning tool, not just as a summative experience.  Unfortunately, many students just pop a graded exam into their folder and move on.  Today I would like to give you some strategies you can use as a teacher to get students to learn from their mistakes as well as their correct answers.  

pencil laying across a multiple-choice test question

These strategies also give teachers the opportunity to look at their own teaching and find commonalities in the mistakes their students are making. If your students are all making similar mistakes you can reteach this topic in a new way.  If mistakes are spread out it may inform you that your students need to work on study skills.  Your students can use these examples to examine their own thinking and learning (become more metacognitive) and become advocates for themselves.   You and your students utilize metacognitive processes to become better teachers and learners.

Let’s start with the exam itself.  Students often get their exam back and struggle to remember what their thinking was when they took it. If you are giving a paper exam, students can use a coded system as they take their test to remember their thinking later.  For example, if a student feels they knew the answer to the question and they feel confident in their choice then they can put a checkmark next to that question.  If they were able to narrow it down but were not entirely sure they made the right choice they can put a dash next to the question.  If they had no idea than they can use an x.  This allows students to remember their thinking as they look back at their exam. Students can find out if they are always missing similar style or topic questions that they thought they already knew.  They can use these self-coded exams as they get close to finals as a study tool.  Students can also take note whether or not their thinking was correct.  If they are the ones about which they felt confident wrong, they need to explore that further.   Student-coded exams also allow teachers to look at patterns for their own use and modify their teaching appropriately, i.e. be metacognitive in their teaching.  For example, teachers can change their focus if a large number of students indicated that they did not know similar concepts or struggled with application questions.  Or, if students indicate that they narrowed down to the best two choices but chose poorly, teachers can share strategies to deal with that issue.  Why do this?  The hope is that students will become more aware of what is working and what isn’t and that by making them more aware, they will make adjustments. By regularly practicing these metacognitive skills, we hope that students will learn to adjust on their own.

Once students get their exam back a next step for many teachers is to have students complete exam corrections.  I have seen many formats of exam corrections.  The methods that really get students thinking about the content and their own testing strategy produce metacognitive awareness.  Here are some methods that you could use individually or combine:

  1. Have students write why they think they got the question wrong.  Was it an error in reading the question?  Did they not know the content?  Did they narrow it down to two but chose incorrectly?
  2. Have students explain why the answer they chose is incorrect or why the correct answer is correct.
  3. Have students rewrite the question to make their wrong answer right.
  4. Have students write a memory aid to help them remember that concept in the future.
  5. Have students write out what they found tricky about that concept.
  6. Have students write out how that concept relates to them or another concept in the course.
  7. Have students categorize the concepts they missed by learning target or standard and draw a conclusion about that target or standard as a whole.  Many classrooms are moving to standards-based learning or a select few overrising concepts students must master to be proficient in the course.  If you can organize your exam to show students patterns they are making with these standards, it can help them make good study decisions and help you make good teaching decisions.

How can we as educators know if students have gotten the most out of this process?  Try including questions on the most commonly missed topics on future exams at no cost to the students. Meaning, do not penalize their score.  Make these questions formative to see if they are making progress.   Do you have great ideas for test corrections that produce metacognition? Let us know.


How can I help students become more expert learners, so they engage in active learning?

by Stephanie Chasteen, University of Colorado Boulder

This chapter focuses on helping students engage productively in active learning classrooms by teaching students reflect on their learning and develop productive mindsets towards learning. It is part of a series on helping students engage productively in active learning classrooms.” It includes a list of tangible teaching and student metacognition strategies to use when working with students.


Addressing Student Resistance to Engaging in their Metacognitive Development

by Patrick Cunningham, Ph.D., Rose-Hulman Institute of Technology

You may be familiar with the quip,

“You can lead a horse to water, but you can’t make it drink.”

Perhaps you can’t, however, my grandfather argued, “but you can put salt in its oats!” We can advise students on the importance of setting specific learning goals and accurately monitoring both their level of understanding and their learning processes. And I believe we should teach them how to be more metacognitive, but we can’t make them do any of it. Nor do I think we should. Students should own their learning. They should experience agency and efficacy in their learning (i.e., they should own their learning). But I can put “salt in their oats!” In this post I want to explore our role, as educators, in encouraging and providing opportunities for students to grow their metacognitive awareness and skills (i.e., our role as purveyors of “learning salt”).

I recently found the book Why Students Resist Learning (Tolman & Kremling, 2017). While written about resistance to learning in general, it is relevant to student resistance to engaging in their metacognitive development. Student resistance is complex with multiple interacting components. In my reading so far I have been challenged by two overarching themes. First, student resistance isn’t just about students. It’s about us, the educators, too. Our interactions with students can exacerbate or ameliorate student resistance. Second, student resistance is a symptom of deeper issues, not a student characteristic itself. For example, a student may be trying to preserve their sense of self and fear admitting a learning deficiency or a student may have had prior experiences that affirm surface approaches to learning and therefore they resist the idea that they need strategies to develop deeper learning.

We, as educators, need to recognize and deal with our role in student resistance to metacognitive development. Our interactions with our students are largely influenced by our beliefs and attitudes about our students. My colleagues and I have sought to address this in the B-ACE framework for giving formative feedback in support of metacognitive development. The ‘B’ represents an attitude of Believing the best about students. When we prepare to give feedback, we are responding to what they have written or said, which may or may not be accurate or complete. Believing the best acknowledges that we have incomplete information and need to reserve judgement. This attitude embodies sincere curiosity and seeks understanding. The remaining letters represent actionable elements of feedback, Affirm-Challenge-Encourage. Implementing our belief in the best about our students, we should seek to authentically affirm positive behaviors and growth, however small. Then explore and seek to understand the broader contexts and details of their statements by asking questions. In this way, you can provide gentle challenge to think more deeply or to discover incongruities between learning goals and behaviors. Finally, close by encouraging them. Let your students know you believe in their abilities to become more skillful learners, with effort and perseverance. If you say it, make sure you mean it. You can also point them to potential strategies to consider. Let’s see how we can implement the B-ACE framework as “learning salt”.

In my teaching, I provide a variety of opportunities for my students to engage in their metacognitive development. At some point I ask something like, “What have you been doing differently since we last talked? How is it helping you be a more skilled and efficient learner?” One common type of response I get from engineering students is exemplified by:

“I am continuing to work practice problems to get ready for exams. I try to work through as many as I can. It works best for me.”

Okay. No change. I’m disappointed. First, I need to make sure I don’t assume they are just memorizing and pattern matching, i.e., relying on surface learning approaches. Or, if they are memorizing and pattern matching, I need to believe it is in honest effort to learn. Further, change is hard and they may be trusting what is familiar and comfortable, even if it isn’t the most effective and efficient. Now I need to ACE the rest of the feedback.

[Affirm] Good! You are taking intentional steps to prepare for your exams. [Challenge] How do you know it works best? What other strategies have you tried? [Encourage] Keep being intentional about your learning. You may want to try recall-and-review, explaining-to-learn, or creating your own problems to measurably test your understanding.

There will be a difference between written feedback and oral feedback, but notice that both include an opening for further interaction and prompt metacognitive reflection. In a face-to-face dialogue, there might be other questions depending on the responses, such as, “How are you working the problems? What will happen if the problem is asked in a way that is different from your practice?” In written feedback, I may want to focus on one question instead of a list, so as not to overwhelm the student with challenge. Notice that these questions are seeking additional information and pointing the student to make connections. Still the student may or may not take my suggestions to try something different. However, I argue this type of response is “saltier” than just settling for this response or telling them directly their approach isn’t as effective, and it may lead to further dialogue later on.

In a recent post, Aaron Richmond questions if well-intentioned metacognitive instruction can, in specific cases, be unethical (Richmond, 2018). John Draeger provides counterpoint in his response, but acknowledges the need to recognize and address possible adverse reactions to metacognitive instruction (Draeger, 2018). The B-ACE feedback framework both encourages student metacognition and is an expression of Ethical Teaching, summarized by Richmond (Richmond, 2018). It acknowledges students’ autonomy in their learning, seeks to avoid harm and promote their well-being, and strives to be unbiased and authentic. Further, it can address adverse reactions, by helping students to discover the deeper issues of their reaction.

In caring for our students, we want to see them grow. They aren’t always ready. Prochaske, Norcross, and DiClemente (1994) delineate six stages of change, and it starts with the lack of awareness and willingness to change. Change takes time an effort. Even so, let’s commit to making interactions with our students “salty”! Let’s gently, quietly, and persistently encourage them in their metacognitive development.

References

Prochaska, J., Norcross, J., & DiClemente, C. (1994). Changing for Good. New York: Harper Collins.

Tolman, A. & Kremling, J. (Eds.). (2017). Why Students Resist Learning: A Practical Model for Understanding and Helping Students. Sterling, VA: Stylus.

Acknowledgements

This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757 & 1433645. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.


Metacognition is essential to ethical teaching

by John Draeger, SUNY Buffalo State

In his most recent post, Aaron Richmond considers the possibility that promoting metacognition might be unethical (Richmond, 2018). According to Richmond, ethical teaching requires promoting student autonomy by providing students with choices between learning strategies and promoting student welfare by safeguarding against harm. Richmond believes that activities promoting student metacognition may pose a potential threat to both student welfare and student autonomy. Thus, Richmond cautiously concludes that promoting student metacognition can be unethical.

Richmond illustrates his worry by considering the use of a metacognitive strategy that he has shared on our site (Richmond, 2017), namely Immediate Feedback Assessment Techniques (IF-AT). He worries that IF-AT can cause students undue anxiety, especially if they aren’t given the option of alternative assignments. In his view, the presence of anxiety threatens welfare and the lack of options threatens autonomy. To avoid these pitfalls, Richmond recommends that instructors tell their students why and how particular teaching strategies will be used. He also recommends that instructors be on the lookout for the possibility that a particular strategy could cause unintended anxiety. And he advises that instructors should be prepared to pre-warn students about the possibility of difficulty and be prepared to debrief students afterwards if difficulties occur. These safeguards are important because they protect student welfare and autonomy. I agree, though I argue below that metacognition is key to getting there. Richmond ends by posing three questions for us to think about. He asks, “Do you believe that some IwM practices have the potential to be unethical? If so, how do you ameliorate this issue? How do I become both an ethical and metacognitive teacher?” (Richmond, 2018). I will take each question in turn.

  1. Do I believe that some metacognitive practices have the potential to be unethical?

In short, no. It is possible that a metacognitive assessment, such as IF-AT, could inadvertently cause serious harm to a particular student. For example, a student facing serious psychological distress outside the classroom might find an assignment, any assignment, more than she can take. But the fact that a learning strategy could inadvertently harm a particular student does not show a strategy to be unethical. By analogy, there are many medical procedures that have been studied, approved, and shown to be effective. It is always possible that one those procedures could inadvertently cause a particular patient serious harm. Doctors ought to be aware of the possibility and monitor the situation. They should be ready with remedies. But the fact that someone could be inadvertently harmed neither shows that doctors are unethical nor that the procedure should be discontinued. Likewise, if a learning strategy has been tested and shown to be effective, then it seems reasonable to try. Instructors should be aware of the possibility that some students might have an adverse reaction. But the fact that a particular student can be inadvertently harmed neither shows that instructors are unethical nor that use of the learning technique should be discontinued.

It is also possible that a well-intentioned instructor could try a teaching innovation (e.g., IF-AT) in hopes that student learning will improve only to find that it doesn’t meet that objective. There are plenty of reasons to be concerned about ineffective instruction, but being unethical is far more than being ineffective, suboptimal, or even a cause for concern. On the analogy with medicine, a particular medical procedure may not help a particular patient or even a group of patients, but it is hard to see how doctors can be unethical for trying something that they believe could work. In both cases, we hope that teachers and doctors will become aware of the problems and look to make meaningful adjustments (i.e. become more metacognitive about their practices). In contrast, it is possible that instructors could be intentionally undermining student learning efforts. Such instruction could be unethical. But I doubt this applies to instructors taking the time to design activities that promote student metacognition in hopes of enhancing student learning.

Richmond’s concern about instruction implementing metacognitive learning strategies centers on whether they harm student welfare and undermine student autonomy. Returning to his illustration, Richmond worries that students may feel coerced into doing IF-AT (thus undermining choice) and the uniqueness of the activity may cause undue anxiety (thus undermining welfare). I don’t doubt that there are plenty of assignments and activities that students don’t want to do and these may stress them out. At some level, however, students have voluntarily opted into an educational system that will make demands on their time and energy, require hard work and dedication, and push their boundaries in order to facilitate their growth. Instructors should be mindful not to make unreasonable demands, but it is unclear how providing students with immediate feedback on their performance (IF-AT) constitutes coercion or any other unethical behavior. Moreover, I have argued that instructors should promote constructive discomfort in an attempt to nudge students towards learning growth (Draeger, 2014). More specifically in regards to IF-AT, it might be that a student feels anxiety associated with students learning that they don’t know as much as they thought they knew, but I suspect that these negative feelings will be offset by the positive feelings associated with improved performance.

In short, well-meaning learning strategies, including metacognitive ones, can be ineffective and in some cases can even inadvertently cause serious harm to specific students. But I see no reason to think this shows that instruction promoting student metacognition can be unethical.

  1. If so, how do you ameliorate this issue?

Though I don’t think that incorporating metacognition into one’s course is unethical, I do believe that it is the key to ameliorating the sorts of concerns Richmond is worried about. For example, Richmond hopes to raise awareness about the possible unintended consequences of well-meaning pedagogical best-practices. He rightly points out that we should not assume that good-intentions will carry the day. He argues for the importance of procedural safeguards when implementing assignments, such as being explicit about the purpose of an assignment, pre-warning students about pitfalls, and debriefing students afterwards. These safeguards could help promote student welfare. He argues for the value of giving students the choice between a variety of assignments. Offering multiple entry points into content both could improve student learning and increase student autonomy. This is good advice because it is a hallmark of good teaching.

I would venture to say that Richmond’s advice is a hallmark of good teaching because it is an example of metacognitive teaching. For example, if instructors should be mindful of student anxiety and discomfort, and use that awareness to guide their pedagogical choices, then promoting metacognition is how we get there. In this case, a metacognitive instructor would become aware of a student need (e.g., reduction of anxiety) and self-regulate by making the necessary adjustments (e.g., offering alternative assignments in order to reduce that anxiety). In my view, therefore, metacognition itself is the way to ameliorate Richmond’s concerns.

  1. How do I become both an ethical and metacognitive teacher?

Metacognition is not a magic wand that guarantees student success. Metacognitive instruction does, however, ask instructors to become increasingly aware of what works (and what doesn’t work) with an eye towards making adjustments that are likely to improve student learning. Metacognitive instructors can monitor roadblocks to learning and help students find ways to overcome them. It is possible that an assignment, such as IF-AT, might not help a particular group of students get where they need to go. If so, then a metacognitive instructor will monitor student progress, recognize that it is not working, and intentionally make a change. The instructor might decide that the assignment should be discontinued. In this case, however, the assignment would be discontinued because it was ineffective and not because it was unethical. In my view, it is metacognitive instruction that identifies the problem and proposes a solution.

In short, if the goal is to of promote awareness of student learning needs and promote the importance of making meaningful adjustments so that student needs are met, then it seems that metacognition is the key to both student welfare and student autonomy. And if, as Richmond argues, being ethical requires promoting welfare and autonomy, then metacognition is essential to the ethical teaching.

References

Draeger, J. (2014). “Cultivating the habit of constructive discomfort.” Retrieved from https://www.improvewithmetacognition.com/cultivating-a-habit-of-constructive-discomfort/

Richmond, A. (2018). “Can metacognitive instruction be unethical?” Retrieved from https://www.improvewithmetacognition.com/can-metacognitive-instruction-be-unethical/

Richmond, A. (2017). “Scratch and win or scratch and lose? Immediate Feedback Assessment Technique.” Retrieved from https://www.improvewithmetacognition.com/scratch-win-scratch-lose-immediate-feedback-assessment-technique/


Embedding Metacognition into New Faculty Orientation

By Lauren Scharff, Ph.D., U. S. Air Force Academy *

When and how might faculty become aware of metacognition in general, how student metacognition might enhance student learning, and how personal metacognition might enhance their own teaching? Ideally, faculty learned about metacognition as students and thereafter consciously engaged in metacognitive practices as learners and developing professionals. Based on conversations with many faculty members, however, this is not the case. It certainly wasn’t the case for me. I don’t remember even hearing the term metacognition until after many years of working as a professor. Even now most new faculty seem to only have a vague familiarity with the term “metacognition” itself, and few claim to have spent much time considering how reflection and self-regulation, key components of metacognition, should be part of their own practice or part of the skill set they plan to help develop in their students.

While this reality is not ideal (at least for those of us true believers in the power of metacognition), realization of this lack of understanding about metacognition provides opportunities for faculty development. And why not start right at the beginning when faculty attend new faculty orientation?

New Faculty Orientation

At my institution this summer, we did just that. Our Director of Faculty Development, Dr. Marc Napolitano, worked the topic into his morning session on student learning. We designed a follow-on, small-group discussion session that encouraged faculty to actively engage in reading, personal application, and discussion of metacognition.

The reading we chose was one of my favorite metacognition articles, Promoting Student Metacognition, by Dr. Kimberly Tanner (2012). The session was only 40 minutes, so we only had them read a few pages of the article for the exercise, including her Table 1, which provides a series of questions students can ask themselves when planning, monitoring, evaluating their learning for a class session, while completing homework, while preparing for an exam. We had the new faculty jot down some reflections based on their responses to several guided prompts. Then we had time for discussion. I facilitated one of the small groups and was thus able to first-hand hear some of their responses.

Example questions:

  • What type of student were you as an undergraduate? Did you ever change your approach to learning as you went through school?
  • You obviously achieved success as an undergraduate, but do you think that you could have been more successful if you had better understood the science of learning and had teachers incorporate it into their courses?
  • If you had to share a definition of metacognition [from the reading] with students – and explain to them why it is an essential practice in learning – which definition would you use and how would you frame it with students?
  • If you wished to incorporate metacognition into your class, what approach(es) currently seems most practical for you? Why?
  • Which 3-4 of the questions in Table 1 seem like they would most helpful to use in your class? Why do these questions stand out, and how might they shape your class?

The discussion following the reading and reflection time was very rich. Only one member of my group of eight reported a good prior understanding of metacognition and how it could be incorporated into course design (she had just finished a PhD in physics education). Two others reported having vague prior familiarity with the term. However, after participating in these two faculty development sessions, all of them agreed that learning about the science of learning would have been valuable as a student regardless of level (K-12 through graduate school).

The faculty in my group represent a wide variety of disciplines, so the ways of incorporating metacognition and the questions from the table in the reading that most appealed to them varied. However, that is one of the wonderful things about designing courses or teaching practices to support student metacognition – there are many ways to do so. Thus, it’s not a problem to fit them to your way of teaching and your desired course outcomes.

We also spent a little time discussing metacognitive instruction: being aware of their choices as instructors and their students’ engagement and success, and using that awareness to guide their subsequent choices as instructors to support their students’ learning. They quickly understood the parallels with student metacognitive learning (students being aware of their choices and whether or not those choices are leading to success, and using that awareness to guide subsequent choices related to their learning). Our small groups will continue to meet throughout the coming year as a continuation of our new faculty development process. I look forward to continuing our conversations and further supporting them in becoming metacognitive instructors and promoting their students’ development as metacognitive learners.

————

Tanner, K. (2012). Promoting student metacognition. CBE—Life Sciences Education; Vol. 11, 113–120

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Helping Students Feel Responsible for Their Learning

by Patrick Cunningham, Ph.D., Rose-Hulman Institute of Technology

“Dr. C, you really expect your students to do a lot!” I quickly replied, “Yes!” We then engaged in a discussion of things only students can do for their learning. How can we help more of our students recognize their responsibility for their learning? Three strategies I employ include explicit and direct instruction, questioning for self-discovery, and in-class opportunities to practice new learning strategies. Each of these strategies direct students’ focus to things under their control.

Helping our students recognize and embrace their responsibility for their learning requires metacognitive activity. Specifically, it requires building metacognitive knowledge of persons and strategies and engaging in metacognitive regulation through planning for and monitoring learning experiences. Direct instruction and in-class learning strategy practice can expand metacognitive knowledge. Questioning for self-discovery can facilitate students metacognitive monitoring and planning for subsequent learning experiences.

For explicit and direct instruction, I start a discussion within the first two days of class by asking, “What does it mean to learn something?” Most responses include applying and explaining concepts. Good answers, but I press for more depth. In turn I respond, “Apply to what? Explain to whom?” Learning something, they say, means being able to apply concepts to real circumstances. My engineering students also come up with a variety of people or groups of people to explain things to: their grandmother, family members, a cross-functional design team, a boss, peer engineers, marketing/sales professionals, or even customers. These answers are good operational definitions of learning. Next, I talk to my students about the knowledge frameworks that underlie these abilities.

Illustration of Knowledge Frameworks

In order to apply concepts to real and diverse circumstances and to explain concepts effectively to a range of audiences we must have many routes to and between the elements of our knowledge and a logical structure of the information. That is, our knowledge frameworks must be well populated, richly interconnected, and meaningfully organized (Ambrose et al., 2010). However, as novices in an area, we start with sparsely populated and isolated knowledge frameworks. I then share with students that they are the only ones who can construct their knowledge frameworks. The population and interconnection of elements depends on what they individually do with the material, in class and out of class. As the instructor, I can create opportunities and experiences for them, but I cannot build their knowledge frameworks for them. Students are responsible for the construction work.

For self-discovery I use guiding questions to help students articulate learning goals, combat the Illusion of Comprehension, and make cause-and-effect linkages between their learning behaviors and outcomes. I may ask, “What goals do you have for your homework/study sessions?” Students often focus on getting assignments done or being “ready” for exams, but these are not directly learning goals. It is helpful here to ask what they want or need to be able to do with the information. Eliciting responses such as: “Apply ____ to ____. Create a ____ using ____. Explain ____.” Now we can ask students to put the pieces together. How does just “getting the homework done” help you know if you can apply/create/explain? We are seeking to help students surface incongruities in their own behavior, and these incongruities are easier to face when you discover them yourself rather than being told they are there.

A specific incongruity that many students struggle with is the Illusion of Comprehension (Svinicki, 2004), which occurs when students confuse familiarity with understanding. It often manifests itself after exams as, “I knew the material, I just couldn’t show you on the exam.” My favorite question for this is, “How did you know you knew the material?” Common responses include looking over notes or old homework, working practice exams, reworking examples and homework problems. But what does it mean to “look over” prior work? How did you work the practice exam? How did you elaborate around the concepts so that you weren’t just reacting to cues in the examples and homework problems? What if the context of the problem changes? It is usually around this point that students begin to realize the mismatch between their perceptions of deep understanding and the reality of their surface learning.

Assignment or exam wrappers are also good tools to help students work out cause-and-effect linkages between what they do to learn material and how they perform. In general, these “wrappers” ask students to reflect on what they did to prepare for the assignment or exam, process instructor feedback or errors, and adjust future study plans.

It is important, once we encourage students to recognize these incongruities, that we also help direct students back to what they can do to make things better. I direct conversations with my students to a variety of learning strategies they can employ, slanted towards elaborative and organizational strategies. We talk about such things as making up problems or questions on their own, explaining solutions to friends, annotating their notes summarizing key points, or doing recall and reviews (retrieval practice).

However, I find that telling them about such strategies often isn’t enough. We trust what is familiar and comfortable – even ineffective and inefficient learning strategies that we have practiced over years of prior educational experiences and for which we have been rewarded. So I implement these unfamiliar, but effective and efficient strategies into my teaching. I want my students to know how to do them and realize that they can do them in their outside of class study time as well.

One way I engage students with new strategies is through constructive review prior to exams. We start with a recall and review exercise. I have students recall as many topics as they can in as much detail as they can for a few minutes – without looking anything up. Then I have students open their notes to add to and refine their lists. After collectively capturing the key elements, I move to having pairs of students work on constructing potential questions or problems for each topic. I also create a discussion forum for students to share their problems and solutions – separately. As they practice with each others’ problems, they can also post responses and any necessary corrections.

In concert, direct instruction, questioning for self-discovery, and in-class opportunities to practice new learning strategies can develop our students’ sense of responsibility for their learning. It even can empower them by giving them the tools to direct their future learning experiences. In the end, whether they recognize it or not, students are responsible for their learning. Let’s help them embrace this responsibility and thrive in their learning!

References

Ambrose, S., Bridges, M., DiPietro, M., Lovett, M., & Norman, M. (2010) How Learning Works: 7 Research-Based Principles for Smart Teaching. San Francisco, CA: Jossey-Bass.

Svinicki, M. (2004). Learning and Motivation in the Postsecondary Classroom. San Francisco, CA: John Wiley & Sons.

Acknowledgements

This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757 & 1433645. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.

Next Blog-post:

Overcoming student resistance to engaging in their metacognitive development.


Supporting Student Self-Assessment with Knowledge Surveys

by Dr. Lauren Scharff, U. S. Air Force Academy*

In my earlier post this year, “Know Cubed” – How do students know if they know what they need to know?, I introduced three challenges for accurate student self-assessment. I also introduced the idea of incorporating knowledge surveys as a tool to support student self-assessment (an aspect of metacognitive learning) and promote metacognitive instruction. This post shares my first foray into the use of knowledge surveys.

What exactly are knowledge surveys? They are collections of questions that support student self-assessment of their course material understanding and related skills. Students complete the questions either at the beginning of the semester or prior to each unit of the course (pre), and then also immediately prior to exams (post-unit instruction). When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. The type of learning expectation is highlighted by including the Bloom’s level at the end of each question. Completion of knowledge surveys develops metacognitive awareness of learning and can help guide more efficient studying.

Example knowledge survey questions
Example knowledge survey questions

My motivation to include knowledge surveys in my course was a result of a presentation by Dr. Karl Wirth, who was invited to be the keynote speaker at the annual SoTL Forum we hold at my institution, the United States Air Force Academy. He shared compelling data and anecdotes about his incorporation of knowledge surveys into his geosciences course. His talk inspired several of us to try out knowledge surveys in our courses this spring.

So, after a semester, what do I think about knowledge surveys? How did my students respond?

In a nutshell, I am convinced that knowledge surveys enhanced student learning and promoted student metacognition about their learning. Their use provided additional opportunities to discuss the science of learning and helped focus learning efforts. But, there were also some important lessons learned that I will use to modify how I incorporate knowledge surveys in the future.

Evidence that knowledge surveys were beneficial:

My personal observations included the following, with increasing levels of each as the semester went on and students learned how to learn using the knowledge survey questions:

  • Students directly told me how much they liked and appreciated the knowledge survey questions. There is a lot of unfamiliar and challenging content in this upper-level course, so the knowledge survey questions served as an effective road map to help guide student learning efforts.
  • Students asked questions in class directly related to the knowledge survey questions (as well as other questions). Because I was clear about what I wanted them to learn, they were able to judge if they had solid understanding of those concepts and ask questions while we were discussing the topics.
  • Students came to office hours to ask questions, and were able to more clearly articulate what they did and did not understand prior to the exams when asking for further clarifications.
  • Students realized that they needed to study differently for the questions at different Bloom’s levels of learning. “Explain” questions required more than basic memorization of the terms related to those questions. I took class time to suggest and reinforce the use of more effective learning strategies and several students reported increasing success and the use of those strategies for other courses (yay!).
  • Overall, students became more accurate in assessing their understanding of the material prior to the exam. More specifically, when I compared the knowledge survey reports with actual exam performance, students progressively became more accurate across the semester. I think some of this increase in accuracy was due to the changes stated in points above.

Student feedback included the following:

  • End-of-semester feedback from students indicated that vast majority of them thought the knowledge surveys supported their learning, with half of them giving them the highest rating of “definitely supports learning, keep as is.”
  • Optional reflection feedback suggested development of learning skills related to the use of the knowledge surveys and perceived value for their use. The following quote was typical of many students:

At first, I was not sure how the knowledge surveys were going to help me. The first time I went through them I did not know many of the questions and I assumed they were things I was already supposed to know. However, after we went over their purpose in class my view of them changed. As I read through the readings, I focused on the portions that answered the knowledge survey questions. If I could not find an answer or felt like I did not accurately answer the question, I bolded that question and brought it up in class. Before the GR, I go back through a blank knowledge survey and try to answer each question by myself. I then use this to compare to the actual answers to see what I actually need to study. Before the first GR I did not do this. However, for the second GR I did and I did much better.

Other Observations and Lessons learned:

Although I am generally pleased with my first foray into incorporating knowledge surveys, I did learn some lessons and I will make some modifications next time.

  • The biggest lesson is that I need to take even more time to explain knowledge surveys, how students should use them to guide their learning, and how I use them as an instructor to tailor my teaching.

What did I do this past semester? I explained knowledge surveys on the syllabus and verbally at the beginning of the semester. I gave periodic general reminders and included a slide in each lesson’s PPT that listed the relevant knowledge survey questions. I gave points for completion of the knowledge surveys to increase the perception of their value. I also included instructions about how to use them at the start of each knowledge survey:

Knowledge survey instructions
Knowledge survey instructions

Despite all these efforts, feedback and performance indicated that many students really didn’t understand the purpose of knowledge surveys or take them seriously until after the first exam (and some even later than that). What will I do in the future? In addition to the above, I will make more explicit connections during the lesson and as students engage in learning activities and demonstrations. I will ask students to share how they would explain certain concepts using the results of their activities and the other data that were presented during the lesson. The latter will provide explicit examples of what would (or would not) be considered a complete answer for the “explain” questions in contrast to the “remember” questions.

  • The biggest student feedback suggestion for modification of the knowledge surveys pertained to the “pre” knowledge surveys given at the start of each unit. Students reported they didn’t know most of the answers and felt like completion of the pre knowledge surveys was less useful. As an instructor, those “pre” responses helped me get a pulse on their level or prior knowledge and use that to tailor my lessons. Thus, I need to better communicate my use of those “pre” results because no one likes to take time to do what they perceive is “busy work.”
  • I also learned that students created a shared GoogleDoc where they would insert answers to the knowledge survey questions. I am all for students helping each other learn, and I encourage them to quiz each other so they can talk out the answers rather than simply re-reading their notes. However, it became apparent when students came in for office hours that the shared “answers” to the questions were not always correct and were sometimes incomplete. This was especially true for the higher-level questions. I personally was not a member of the shared document, so I did not check their answers in that document. In the future, I will earlier and more explicitly encourage students to be aware of the type of learning being targeted and the type of responses needed for each level, and encourage them to critically evaluate the answers being entered into such a shared document.

In sum, as an avid supporter of metacognitive learning and metacognitive instruction, I believe that knowledge surveys are a great tool for supporting both student and faculty awareness of learning, the first step in metacognition. We then should use that awareness to make necessary adjustments to our efforts – the other half of a continuous cycle that leads to increased student success.

———————————————–

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


How to Get the Most Out of Studying

Dr. Stephen Chew has put together a highly lauded series of short videos that share with students some powerful principles of effective learning, including metacognition. His goal was to create a resource that students can view whenever and as often as they want.

They include

  • Video 1: Beliefs That Make You Fail…Or Succeed
  • Video 2: What Students Should Understand About How People Learn
  • Video 3: Cognitive Principles for Optimizing Learning
  • Video 4: Putting the Principles for Optimizing Learning into Practice
  • Video 5: I Blew the Exam, Now What?

Links to the videos can be found here:

https://www.samford.edu/departments/academic-success-center/how-to-study

Dr. Chew also provides an overview handout that summarizes the purposes of the videos, gives guidance on how to use them, and outlines the main points within the videos:

https://www.samford.edu/departments/files/Academic_Success_Center/How-to-Study-Teaching_Resources.pdf


Where Should I Start With Metacognition?

by Patrick Cunningham, Rose-Hulman Institute of Technology

Have you ever had a student say something like this to you? “I know the material, I just couldn’t show you on the exam.” How do you respond?

I have heard such comments from students and I think it exemplifies two significant deficiencies.

First, students are over-reliant on rehearsal learning strategies. Rehearsal is drill-and-practice or repetitive practice aimed at memorization and pattern matching. Such practices lead to surface learning and shallow processing. Students know facts and can reproduce solutions to familiar problems, but struggle when the problem looks different. Further, when faced with real-world situations they are often not even able to identify the need for the material let alone apply it. Only knowing material by rote is insufficient for fluency with it. For example, I can memorize German vocabulary and grammar rules, but engaging someone from Germany in a real conversation requires much more than just knowing words and grammar.

Second, students are inaccurate in their self-assessments of their learning, which can lead to false confidence and poor learning choices (Ehrlinger & Shain 2014). Related to this, I have developed a response to our hypothetical student. I ask, “How do you know you know the material?” In reply, students commonly point to looking over notes, looking over homework, reworking examples or homework problems, or working old exams – rehearsal strategies. I often follow up by asking how they assessed their ability to apply the material in new situations. This often brings a mixture of surprise and confusion. I then try to help them discover that while they are familiar with the concepts, they are not fluent with them. Students commonly confuse familiarity with understanding. Marilla Svinicki (2004) calls this the Illusion of Comprehension, and others have called it the illusion of fluency. Continuing the language example, I could more accurately test my knowledge of German by attempting and practicing conversations in German rather than just doing flashcards on vocabulary and grammar rules. Unless we employ concrete, demonstrable, and objective measures of our understanding, we are prone to inaccurate self-assessment and overconfidence. And, yes, we and our students are susceptible to these maladies. We can learn about and improve ourselves as we help our students.

Addressing these two deficiencies can be a good place to start with metacognition. Metacognition is the knowledge and regulation of our thinking processes. Our knowledge of strategies for building deeper understanding and our awareness of being susceptible to the illusion of comprehension are components of metacognitive knowledge. Our ability to regulate our thinking (learning) and apply appropriate learning strategies is critically dependent on accurate self-assessment of our level of understanding and our learning processes, specifically, in metacognitive monitoring and evaluation. So how can we support our students’ metacognitive development in these areas?

To help our students know about and use a broader range of learning strategies, we can introduce them to new strategies and give them opportunities to practice them. To learn more deeply, we need to help students move beyond rehearsal strategies. Deeper learning requires expanding and connecting the things we know, and is facilitated by elaborative and organizational learning strategies. Elaboration strategies aid the integration of knowledge into our knowledge frameworks by adding detail, summarizing, and creating examples and analogies. Organizational strategies impose structure on material and help us describe relationships among its elements (Dembo & Seli 2013).

We can help our students elaborate their knowledge by asking them to: 1) explain their solutions or mistakes they find in a provided solution; 2) generate and solve “what-if” scenarios based on example problems (such as, “what if it changed from rolling without slipping to rolling with slipping”); and 3) create and solve problems involving specific course concepts. We can help our students discover the structure of material by asking them to: 1) create concept maps or mind maps (though you may first need to help them learn what these are and practice creating them); 2) annotate their notes from a prior day or earlier in the period; and 3) reorganize and summarize their notes. Using these strategies in class builds students’ familiarity with them and improves the likelihood of students employing them on their own. Such strategies help students achieve deeper learning, knowing material better and making it more accessible and useable in different situations (i.e., more transferable). For example, a student who achieved deeper learning in a system dynamics course will be more likely to recognize the applicability of a specific dynamic model to understand and design a viscosity experiment in an experiment design class.

To help our students engage in more accurate self-assessment we can aid their discovery of being susceptible to inaccurate self-perceptions and give them opportunities to practice strategies that provide concrete, demonstrable, and objective measures of learning. We can be creative in helping students recognize their propensity for inaccuracy. I use a story about an awkward conversation I had about the location of a youth hostel while travelling in Germany as an undergraduate student. I spent several minutes with my pocket dictionary figuring out how to ask the question, “Wissen Sie wo die Jugendherberge ist?” When the kind stranger responded, I discovered I was nowhere near fluent in German. It takes more than vocabulary and grammar to be conversant in the German language!

We can help our students practice more accurate self-assessment by asking them to: 1) engage in brief recall and review sessions (checking completeness and correctness of their recalled lists); 2) self-testing without supports (tracking the time elapsed and correctness of solution); 3) explaining solutions (noticing the coherence, correctness, and fluency of their responses); and 4) creating and solving problems based on specific concepts (again, noting correctness of their solution and the time elapsed). Each of these strategies creates observable and objective measures (examples noted in parentheses) capable of indicating level of understanding. When I have students do brief (1-2 minute) recall exercises in class, I have them note omissions and incorrect statements as they review their notes and compare with peers. These indicate concepts they do not know as well.

Our students are over-reliant on rehearsal learning strategies and struggle to accurately assess their learning. We can help our students transform their learning by engaging them with a broader suite of learning strategies and concrete and objective measures of learning. By starting here, we are helping our students develop transferable metacognitive skills and knowledge, capable of improving their learning now, in our class, and throughout their lives.

References

Ehrlinger, J., & Shain, E. A. (2014). How Accuracy in Students’ Self Perceptions Relates to Success in Learning. In V. A. Benassi, C. E. Overson, & C. M. Hakala (Eds.). Applying science of learning in education: Infusing psychological science into the curriculum. Retrieved from the Society for the Teaching of Psychology web site: http://teachpsych.org/ebooks/asle2014/index.php

Svinicki, M. (2004). Learning and motivation in the postsecondary classroom. San Francisco, CA: John Wiley & Sons.

Dembo, M. & Seli, H. (2013). Motivation and learning strategies for college success: A focus on self-regulated learning (4th ed.). New York, NY: Routledge.


Am I responsible for engaging my students in learning how to learn?

by Patrick Cunningham, Rose-Hulman Institute of Technology

I’m a mechanical engineering professor and since my first teaching experience in graduate school I’ve wanted my students to walk away from my classes with deep learning. Practically, I want my students to remember and appropriately apply key concepts in new and different situations, specifically while working on real engineering problems.

In my early years of teaching, I thought if I just used the right techniques, exceptional materials, the right assignments, or the right motivational contexts, then I would get students to deeper learning. However, I still found a disconnect between my pedagogy and student learning. Good pedagogy is important, but it isn’t enough.

On sabbatical 4 years ago, I sat in on a graduate-level cognitive processes course that helped explain this disconnect. It helped me realize student learning is principally determined by the student. What the student does with the information determines the quality of their learning. How they use it. How they apply it. How they practice it. How engaged they are with it. I can provide a context conducive to deeper learning, but I cannot build the foundational and rich knowledge frameworks within the students’ minds. Only the students can do this. In other words, while we, as educators, are important in the learning process, we are not the primary determinants of learning, students are. Students are responsible for their learning, but they don’t universally realize it.

So, how do we help students realize their responsibility for learning? It requires presenting explicit instruction on how learning really works, providing practice with effective approaches to learning, and giving constructive feedback on the learning process (Kaplan, et al. 2013). When left unchecked, flawed conceptions of the learning process at best are allowed to persist and at worst are reinforced. Even when we do not explicitly speak to the learning process with our students, we say something about it. For example, when our primary mode of instruction is walking students through example problems, we may reinforce the belief that learning is about memorizing the process rather than connecting concepts to different contexts and knowing when to apply one concept versus another concept. Sometimes we do speak to students about the learning process, but we offer vague and unhelpful advice, such as, “work more problems”, or “study harder”. Such advice doesn’t point students to specific strategies instrumental in building more interconnected knowledge frameworks (e.g., elaborative and organizational strategies) (Dembo & Seli 2013) and can reinforce surface-level memorization and pattern matching approaches.

Because our teaching doesn’t guarantee student learning, because we desire our students develop deep and meaningful learning, and since we always say something about the learning process (intentionally or not), we, as educators, are responsible for engaging our students in developing as learners. We should be explicitly engaging our students in learning about and regulating their learning processes, i.e., developing their metacognitive skills.

As I advocate for our responsibility to aid students’ in learning how to learn, some common reactions include:

  1. Don’t people figure out how to learn naturally?
  2. Shouldn’t students already do this on their own?
  3. I don’t know metacognition and the science of learning like I know my specialty area.

Don’t we figure out how to learn naturally? Yes, learning is a natural process, but, no, we do not naturally develop deep and efficient approaches to learning – anymore than we naturally develop the skill of a concert musician or any other highly refined practice. Shouldn’t students already do this on their own? Ideally, yes, but the reality is most students’ prior learning experiences have led to ingrained surface learning habits.

Prior learning experiences condition how we go about learning, along with contextual factors, such as the guidance of parents and teachers. In general, students think they are good at learning and don’t see a need to change their approaches. They continue to get good grades using memorization and pattern matching – often cramming for exams – while lacking long-term memory of concepts and the ability to transfer these concepts to real applications. As long as our courses allow students to get good grades (their measures of “success”) with surface learning habits, such views will persist. Deep learning includes memorizing, i.e., knowing, things, but such durable and transferable learning requires much more than just memorization. It takes effortful intellectual engagement with concepts, exploring connections and sorting out relationships between concepts, and accurate self-assessment. Such approaches can be learned, and a few students do. More can if we explicitly guide them. Our students are not lazy, rather they are misguided by prior experiences. Let’s guide them!

I don’t know metacognition and the science of learning like I know my specialty area. Yes, it is important to be knowledgeable and proficient with what we teach. While we have done much with the content in our specialties, we have limited training, if any, training on metacognition (the knowledge and regulation of our thinking/learning processes) and the science of learning. However, as educators trying to improve our craft, shouldn’t we also be students of learning? This can start small and continue as a career-long pursuit. We can always improve! You also likely know more than you think you do. Your self-selection into advanced studies and a college teaching career are not an accident. As part of the select group of academics, you are likely already metacognitively skilled, even if you don’t realize it. Start small, with one thing. Learn about it and practice or recognize it in your own life. For example, peruse a copy of Linda Nilson’s Creating Self-Regulated Learners or James Lang’s Small Teaching, or attend a teaching workshop that sparks your interest. Then, confidently share it with your students and engage them in it as you teach your content. Your authentic experience with it demonstrates its relevance and importance. Once you have become comfortable with this, add another element. Over time, you will build practical expertise about the learning process. Along the way you will likely learn about yourself and make sense of your past (and present) learning experiences. I did!

Need help? Look for my next post, “Where should I start with metacognition?”

Acknowledgements

This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757, 1433645, & 1150384. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation. I also extend my gratitude to my collaborating researchers, Dr. Holly Matusovich and Ms. Sarah Williams, for their support and critical feedback.

References

Dembo, M. & Seli, H. (2013). Motivation and Learning Strategies for College Success: A Focus on Self-Regulated Learning (4th ed.). New York, NY: Routledge.

Kaplan, M., Silver, N., Lavaque-Manty, D., Meizlish, D. (Eds.). (2013). Using Reflection and Metacognition to Improve Student Learning. Sterling, VA: Stylus.


It shouldn’t be Top Secret – Bloom’s Taxonomy

By Lauren Scharff, Ph.D.,  U. S. Air Force Academy *

Across the past year or so I have been reminded several times of the following fact: Most students are not aware of Bloom’s Taxonomy, and even if they are aware, they have no clue how or why their awareness of it might benefit them and their learning. Most instructors have heard of at least one version of Bloom’s Taxonomy, and some keep it in mind when designing learning activities and assessments.  But, rarely do instructors even mention it to their students.

Why don’t instructors share Bloom’s Taxonomy with their students? Is it a top secret, for instructors only? No! In fact, awareness and use of Bloom’s taxonomy can support metacognitive learning, so students should be let in on the “secret.”

What were the key experiences that led me to this strong stance? Let me share….

In May of 2016, I was fortunate to attend a keynote by Dr. Saundra McGuire at High Point University. In her keynote address and in her book, Teach Students How to Learn (2015), McGuire shared stories of interactions with students as they became aware of Bloom’s Taxonomy and applied it to their learning. She also shared data showing how this coupled with a variety of other metacognitive strategies lead to large increases in student academic success. Her work served as the first “ah ha” moment for me, and I realized that I needed to start more explicitly discussing Bloom’s Taxonomy with my students.

An additional way to highlight Bloom’s Taxonomy and support student metacognitive learning was shared this past October (2017) when Dr. Karl Wirth led a workshop as part of our 9th Annual Scholarship of Teaching and Learning (SoTL) Forum at the U. S. Air Force Academy. In his workshop he shared examples of knowledge surveys along with data supporting their use as a powerful learning tool. Knowledge surveys are collections of questions that support student self-assessment of their knowledge, understanding, and skills. When answering the questions, students rate themselves on their ability to answer the question (similar to a confidence rating) rather than fully answering the question. Research shows that most students are able to accurately self-assess (confidence ratings correlate strongly with actual performance; Nuhfer, Fleisher, Cogan, & Gaze, 2017). However, most students do not take the time to carefully self-assess their knowledge and abilities without formal guidance and encouragement to do so. In order to be effective, knowledge surveys need to ask targeted / granular questions rather than global questions. Importantly, knowledge survey questions can span the full range of Bloom’s Taxonomy, and Dr. Wirth incorporates best practices by taking the time to explain Bloom’s Taxonomy to his students and explicitly share how his knowledge survey questions target different levels.

Sharing Bloom’s Taxonomy in our classes is a great first step, but ultimately, we hope that students use the taxonomy on their own, applying it to assignments across all their courses. However, just telling them about the taxonomy or explaining how aspects of our course tap into different levels of the taxonomy may not be enough to support their use of the taxonomy beyond our classrooms. In response to this need, and as part of an ongoing Scholarship of Teaching and Learning (SoTL) project at my institution, one of my student co-investigators (Leslie Perez, graduated May 2017), created a workshop handout that walks students through a series of questions that help them apply Bloom’s as a guide for their learning and academic efforts. This handout was also printed in a larger, poster format and is now displayed in the student dorms and the library. Students use the handout by starting in the middle and asking themselves questions about their assignments. Based on their answers, the walk through a path that helps them determine what level of Bloom’s Taxonomy they likely need to target for that assignment. It should help them become more explicitly aware of the learning expectations for their various assignments and support their informed selection of learning strategies, i.e. help them engage in metacognitive learning.

Figure 1. Snapshot of the handout we use to guide students in applying Bloom’s Taxonomy to their learning.  (full-sized version here)

As someone who is a strong proponent of metacognitive learning, I have become increasingly convinced that instructors should more often and more explicitly share this taxonomy, and perhaps even more importantly, share how it can be applied by students to raise their awareness of learning expectations for different assignments and guide their choice of learning strategies. I hope this post motivates instructors to share Bloom’s Taxonomy (and other science of learning information) with their students. Feel welcome to use the handout we created.

————

McGuire, S. (2015). Teach Students How to Learn. Stylus Publishing, LLC, Sterling, VA.

Nuhfer, E., Fleisher, S., Cogan, C., Wirth, K., & Gaze, E. (2017). How random noise and a graphical convention subverted behavioral scientists’explanations of self-assessment data: Numeracy underlies better alternatives. Numeracy, 10(1), Article 4. DOI: http://dx.doi.org/10.5038/1936-4660.10.1.4

* Disclaimer: The views expressed in this document are those of the author and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.


Mind Mapping: A Technique for Metacognition

by Charlie Sweet, Hal Blythe, Rusty Carpenter, Eastern Kentucky University  Downloadable

Background

The Provost at Eastern Kentucky University invited Saundra McGuire to speak on metacognition as part of our University’s Provost’s Professional Development Speaker Series. Our unit was tasked with designing related programming both before and after McGuire’s visit.   Our aim was to provide a series of effective workshops that prepared the ground for our university’s Quality Enhancement Plan 2.0 on metacognition as a cross-disciplinary tool for cultivating reading skills. The following mind mapping exercise from one of four workshops was taught to over 50 faculty from across campus and the academic ranks. Feedback rated its popularity high and suggested its appropriateness for any level of any discipline with any size class.

Scientific Rationale

The Mind Map, a term invented by Tony Buzan in The Mind Map Book (1993), “is a powerful graphic technique which provides a universal key to unlocking the potential of the brain” (9). For that reason, Buzan’s subtitle is How to Use Radiant Thinking to Maximize Your Brain’s Untapped Potential. A mind map provides a way for organizing ideas either as they emerge or after the fact. Perhaps the mind map’s greatest strength lies in its appeal to the visual sense.

We chose to share mind mapping with our faculty because according to Brain Rules (2008), rule number ten is “Vision trumps all other senses” (221). For proof, the author, John Medina, cites a key fact: “If information is presented orally, people remember about 10%, tested 72 hours after exposure. That figure goes up to 65% if you add a picture” (234). Because of its visual nature, mind mapping provides a valuable metacognitive tool.

How Mind Mapping Supports Metacognition

Silver (2013) focuses on reflection in general and in particular “the moment of meta in metacognition—that is the moment of standing above or apart from oneself, so to speak, in order to turn one’s attention back upon one’s own mental work” (1). Mind mapping allows thinkers a visual-verbal way to delineate that moment of reflection and in capturing that moment to preserve its structure. Because analysis is one of Bloom’s higher-order learning skills, mind mapping leads to deep thinking, which makes self-regulation easier.

Method

Essentially, a mind map begins with what Gerry Nosich in Learning to Think Things Through (2009) calls a fundamental and powerful concept, “one that can be used to explain or think out a huge body of questions, problems, information, and situations” (105). To create a mind map, place the fundamental and powerful concept (FPC) you wish to explore in the center of a piece of paper and circle it. If at all possible, do something with color or the actual lettering in order to make the FPC even more visual. For instance, if you were to map the major strategies involved in metacognition, metacognition is the FPC, and you might choose to write it as such:

M E T A
Cognition

Increasing the visual effect of the FPC are lines that run to additional circled concepts that support the FPC. These Sputnik-like appendages are what Buzan calls basic ordering ideas, “key concepts within which a host of other concepts can be organized” (p. 84). For example, if you were working with our metacognition example, your lines might radiate out to a host of also-circled metacognitive strategies, such as retrieving, reflection, exam wrappers, growth mindset, and the EIAG process of Event selection-Identification of what happened-Analysis-Generalization of how the present forms future practice (for a fuller explanation see our It Works for Me, Metacognitively, pp. 33-34). And if you wanted to go one step further, you might radiate lines from, for instance, retrieving, to actual retrieving strategies (e.g., flashcards, interleaving, self-quizzing).

Uses for Mind Maps

Mind mapping has many uses for both students and faculty:

  • Notetaking: mind mapping provides an alternative form of notetaking whether for students or professors participating in committee meetings. It can be done before a class session by the professor, during the session by the student, or afterwards as a way of checking whether the fundamental and powerful concept(s) was taught or understood.
  • Studying: instead of rereading notes taken, a method destined for failure, try reorganizing them into a mind map or two. Mind mapping not only offers the visual alternative here, but provides retrieval practice, another metacognitive technique.
  • Assessing: instead of giving a traditional quiz at the start of class or five-minute paper at the end, ask students to produce a mind map of concept X covered in class. This alternative experiment will demonstrate to students a different approach and place another tool in their metacognitive toolbox.
  • Prioritizing: when items are placed in a mind map, something has to occupy center stage. Lesser items are contained in the radii.

Outcomes

Mind maps are easy, deceptively simplistic, fun, and produce a deep learning experience. Don’t believe it? Stop reading now, take out a piece of paper, and mind map what you just read. We’re willing to bet that if you do so, the result will provide a reflection moment.

References

Buzan, T. (1993). The mind map book: How to use radiant thinking to maximize your brain’s untapped potential. New York: Plume Penguin.

McGuire, S. Y., & McGuire, S. (2015). Teach students how to learn: Strategies you can incorporate into any course to improve student metacognition, study skills, and motivation. Sterling, VA: Stylus.

Medina, J. (2008). Brain rules. Seattle: Pear Press.

Nosich, J. (2009). Learning to think things through. Upper Saddle River, NJ: Pearson.

Silver, N. (2013). Reflective pedagogies and the metacognitive turn in college teaching.

In M. Kaplan, N. Silver, D. Lavaque-Manty, & D. Meizlish (Eds.), Using reflection and metacognition to improve student learning (pp. 1-17). Sterling, VA: Stylus.

Sweet, C., Blythe, H., & Carpenter, R. (2016). It Works for Me, Metacognitively. Stillwater, OK: New Forums.

Appendix: How to Use Word to Create a Mind Map

  1. Click Insert.
  2. Click Shapes and select Circle.
  3. Click on desired position, and the circle will appear.
  4. Click on Draw Textbox.
  5. Type desired words in textbox (you may have to enlarge the textbox to accommodate words).
  6. Drag textbox into center of circle.
  7. Repeat as desired.
  8. To connect circles, click Insert Shapes and then Select Line.
  9. Drag Line between circles.

Hate-Inspired Webforums, PTSD, and Metacognition

by Roman Taraban, Texas Tech University

In linguistics, a register is a variety of speech used for distinct purposes in particular social settings. In a manner consistent with that terminology, I am here using the term discourse register to refer to sets of specific terms and meanings, and to specific vocabularies used by groups in order to achieve distinct purposes. Unlike a dictionary, a register is not so much concerned with the meanings of words as it is with their association with cognitions, affects, and behaviors. A discourse register can link together such disparate phenomena as hate speech, PTSD, and metacognition by virtue of the fact that each has a distinct discourse register, that is, each applies a specific vocabulary and manner of speech. The purpose of this blog post is to suggest that these disparate phenomena are similar by virtue of the way that they operate. The second purpose is to suggest a way of increasing our understanding of metacognitive processing by beginning to implement some of the technology that has already been extensively applied to hate-inspired webforums and trauma-related therapies.

Regarding hate speech, the internet has provided radical right groups the means to organize networks that often promote bias, bigotry, and violence. An example is Stormfront (https://www.stormfront.org/forum/), which was established by white supremacist and ex-felon Don Black in 1996. (Figea, 2015). Right-wing extremists use the internet to build identity and unity with “like-minded” individuals. This has prompted researchers and government analysts to analyze extremist communications in order to gain an understanding of these groups. Importantly, key indicators in the communications are sought out that could indicate future events (Figea, 2015; Figea et al., 2016).

What are the key indicators in extremist communications? The answer lies in part in the concept of a discourse register. It consists of the specific vocabulary and ways of communicating that characterize the shared conversations and practices of a group. For example, Figea (2015) applied machine learning to analyze Stormfront forum exchanges in an attempt to assess the level of three affects: aggression towards an outgroup, racism, and worries about the present or future. A sample of forum posts was classified by humans for the affects, then a machine was trained on the human classifications and tested on a new sample of forum posts. Key indicators for the racism affect were black, race, Jew, protest, and Zionist, corresponding to topics in the forums associated with Black inferiority, Jewish conspiracy, and government corruption (Figea, 2015).

The idea of a shared discourse among a group of individuals provides the theoretical glue that allows binding the activities, speech, and shared identity of groups of individuals. In some cases, the analysis of discourse has provided insights into the motivations and behaviors of extremist and terrorist groups, as described by Figea and colleagues (2015; Figea et al., 2016). In other cases, researchers have applied the idea of discourse and discourse analysis to prosocial activities involving counseling and therapy. Pennebaker and King (1999) proposed that “the way people talk about things reveals important information about them” (p. 1297). In order to assist them in their analyses, Pennebaker and colleagues developed and tested the LIWC (Linguistic Inquiry and Word Count) software. This software has been successfully applied to the analysis of texts in a variety of contexts and applied to a wide range of dimensions. These include analyses of emotionality, social status, group processes, close relationships, deception and honesty, thinking styles, and individual differences (Tausczik & Pennebaker, 2010).

Jaeger et al. (2014) examined the associations between trauma-related experiences (e.g., PTSD, depression, anxiety) and the content of the narratives written by trauma patients. The researchers found significant differences between daily vs trauma-related narratives in the use of cognitive-mechanism words (e.g., cause, know, ought) and negative emotion words (e.g., hate, worthless, enemy). There were also strong associations between the words that patients used and the severity of their trauma. The approach and outcomes in Jaeger et al. was similar to that employed by Figea and colleagues.

A perk of the LIWC software is that it allows individuals to develop their own specialized dictionaries and to import those dictionaries into LIWC to analyze language use for evidence of the target constructs. When individuals express sadness, they use words like sad, loss, cry, alone (Pennebaker & King, 1999). Sadness is part of a person’s emotion register. Can we apply this analytic approach to metacognition and ask, What is the discourse of metacognition? As instructors, how do the ways we talk about teaching reflect a metacognitive register – i.e., words that reflect an understanding of cognitive functioning, learning, limitations, self-regulation, monitoring, scaffolding, and so on. How do the ways we talk about students, classrooms, homework, and student collaboration mirror metacognitive understanding and processing? Current technology allows us to begin exploring these questions. Following the model provided in Figea (2015; Figea et al., 2016), one place to start might be this Improve With Metacognition (IWM) forum. The analysis of published scholarship on metacognition would be another source of texts to use to train and analyze a machine to detect key metacognitive indicators in texts. Human coders would code sentences in a sample of the texts as involving or not involving metacognition. These classification would be used to train a machine. After training, the machine would be tested on a new sample of texts.

Development of a metacognitive register is subject to the same constraints as any good scholarship. The developers need to be experts in the area of metacognition, and they need to have a clear grasp of how metacognition works. The linguistic analysis dictionary that they develop needs to be accurate and comprehensive. It needs to be a team effort – one individual cannot do it alone. The dictionary needs to be tested for construct validity, internal consistency, and for reliable test results across a variety of participants and contexts. In spite of the challenges inherent in the task, the prospect of a ready analytic tool for metacognition could help in advancing the application of the powerful cognitive suite of metacognitive processes in classrooms.

 

References

Figea, L. (2016). Machine learning for affect analysis on white supremacy forum. Downloaded from https://uu.diva-portal.org/smash/get/diva2:955841/FULLTEXT01.pdf .

Figea, L., Kaati, L, & Scrivens, R. (2016). Measuring online affects in a white supremacy forum. In IEEE Xplore. DOI: 10.1109/ISI.2016.7745448

Pennebaker, J. W., & King, L. A. (1999). Linguistic styles: Language Use as an individual difference. Journal of Personally and Social Psychology, 77(6), 1296-1312.

Tausczik, Y. R., & Pennebaker, J. W. (2010). The psychological meaning of words: LIWC and computerized text analysis methods. Journal of Language and Social Psychology, 29(1), 24-54.