From Faculty to Chair: Lessons Learned

by Dr. Scott Santos, faculty member who became Chair of the Department of Biological Sciences in 2018; thus, he was thrust into leadership of the curricular redesign project.  

In the final post of “The Evolution of Metacognition in Biological Sciences” guest series, Dr. Scott Santos shares his experience of moving from a faculty member in the department when the process of improving metacognition in the department began, to becoming chair and suddenly in a position to lead it. He also shares key lessons we learned over the course of the project and looks ahead to what’s next for the Biology department.

Learning about Metacognition

“Metacognition…Huh? What’s that?!” is what popped to mind the first time I came across that word in an email announcing that our department would be investigating ways to integrate it into courses at our 2017 Auburn University (AU) Department of Biological Sciences (DBS) Annual Faculty Retreat. Testimony to my naïveté on metacognition at the time comes from the fact that the particular email announcing the above is the first containing that specific word among 100,000+ correspondences dating back to 2004 when I started as a faculty member.

The email mentioning metacognition prompted me to spend a few minutes researching the word and, not surprisingly, discovering a wealth of internet resources. One of the most useful I found among these came from the Center for Teaching at fellow Southeastern Conference school, Vanderbilt University, where it was defined as “…. simply, thinking about one’s thinking” (Chick, 2015). I found this interesting since it reminded me of a recurring comment I have heard over the years amongst individuals who have successfully defended their Ph.D. dissertations, namely that one’s defense makes you realize how much you know about one particular area of knowledge while realizing how little you know about everything else.

The point that jumped out at me concerning this potential analogy was that, if it represented a genuine example of metacognition, it evolves in an individual over multiple years as they experience the trials and tribulations (as well as rewards and eventual success) associated with obtaining a terminal degree. Ambitiously, we were taking on the challenge of attempting to instill in early-career students an awareness and recognition of their strengths and weaknesses across the spectrum of learning, writing, reading, etc. As you can imagine, this was my first indicator that we had some work to do.  

Educating Our Department

So how did the DBS faculty at AU approach this seemingly daunting task of bringing metacognition “to the masses”?

Firstly, our previous departmental leadership had the foresight to start the process by having the retreat facilitated by highly-qualified individuals. Specifically, Dr. Ellen Goldey (currently Dean, Wilkes Honors College, Florida Atlantic University; formerly Department Chair, Wofford College, SC) and Dr. April Hill (Chair, Department of Biology, University of Richmond) were recruited as two nationally-recognized leaders involved in the National Science Foundation (NSF)-funded Vision and Change (V&C) Report (Brewer & Smith 2011) to conduct a workshop that included integration of metacognition into our curricula. This proved highly useful to having our faculty begin to wrap our collective minds around what metacognition was (and could be) along with how we might begin approaching its integration into our existing and future courses. This has been followed up by general faculty meetings, and subcommittee meetings such as those of the DBS Curriculum Committee, which have occurred at regular intervals to undertake this process. 

photo of Bio Dept faculty fall 2019
Fall 2019 Biology faculty retreat group photo. Scott Santos, front and center, giving the Shaka sign.

Overall, I am happy to report that we have made some significant progress in this area, including holding specialized workshops on the topic and discussing approaches to incorporate metacognitive prompts into midterms, finals, and surveys of undergraduate student research experiences that collect responses for future qualitative analyses, for integration of metacognition development and assessment into our budding ePortfolio initiative, and other activities (like this blog series). However, these modest successes have not come without challenges: as our “metacognition massacre” experience taught us, it takes significant levels of time and energy for such efforts to come to fruition and to seed and foster support for these efforts among the faculty charged with bringing metacognition into the classroom.  

Key Insights

It has now been several years since AU DBS started our initiatives with metacognition, and during this time I have transitioned from an individual faculty member “in the trenches” to Chair of the department and thus charged with “leading the troops.” While I would be well-off financially if I had received a nickel for every time I have been offered “congratulation, and condolences” in the year and a half since becoming Chair, it has given me a new and different perspective on our metacognitive efforts:  

  1. First, a passionate and dedicated team is needed for initiatives like this to prosper and we, as a department, have been fortunate to have that in the form of our AU colleagues who are also contributing blogs in this series. Importantly, they belong to multiple units outside DBS, thus bringing the needed expertise and perspective that we lacked or might miss, respectively. We are greatly indebted to them, and departmental chairs and heads interested or intending to start similar initiatives would be wise to establish and cultivate such collaborations early in the process. It is very helpful to have expert advice when tackling issues that are unfamiliar to most of the faculty.
  2. Second, working with your faculty on understanding what metacognition is, along with defining expectations and assessment for initiatives around it, are paramount for your department’s immediate success with implementing activities from such efforts. In our case, the fact that many AU DBS faculty wrestled with the concept of metacognition meant that we had to invest more time for calibration before discussion could move forward.
  3. Third, the significance of solicitating undergraduate student participation during the development and implementation stages of the process should not be undervalued since they are the constituents who our efforts are ultimately targeting and thus deserve a voice at the table. Although our posts in this series have highlighted inflection points for the faculty as we moved our curriculum toward more metacognition, it is critical to note that we involved students as partners throughout the process. Some strategies we used include organizing student focus groups led by facilitators outside the department, conducting surveys, and inviting students to some meetings and department retreats.

Importantly, this should not be considered an exhaustive list and instead should serve as a general guide of issues to consider from someone who has had an opportunity to both witness and participate in the process from the departmental faculty and leadership perspectives.  

Looking Toward the Future

What does the future hold for AU DBS when it comes to metacognition? On one hand, we will continue in the short-term to implement the initiatives described above while being opportunistic in improving them, which we consider to be a strategy consistent with the current stage of our efforts to develop metacognitive abilities in students enrolled in our programs. On the other hand, the long-term forecast, at least from the departmental standpoint, is more amorphous, with reasons for this including our need to involve a large number of newly recruited faculty. We look forward to new directions and possibilities as we learn about new strategies from our colleagues, though we recognize the need to balance and maintain synergy between departmental undergraduate and graduate programs in the face of limited resources.

Finally, a key element for metacognition highlighted by Vanderbilt’s Center of Learning is “recognizing the limit of one’s knowledge or ability and then figuring out how to expand that knowledge or extend the ability.” Given that, I would like to think that Auburn’s Department of Biological Sciences itself is attempting to be metacognitive in its approach to preparing and fostering metacognition in students, and it will be interesting to see how our current efforts evolve in the future.  

References:

Chick, N. (2015). Metacognition: Thinking about one’s thinking. Vanderbilt University-The Centre for Teaching. 

Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Science, Washington, DC


Assessing Metacognition: A Plan for Learning Improvement

In the fourth post of “The Evolution of Metacognition in Biological Sciences” guest series, Dr. Lindsay Doukopoulos describes the goals and outcomes of the spring meetings of the Biology curriculum committee that were led by Biggio Center and the Office of Academic Assessment. Namely, they sought to create a questionnaire that would be given to all graduating students that would allow them reflect on their learning over the course of their academic career and to create a rubric to measure the quality of metacognition in their responses.

by Dr. Lindsay Doukopoulos, Assistant Director of the Biggio Center for the Enhancement of Teaching and Learning. In collaborating with the Office of Academic Assessment on the Learning Improvement Initiative, she leads faculty development initiatives designed to connect faculty with strategies, resources, and partners to support their teaching. 

In Spring of 2019, Katie Boyd and I led four meetings with Biology’s curriculum committee with the two-part goal of producing a set of metacognitive-reflection questions to be completed by every graduating student in their capstone course and a rubric to assess the quality of metacognition evidenced in the reflections.  

Developing the Rubric

In the first workshop, our goal was to help faculty unpack the definition of metacognition into categories and decide how many levels or standards to maintain within the rubric. In other words, we were hoping to fill in the x and y axes of the Metacognition rubric.

To facilitate this discussion, we brought two rubrics designed to measure metacognition. One came from the General Learning Outcome site of Cal State University-San Bernardino (CSUSB). The other came from the AAC&U Value rubric on Lifelong Learning, specifically, the two elements called Transfer and Reflection. Both rubrics appeared to offer valuable ways of assessing the metacognition evident in a written reflection. Rather than choose one or the other, we combined the two categories (rows) of the AAC&U Value rubric and the three categories (rows) of the CSUSB rubric. We also decided to use four standards of quality (columns) and discussed terminology resulting in: Beginning or N/A, Emerging/Developing, Mastery, and Exceeding Expectations.  

photo of a curriculum meeting
Spring 2019 Biology undergraduate curriculum meeting number 1: creating a rubric to assess metacognition using the newly defined and approved SLO 6.

In the second workshop, our goal was to fill in the performance criteria or behavioral anchors that would make up the rubric. After much discussion, we again decided to leave the rubric big and pilot it in our next meeting to determine whether the AAC&U elements or the CSUSB elements would be preferable.

In our third workshop, piloted the rubric by we scoring a packet of student reflections that had come out of the Biology undergraduate research capstone course the previous year. In practice, the faculty found the two elements of the AAC&U rubric easier to apply and more valuable for differentiating between the quality of metacognition in the student responses. Thus, we reduced the final rubric to those two elements.

chart showing a metacognition rubric for biological sciences
This is the rubric to assess metacognition that came out of Biology’s spring curriculum committee meetings with Biggio Center and Academic Assessment.

Developing the Reflection Questions

In the final workshop, our goal was to draft and finalize questions that would be given to sophomores and seniors in the program. These questions would parallel those already being used in the undergraduate research capstone course. These are the questions the committee created:

  1. What has been your favorite learning moment in your major? Please describe it in detail and explain why it was your favorite.
  2. What were the most useful skills you learned in your major and why?
    1. Regarding the skills you listed in question 2: how do you know you learned them? Please provide specific examples.
    2. How do you plan to apply these skills in future courses or your career?
  3. As a student, what could you have done to learn more? If you could go back in time and give yourself advice, what you say?
  4. Evaluate your capacity to design an experiment and generate hypotheses. Please provide specific examples of aspects of the scientific process you’re most and least confident about.
  5. Reflect on your view of science. How has your participation in your Biological Sciences major changed your view of science, if at all? Please provide specific examples.
  6. Reflecting on your learning journey, what do you value most about your major curriculum (i.e. the courses you took and the order you took them in)?

This question-writing process concluded the initial phase of the Learning Improvement Initiative as it led to the creation of the instrument the department will use to gather baseline data on the metacognition SLO. Moving forward, all students (roughly 75 majors per year) will complete the questionnaire during their capstone course and the curriculum committee will lead assessment using the rubric we created.

The goal is to have every student scored on the rubric every year beginning with baseline data collection in spring 2020 with students who have not experienced the “treatment” conditions, i.e. courses redesigned by faculty to improve metacognition. Over time, we expect that the faculty development workshops around transparent assignment design, reflective writing assignments, and ePortfolio pedagogy will result in graduates who are more metacognitive and data that reflects the learning improvement.  


Re-Defining Metacognition: Generating Faculty Engagement

In the third post of “The Evolution of Metacognition in Biological Sciences” guest series, Dr. Chris Basgier describes the workshops series led by Office of University Writing and the Biggio Center that helped the department redefine metacognition in such a way that they felt like they could understand it, teach it, and assess it. He also unpacks the value of the new definition and points to the work ahead as Biology embraces ePortfolios as part of a pedagogical strategy to increase metacognition in their students.

by Christopher Basgier, Associate Director of University Writing 

In the fall semester of 2018, Lindsay Doukopoulos and I had the opportunity to guide faculty from Auburn University’s Department of Biological Sciences (DBS) through a series of workshops devoted to metacognition. These workshops were a direct response to the “metacognition massacre” that had occurred at the August 2018 faculty retreat, as Dr. Robert Boyd recounted in the first blog post in this series.

Photo of Fall 2018 Biology faculty workshop number one: using the TILT Higher Ed transparent assignment design framework to improve metacognition.
Fall 2018 Biology faculty workshop number one: using the TILT Higher Ed transparent assignment design framework to improve metacognition.

Essentially, DBS faculty were uneasy with the definition of metacognition contained in the department’s student learning outcome (SLO), and unsure how to implement metacognitive activities in their courses. Working together, Lindsay and I decided to use these workshops to introduce faculty to the principles of transparent assignment design, offer guidance on integrating reflective writing into courses, and work with them to redefine the metacognition SLO in more familiar terms. 

Transparent Design

We began with transparent assignment design and reflective writing—rather than the SLO—to generate faculty engagement in metacognition. With concrete such strategies for promoting metacognition under their belts, we decided, faculty would be more invested in redefining the SLO and more willing to commit to aligning their courses to that outcome.  

Lindsay led the effort to introduce transparent assignment design to DBS workshop participants. According to Mary-Ann Winkelmes with TILT Higher Ed (2016), transparent assignment design invites faculty to clarify how and why students are learning course content in particular ways. Transparently designed assignments include   

  1. The assignment’s purpose, including the skills they will practice and the knowledge they will gain  
  2. The task, including what students will do and the steps they should take to complete the assignment  
  3. Criteria for success, including a checklist or rubric and examples of successful student work  

From our perspective, transparently designed assignments can promote metacognition. They make explicit what is often implicit in course assignments, and they help students see how a given assignment fits within the larger context of a course, and even a curriculum. The best designed assignments show students how to draw on what they already know, and help them imagine future implications of their work.  

We gave faculty ample time during the first workshop to consider how they would revise one or more assignments using the transparent assignment design framework, but we also knew that students needed to take an active role in their learning if they were to enhance their metacognitive capabilities. Therefore, I led a second workshop on reflective writing.   

Reflective Writing Component

In Auburn’s Office of University Writing, where I work as Associate Director, we spend a lot of time introducing principles of reflective writing to faculty, namely because we are in charge of the ePortfolio Project, which is Auburn’s Quality Enhancement Project required for accreditation in the SACSCOC.

The ePortfolios we support are polished, integrative, public-facing websites that students can use to showcase their knowledge, skills, and abilities for a range of audiences and purposes. A key component of ePortfolios, reflective writing is a metacognitive practice that invites students to articulate learning experiences, ask questions, draw connections, imagine future implications, and repackage knowledge for different audiences and purposes. After introducing DBS faculty to various levels of reflective writing, I gave them time to develop a reflective writing activity that would support a project or experience already in play in the courses.   

Our hope in these first two workshops was to give DBS faculty practical tools for promoting metacognition in their courses that would not require wholesale course redesign. Transparent assignment design and low-stakes reflective writing are fairly easy to implement in most course contexts.

Redefining the Metacognition Learning Objective

Our third workshop required more intellectual heavy lifting, as it focused on redefining the metacognition SLO. The original metacognition SLO read as follows:  

Students will develop metacognitive skills and be able to distinguish between broad categories of metacognition as applied to their major. In particular, they will distinguish between foundational (i.e., knowledge recall) and higher order (i.e., creative, analysis, synthesis) metacognitive skills.  

The trouble with this definition is that it seems to require students to be able to define different kinds of metacognition (which is difficult enough for faculty), rather than put different kinds of metacognition into practice, regardless of whether or not they can name the metacognitive “categories” they are using.

As an alternative, I turned to research by Gwen Gorzelsky and colleagues, scholars in writing studies who developed a taxonomy of kinds of metacognition. In their framework, the richest form of metacognition is constructive metacognition, which they define as “Reflection across writing tasks and contexts, using writing and rhetorical concepts to explain choices and evaluations and to construct a writerly identity” (2016, p. 226). 

Attracted to the notion that metacognition involves reflection on choices and the construction of identity, Lindsay and I tried our hand at a revised definition:  

Metacognition is defined as the process by which students reflect on and communicate about their role in learning. Reflection and communication may include: 1. students’ choices made in response to the affordances and constraints on learning, and/or 2. students’ evaluations of the success of such choices, particularly across tasks and contexts. Ultimately, these activities should help students develop and articulate identities as scientists.  

Our goal in composing this definition was not to suggest to DBS faculty that it was the right one, only that alternatives were possible. During the final workshop, we asked them to review the original SLO as well as our alternative, and then apply some “critical resistance” to each by reflecting on which terms or ideas made sense, which did not, and what language they might like to include. After much discussion, the group developed a revised SLO:  

Students will develop their metacognitive skills. Metacognition is defined as the process by which students reflect on and communicate about their role in learning. Reflection and communication may include: 1. Awareness of choices made in response to the opportunities (i.e., homework, office hours, review sessions) and constraints (i.e., challenging problems, short time frames) on learning, and/or; 2. Evaluation of the success of such choices, particularly across tasks and contexts. Ultimately, these activities should help students develop and articulate their science knowledge and its value to their professional and lifelong learning goals.   

This definition includes some key changes and additions. It eliminates jargon like “knowledge recall” and “affordances” in favor of more accessible language like “opportunities,” which are further defined in parentheses. Faculty also pushed back on the idea that all students should develop identities as scientists. A great number of students who take DBS courses plan to go into medical fields, so instead, they wanted to put the emphasis on science knowledge, a much more portable focus than science identity. They also added the notion of professional and lifelong learning goals to acknowledge the varied contexts in which their science knowledge might be relevant.   

In the end, our metacognition workshop was a success: the department approved the new definition in December 2018, and many commented on how much clearer and easier to implement and assess it appeared. But our work is not done. Faculty still need to integrate metacognition throughout the curriculum—or at least in courses where it is feasible. The department has agreed that ePortfolios are an effective vehicle for doing so.

ePortfolios to Support Implementation

DBS had joined the ePortfolio Cohort (the group of departments and units committed to implementing ePortfolios) in 2017, and have been working steadily on implementation. Valerie Tisdale, the department’s academic advisor, began the effort to introduce ePortfolios in BIOL 2100, a professional practice course for undergraduate biology majors, in fall 2018. Most recently, in spring 2019, DBS faculty applied for and were awarded with a grant to support an intensive summer workshop to further the integration of ePortfolios in support of metacognition and written communication. My colleague Amy Cicchino and I met with three department members—Lamar Seibenhener, Joanna Diller, and Valerie Tisdale—for four weeks in summer 2019. Utilizing the resources of the ePortfolio Project, the departmental team developed a host of materials for a new, required course that asks students to complete their final ePortfolios during their senior year. 

In the interest of transparent assignment design, they also created an ePortfolio “roadmap” that would help DBS majors understand what an ePortfolio is, why it is important for students in the sciences, and where in the curriculum they might encounter artifacts that could be used as evidence of their knowledge, skills, and abilities. The department approved the new course and completed the roadmap at a retreat in late 2019.

At this point, we are awaiting university-level approval of the new course. In the meantime, we are also planning workshops for DBS faculty on designing meaningful assignments that can be used as ePortfolio artifacts. Taken together, these efforts will help DBS support metacognition through ePortfolios in the years to come.  

References

Gorzelsky, G., Driscoll, D. L., Paszek, J., Jones, E., & Hayes, C. (2016). Cultivating constructive metacognition: a new taxonomy for writing studies. Critical transitions: Writing and the question of transfer, 215

Winkelmes, M. A., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Weavil, K. H. (2016). A teaching intervention that increases underserved college students’ success. Peer Review, 18(1/2), 31-36. 


Project Beginnings

In the second post of “The Evolution of Metacognition in Biological Sciences” guest series, Dr. Katie Boyd describes the activities the year prior to the Metacognition Massacre. These early activities started with the Learning Improvement Initiative that marked the beginning of the collaboration between Biology, Office of Academic Assessment, and the Biggio Center. She also describes how the initial definition of the metacognition learning outcome came about, how the department came to a greater understanding of metacognition, and how that understanding prompted a redefinition of what they believe metacognition is and should be within their context.

by Katie Boyd, Associate Director of the Office of Academic Assessment 

Luckily, the work of the Department of Biological Sciences to increase their graduating students’ metacognitive skills did not simultaneously begin and end with the “Metacognition Massacre” of 2018! If we back up just one year, the department was coming off of a strong fall faculty retreat and was ready to turn attention to the thoughtful examination of their curriculum and the knowledge, skills, and abilities expected of all students graduating from their program(s).    

In 2017, each undergraduate degree program in the Department of Biological Sciences (Marine Biology, Microbial, Cellular, & Molecular Biology, and Organismal Biology) had two (2) student learning outcomes and they addressed critical reading, information literacy, and communication skills. Metacognition had only just entered the conversation: it had not been a thoughtful component of the curriculum nor was it a learning outcome for their graduating students. The Department of Biological Sciences needed help. 

Partnering with the Teaching & Learning Center

Enter Auburn University’s Biggio Center for the Enhancement of Teaching and Learning, and Office of Academic Assessment.  That fall semester the two offices joined forces to support programs interested in evidencing learning improvement and jointly issued a request for learning improvement proposals.  

The learning improvement initiative was a way for programs to demonstrate a positive impact by showing how investment in innovative curricular experiences could lead to the improvement of student learning.  The Biggio Center and Office of Academic Assessment wanted to help programs evidence this improvement.  Of note, most departments redesign their curriculum too infrequently or do not have data to inform their curricular redesign, thus delaying their ability to showcase the improved preparedness of their graduates. We anticipated that the joint support of a teaching and learning center AND an assessment office would provide programs with many benefits, such as:   

  • A streamlined approach to aligning assessment processes with curricular innovation(s)  
  • The possibility of improving their students’ learning  
  • Strengthened program reputation  
  • Faculty satisfaction with process and outcome(s)  
  • Demonstrated good stewardship of departmental/college resources  
  • Opportunity for presentation/publication  

Biological Sciences submitted a proposal asking for support to define, measure, and improve metacognition amongst their graduates and they were chosen as one of six programs to participate in the inaugural cohort of learning improvement teams. Their specific reasons for choosing this outcome effort for their Learning Improvement Project are outlined below: 

  1. Metacognition was an element in the Action Plan developed by departmental  representatives at the PULSE Institute in June 2016. It was selected for the Action Plan  because it was a neglected element in our curricular planning.  
  2. This SLO was a new one on the department-wide list of SLOs, and of all the SLOs was the one with which faculty were least familiar. Specifically, the program felt they would need the most assistance integrating that into their degree programs. 
  3. A final reason was the hope that working with the Office of Academic Assessment and Biggio Center on improving students’ metacognition would eventually provide a model by which Biological Sciences could plan and implement curricular changes for their other SLOs.   

Writing the Learning Outcomes

Thus began the learning improvement project and, throughout the Fall semester, the Office of Academic Assessment facilitated a number of Biological Sciences curriculum committee meetings to re-write all of the department’s student learning outcomes (SLOs). The committee made incremental progress with bi-weekly meetings led by the Department’s intrepid chair.  The department chair quickly led the committee to write six of the seven student learning outcomes, but conversation continued around the metacognition outcome. 

Photo: Bob Boyd showcases Biology's Learning Improvement Project at the year one celebration event hosted by Biggio Center and Academic Assessment in fall 2018.
Bob Boyd showcases Biology’s Learning Improvement Project at the year one celebration event hosted by Biggio Center and Academic Assessment in fall 2018.

A number of committee members advocated for the importance of metacognition and reflection and admitted to embedding reflection components into weekly lectures and/or assignments.  On another hand, the former department chair  advocated for a definition that would be easily measurable and liked the idea of students being able to identify the level/type of learning being assessed in specific types of questions on exams or similar instruments (knowledge, comprehension, application).  Bloom’s taxonomy drove a lot of this conversation. 

Eventually, the committee finalized a metacognition SLO (6) and completed their list of seven department-wide SLOs (8 or 9 if you include major-specific outcomes).  At the time, the metacognition SLO was defined by the curriculum committee as:  

Students will develop metacognitive skills and be able to distinguish between  broad categories of metacognition as applied to their major. In particular, they willdistinguish between foundational (i.e., knowledge recall) and higher order (i.e., creative,analysis, synthesis) metacognitive skills. 

The list of outcomes was shared with all program faculty during a fall faculty meeting and they voted to accept the list as the new set of outcomes.  There were few questions regarding the outcomes during this meeting.  However, I think we can all agree that this is pretty typical when these sorts of items/topics are brought up in faculty meetings.   

Creating the Curriculum Map

A secondary goal of the curriculum committee was to draft a curriculum map aligning the new student learning outcomes with the required courses in each of the three undergraduate curricula.  The first few meetings allowed the committee to finalize the list of classes they wanted to include in the map and a subsequent discussion about how accurate a curriculum map would be when drafted by a subset of the faculty.  The curriculum committee entered the curriculum mapping conversations with some apprehension because the faculty in the room did not represent or teach all of the courses within the curriculum map. 

Eventually, it was decided that they would draft an aspirational curriculum map in which the ideal alignments would be suggested and discussed in a future faculty retreat. When it came to the metacognition outcome, the committee strongly felt as though it should be covered in each required course and that each course truly should be contributing to the students’ lifelong learning.    

Starting to Consider Assessment

With a set of student learning outcomes agreed upon, and a drafted curriculum map, the learning improvement conversation finally began to move towards assessment and measurement. Essentially, there needed to be a way to evaluate whether students were thinking about thinking and knowing about knowing. Enter the Office of University Writing.  It was at this point that Biological Sciences seriously considered ways in which ePortfolios could be used to both teach and assess metacognition.

Initial conversations targeted ePortfolios as a way to encourage reflective writing and simply “house” student assignments. This idea has blossomed and become much more than a data warehouse, and Chris Basgier (Office of University Writing) will expand on this in the next blog post.     This brings us to the Fall 2018 faculty retreat, which allowed for a guided and thoughtful discussion around each outcome and the aspirational curriculum map.  It was this thoughtful discussion that led to the very effective massacre of SLO 6, ultimately pointing to the need for a better definition of metacognition as a learning outcome. 


“The Metacognition Massacre”

In the first post of “The Evolution of Metacognition in Biological Sciences” guest series, Dr. Bob Boyd reflects on 2018 faculty retreat where Biology faculty rejected responsibility for teaching metacognition in their courses. He also shares where and how Biology’s journey to learning improvement around metacognition began.

By Robert Boyd, Professor of Biological Sciences and former Undergraduate Program Officer for Department of Biological Sciences (DBS). Currently, Associate Dean for Academic Affairs, College of Sciences and Mathematics 

My most memorable moment regarding metacognition occurred at a departmental faculty retreat in August 2018, right before the start of Fall Semester. Before this retreat, our departmental Curriculum Committee had created an “aspirational” curriculum map that purported to show which required courses addressed our brand-new list of eight or nine Student Learning Outcomes (SLOs) for each of the three majors in our department.

The Massacre

Metacognition, our new SLO 6, was selected as being a part of every required course. At the retreat, breakout groups were assigned to discuss and describe some aspects of several SLOs (one SLO per group, including a group assigned to “metacognition”) and put their ideas on a flipchart. When all the breakout groups reported, the metacognition group presented a blank flipchart page and said that they had been unable to decide what metacognition was.

Later during the retreat, when we discussed our “aspirational” curriculum map to convert it into a map that showed which SLOs were actually addressed in our core classes, almost all the checkmarks for metacognition were removed from the map. We asked faculty to place Post Its over the SLOs that they didn’t feel like their courses needed to address. In my mind, that retreat was a metacognition massacre. It showed that we needed to do some serious work to define that SLO as well as decide how to integrate it and measure it in our curricula.

Photo of a chart showing a curriculum map from a faculty retreat

Image 1: Biology’s ideal curriculum map presented at the 2018 Retreat. Faculty used slips of pink paper to indicate rejection of an SLO they didn’t think their individual course (left hand column) addressed. SLO 6, metacognition, was almost entirely stricken from the curriculum.

This blog series will present my department’s work on metacognition, mainly focusing on how we have proceeded since the memorable metacognition massacre at that faculty retreat. But I want to take some time now to set the stage by describing my department and some of our work prior to that retreat.

Setting the Stage for the Metacognition Massacre

Auburn University is a land-grant school with about 30,000 students, and has recently achieved the status of a Carnegie R1 institution (meaning that research is an important part of our mission). My department of 43 faculty is a Biological Sciences department, and our courses are vital to the university’s educational mission as well. As evidence of this, in an academic year we teach about 45,000 student credit hours. 

This new outcome effort began in January 2016, when one of our faculty, Jason Bond, became Chair and encouraged us to review our curricula, something that had not been done since 2008. Coincidentally, also in January 2016, our department was invited to participate in an NSF-funded Institute at Wofford College in South Carolina designed to help us begin the process of reviewing and revamping our programs.

In June 2016, a small departmental team attended the retreat which was focused on a report by the American Association for Advancement of Science (AAAS) on undergraduate biology education in the US. The report, entitled “Vision and Change in Undergraduate Biology Education: A Call to Action” (referred to as V&C below) and available from this link (https://live-visionandchange.pantheonsite.io/wp-content/uploads/2013/11/aaas-VISchange-web1113.pdf), pointed out that undergraduate biology education needed reform and the workshop involved assessing our department and its curricula.The assessment used a rubric that listed “Student Metacognitive Skills” as one of the ten elements evaluated, with an exemplary department described as “Instructors regularly integrate practice of effective metacognitive strategies within assignments. Most students become adept at reflecting upon, and improving, their own learning and coaching their peers.”

To begin the work of moving as a department from having no outcomes related to metacognition to one that placed it squarely in the SLOs for all of our programs, we held a retreat in 2017 which focused on High Impact Practices (HIPs). This retreat was facilitated by two nationally known educational leaders: Dr. Ellen Goldey (Dean, Wilkes Honors College, Florida Atlantic University) and Dr. April Hill (Chair, Department of Biology, University of Richmond). Faculty engagement at this event was strong and led us to begin the work of formally committing to a curriculum that would address metacognition as an outcome of our undergraduate programs.

In the spring of 2018, we held faculty meetings to introduce V&C concepts and ask the faculty in each of our three majors to evaluate our programs. In every case we decided we were at a “Beginning” stage. According to the V&C rubric, this means “Rarely are students encouraged to reflect on their learning strategies and skills. Study strategies, when discussed, may not be specifically geared to STEM learning or the particular student’s needs.” These meeting led to the 2018 Faculty Retreat described earlier which showed us how challenging it would be for us to understand and embrace our metacognition SLO. 

Citations  Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: a call to action. American Association for the Advancement of Science, Washington, DC.


The Evolution of Metacognition in Biological Sciences

By Lindsay Doukopoulos, Assistant Director of the Biggio Center for the Enhancement of Teaching and Learning at Auburn University, and blog mini-series editor.

Much of the literature on metacognition focuses on strategies that faculty can use to improve metacognitive skills in their students and the benefits of such skills. Our mini-series tackles a different kind of problem: how can a department redesign its curriculum to improve metacognition for all students and how will it know if improvement has actually occurred?  We believe our efforts can inform others across a variety of disciplines.

Our answer to this question takes the form of a case study in five parts about our collaborative and ongoing efforts to redesign the Department of Biological Sciences’ undergraduate curriculum and program assessment with a goal of improving metacognition for its students and demonstrating that improvement with data. We use a narrative structure to present the key inflection points in this process as well as lessons learned and best practices from our diverse perspectives.

Our collaborators include: Associate Dean for Academic Affairs for the College of Sciences and Mathematics, Bob Boyd (also a Biological Sciences professor and formerly the department’s Undergraduate Program Officer); Associate Director of Academic Assessment, Katie Boyd; Associate Director of the Office of University Writing, Chris Basgier; Chair of the Department of Biological Sciences, Scott Santos; and Assistant Director of the Biggio Center for the Enhancement of Teaching and Learning, Lindsay Doukopoulos.  

This timeline provides an overview of our efforts while our individual posts go into more detail about specific strategies and outcomes:  

Ideation: 

June 2016: Department leaders attend PULSE Institute and decide to make metacognition a student learning outcome (SLO) for all undergraduate programs the Department of Biological Sciences (hereafter, Biology) 

May 2017: Program assessment reports at this time include only two student learning outcomes (metacognition not one of them) for each of the three undergraduate programs in Biology 

August 2017: Faculty retreat led by NSF Vision & Change experts introducing metacognitive teaching strategies  

Commitment: 

October 2017: Learning Improvement Initiative launched by Biggio Center and Office of Academic Assessment: Biology proposes to improve SLO 6 – Metacognition  

Spring 2018: Biology’s curriculum committee develops a plan for improvement and creates an ideal (“aspirational”) curriculum map to share at the 2018 fall faculty retreat 

Lindsay Doukopoulos leading faculty development on metacognition at the 2018 Biology Faculty Retreat
Lindsay Doukopoulos leading faculty development on metacognition at the 2018 Biology Faculty Retreat

Conflict: 

August 2018: Faculty retreat, aka “Metacognition Massacre” – widespread faculty rejection of the metacognition SLO on the curriculum map 

A New Approach: 

Fall 2018: A three-part workshop series created by Office of University Writing (OUW) and the Biggio Center leads faculty to redefine the metacognition SLO and introduces strategies to support faculty teaching  

Turning Point:  

December 2018: Outcomes of the workshop series, including the new definition of SLO 6, are presented at a faculty meeting and the faculty vote to approve the new definition  

Assessing Metacognition:  

January – April 2018: Office of Academic Assessment and the Biggio Center lead Biology’s curriculum committee in creating a metacognitive questionnaire for graduating students and a rubric to assess the level of metacognition evidenced in the responses 

Improving Metacognition: 

Summer 2019: Biology invests in comprehensive strategy to promote metacognition across the curriculum using ePortfolios and several faculty participate in an intensive course redesign program 

What now?  

Fall 2019: OUW and Biggio provide ongoing support of teaching interventions to improve metacognition; Office of Academic Assessment provides ongoing support of the assessment of this work 

What’s next? 

Spring 2020: Gather baseline data on graduates’ metacognitive capabilities  Goals: Based on our efforts and an ongoing collection of data, we expect to see increases in students’ metacognitive abilities over time 


Enhancing Medical Students’ Metacognition

by Leslie A. Hoffman, PhD, Assistant Professor of Anatomy & Cell Biology, Indiana University School of Medicine – Fort Wayne

The third post in this guest editor miniseries examines how metacognition evolves (or doesn’t) as students progress into professional school.  Despite the academic success necessary to enter into professional programs such as medical school or dental school, there are still students who lack the metacognitive awareness/skills to confront the increased academic challenges imposed by these professional programs.  In this post Dr. Leslie Hoffman reflects on her interactions with medical students and incorporates data she has collected on self-directed learning and student reflections on their study strategies and exam performance. ~Audra Schaefer, PhD, guest editor

————————————————————————————————-

The beginning of medical school can be a challenging time for medical students.  As expected, most medical students are exceptionally bright individuals, which means that many did not have to study very hard to perform well in their undergraduate courses.  As a result, some medical students arrive in medical school without well-established study strategies and habits, leaving them overwhelmed as they adjust to the pace and rigor of medical school coursework.  Even more concerning is that many medical students don’t realize that they don’t know how to study, or that their study strategies are ineffective, until after they’ve performed poorly on an exam.  In my own experience teaching gross anatomy to medical students, I’ve found that many low-performing students tend to overestimate their performance on their first anatomy exam (Hoffman, 2016).  In this post I’ll explore some of the reasons why many low-performing students overestimate their performance and how improving students’ metacognitive skills can help improve their self-assessment skills along with their performance.

Metacognition is the practice of “thinking about thinking” that allows individuals to monitor and make accurate judgments about their knowledge, skills, or performance.  A lack of metacognitive awareness can lead to overconfidence in one’s knowledge or abilities and an inability to identify areas of weakness.  In medicine, metacognitive skills are critical for practicing physicians to monitor their own performance and identify areas of weakness or incompetence, which can lead to medical errors that may cause harm to patients.  Unfortunately, studies have found that many physicians seem to have limited capacity for assessing their own performance (Davis et al., 2006).  This lack of metacognitive awareness among physicians highlights the need for medical schools to teach and assess metacognitive skills so that medical students learn how to monitor and assess their own performance. 

Cartoon of a brain thinking about a brain

In my gross anatomy course, I use a guided reflection exercise that is designed to introduce metacognitive processes by asking students to think about their study strategies in preparation for the first exam and how they are determining whether those strategies are effective.   The reflective exercise includes two parts: a pre-exam reflection and a post-exam reflection.  

The pre-exam reflection asks students to identify the content areas in which they feel most prepared (i.e. their strengths) and the areas in which they feel least prepared (i.e. their weaknesses).  Students also discuss how they determined what they needed to know for the upcoming exam, and how they went about addressing their learning needs.  Students were also asked to assess their confidence level and make a prediction about their expected performance on the upcoming exam.  After receiving their exam scores students completed a post-exam reflection, which asked them to discuss what, if any, changes they intended to make to their study strategies based on their exam performance. 

My analysis of the students’ pre-exam reflection comments found that the lowest performing students (i.e. those who failed the exam) often felt fairly confident about their knowledge and predicted they would perform well, only to realize during the exam that they were grossly underprepared.  This illusion of preparedness may have been a result of using ineffective study strategies that give students a false sense of learning.  Such strategies often included passive activities such as re-watching lecture recordings, re-reading notes, or looking at flash cards.  In contrast, none of the highest performing students in the class over-estimated their exam grade; in fact, many of them vastly underestimated their performance. A qualitative analysis of students’ post-exam reflection responses indicated that many of the lowest performing students intended to make drastic changes to their study strategies prior to the next exam.  Such changes included utilizing different resources, focusing on different content, or incorporating more active learning strategies such as drawing, labeling, or quizzing.  This suggests that the lowest performing students hadn’t realized that their study strategies were ineffective until after they’d performed poorly on the exam.  This lack of insight demonstrates a deficiency in metacognitive awareness that is pervasive amongst the lowest performing students and may persist in these individuals beyond medical school and into their clinical practice (Davis et al., 2006).

So how can we, as educators, improve medical students’ (or any students’) metacognitive awareness to enable them to better recognize their shortcomings before they perform poorly on an exam?  To answer this question, I turned to the highest performing students in my class to see what they did differently.  My analysis of reflection responses from high-performing students found that they tended to monitor their progress by frequently assessing their knowledge as they were studying.  They did so by engaging in self-assessment activities such as quizzing, either using question banks or simply trying to recall information they’d just studied without looking at their notes.  They also tended to study more frequently with their peers, which enabled them to take turns quizzing each other.  Working with peers also provided students with feedback about what they perceived to be the most relevant information, so they didn’t get caught up in extraneous details. 

The reflective activity itself is a technique to help students develop and enhance their metacognitive skills.  Reflecting on a poor exam performance, for example, can draw a student’s attention to areas of weakness that he or she was not able to recognize, or ways in which his or her preparation may have been inadequate.   Other techniques for improving metacognitive skills include the use of think-aloud strategies in which learners verbalize their thought process to better identify areas of weakness or misunderstanding, and the use of graphic organizers in which learners create a visual representation of the information to enhance their understanding of relationships and processes (Colbert et al., 2015). 

Ultimately, the goal of improving medical students’ metacognitive skills is to ensure that these students will go on to become competent physicians who are able to identify their areas of weakness, create a plan to address their deficiencies, and monitor and evaluate their progress to meet their learning goals.   Such skills are necessary for physicians to maintain competence in an ever-changing healthcare environment.

Colbert, C.Y., Graham, L., West, C., White, B.A., Arroliga, A.C., Myers, J.D., Ogden, P.E., Archer, J., Mohammad, S.T.A., & Clark, J. (2015).  Teaching metacognitive skills: Helping your physician trainees in the quest to ‘know what they don’t know.’  The American Journal of Medicine, 128(3), 318-324.

Davis, D.A., Mazmanian, P.E., Fordis, M., Harrison, R., Thorpe, K.E., & Perrier, L. (2006). Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA, 296, 1094-1102. Hoffman, L.A. (2016). Prediction, performance, and adjustments: Medical students’ reflections on the first gross anatomy exam.  The FASEB Journal 30 (1 Supplement): 365.2.


Metacognition v. pure effort: Which truly makes the difference in an undergraduate anatomy class?

by Polly R. Husmann, Ph.D., Assistant Professor of Anatomy & Cell Biology, Indiana University School of Medicine – Bloomington

Intro: The second post of “The Evolution of Metacognition” miniseries is written by Dr. Polly Husmann, and she reflects on her experiences teaching undergraduate anatomy students early in their college years, a time when students have varying metacognitive abilities and awareness.  Dr. Husmann also shares data collected that demonstrate a relationship between students’ metacognitive skills, effort levels, and final course grades. ~ Audra Schaefer, PhD, guest editor

————————————————————————————————–

I would imagine that nearly every instructor is familiar with the following situation: After the first exam in a course, a student walks into your office looking distraught and states, “I don’t know what happened.  I studied for HOURS.”  We know that metacognition is important for academic success [1, 2], but undergraduates often struggle with how to identify study strategies that work or to determine if they actually “know” something.  In addition to metacognition, recent research has also shown that repeated recall of information [3] and immediate feedback also improve learning efficiency [4].  Yet in large, content-heavy undergraduate classes both of these goals are difficult to accomplish.  Are there ways that we might encourage students to develop these skills without taking up more class time? 

Online Modules in an Undergraduate Anatomy Course

I decided to take a look at this through our online modules.  Our undergraduate human anatomy course (A215) is a large (400+) course mostly taken by students planning to go into the healthcare fields (nursing, physical therapy, optometry, etc.).  The course is comprised of both a lecture (3x/week) and a lab component (2x/week) with about forty students in each lab section.  We use the McKinley & O’Loughlin text, which comes with access to McGraw-Hill’s Connect website.  This website includes an e-book, access to online quizzes, A&P Revealed (a virtual dissection platform with images of cadavers) and instant grading.  Also available through the MGH Connect site are LearnSmart study modules. 

These modules were incorporated into the course along with the related electronic textbook as optional extra credit assignments about five years ago as a way to keep students engaging with the material and (hopefully) less likely to just cram right before the tests. Each online module asks questions over a chapter or section of a chapter using a variety of multiple-choice, matching, rank order, fill-in-the-blank, and multiple answer questions. For each question, students are not only asked for their answer, but also asked to rank their confidence for their answer on a four-point Likert scale. After the student has indicated his/her confidence level, the module will then provide immediate feedback on the accuracy of their response. 

During each block of material (4 total blocks/semester) in our anatomy course during the Fall 2017 semester, 4 to 9 LearnSmart modules were available and 2 were chosen by the instructor after the block was completed to be included for up to two points of extra credit (total of 16 points out of 800).  Given the frequency of the opening scenario, I decided to take a look at these data and see what correlations existed between the LearnSmart data and student outcomes in our course.

Results

The graphs (shown below) illustrated that the students who got As and Bs on the first exam had done almost exactly the same number of LearnSmart practice questions, which was nearly fifty more questions than the students who got Cs, Ds, or Fs.  However, by the end of the course the students who ultimately got Cs were doing almost the exact same number of practice questions as those who got Bs!  So they’re putting the same effort into the practice questions, but where is the problem? 

The big difference is seen in the percentage of these questions for which each group was metacognitively aware (i.e., accurately confident when putting the correct answer or not confident when putting the incorrect answer).  While the students who received Cs were answering plenty of practice questions, their metacognitive awareness (accuracy) was often the worst in the class!  So these are your hard-working students who put in plenty of time studying, but don’t really know when they accurately understand the material or how to study efficiently. 

Graphs showing questions completed as well as accuracy of self-assessment.

The statistics further confirmed that both the students’ effort on these modules and their ability to accurately rate whether or not they knew the answer to a LearnSmart practice question were significantly related to their final outcome in the course. (See right-hand column graphs.) In addition to these two direct effects, there was also an indirect effect of effort on final course grades through metacognition.  So students who put in the effort through these practice questions with immediate feedback do generally improve their metacognitive awareness as well.  In fact, over 30% of the variation in final course grades could be predicted by looking at these two variables from the online modules alone.

Flow diagram showing direct and indirect effects on course grade

Effort has a direct effect on course grade while also having an indirect effect via metacognition.

Take home points

  • Both metacognitive skills (ability to accurately rate correctness of one’s responses) and effort (# of practice questions completed) have a direct effect on grade.
  • The direct effect between effort and final grade is also partially mediated by metacognitive skills.
  • The amount of effort between students who get A’s and B’s on the first exam is indistinguishable.  The difference is in their metacognitive skills.
  • By the end of the course, C students are likely to be putting in just as much effort as the A & B students; they just have lower metacognitive awareness.
  • Students who ultimately end up with Ds & Fs struggle to get the work done that they need to.  However, their metacognitive skills may be better than many C level students.

Given these points, the need to include instruction in metacognitive skills in these large classes is incredibly important as it does make a difference in students’ final grades.  Furthermore, having a few metacognitive activities that you can give to students who stop into your office hours (or e-mail) about the HOURS that they’re spending studying may prove more helpful to their final outcome than we realize.

Acknowledgements

Funding for this project was provided by a Scholarship of Teaching & Learning (SOTL) grant from the Indiana University Bloomington Center for Innovative Teaching and LearningTheo Smith was instrumental in collecting these data and creating figures.  A special thanks to all of the students for participating in this project!

References

1. Ross, M.E., et al., College Students’ Study Strategies as a Function of Testing: An Investigation into Metacognitive Self-Regulation. Innovative Higher Education, 2006. 30(5): p. 361-375.

2. Costabile, A., et al., Metacognitive Components of Student’s Difficulties in the First Year of University. International Journal of Higher Education, 2013. 2(4): p. 165-171.

3. Roediger III, H.L. and J.D. Karpicke, Test-Enhanced Learning: Taking Memory Tests Improves Long-Term Retention. Psychological Science, 2006. 17(3): p. 249 – 255.

4. El Saadawi, G.M., et al., Factors Affecting Felling-of-Knowing in a Medical Intelligent Tutoring System: the Role of Immediate Feedback as a Metacognitive Scaffold. Advances in Health Science Education, 2010. 15: p. 9-30.


Learning about learning: A student perspective

by Caroline Mueller, B.S., Clinical Anatomy PhD student, University of Mississippi Medical Center

Intro: In this guest editor miniseries, “The Evolution of Metacognition”, we will be discussing a progression of metacognitive awareness and development of metacognition in multiple stages of education, from undergraduate, to graduate and professional students, and even faculty. In this first post Caroline Mueller, a doctoral student in an anatomy education program, is providing a student perspective.  She shares reflections on learning about metacognition, how it has shaped her approaches to learning, and how it is influencing her as an emerging educator.  ~Audra Schaefer, PhD, guest editor

———————————————————————————————-

As a second-year graduate student hearing the word “metacognition” for the first time, I thought the idea of “thinking about thinking” seemed like another activity necessitated by teachers to take up more time. After looking into what metacognition actually meant and the processes it entails, my mindset changed. It is logical to think about the thought processes that occur during learning. Engaging in metacognitive thought seems like an obvious, efficient activity for students to do to test their knowledge—yet very few do it, myself included. In undergrad, I prided myself on getting high grades, thinking that my method of reading, re-writing, memorizing, and then repeating was a labor-intensive but effective method. It did the job, and it resulted in high grades. However, if my goals included retaining the content, this method failed me. If someone today asked me about the Krebs Cycle, I could not recite it like I could for the test, and I definitely could not tell you about its function (something to do with glucose and energy?).

Upon entering graduate school, what I thought were my “fool-proof” methods of study soon became insufficient and fallible. The work load in medical gross anatomy and medical histology increased by at least 20 times (well, it felt like it anyway). It was laborious to keep up with taking notes in lecture, re-writing, reading the text, and then testing myself with practice questions. I felt as though I was drowning in information, and I saw a crippling arthritis in my near future. I then faced my first devastating grade. I felt cheated that my methods did not work, and I wondered why. Needing a change, I started trying different study methods. I started reviewing the information, still re-writing, but self-quizzing with a small group of classmates instead of by myself. We would discuss what we got wrong and explain answers if we knew them. It helped me improve my grades, but I wish I had more guidance about metacognition at that point.

As I begin studying for my terrifying qualifying exams this semester, I am currently facing the daunting task of studying all the material I have learned in the last 2 years of graduate school. Easy task, right? Even though you may sense my dread, I have a different approach to studying because of what I’ve recently learned about metacognition. An important aspect of metacognition is self-assessment, using tools such as pre-assessment and the most confusing point (muddiest point). The pre-assessment is a tool that allows students to examine their current understanding of a topic and to direct them to think about what they do and do not know. It helps guide students to focus their efforts on those elements they do not know or understand well (Tanner, 2012). The muddiest point tool can be used at the end of a long day of studying. Students reflect on the information covered in a class or study session and assess what was the muddiest point (Tanner, 2012).

Both tools have shaped my approach to studying.  Now I study by human body systems, starting each system off by writing what I do know about the subject and then writing down what I want to know by the end of my review. This aids in my assessment of what I do and do not know, so that I can orient myself to where I struggle the most. At first, it seemed like a time-intensive activity, but it quickly made me realize that it was more efficient then rewriting and rereading the content I already knew. I implemented muddiest point in my studies too because after a strenuous day of trying to grasp intense information, I end up feeling like I still do not know anything. After reviewing the information and filling in the gaps, at the end of my week of review, I quiz myself and ask myself what I was most confusing. It helps me plan for future study sessions.

Metacognition feels like it takes a lot of time when you first start doing it because it makes the learner deal with the difficult parts of a subject matter. Students, myself included, want the act of acquiring new information to be rewarding, quick, and an affirmation of their competency of the material. An example of this is when I would get an answer correct when I did practice questions while preparing for an exam, but I never thought about why the correct answer was correct. Getting it right could have been pure luck; in my mind, I must have known the material. By thinking about the “why,” it prompts students to think deeply about their thought process to picking that answer. This act alone helps solidify understanding of the topic. If one can explain how they got to the answer, or why they believe an answer to be true, it allows them to assess how well they understand the content matter.

cartoon of a brain working out using books as weights

My role as a student is beginning to change—I have become a teacher’s assistant, slowly on my way to full-on teacher status. After learning about metacognition and applying it as a student, I attempted to try it on the students I teach.

For example, an important part of metacognition is learning to recognize what you do and do not know. In anatomy lab, in order to prompt students to think deeper about material, I ask students what they know, rather than just giving them the answer to their questions. I let them describe the structure and ask them to explain why they think that structure is what it is.

When I first did this, students resisted—the stress of the first-year medical school makes students desire the answer immediately and to move on. But I persisted in asking questions, explaining to students that finding out what you do know and do not know allows you to focus your studying to filling in those gaps.

Since I am a new convert to teacher assistant from student, students often ask me the best ways to study and about how I studied. I again urge them to take an approach that helps identify gaps in their knowledge. I encourage them to go over the chapter headings and write down what they know about each one, essentially completing a preassessment I previously mentioned.

At this point, I might be a little rough in my approach to instill the incredible power of metacognitive skills in students, but I am still working out the kinks. I am still learning—learning to be an effective teacher, learning the content as a student, and learning to learn about teaching and learning. As a student and a teacher, my hope for the future of my teaching is that I learn how to implement metacognitive methods effectively and to be able to assess these methods and keep trying to improve on them.

Tanner, K.D. (2012). Promoting student metacognition. CBE-Life Sciences Education, 11, 113-120. [https://www.improvewithmetacognition.com/promoting-student-metacognition/]


Paired Self-Assessment—Competence Measures of Academic Ranks Offer a Unique Assessment of Education

by Dr. Ed Nuhfer, California State Universities (retired)

What if you could do an assessment that simultaneously revealed the student content mastery and intellectual development of your entire institution, and you could do so without taking either class time or costing your institution money? This blog offers a way to do this.

We know that metacognitive skills are tied directly to successful learning, yet metacognition is rarely taught in content courses, even though it is fairly easy to do. Self-assessment is neither the whole of metacognition nor of self-efficacy, but self-assessment is an essential component to both. Direct measures of students’ self-assessment skills are very good proxy measures for metacognitive skill and intellectual development. A school developing measurable self-assessment skill is likely to be developing self-efficacy and metacognition in its students.   

This installment comes with lots of artwork, so enjoy the cartoons! We start with Figure 1A, which is only a drawing, not a portrayal of actual data. It depicts an “Ideal” pattern for a university educational experience in which students progress up the academic ranks and grow in content knowledge and skills (abscissa) and in metacognitive ability to self-assess (ordinate). In Figure 1B, we now employ actual paired measures. Postdicted self-assessment ratings are estimated scores that each participant provides immediately after seeing and taking a test in its entirety.

Figure 1.

Figure 1. Academic ranks’ (freshman through professor) mean self-assessed ratings of competence (ordinate) versus actual mean scores of competence from the Science Literacy Concept Inventory or SLCI (abscissa). Figure 1A is merely a drawing that depicts the Ideal pattern. Figure 1B registers actual data from many schools collected nationally. The line slopes less steeply than in Fig. 1A and the correlation is r = .99.

The result reveals that reality differs somewhat from the ideal in Figure 1A. The actual lower division undergraduates’ scores (Fig. 1B) do not order on the line in the expected sequence of increasing ranks. Instead, their scores are mixed among those of junior rank. We see a clear jump up in Figure 1B from this cluster to senior ranks, a small jump to graduate student rank and the expected major jump to the rank of professors. Note that Figure 1B displays means of groups, not ratings and scores of individual participants. We sorted over 5000 participants by academic rank to yield the six paired-measures for the ranks in Figure 1B.

We underscore our appreciation for large databases and the power of aggregating confidence-competence paired data into groups. Employment of groups attenuates noise in such data, as we described earlier (Nuhfer et al. 2016), and enables us to perceive clearly the relationship between self-assessed competence and demonstrable competence.  Figure 2 employs a database of over 5000 participants but depicts them in 104 randomized (from all institutions) groups of 50 drawn from within each academic rank. The figure confirms the general pattern shown in Figure 1 by showing a general upwards trend from novice (freshmen and sophomores), developing experts (juniors, seniors and graduate students) through experts (professors), but with considerable overlap between novices and developing experts.

Figure 2

Figure 2. Mean postdicted self-assessment ratings (ordinate) versus mean science literacy competency scores by academic rank.  Figure 2 comes from selecting random groups of 50 from within each academic rank and plotting paired-measures of 104 groups.

The correlations of r = .99 seen in Figure 1B have come down a bit to r = .83 in Figure 2. Let’s learn next why this occurs. We can understand what is occurring by examining Figure 3 and Table 1. Figure 3 comes from our 2019 database of paired measures, that is now about four times larger than the database used in our earlier papers (Nuhfer et al. 2016, 2017), and these earlier results we reported in this same kind of graph continue to be replicated here in Figure 3A.  People generally appear good at self-assessment, and the figure refutes claims that most people are either “unskilled and unaware of it” or “…are typically overly optimistic when evaluating the quality of their performance….” (Ehrlinger, Johnson, Banner, Dunning, & Kruger, 2008). 

Figure 3

Figure 3. Distributions of self-assessment accuracy for individuals (Fig. 3A) and of collective self-assessment accuracy of groups of 50 (Fig. 3B).

Note that the range in the abscissa has gone from 200 percentage points in Fig 3A to only 20 percentage points in Fig. 3B. In groups of fifty, 81% of these groups estimate their mean scores within 3 ppts of their actual mean scores. While individuals are generally good at self-assessment, the collective self-assessment means of groups are even more accurate. Thus, the collective averages of classes on detailed course-based knowledge surveys seem to be valid assessments of the mean learning competence achieved by a class.

The larger the groups employed, the more accurately the mean group self-assessment rating is likely to approximate the mean competence test score of the group (Table 1). In Table 1, reading across the three columns from left to right reveals that, as group sizes increase, greater percentages of each group converge on the actual mean competency score of the group.

Table 1

Table 1. Groups’ self-assessment accuracy by group size. The ratings in ppts of groups’ postdicted self-assessed mean confidence ratings closely approximate the groups’ actual demonstrated competency mean scores (SLCI). In group sizes of 200 participants, the mean self-assessment accuracy for every group is within ±3 ppts. To achieve such results, researchers must use aligned instruments that produce reliable data as described in Nuhfer, 2015 and Nuhfer et al. 2016.

From Table 1 and Figure 3, we can now understand how the very high correlations in Figure 1B are achievable by using sufficiently large numbers of participants in each group. Figure 3A and 3B and Table 1 employ the same database.

Finally, we verified that we could achieve high correlations like those in Figure 2B in single institutions, even when we examined only the four undergraduate ranks within each. We also confirmed that the rank orderings and best-fit line slopes formed patterns that differed measurably by the institution.  Two examples appear in Figure 4. The ordering of the undergraduate ranks and the slope of the best-fit line in graphs such as those in Fig. 4 are surprisingly informative.

Figure 4

Figure 4. Institutional profiles from paired measures of undergraduate ranks. Figure 4A is from a primarily undergraduate, public institution. Figure 4B comes from a public research-intensive university. The correlations remain very high, and the best-fit line slopes and the ordering pattern of undergraduate ranks are distinctly different between the two schools. 

In general, steeply sloping best-fit lines in graphs like Figures 1B, 2, and 4A indicate when significant metacognitive growth is occurring together with the development of content expertise. In contrast, nearly horizontal best-fit lines (these do exist in our research results but are not shown here) indicate that students in such institutions are gaining content knowledge through their college experience but are not gaining  metacognitive skill. We can use such information to guide the assessment stage of “closing the loop.” The information provided does help taking informed actions. In all cases where undergraduate ranks appear ordered out of sequence in such assessments (as in Fig. 1B and Fig. 4B), we should seek understanding why this is true.

In Figure 4A, “School 7” appears to be doing quite well. The steeply sloping line shows clear growth between lower division and upper division undergraduates in both content competence and metacognitive ability. Possibly, the school might want to explore how it could extend gains of the sophomore and senior classes. “School 3”  (Fig. 4B) probably should want to steepen its best-fit line by focusing first on increasing self-assessment skill development across the undergraduate curriculum.

We recently used paired measures of competence and confidence to understand the effects of privilege on varied ethnic, gender, and sexual orientation groups within higher education. That work is scheduled for publication by Numeracy in July 2019. We are next developing a peer-reviewed journal article to use the paired self-assessment measures on groups to understand institutions’ educational impacts on students. This blog entry offers a preview of that ongoing work.

Notes. This blog follows on from earlier posts: Measuring Metacognitive Self-Assessment – Can it Help us Assess Higher-Order Thinking? and Collateral Metacognitive Damage, both by Dr. Ed Nuhfer.

The research reported in this blog distills a poster and oral presentation created by Dr. Edward Nuhfer, CSU Channel Islands & Humboldt State University (retired); Dr. Steven Fleisher, California State University Channel Islands; Rachel Watson, University of Wyoming; Kali Nicholas Moon, University of Wyoming; Dr. Karl Wirth, Macalester College; Dr. Christopher Cogan, Memorial University; Dr. Paul Walter, St. Edward’s University; Dr. Ami Wangeline, Laramie County Community College; Dr. Eric Gaze, Bowdoin College, and Dr. Rick Zechman, Humboldt State University. Nuhfer and Fleisher presented these on February 26, 2019 at the American Association of Behavioral and Social Sciences Annual Meeting in Las Vegas, Nevada. The poster and slides from the oral presentation are linked in this blog entry.


Setting Common Metacognition Expectations for Learning with Your Students

by Patrick Cunningham, Ph.D., Rose-Hulman Institute of Technology

We know that students’ prior subject knowledge impacts their learning in our courses. Many instructors even give prior knowledge assessments at the start of a term and use the results to tailor their instruction. But have you ever considered the impact of students’ prior knowledge and experiences with learning on their approaches to learning in your course? It is important for us to recognize that our students are individuals with different expectations and learning preferences. Encouraging our students’ metacognitive awareness and growth can empower them to target their own learning needs and establish common aims for learning.

image of target with four colored arrows pointed at the center

Among other things, our students often come to us with having experienced academic success using memorization and pattern matching approaches to material, i.e., rehearsal strategies. Because they have practiced these approaches over time and have gotten good grades in prior courses or academic levels, these strategies are firmly fixed in their learning repertoire and are their go-to strategies. Further, when they get stressed academically, they spend more time employing these strategies – they want more examples, they re-read and highlight notes, they “go-over” solutions to old exams, they memorize equations for special cases, and more. And many of us did too, when we were in their shoes.

However, rehearsal strategies only result in shorter-term memory of concepts and surface-level understanding. In order to build more durable memory of concepts and deeper understanding, more effortful strategies are needed. Recognizing this and doing something about it is metacognitive activity – knowing about how we process information and making intentional choices to regulate our learning and learning approaches. One way to engage students in building such metacognitive self-awareness and set common expectations for learning in your course starts with a simple question,

‘What does it mean to learn something?”

I often ask this at the start of a course. In an earlier post, Helping Students Feel Responsible for Their Learning, I introduced students’ common responses. Learning something, they say, means being able to apply it or explain it. With some further prompting we get to applying concepts to real situations and explaining material to a range of people, from family member to bosses, to cross-functional design teams. These are great operational definitions of learning, and I affirm my students for coming up with them.

Then I go a step further, explaining how transferring to new applications and explaining to a wide range of audiences requires a richly interconnected knowledge framework. For our knowledge to be useful and available, it must be integrated with what we already know.

So, I tell my students, in this class we will be engaging in activities to connect and organize our knowledge. I also try to prepare my students for doing this, acknowledging it will likely be different than what they are used to. In my engineering courses students love to see and work more and more example problems – i.e., rehearsal. Examples are good to a point, particularly as you engage a new topic, but we should be moving beyond just working and referencing examples as we progress in our learning. Engaging in this discussion about learning helps make my intentions clear.

I let my students know that as we engage with the material differently it will feel effortful, even hard at times. For example, I ask my students to come up with and explore variations on an example after we have solved it. A good extension is to have pairs working different variations explain their work to each other. Other times I provide a solution with errors and ask students to find them and take turns explaining their thinking to a neighbor. In this effortful processing, they are building connections. My aim is to grow my students’ metacognitive knowledge by expanding their repertoire of learning strategies and lowering the ‘activation energy’ to using these strategies on their own. It is difficult to try something new when there is so much history behind our habitual approaches.

Another reason I like this opening discussion, is that it welcomes opportunities for metacognitive dialogue and ongoing conversations about metacognition. I have been known to stop class for a “meta-moment” where we take time to become collectively more self-aware, recognizing growth or monitoring our level of understanding. The discussion about what it means to learn something also sets a new foundation and changes conversations about exam, quiz, and homework preparations and performance. You might ask, “How did you know you knew the material?” Instead of suggesting “working harder” or “studying more”, we can talk meaningfully about the context and choices and how effective or ineffective they were.

Such metacognitive self-examination can be challenging for students and even a little uncomfortable, especially if they exhibit more of a fixed mindset toward learning. It may challenge their sense of self, their identity. It is vital to recognize this. Some students may exhibit resistance to the conversation or to the active and constructive pedagogies you employ. Such resistance is challenging, and we must be careful with our responses. Depersonalizing the conversation by focusing on the context and choices can make it feel less threatening. For example, if a student only studied the night or two before an exam, instead of thinking they are lazy or don’t care about learning, we can acknowledge the challenge of managing competing priorities and ask them what they could choose to do differently next time. We need to be careful not to assume too much, e.g., a student is lazy. Questions can help us understand our students better and promote student self-awareness. For more on this approach to addressing student resistance see my post on Addressing Student Resistance to Engaging in their Metacognitive Development.

Students’ prior learning experiences impact how they approach learning in specific courses. Engaging students early in a metacognitive discussion can help develop a common set of expectations for learning in your course, clarifying your intentions. It also can open doors for metacognitive dialogue with our students; one-on-one, in groups, or as a class. It welcomes metacognition as a relevant topic into the course. However, as we engage in these discussions, we must be sensitive to our students, respectfully and gently nudging their metacognitive growth. Remember, this is hard work and it was (and often still is) hard for us too!

Acknowledgements This blog post is based upon metacognition research supported by the National Science Foundation under Grant Nos. 1433757 & 1433645. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.


Metacognitive links connecting the Arts and STEM

by Jessica Santangelo and Ilona Pierce, Hofstra University

We may be an unlikely pair at first glance – an actor and a biologist. We met after Jess gave a talk about the role of metacognition in supporting student learning in biology. Ilona realized during the talk that, though unfamiliar with the term metacognition, what she does with theatre students is inherently metacognitive. This has led to rich conversations about metacognition, the role of metacognition in teaching, and the overlap between the arts and STEM (Science, Technology, Engineering and mathematics).

Here we offer a condensed version of one of our conversations in which we explored the overlap between metacognition in the arts and STEM (STEAM).

Ilona: In actor training, (or voice/speech training, which is my specialty) self-reflection is the core part of an actor’s growth. After a technique is introduced and application begins, we start to identify each student’s obstacles. In voice work, we examine different ways we tighten our voices and bodies then explore pathways to address the tension. As tension is released, I’ll typically ask, “What do you notice? How are things different than they were when we began?”  This is what hooked me in at your lecture….you worked with the students, uncovering their shortcomings (their version of TENSION) and you watched their test scores go up. It was a great thing to see, but I sat there thinking, “doesn’t every teacher do that?”

Jess: In my experience, most STEM courses do not intentionally or explicitly support students reflecting on themselves, their performance, or their learning strategies. I’m not entirely sure why that is. It may be a function of how we (college-level STEM educators) were “brought up,” that many of us never had formal training in pedagogy, and/or that many STEM educators don’t feel they have time within the course to support students in this way.

When you contacted me after the lecture, I had an “aha!” moment in which I thought “Woah! She does this every day as an inherent part of what she does with her students. It’s not something special, it’s just what everyone does because it’s essential to the students’ learning and to their ability to grow as actors.” Though you hadn’t been aware of the term “metacognition” before the talk, what you are having your students do IS metacognitive.

Ilona: Of course, the students have to be taught to notice, and prodded into verbalizing their observations. In the beginning, when I ask, “What do you notice?” I’m typically met with silence. They don’t know what they notice. I have to guide them: “How has your breathing changed? Are you standing differently? What emotions arose?” As the course goes on, I’ll ask for deeper observations like, “How does your thinking/behavior during class help you/hinder you? What patterns are arising?” It’s not unusual to hear things like, “I realized I talk fast so that people don’t have the chance to interrupt me,” or “If I speak loudly, I’m afraid people will think I’m rude.”

Jess: I think that highlights a difference in the approach that educators within our respective fields take to our interactions with students. Your class is entirely focused on the student, the student’s experience, and having the student reflect on their experience so they can adjust/adapt as necessary.

In contrast, for many years, the design of STEM courses and our interactions with students focused on the conveyance of content and concepts to students. Some STEM classes are becoming more focused on having the students DO something with content/concepts in the classroom (i.e., active learning and flipped classrooms), but that hasn’t always been the case. Nor does having an active learning or flipped classroom mean that the course intentionally or explicitly supports student metacognitive development.

Ilona: Principles and content are an important part my coursework as well, but most of it is folded into the application of the skills they’re learning. The environment helps to support this kind of teaching. My students are hungry young artists  and the class size is 16 – 18 max. This allows me to begin by “teaching” to the group at large, and then transition to doing one-on-one coaching.

When you work with your students, do you work individually or in small groups?

Jess: I am pretty constrained in terms of what I can do in the classroom as I generally have 44-66 students/section (and faculty at other institutions are even more constrained with 100+ students/section!). However, even with my class size, I generally try to minimize whole-group lecture by having students work in small groups in the classroom, prompting them to discuss how they came to a conclusion and to make their learning visible to each other. One-on-one “coaching” generally occurs during office hours.

I’m really drawn to the word “coaching” here. I feel like you literally coach students – that you work with them, meeting them wherever they are in that moment, and help them gain awareness and skills to get to some endpoint. Does that accurately capture how you view yourself and your role? How does that play out in terms of your approach to your classes and to your interactions with students?

Ilona: I think it’s “teacher” first and then I transition to “coach”.  But I also use one-on-one coaching to teach the entire class. For example, one student gets up to share a monologue or a poem.  Afterwards, I ask a question, maybe a couple: ”What did you notice about your breathing? Your body? Your emotions?” If the student has difficulty answering, I’ll guide them to what I noticed: “Did you notice… i.e. your hands were in fists the whole time?” I might turn to the class and say, “Did you guys notice his hands?” The class typically will notice things the performer doesn’t. I’ll ask the class, “As an audience member, how did his clenched hands make you feel (emotionally, physically)? Did you want him to let them go, or did it help the piece?”  So the coaching bounces from the individual to the group, asking for self-reflection from everyone.

Jess: It sounds like we do something similar in that, as I prompt one student in a small group to explain how they arrived at a conclusion, I’m using that as an opportunity to model a thought process for the rest of the group. Modeling the thought process alone isn’t necessarily metacognitive, but I take it a step farther by asking students to articulate how the thought process influenced their ability to come to an accurate conclusion and then asking them to apply a similar process in other contexts. I’m essentially coaching them towards using thought process that is inquisitive, logical, and evidence-based – I’m coaching them to think like a scientist.

When I reflect on my title: professor/teacher/instructor/educator versus coach, I’m struck that the title brings up very different ideas for me about my role in the classroom – it shifts my perspective. When I think of professor/teacher/instructor/educator, I think of someone who is delivering content. When I think of a coach, I think of someone standing on the sidelines, observing an athlete perform, asking the athlete to do various exercises/activities/drills to improve various aspects of their performance. You seem to fit squarely in the “coach” category to me – you are watching the students perform, asking students to reflect on that performance, and then doing exercises to improve performance via the release of tension.

Ilona: I definitely do both. Coaching to me implies individualized teaching that is structured in a way to foster independence. Eventually, a coach may just ask questions or offers reminders. It’s the last stop before students leave to handle things on their own. Like parenting, right? We start with “hands on”, and over time we teach our children to become more and more independent, until they don’t need us anymore.

Jess: I wonder how often STEM educators think of themselves at coaches? How does viewing oneself as a coach alter what one does in the classroom? Is there a balance to be struck between “teaching” and “coaching”? How much overlap exists between those approaches?

In thinking about myself, I can wear both hats depending on the circumstance. I can “teach” content and “coach” to help students become aware of their level of content mastery. When I think of myself as a teacher, I feel responsible for getting students to the right answer. When I think of myself as a coach, I feel more responsible for helping them be aware of what they know/don’t know and supporting their use of strategies to help them be successful. Isn’t that the point of an athletic coach? To help an athlete be aware of their bodies and their abilities and then to push an athlete to do and achieve more within their sport? The academic analogy then would be to push a student to be aware of what they know or don’t know and to effectively utilize strategies to increase their knowledge and understanding. The goal is to get students doing this on their own, without guidance from an instructor.   

The other piece to this is how the students respond and use the metacognitive skills we are trying to help them develop. I wonder: Are your students, who are being encouraged to develop strong metacognitive skills in their theatre classes, naturally transferring those skills and using them in other disciplines (like in their bio class!)? If not, and if they were prompted to do so, would they be more likely to do so (and do so successfully) than non-theatre students who haven’t been getting that strong metacognitive practice?

Ilona: One would hope so. My guess is that when they get into non-acting classes, they revert to the student skills they depended on in high school. Although, I often get “metacognitive success stories” after summer break. Students will report that during their lifeguard or food-service gig, they realized their growing skills of self-awareness helped them to do everything from using their voices differently to giving them greater insight into their own behavior. If they can make connections like this during a summer job, perhaps they can apply these skills in their bio class.

 


Encouraging Metacognition in the Advanced Physics Lab

by Melissa Eblen-Zayas, Carleton College  Downloadable

 

Description of activity:

I have incorporated metacognitive support activities in the form of written reflections and class discussions to help students develop better approaches to dealing with challenges that arise in open-ended experimental work in an advanced lab course in physics.

Motivations and context:

The advanced lab course is the third of three required intermediate/advanced courses for the physics major that has a significant lab component. This course typically enrolls 18-24 physics majors, and the labs are significantly less scripted than the other required lab courses. The laboratory activities consist of three two-week-long, instructor-designed labs and four weeks of students carrying out an experimental project of their own design.

While some students welcome the move to more open-ended laboratory work, others struggle. Some students are reluctant to take initiative; rather than trying to problem solve on their own, they seek help from course instructors as soon as problems arise. Other students have difficulty developing a strategic approach to troubleshoot the challenges they encounter. To encourage independence in the lab, I have introduced reflection prompts to support student metacognition. Encouraging students to reflect on how they approach challenges and how they will do things differently going forward helps students develop more thoughtful problem-solving approaches in open-ended laboratory work, thereby increasing self-sufficiency and reducing frustration.

Nuts and bolts:

One of the four course goals for the advanced lab course is that students will demonstrate the ability to be reflective on the practice of experimental physics. I introduce the importance of reflective practice on the first day of course, and incorporate reflection activities in both the two-week instructor-designed labs and throughout the final project. These reflection activities account for 10% of the course grade, and most of these reflection activities are graded using a rubric.

1. First day of class. Prior to the first class, I ask students to respond to the prompt: “In two sentences, describe your definition of a successful experiment.” Then I select a number of student statements and share them the first day of class. Although student definitions of a successful experiment vary widely, many responses fall into one of two categories; a successful experiment is a) an experiment that gives a result that is in agreement with what is expected, or b) an experiment in which the experimenter learns something (maybe not what they intended). We discuss these two definitions of successful experiments, and I encourage students to adjust their expectations and appreciate that learning from things that go wrong is still a “success” in the experimental realm. These conversations allow me to introduce the importance of metacognition and the course goal of helping students become reflective practitioners.

2. Reflections on the instructor-designed labs. At the end of every two-week instructor-designed lab activity, I ask students to reflect on their most recent lab and respond to five questions designed to foster metacognition:

  1. Tell me a bit about how you approached the lab.
  2. When you ran into problems, what was the strategy your group employed for troubleshooting the problems you encountered?
  3. What types of pre-reading or additional research did you do to prepare for this lab?
  4. When you asked for help, who did you seek help from (other members of your group, other groups, your lab assistant, your instructor) and what kinds of questions did you ask?
  5. What is one thing that you will do differently when tackling labs going forward?

Students write individual responses to these prompts, and I provide feedback using the rubric. When I first began using these reflective prompts, I did not grade them. Grading the responses has increased the quality and depth of the reflections.

3. Reflections on the final project. I ask students to reflect on their final project work throughout the course of the project. Here is a sample of the questions used:

  1. What did you learn from the process of identifying and refining your final project proposal? What are you most looking forward to and what do you anticipate the biggest challenge will be as you begin working on your final project?
  2. What aspect of your contributions to the final project demonstrates your strengths and talents and why?
  3. What is one significant problem that your group encountered when working on your project in the past week, and how did you overcome it or redesign your project to work around it?
  4. What are your main project goals for the coming week, and how do you plan to pursue those goals?

The format for the responses has varied over the years. Sometimes lab groups respond to one of these prompts during a short oral report to the whole class. Other times, students write individual responses. Still other times, one of these questions serves as the starting point for an in-class discussion. I have found benefits and drawbacks to each of these approaches, and I continue to experiment with the format.

Outcomes:

Including metacognitive support activities in the advanced lab course, being explicit about why reflection is important in experimental physics, and grading student reflective responses has had a positive impact on the quality of student reflections and student attitudes towards the course. Students develop a more self-sufficient approach to tackling challenges that they encounter in the lab, and frustration is reduced. I reported some of the outcomes in a paper presented at the 2016 Physics Education Research Conference. That paper has been published in the conference proceedings:

Eblen-Zayas, M. (2016). The impact of metacognitive activities on student attitudes towards experimental physics, In D. L. Jones, L. Ding, & A. Traxler (Eds). 2016 PERC Proceedings, 104, doi:10.1119/perc.2016.pr.021


Addressing Metacognition Deficits in First Semester Calculus Students: Getting Students to Effectively Self-Evaluate their Understanding

by Derek Martinez, University of New Mexico

 Downloadable

Motivations and context: 

The problem I chose to tackle as a UNM Teaching Fellow was to develop methods for how to teach first semester calculus students to effectively self-test and develop metacognitive skills. One of the biggest issues I have seen over the years is students thinking they understand the material, getting over confident, and then performing horribly on an exam. The practices described below were carried out during the spring 2016 semester in two math 162 (Calculus I) classes.

Method: The two main ways that metacognitive strategies were incorporated into the curriculum were (1) daily “Test Yourself” exercises and (2) exam skills check self-assessments (essentially practice tests) before each exam. The “Test Yourself ” exercises were designed to be a daily reminder to the students that they should not confuse the ability to follow a lecture with the ability to solve a problem on their own. The purpose of the self-assessments was to help students identify where they had gaps in their understanding before taking each exam.

The “Test Yourself” exercises (example attached) were e-mailed to students the night before the lecture and were designed to give students a way to assess whether or not they understood the fundamental concepts of the lecture the following day. For example, if the lecture that day was about rates of change applications, the exercise would focus on an easy-to-medium level example that would test whether or not the students got the fundamental concepts from the section before going on and trying the more challenging homework problems.

Sending the exercises out the night before was effective in getting many students to read ahead in the text, and try to solve parts of the exercises (this was especially true of students who were struggling, or had math anxiety). If a student had solved the exercise before class, they were instructed to bring in a blank exercise and make sure they could duplicate their results without notes. The format for each class was usually lecture for about 40 minutes and then students would work on these exercises (some in groups, some by themselves).

The self-assessments (example attached) were given about five days before each exam. Participation in this was voluntary. I reserved a room outside of class and students took this like an actual exam. I made it clear to the students that the material on these assessments covered the fundamental ideas and basic examples, but were at a lower level of difficulty than the actual exams. The reasoning behind this was to help students pinpoint what core skills they still needed to work on. I graded these assessments just like exams so students could get feedback on their work as well as use of proper notation. To help identify their level of metacognition, at the end of each assessment the students were asked to rank their performance on a scale of 1-5 (5 being best performance). In many cases this ranking followed by actual exam scores provided further evidence to the students that they tended to be overconfident in their preparedness and needed to study more. In the beginning, students tended to over rank their performance but by the final exam assessment, their rankings were more in line with their performance.

Outcomes: Students in my spring 2016 sections had a final exam pass rate of more than 11% higher than all other sections (group graded without me to avoid any possible bias). These students also had a higher final exam pass rate than my fall/spring 2015 students by about 10% (when I did not yet incorporate these activities). The self-assessments seemed to have the biggest measurable impact on student success, as students who took them consistently outscored those who did not by 10 – 20% on the exams. Further, scores on the actual exams were 15 – 65% higher than on the self-assessments. I believe this was due to the fact that they guided and motivated their learning as well as simply scared some students into studying harder.

Lessons learned: “Buy-in” from the beginning is essential. Sharing the data with the students after the first assessment significantly increased the number of students taking the remaining assessments. These were mainly STEM majors so the statistical evidence went a long way with them. It was also crucial to make time throughout the semester to talk about what metacognition is and remind the students why they were doing these exercises.


Weekly Status Reports to Promote Awareness

by David Woods and Beth Dietz, Miami University

 Downloadable

Motivation for the activity or process: Teaching an introductory Information Technology (IT) course involves several goals that focus on creating metacognitive awareness and cognitive monitoring (Flavell, 1979; Schraw, 1998). The main goal of the course is to introduce students to several IT topics (e.g., data representations, computer architecture, and assembly language) that are foundational to the IT curriculum. Other goals of the course include analyzing and solving problems using a computer programming language, as well as applying written and oral communication skills to IT. Teaching these skills also helps address misconceptions about what IT professionals actually do. Students are often surprised to learn that IT professionals usually work in teams for specific projects or on an ongoing basis. Status reports are a key communication tool for groups, and good status reports require the individual to reflect and analyze what they have done, and plan for the future. Considering the course as a project, the status report should prompt the planning and evaluation aspects of metacognitive regulation (Flavell, 1979).

Context: A metacognitive-awareness activity was used in an introductory IT course. The course is a 100-level course and is one of the first courses taken by students considering a major in Computer and Information Technology. Typically, the class size is 20 – 25 students. While the instructor was only in his second year of full time teaching, he also had over 15 years experience working as an IT professional.

Description of activity: Weekly status reports are common activities in many IT positions, especially when an individual is part of a larger project team. They are a basic way for an employee to document what they have accomplished and what they are currently working on. This is valuable in the IT field since work such as writing software or configuring a server does not produce physical objects that provide visual evidence of progress.

The requirements for the status report were simple and made use of several metacognitive processes (Fogerty, 1994). Students were asked to discuss three specific items:

  • Current week activity: List the main course related activities since the last status report and provide a brief discussion of each along with the amount of time spent on the activity. This prompts the student to evaluate their learning from the past week.
  • Upcoming activity: List major course related activities planned for the next week with a brief discussion of the activity and what will be completed during the week. This prompts the student to plan the learning for the next week.
  • Issues and Overdue items: List any problems with the course materials or assignments. If there are no issues, this should be clearly stated. This prompts the student to monitor their understanding of the issues or problems.

During the semester, students completed 13 status reports. The status reports made up 5% of the final grade and students were allowed to skip three reports (or alternatively earn extra points by doing all of the assigned status reports).

Outcomes and Lessons Learned: The assignment met the immediate goal of prompting metacognitive reflection by asking students to evaluate their prior learning, plan for future learning, and monitor the learning process (Fogerty, 1994). In addition, the status reports gave the instructor good feedback on the amount of work that students did outside of the scheduled class meetings. An additional benefit was the opportunity to provide feedback to students who submitted status reports with limited content and limited evidence of planning and evaluation.

Many status reports showed clear evidence of evaluation and planning as students reported challenges with specific concepts or assignments and then planned activities in response. Some students failed to mention class meetings or submitted assignments in the current week activity. When this was mentioned in grading feedback, later status reports from these student showed improved tracking of completed work.

As the semester progressed and a few students missed assignments, there was an opportunity to ensure that these were noted and discussed in the overdue items section. In several instances, instructor comments led to students evaluating root causes including poor time management and mandatory overtime at work. Not all of the root causes had obvious solutions, but discussing the root causes offered a chance to plan ways to address the issue and was more productive than simply reminding students about late assignments.

The simple structure for the status reports should work well for courses at all levels. In courses where students have more than a week to complete assignments, status reporting could require students to break assignments down into smaller tasks, which is a useful skill to develop.

References:

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive-developmental inquiry. American Psychologist, 34 (10), 906-911.

Fogarty, R. (1994). How to teach for metacognition. Palatine, IL: IRI/Skylight Publishing.

Schraw, G. (1998). Promoting general metacognitive awareness. Instructional Science, 26(1-2),113-125.


Practice with a Reasoning Process to Make Learning Visible and Improve Academic Performance

by Jessica Santangelo, Hofstra University

 

Downloadable

Description of Activity

Motivations and context: I teach a fast-paced, content-heavy introductory biology course. Many students struggle in the course – not because they are not capable, but because they lack a repertoire of learning strategies that best support learning within the structure of the course. Rather than discuss “study strategies” as an add-on to course content, this activity has students model behaviors that make their learning visible, reduce reliance on memorization, and empowers them with a process to improve academic performance.

My basic goal with this activity was to make a very specific process available to students to mitigate the tendency I saw of students, when faced with a challenging question or concept, to simply guess or give up. Namely, students remember one key fact about a complex system from which all other pertinent facts can be derived. In this specific example, they organize those facts in a table and (critically) use the table when faced with questions regarding the system. The process of reasoning from a key fact to a deeper or more applied understanding is not metacognitive in and of itself. In this case, metacognitive development is promoted by the structuring of the in-class work that allows multiple opportunities for practice with the reasoning process.

Nuts and bolts of an example application

In the course we cover the urinary system. The most challenging aspect of this topic is the function of antidiuretic hormone (ADH). It involves understanding the effects of a diuretic (so students can then understand the effects of an antidiuretic) and osmosis – the movement of water across a semipermeable membrane. It further involves blood pressure, blood osmolarity, stimuli that either cause or inhibit release of ADH from the hypothalamus, and impacts of ADH (or lack thereof) on the kidney. Needless to say, there are a lot of moving parts.

I structure two class periods around one concept: Diuretics promote urine production. I tell students that this is the one thing they should memorize. Everything else follows from that one statement. So, rather than memorizing the entire table below, they memorize one statement, then reason their way through all the other information. Making students aware of this general strategy can greatly reduce the amount of time spent memorizing while increasing the amount of time spent making connections between interrelated facts or processes. Indeed, it’s worth asking students to self-identify one key starting point for any concept such that, if they remember that one key point, they can reason through the rest of the information.

The one concept to remember: Diuretics promote urine production.
Diuretic Antidiuretic
urine production Increases / promotes Decreases / inhibits
water loss Increases / promotes Decreases / inhibits
water retention Decreases Increases
blood osmolarity Increases (more salty as remove water) Decreases (less salty as add water)
blood pressure Decreases (as remove water) Increases (as add water)

I introduce the one concept, then have students work in groups to fill in the table on large wall-mounted whiteboards. Throughout their group work I ask questions to promote their metacognitive development like “What do you already know?” and “How did you come to that conclusion?”. This is a key step in the metacognitive process: asking them to make their reasoning visible to themselves and their group-mates. Though students may get stuck, being metacognitive (i.e., asking “what do I know, how do I know it, and how does that help me?”) helps them reason their way through more effectively. At the end of the class session I remind students to test themselves on their ability to start with the one key concept and subsequently explain the table before coming to the next class session.

The next class session, students put all of their notes and other resources away, and recreate the table on the wall-mounted whiteboards using only their brains. Invariably, most groups jump right into filling out the table. But one or two groups will take the time to write “Diuretics promote urine production” on their board before filling in the table. The groups who write this tend to complete the table more quickly and more accurately. I use this as a teachable moment for all the groups by reminding them that they have a simple tool – the one phrase to remember – to guide them in completing the table.

The groups then use their tables as a guide to answer a series of challenging questions about the stimuli for ADH release/inhibition and the associated outcomes. Most groups get bogged down in the questions – they discuss possible answers with their neighbors but go round and round and get confused. I let this happen for a question or two and then I remind students to use the table they put on the board. I ask one student from each group to stand up and model (with their group’s help) how to use the table to answer the next question. At this point, there are lots of “oh”s and “aha”s as students realize it is much easier to arrive at the correct answer using the table.

I then tell students: “You just used a tool (the table) to help you answer this question. What tools do you have available to you when you face a question like this on the exam?” Most of them look around in bewilderment as I don’t allow them to use any outside resources on exams. I then ask “What about the table?” and they say “But we aren’t allowed to bring anything with us to the exam”. And I say “But where did that table come from today?” and they respond “our brains” and I reply, “Exactly. You remembered ONE sentence and then you filled out that whole table with just your brain. So why not jot that table down on your exam?” And their eyes light up…

This is another key step in the metacognitive process: making it obvious to students how they can use this approach on their own to support learning and achievement. The behaviors they modeled in class (remembering one key concept from which to derive all other relevant information, organizing information into an easy-to-reference format, and utilizing that organized information to answer applied questions) should not be used solely in class or when I ask them to do it. They can use those behaviors on their own to promote learning outside of class or on an exam. I have found that unless I make this explicit to students, they rarely use an approach from the classroom on their own.

Outcomes

I’ve been incorporating a variety of activities and practices to promote student metacognitive development into the course for a few years with success. As a result, many students who would not have passed (or would have barely passed) the course have altered their learning strategies and improved their grades – some to A’s and B’s. As I’ve incorporated this specific example with the urinary system I’ve noticed that students are more willing to attempt the challenging ADH questions and are more likely to reason out the answer than to simply guess.

Lessons learned and future directions

Modeling behaviors in a group context works well for these students. Most of them were not challenged in high school the way they are challenged in this course. Embedding tips and tricks that enhance their ability to make their thought process visible (i.e., that promote metacognition) within the very context of the course 1. makes the tips/tricks an inherent part of learning biology rather than “add-ons” and 2. Increases the likelihood that they will use these metacognitive tips/tricks (self-regulation). My goal is to have students model these behaviors with more topics in the course, constantly reinforcing the thought/reasoning process so it is ingrained by the end of the semester.


The impact of metacognitive activities on student attitudes towards experimental physics

This article by Melissa Eblen-Zayas, Ph.D., shares the implementation of metacognitive activities in an advanced Physics lab. She reports that “the introduction of metacognitive activities in an advanced lab where the laboratory work is not carefully scripted may improve students’ enthusiasm for experimental work and confidence in their ability to be successful in such work.” Check out this article to see the metacognitive prompts they used as well as learn about other metacognition-related activities.

Eblen-Zayas, M. (2016). The impact of metacognitive activities on student attitudes towards experimental physics. 2016 PERC Proceedings edited by Jones, Ding, and Traxler; doi:10.1119/perc.2016.pr.021


A Whole New Engineer: A Whole New Challenge

by Roman Taraban, Ph.D.,  Texas Tech University

In 1973, cognitive psychologists Kahneman and Tversky (1973) wanted to present their study participants with a stereotypical description of engineers:

Jack is a 45-year old man. He is married and has four children. He is generally conservative, careful, and ambitious. He shows no interest in political and social issues and spends most of his free time on his many hobbies, which include home carpentry, sailing, and mathematical puzzles. (p. 241)

When asked if they thought Jack was an engineer, 90% of the participants thought he was.

Whatever stereotypes of engineers may persist to the present day (e.g., geek, introvert, asocial: http://www.thecreativeengineer.com/2008/12/16/a-few-engineering-myths/ ), various parts of the engineering community are trying to create “a whole new engineer” (Goldberg & Somerville, 2014). Cross-disciplinary centers have been established at universities, like iFoundry which was launched in 2008 at the University of Illinois, in order to prepare engineering students for working in the 21st century. One mandate was to promote “deep reflection and attention to the complex system in which engineering education is embedded” (https://ifoundry.illinois.edu/who-we-are/what-ifoundry ).

On a larger scale, the Franlin W. Olin College of Engineering admitted its first class in 2002 in order to implement a full-scale hands-on, project-based and design curriculum. Olin College provides students with funding for “passionate pursuits,” which are personal projects of academic value proposed by students https://en.wikipedia.org/wiki/Franklin_W._Olin_College_of_Engineering. STEM is being transformed to STEAM, where the addition of A represents Artful Thinking in the context of Science, Technology, Engineering, and Mathematics (Radziwell et al., 2015). To develop artful thinking a facilitator might present a painting and ask students: What do you see? What does it make you think? What is happening? Why do you think so? These questions help learners develop dispositions to observe, describe, question, reason, and reflect. The whole new engineer is becoming a whole lots of things, but is the new engineer becoming more metacognitive?

We know that engineering students can be metacognitive when solving textbook problems (Taraban, 2015). Indeed, by now there is an extensive corpus of research on students’ textbook problem-solving in introductory physics and other areas of STEM. Explaining the material to oneself with the knowledge that this will help one better understand it, or testing oneself with the knowledge that this will help one more reliably retrieve the information later, are examples of metacognitive processes and knowledge. Case and Marshall (1995) described a developmental pathway by which students transition towards deeper understanding of domain concepts and principles, which they labeled the conceptual deep approach to learning, and which is: “relating of learning tasks to their underlying concepts or theory” with the intention “to gain understanding while doing this” (p. 609). Basically, their suggestion is that over the course of development students recognize that a goal of learning is to understand the material more deeply, and that this recognition guides how they learn. Draeger (2015), and others, have suggested that this kind of monitoring of the effectiveness of learning strategies and regulating one’s behavior are characteristic of metacognitive thinking.

The current re-design of the traditional engineer involves sweeping changes, in the classroom, in the university, and in professional practice, and it aims to do this, in part, by infusing more reflection into engineering training and practice. So, what is a reflective practitioner, and are reflective practitioners metacognitive thinkers?

Schön (1987) suggested that reflective practitioners think carefully about what they are doing as they are doing it. Reflective practitioners assess and revise their existing practices and strive to develop more effective behaviors. They critically assess their behavior as a means to improving it. As Schön (1987) puts it, reflective practice is a “dialogue of thinking and doing through which I become more skillful” (p. 31). Schön maintained “that there is a core of artistry, an exercise of intelligence, and a kind of knowing inherent in professional practice, which we can only learn about by carefully studying the performance of extremely competent professionals” (Osterman, 1990, p. 133).

Through reflective practice we submit our behaviors to critical analysis, asking questions like these: What am I doing? What effect is it having? (Osterman, 1990). This very much reminds one of the distinction that Draeger (2015) made between metacognition and critical thinking. Specifically, one can be a critical thinker without being metacognitive. The two processes can overlap but are not identical. Simply, to be metacognitive, one would need to think about the reflective processing itself. Metacognitions would involve knowledge of the benefits of reflective practice, how it relates to self, and metacognitive processes related to monitoring and controlling the reflective practices. Imagine observing any expert – an expert teacher, an expert golfer, an expert acrobat – and striving to mimic that expertise through carefully observing and critiquing one’s own performance. That’s reflective practice. It’s about trying to get a job done in the best possible way. In a complementary fashion, metacognitive knowledge and processing involve intentionally and consciously monitoring and regulating those reflective practices.

In A Whole New Engineer (Goldberg & Somerville, 2014) the authors assert that

Here we are calling attention to the importance of the Whole New Engineer’s ability to do three things:

  • Notice and be aware of thoughts, feelings, and sensations.
  • Reflect and learn from experience.
  • Seek deeper peace, meaning, and purpose from noticing and reflection. (p. 114)

Goldberg and Somerville (2014) make a call to be more attentive and sensitive to surroundings, to notice and reflect, but not necessarily to be metacognitive in those contexts – they are not clear about the latter point. Thus, it may be safe to say that being metacognitive doesn’t automatically come through reflective practice, critical thinking, mindfulness, or artful thinking strategies. Metacognition represents a distinct type of knowledge and process that can potentially enhance the effects of the aforementioned. The whole new engineer can be a whole lot of things, but is not automatically a metacognitive engineer. Simply, an engineering student, or even a practicing engineer, can be good at certain design projects, for instance, and develop a critical eye for that work, but without necessarily developing metacognitive awareness around when to shift strategies or techniques in order to be more effective.

References

Draeger, J. (2015). Two forms of ‘thinking about thinking’: metacognition and critical thinking. Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking/ .

Kahneman, D., & Tversky, A. (1973). On the psychology of prediction. Psychological Review, 80(4), 237-251. http://dx.doi.org/10.1037/h0034747

Osterman, K. F. (1990). Reflective practice: A new agenda for education. Education and Urban Society, 22(2), 133-152.

Radziwill, N. M., Benton, M. C., & Moellers, C. (2015). From STEM to STEAM: Reframing what it means to learn. The STEAM Journal, 2(1), Article 3.

Schön, D. (1987). Educating the reflective practitioner. How professionals think in action. London: Temple Smith.

Taraban, R. (2015). Metacognition in STEM courses: A developmental path. Retrieved from https://www.improvewithmetacognition.com/metacognition-in-stem-courses-a-developmental-path/


Metacognition in STEM courses: A Developmental Path

by Roman Taraban, PHD, Texas Tech University

There is a strong focus in science, technology, engineering, and math (STEM) courses to solve problems (Case & Marshall, 2004). Does problem solving in STEM involve metacognition? I argue that the answer must surely be ‘yes’. That’s because metacognition involves monitoring the effectiveness of learning and problem-solving strategies and using metacognitive knowledge to regulate behavior (Draeger, 2015). But when does metacognition become part of problem solving, and how does it come about? Can we discern development in metacognitive monitoring and regulation? In this post, I will present some qualitative data from a study on problem-solving in order to reflect on these questions. The study I draw from was not about metacognition per se, however, it may provide some insights into the development of metacognition.

The study I conducted involved freshman engineering majors. These students were asked to solve typical problems from the course in mechanics in which they were currently enrolled (Taraban, 2015). Not surprisingly, students varied in how they began each problem and how they proceeded towards a solution. In order to gain some insight into their problem-solving strategies, I asked students to simply state why they started with the equation they chose and not some other equation, after they had solved the problems.

Students’ responses fell into at least three types, using labels from Case and Marshall (2004): surface, algorithmic, and deep conceptual. When asked why they started with their first equation, some students responded:

  • “I don’t know, it’s just my instinct”.
  • “No special reason. I’m just taking it randomly”.
  • “It’s just habit.”
  • “The first thing that came to my mind.”

Of interest here, these students did not appear to reflect on the specific problem or show evidence of modulating their behavior to the specific problemheir responses fit a surface learning approach: “no relationships sought out or established, learn by repetition and memorization of formulae” (Case & Marshall, 2004, p. 609).

Other students’ responses reflected an algorithmic approach to learning — “identifying and memorizing calculation methods for solving problems” (Case & Marshall, 2004, p. 609):

  • “I am getting three variables in three unknowns so I can solve it.”

Here the student verbally expresses a more structured approach to the problem. The student believes that he needs three equations involving three unknowns and uses that as a goal. Students who take an algorithmic approach appear to be more reflective and strategic about their solutions to problems, compared to surface problem solvers.

Case and Marshall (1995) regarded both the surface and algorithmic pathways as part of development towards deeper understanding of domain concepts and principles, the latter which they labeled the conceptual deep approach to learning: “relating of learning tasks to their underlying concepts or theory” with the intention “to gain understanding while doing this” (p. 609). Basically, their suggestion is that at some point students recognize that a goal of learning is to understand the material more deeply, and that this recognition guides how they learn. Case and Marshall’s description of conceptual deep learning fits Draeger’s (2015) earlier suggestion that monitoring the effectiveness of learning and regulating one’s behavior is characteristic of metacognitive thinking. Once students reach this level, we should be able to more readily observe students’ intentions to understand the material and observe their overt attempts to grasp the material through their explicit reflection and reasoning. Examples of this type of reflection from my study could be gleaned from those students who did not jump directly to writing equations without first thinking about the problem:

  • “If I choose the moment equation first, then directly I am getting the value of F. So in the other equations I can directly put the value of F.”

As students progress from surface to algorithmic to deep conceptual processing, there is certainly development. However, in the present examples that track that development, it is difficult to partial out students’ thinking about the problem content from their thinking-about-thinking, that is, their metacognitions. Draeger (2015) helps here by distinguishing between metacognition and critical thinking. The latter often requires domain-specific knowledge. Draeger suggests that “many students are able to solve complex problems, craft meaningful prose, and create beautiful works of art without understanding precisely how they did it” (p. 2). Basically, critical thinking is about methodology within a domain – e.g., the person knows how to format a narrative or select an appropriate statistical procedure, without necessarily reflecting on the effectiveness of those choices, that is, without metacognition. In the examples I provided above from my work with undergraduates on problem solving, there is invariably a mix of critical thinking and metacognition. Draeger’s distinction signals a need to better decouple these two distinct kinds of cognitive processes in order to better clarify the developmental trajectory of metacognitive processing in problem solving.

Finally, why do we observe such wide variance in students’ approaches to problem-solving, and, relatedly, to metacognition? One reason is that instructors may emphasize assessment and grades (Case & Marshall, 2004). As a consequence, students may focus more on gaining points for the correct answer rather than on the process. Welsh (2015) has suggested that course structure can act as a barrier to deeper learning: “high stakes assessments may overshadow resources designed for metacognitive development” (p. 2). Welsh found that students were more concerned with test performance than with reflecting upon their study strategies and implementing learning strategies recommended by the instructor.

How are we to understand this discord between concern with test performance and metacognition? At some level, when students set goals to do well on tests they are regulating their behavior. Metacognitive resources from the instructor may be in competition with students’ perceived resources (e.g., access to old tests, study buddies, cramming the night before). The instructor can facilitate change, but the leap from surface and algorithmic learner to deep conceptual learner must be undertaken by the student.

Passion and commitment to a topic are strong motivators to find the means to access and acquire deeper conceptual understanding. One measure of teacher success is class test performance, but another can be found in student comments. Here is one that I recently received that I found encouraging: Despite the fact that I was a bit uninterested in the subject matter, this was one of my favorite classes. By the end of the semester, not only was I interested in the subject matter, I was fascinated by it. Perhaps as instructors we need to facilitate good metacognitive practices but also nurture interest in what we teach in order to motivate students to pursue it more deeply through more effective metacognitive practices.

References

Case, J., & Marshall, D. (2004). Between deep and surface: procedural approaches to learning in engineering education contexts. Studies in Higher Education, 29(5), 605-615.

Draeger, J. (2015). Two forms of ‘thinking about thinking’: metacognition and critical thinking. Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking/ .

Taraban, R. (2015, November). Transition from means-ends to working-forward problem solving. 56th Annual Conference of the Psychonomic Society. Chicago, IL.

Welsh, A. (2015). Supports and barriers to students’ metacognitive development in a large intro chemistry course. Retrieved from https://www.improvewithmetacognition.com/supports-and-barriers-to-students-metacognitive-development-in-a-large-intro-chemistry-course/