Metacognition and the Development of Self-Identity

by Roman Taraban, Ph.D. Texas Tech University

The question “What do you want to be when you grow up” should be familiar to all of us, as well as the typical responses: a firefighter, a pilot, a doctor, a nurse, a teacher, an astronaut. We playfully pose this question to children, not fully realizing we are inquiring about their ultimate self-identity – the deep and personal awareness of who they are. Children may not have a self-identity beyond “child,” “son” “daughter,” “student,” “soccer goalie,” “Girl Scout.” But over time, that will change.

image of woman outline with words related to self-identity. Image from https://www.nextcallings.com/solutions/2017/8/24/my-self-is-changing-myselfhow-making-life-or-business-transitions-can-produce-new-parts-of-the-self

So when does self-identity emerge, and how does metacognition help it along its developmental path? In this post, I propose that the emergence of self-identity is a lifelong process that begins in early childhood and has strong underpinnings in memory research. Flavell (1987) brings in the metacognitive factor, in part, through his discussion of metacognitive experiences. We all have self-identity, however, we know little about how to monitor and regulate it metacognitively in order to develop and maintain a healthy and adaptive sense of self.

Who Am I? Where Is My Life Going?

Self-identity emerges out of a specific kind of memory, known as episodic memory. Episodic memory enables a person to recall personally experienced events and to re-live those experiences in the here-and-now (Tulving, 2002). Fivush (2011) refers to the organized coherent sense of self that emerges from episodic experiences as autobiographical memory. Autobiographical memory allows a person to construct an evolving life story that creates a coherent sense of self-identity, of who we are. Thinking about these memory processes would seem to be a perfect place for metacognition to play a major role.

Autobiographical memory and, with it, narrative identity, develop starting in early childhood. A child’s identity is influenced, in part, by the opportunities for relating personal events through conversations with caregivers and friends. Mothers who are elaborative with their children before their preschool years have children who produce more coherent self-narratives by the end of their preschool years (Fivush, 2011). One way, for example, is by asking open-ended questions with some guiding information – e.g., What did we do at the park today? Parents, teachers, and friends continue to shape identity long into adulthood with the questions they ask and the personal experiences that they share. These interactions prompt reflections on one’s own experiences and resonate to the questions Who am I? Where is my life going?

Metacognitive Experiences

John H. Flavell, an American developmental psychologist, labeled higher-level cognition as metacognition and is regarded as a founding scholar in metacognitive research. A major component in Flavell’s theory is a metacognitive experience, which is “any kind of [a]ffective or cognitive conscious experience that is pertinent to the conduct of intellectual life” (Flavell, 1987, p. 24). Flavell suggests that there is a developmental element in individuals’ adaptive responsiveness to these experiences: “As one grows older one learns how to interpret and respond appropriately to these experiences” (p. 24). When do we have metacognitive experiences? According to Flavell, “when the cognitive situation is something between completely novel and completely familiar…where it is important to make correct inferences, judgments, and decisions” (p. 24).

The question of how and when self-identity evolves in college students was explored in an edited book on undergraduate research experiences (Taraban & Blanton, 2008). Students’ responses have the character of metacognitive experiences – i.e., conscious experiences in which inferences, judgments, and decisions are critical. It is metacognitive experiences like these that help us to theoretically bridge the development of self-identity from the nurturing discourses of mothers with young children, to the choice of fields of study in high school and college, and ultimately to a relatively stable identity as an adult professional:

Wyatt McMahon: Thus, as I grew up, when people asked me what I wanted to be, I realized that I wanted to help improve society, but I was not sure how.

Robin Henne: Before the tour [of Texas Tech Biology], I had no idea that research was even possible for biology majors; following the tour, I was convinced that research was what I wanted to do for my career.

Susan Harrell Yee: When I first started as a freshman at Texas Tech University, I chose environmental engineering as my major. It seemed a wise decision – I liked math and I liked ecology, and environmental engineering seemed to be a logical combination of the two. But after a single day, I knew the engineering route was not for me.

Engineering Identity

An area of great interest in current scholarly research involves engineering identity. Engineering educators are interested in how engineering students view themselves early on in their training (Loshbaugh & Claar, 2007), as well as what it means more generally to think of oneself as an engineer (Godwin, 2016; Morelock, 2017). The poignancy of this issue struck me when leading a discussion with graduate engineering students. The topic of discussion was, in part, personal narrative, which is the autobiographical narrative we create about ourselves and which is the basis of self-identity. It was evident from their comments that embracing a self-identity was not instantaneous upon choosing professional training. The following conveyed a sense of the struggle:

For the majority of my life, I have always been a “student” studying to become insert profession.

I sometimes to this day don’t consider myself as an engineer. I feel like throughout my time [here], I’ve always just been an “engineering student”.

I have struggled to see myself as an engineer but the older I get and the more secure I become in my field the easier it is to own and step into that narrative.

The Role of Metacognition

We are surrounded by instances of introspection regarding self-identity. Neal Diamond, the 20th century pop singer, presented his reflections as an existential crisis: I am…I said. Walt Whitman, the 19th century poet, gave a transcendental response in 52 parts in “Song of Myself,” and Reverend William Holmes Borders, Sr., a civil-rights activist, in the 1950s proclaimed “I Am Somebody” in a poem of self identity. Although we all have a sense of self-identity, very little explicit attention has been given in research to ways of metacognitively monitoring and guiding the development of a healthy and adaptive sense of self. This is one area where extending metacognitive theory beyond its current bounds could have a significant role in helping us to know who we are and to reach our true potential.

References

Fivush, R. (2011). The development of autobiographical memory. Annual Review of Psychology, 62, 559-582.

Flavell, J. H. (1987). Speculations about the nature and development of metacognition. In F. E. Weinert, & R. Kluwe (Eds.), Metacognition, motivation, and understanding (pp. 21-29). Hillsdale, NJ: Lawrence Erlbaum Associates.

Godwin, A. (2016). The development of a measure of engineering identity. In Proceedings, ASEE Annual Conference & Exposition, New Orleans, LA.

Loshbaugh, H., & Claar, B. (2007). Geeks are chic: Cultural identity and engineering students’ pathways to the profession. In Proceedings ASEE Annual Conference & Exposition, Honolulu, HI.

Morelock, J. R. (2017). A systematic literature review of engineering identity: Definitions, factors, and interventions affecting development, and means of measurement. European Journal of Engineering Education42(6), 1240-1262.

Taraban, R., & Blanton, R. L. (Eds.). (2008). Creating effective undergraduate research programs in science: The transformation from student to scientist. New York: Teachers College Press.

Tulving, E. (2002). Episodic memory: From mind to brain. Annual Review of Psychology, 53, 1-25.


The Metacognitive Reading Strategies Questionnaire (MRSQ): Cross-Cultural Comparisons

by Roman Taraban, Ph.D. Texas Tech University

When you read, do you ask yourself whether the material is contributing to your knowledge of the subject, whether you should revise your prior knowledge, or how you might use the new knowledge that you are acquiring?  Do you highlight information or make notes in the margins to better remember and find information later on? Prior research by Pressley and colleagues (e.g., Pressley & Afflerbach, 1995) suggested that the type of metacognitions suggested by reading strategies like these were critical for effective reading comprehension.  

photo of a stack of books with a pair of reading glasses on top

Inspired by that research, Taraban et al. (2000) conducted a study involving 340 undergraduates and 35 reading strategies like those suggested by Pressley and colleagues and found that self-reports of strategy use were significantly associated with grade-point averages (GPA). Specifically, students who reported higher use of reading strategies also had higher GPAs.  Additionally, responses to open-ended questions showed that students who could name more reading strategies and reading goals also had significantly higher GPAs. 

The data in Taraban et al. (2000) overwhelmingly suggested a strong positive relationship between students’ knowledge and use of reading goals and strategies and their academic performance.  More generally, data by Taraban et al. and others suggest that effective reading depends on metacognitive processing – i.e., on directed cognitive effort to guide and regulate comprehension. Skilled readers know multiple strategies and when to apply them. In the remainder of this post, I review subsequent developments associated with metacognitive reading strategies, including cross-cultural comparisons, as well as raising a question about the relevance of these strategies to present-day text processing and comprehension given widespread technological developments.

Analytic VS Pragmatic Reading Strategies

In 2004, my students and I created a questionnaire, dubbed the Metacognitive Reading Strategies Questionnaire (MRSQ) (Taraban et al., 2004). The questionnaire drew on the strategies tested earlier in Taraban et al. (2000) and organized the strategies into two subscales through factor analytic methods: analytic strategies and pragmatic strategies.  The analytic scale relates to cognitive strategies like making inferences and evaluating the text (e.g., After I read the text, I consider other possible interpretations to determine whether I understood the text.). The pragmatic scale relates to practical methods for finding and remembering information from the text (e.g., I try to underline when reading in order to remember the information.). Students respond to these statements using a five-point Likert-type scale: Never Use, Rarely Use, Sometimes Use, Often Use, Always Use.

Initial applications of the MRSQ suggested that the two-factor model could aid in better understanding students’ use of metacognitive comprehension strategies.  Specifically, in students’ self-reports of expected GPA for the coming academic year, there was a significant positive correlation with analytic strategies but a non-significant correlation with pragmatic strategies, which suggested that students who reported higher use of analytic strategies also anticipated doing well academically in the coming academic year.

Cross-Cultural Explorations of Reading Strategies

Vianty (2007) used the MRSQ to explore difference in students’ use of metacognitive reading strategies in their native language, Bahasa Indonesia, and their second language, English. Participants were students in a teacher education program who completed the MRSQ in English and Bahasa Indonesia. Vianty found that students processed language differently in their native language compared to a non-native language.

In comparing mean use of analytic strategies when reading in their native language compared to English, Vianty found that nearly all means were higher for Bahasa Indonesia.  T-tests showed significant differences favoring Bahasa Indonesia for eight out of sixteen analytic strategies. Conversely, four of the six pragmatic strategies were favored when reading English, however, only one difference (I take notes when reading in order to remember the information) was significant on a t-test. Vianty concluded that students used analytic strategies significantly more in Bahasa Indonesia than English. Conversely, use of pragmatic strategies was higher when reading in English, but the effect was weak.

Taraban et al. (2013) compared US and Indian engineering undergraduates on their application of analytic and pragmatic strategies. The language of instruction in Indian universities is English; however, this is not typically the native language (the mother tongue) of the students.  Therefore, the researchers predicted lower use of analytic strategies and higher use of pragmatic strategies among Indian students compared to US students, reasoning from the findings in Vianty (2007). The latter but not former prediction was supported. Indeed, Indian students applied analytic strategies significantly more frequently than US students.  Pragmatic strategy use was significantly lower than analytic strategy use for US students but not for Indian students, who applied analytic and pragmatic strategies equally often.  Contrary to the findings in Vianty (2007), these findings suggest that students can make significant use of analytic and pragmatic strategies in a non-native language.

The most comprehensive cross-linguistic comparison was conducted recently by Gavora et al. (2019), who compared analytic and pragmatic strategy use, measured by variants of the MRSQ, among 2692 students from Poland, Hungary, Slovakia, and the Czech Republic, enrolled in education programs, primarily teacher and counseling.  Students in Hungary, Slovakia, and the Czech Republic reported significantly higher use of pragmatic over analytic strategies. Students in Poland showed a converse preference, reporting significantly more frequent use of analytic strategies. Quite striking in the results were the significant correlations between pragmatic strategy use and GPA, and analytic strategy use and GPA, for all four countries.  Specifically, the correlation showed that higher frequency use of both pragmatic and analytic strategies was associated with more successful academic performance.

Gavora et al. (2019) suggest that “In order to succeed academically, students direct their reading processes not towards comprehension but to remembering information, which is the core component of the pragmatic strategy” (p. 12). Their recommendation, that “educators’ attention should be focused on developing especially analytic strategies in students,” is strongly reminiscent of the ardor with which Pressley and colleagues began promoting metacognitive reading strategies beginning in the elementary grades. 

However, given the significant correlations between both analytic and pragmatic strategy use with GPA, it may be that the predominance of analytic strategies is not what is important, but whether application of either type of strategy – analytic or pragmatic – aids students in their academic achievement. The data from Vianty (2007) may be informative in this regard, specifically, the finding that those students applied pragmatic strategies more frequently than analytic strategies when the context – reading outside their native language – dictated a more pragmatic approach to reading and comprehension.

A relevant point made by Gavora et al. relates to the samples that have been tested to-date, and the relevance of context to strategy use. They point out that in contexts like engineering (e.g., Taraban et al., (2013), the context may support more analytic thinking and analytic strategy use.  The Gavora et al., sample consisted of humanities students, which, on their argument, may have resulted in an overwhelming affirmation of pragmatic strategies. Further comparisons across students in different programs is certainly warranted.

Changing Times: The Possible Influence of Technology on Reading

An additional question comes to mind, which is the effect of widespread technology in instructional settings. When I, like others, am uncertain about a definition, algorithm, theory, etc., I find it very easy to simply Google the point or look for a YouTube, which I simply need to read or watch for an explanation. This personal observation suggests that perhaps the strategies that are probed in the MRSQ may, at this point, be incomplete, and in some instances, somewhat irrelevant.  The next step should be to ask current students what strategies they use to aid comprehension. Their responses may lead to new insights into contemporary student metacognitions that assist them in learning.

In conclusion, there is no doubt that metacognitive strategies are essential to effective information processing.  However, there may be room to reconsider and update the strategies that students employ when reasoning and searching for information and insights to guide and expand comprehension and learning.  It may be that current technology has made students more pragmatic and a promising goal for further research would be to uncover the ways in which that pragmatism is being expressed through new search strategies.

References

Gavora, P., Vaculíková, J., Kalenda, J., Kálmán, O., Gombos, P., Świgost, M., & Bontová, A. (2019). Comparing metacognitive reading strategies among university students from Poland, Hungary, Slovakia and the Czech RepublicJournal of Further and Higher Education, 1-15.

Pressley, M., & Afflerbach, P. (1995). Verbal protocols of reading: The nature of constructively responsive reading. Hillsdale, NJ: Erlbaum.

Taraban R, Kerr M, Rynearson K (2004) Analytic and pragmatic factors in college students’ metacognitive reading strategies. Reading Psychology, 25(2), 67–81.

Taraban, R., Rynearson, K., & Kerr, M. (2000). College students’ academic performance and self-reports of comprehension strategy use. Reading Psychology, 21, 283–308.

Taraban, R., Suar, D., & Oliver, K. (2013). Information literacy of US and Indian engineering undergraduatesSpringerPlus2(1), 244.

Vianty, M. (2007). The comparison of students’ use of metacognitive reading strategies between reading in Bahasa Indonesia and in English. International Education Journal,8(2), 449–460.


How Metacognition Helps Develop a New Skill

by Roman Taraban, Ph.D., Texas Tech University

Metacognition is often described in terms of its general utility for monitoring cognitive processes and regulating information processing and behavior. Within memory research, metacognition is concerned with assuring the encoding, retention, and retrieval of information. A sense of knowing-you-know is captured in tip-of-the-tongue phenomena. Estimating what you know through studying is captured by judgments of learning. In everyday reading, monitoring themes and connections between ideas in a reading passage might arouse metacognitive awareness that you do not understand a passage that you are reading, and so you deliberately take steps to repair comprehension.  Overall, research shows that metacognition can be an effective aid in these common situations involving memory, learning, and comprehension (Dunlosky & Metcalfe, 2008).

image from https://www.champagnecollaborations.com/keepingitreal/keeoing-it-real-getting-started

But what about new situations?  If you are suddenly struck with a great idea, can metacognition help? If you want to learn a new skill, how does metacognition come into play? Often, we want to develop fluency, we want to accurately and quickly solve problems. The classic model of skill development proposed by Fitts and Posner (1967) did not explicitly incorporate metacognition into the process.  A recent model by Chein and Schneider (2012), however, does give metacognition a prominent role.  In this blog, I will review the Fitts and Posner model, introduce the Chein and Schneider model, and suggest ways that the latter model can inform learning and development.  

In Fitts and Posner’s (1967) classic description of the development of skilled performance there are three overlapping phases:

  • Initially, facts and rules for a task are encoded in declarative memory, i.e., the part of memory that stores information.
  • The person then begins practicing the task, which initiates proceduralization (i.e., encoding the action sequences into procedural memory), which is that part of memory dedicated to action sequences.  Errors are eliminated during this phase and performance becomes smooth. This phase is conscious and effortful and gradually shifts into the final phase.
  • As practice continues, the action sequence, carried out by procedural memory, becomes automatic and does not draw heavily on cognitive resources.

An example of this sequence is navigating from point A to point B, like from your home to your office.  Initially, the process depends on finding streets and paying attention to where you are at any given time, correcting for wrong turns, and other details.  After many trials, you leave home and get to the office without a great deal of effort or awareness.  Details that are not critical to performance will fall out of attention.  For instance, you might forget the names of minor streets as they are no longer necessary for you to find your way. Another more academic example of Fitts and Posner includes learning how to solve math problems (Tenison & Anderson, 2016). In math problems, for instance, retrieval of relevant facts from declarative memory and calculation via procedural memory become accurate and automatic along with speed-up of processing.

Chein and Schneider (2012) present an extension of the Fitts and Posner model in their account of the changes that take place from the outset of learning a new task to the point where performance becomes automatic. What is distinctive about their model is how they describe metacognition. Metacognition, the first stage of skill development, “guides the establishment of new routines” (p. 78) through “task preparation” (p. 80) and “task sequencing and initiation” (p. 79). “[T]he metacognitive system aids the learner in the establishing the strategies and behavioral routines that support the execution of the task” (p. 79).  Chein and Schneider suggest that the role of metacognition could go deeper and become a characteristic pattern of a person’s thoughts and behaviors: “We speculate that individuals who possess a strong ability to perform in novel contexts may have an especially well-developed metacognitive system which allows them to rapidly acquire new behavioral routines and to consider the likely effectiveness of alternative learning strategies (e.g., rote rehearsal vs. generating explanations to oneself; Chi, 2000).”

In the Chein and Schneider model, metacognition is the initiator and the organizer.  Metacognitive processing recruits and organizes the resources necessary to succeed at learning a task.  These could be cognitive resources, physical resources, and people resources. If, for example, I want to learn to code in Java, I should consider what I need to succeed, which might include YouTube tutorials, a MOOC, a tutor, a time-management plan, and so on. Monitoring and regulating the cognitive processes that follow getting things set up are also part of the work of metacognition, as originally conceived by Flavell (1979).  However, Chein and Schneider emphasize the importance of getting the bigger picture right at the outset. In other words, metacognition can work as a planning tool. We tend to fall into thinking of metacognition as a guide for when things go awry. While we know that it can be helpful in setting learning goals so that we can track progress towards those goals and resources to help us achieve them, we may fall into thinking of metacognition as a “check-in” when things go wrong. Of course, metacognition can be that too, but metacognition can be helpful on the front end, especially when it comes to longer-term, challenging, and demanding goals that we set for ourselves. Often, success depends on developing and following a multi-faceted and longer-term plan of learning and development.

In summary, the significant contribution to our understanding of metacognition that Chein and Schneider (2012) make is that metacognitive processing is responsible for setting up the initial goals and resources as a person confronts a new task. With effective configuration of learning at this stage and sufficient practice, performance will become fluent, fast, and relatively free of error.  The Chein and Schneider model suggests that learning and practice should be preceded by thoughtful reflection on the resources needed to succeed in the learning task and garnering and organizing those resources at the outset. Metacognition as initiator and organizer sets the person off on a path of successful learning.

References

Chein, J. M., & Schneider, W. (2012). The brain’s learning and control architecture. Current Directions in Psychological Science, 21, 78-84.

Chi, M. T. (2000). Self-explaining expository texts: The dual processes of generating inferences and repairing mental models. In R. Glaser (Ed.), Advances in instructional psychology, (Vol. 5), pp. 161-238. Mahwah, NJ: Erlbaum.

Dunlosky, J., & Metcalfe, J. (2008). Metacognition. SAGE, Los Angeles

Fitts, P. M., & Posner, M. I. (1967). Human performance. Belmont, CA: Brooks/Cole.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist34, 906-911.

Tenison, C., & Anderson, J. R. (2016). Modeling the distinct phases of skill acquisition. Journal of Experimental Psychology: Learning, Memory, and Cognition42(5), 749-767.


Metacognitions About a Robot

by Roman Taraban, Ph.D., Texas Tech University

Imagine a time when intelligent robots begin interacting with humans in sophisticated ways. Is this a bit farfetched? Probably not, as there already exist compelling examples of just that. Sophia, a robot, so impressed her Saudi audience at an investment summit in 2017 that she was granted Saudi citizenship. Nadine, another robot, is an emotionally intelligent companion whose “intelligent behavior is almost indistinguishable from that of a human”. The coming exponential rise of artificial intelligence into all aspects of human behavior requires a consideration of possible consequences. If a machine is a billion times more intelligent than a human, as some predict will happen by 2045, what will cognitive and social interactions with such superhuman machines be like? Chris Frith (2012) argues that a remarkable human capacity is metacognition that concerns others. However, what if the “other” is an intelligent machine, like a robot. Is metacognition about a robot feasible? That is the question posed here. Four aspects of metacognition are considered: the metacognitive experience, theory of mind, teamwork, and trust. Other aspects could be considered, but these four should be sufficient to get a sense of the human-machine metacognitive possibilities.

robot and human hand fist bump

Flavell (1979) defined metacognitive experiences as follows: “Metacognitive experiences are any conscious cognitive or affective experiences that accompany and pertain to any intellectual enterprise. An example would be the sudden feeling that you do not understand something another person just said” (p. 906). Other examples include wondering whether you understand what another person is doing, or believing that you are not adequately communicating how you feel to a friend. We can easily apply these examples to intelligent machines. For instance, I might have a sudden feeling that I did not understand what a robot said, I might wonder if I am understanding what a robot is doing, or I may believe that I am communicating poorly with the robot. So it appears to be safe to conclude that we can have metacognitive experiences involving robots.

Other instances of metacognition involving intelligent machines, like robots, are problematic. Take, for instance, mentalizing or Theory of Mind. In mentalizing, we take account (monitor) of others’ mental states and use that knowledge to predict (control) others’ and our own behavior. In humans, the ability to reason about the mental states of others emerges between the ages of 4 to 6 years and continues to develop across the lifespan. In a typical test of this ability, a child observes a person place an object in drawer A. The person then leaves the room. The child observes another person move the object to drawer B. When the first person returns, the child is asked to predict where the person will look for the object. Predicting drawer A is evidence that the child can think about what the other person believes, and that the child recognized that the other person’s beliefs may not be the same as the child’s own knowledge. Theory of mind metacognition directed towards humans is effective and productive; however, theory of mind metacognition directed to intelligent machines is not likely to work. The primary reason is that theory of mind is predicated on having a model of the other person and being able to simulate the experience of the other person. Because intelligent machines process information using algorithms and representations that differ from those humans use, it is not possible to anticipate the “thinking” of these machines and therefore predict their behavior in a metacognitive manner, i.e., having a theory of the other mind. Presently, for instance, intelligent machines use deep learning networks and naïve Bayes algorithms to “think” about a problem. The computational methods employed by these machines differ from those employed by humans.

What about teamwork? According to Frith (2012), humans are remarkable in their ability to work together in groups. Teamwork accounts for humans’ incredible achievements. The ability to work together is due, in large part, to metacognition. The specific factor cited by Frith is individuals’ willingness to share and explain the metacognitive considerations that prompted their decision-making behavior. For group work to succeed, participants need to know the goals, values, and intentions of others in the group. As has been pointed out already, machine intelligence is qualitatively different from human knowledge, so that is one barrier to human-machine group work. Further, the benefits of group work depend on a sense of shared responsibility. It is currently unknown whether or how a sense of cooperation and shared responsibility would occur in human-machine decision making and behavior.

There is one more concern related to machine intelligence that is separate from the fact that machines “think” in qualitatively different ways compared to humans. It is an issue of trust. In some cases of social interaction, understanding information that is being presented is not an issue. We may understand the message, but wonder if our assessment of the source of the information is reliable. Flavell (1979) echoed this case when he wrote: “In many real-life situations, the monitoring problem is not to determine how well you understand what a message means but to determine how much you ought to believe it or do what it says to do” (p. 910). When machines get super smart, will we be able to trust them? Benjamin Kuipers suggests the following: “For robots to be acceptable participants in human society, they will need to understand and follow human social norms.  They also must be able to communicate that they are trustworthy in the many large and small collaborations that make up human society” https://vimeo.com/253813907 .

What role will metacognitions about super-intelligent machines have in the future? Here I argue that we will have metacognitive experiences involving these machines. Those experiences will occur when we monitor and regulate our interactions with the machines. However, it is not clear that we will be able to attain deeper aspects of metacognition, like theory of mind. This is because the computations underlying machine intelligence are qualitatively different from human computation. Finally, will we be able to trust robots with our wealth, our children, our societies, our lives? That will depend on how we decide to regulate the construction, training, and deployment of super intelligent machines. Flavell (1979) often brings in affect, emotion, and feelings, into the discussion of metacognitive experiences. Kuipers emphasizes the notion of trust and ethics. These are all factors that computer scientists have not begun to address in their models of intelligent machine metacognition (Anderson & Oates, 2007; Cox, 2005). Hopefully solutions can be found, in order to enable rich and trustworthy relationships with smart machines.

References

Anderson, M. L., & Oates, T. (2007). A review of recent research in metareasoning and metalearning. AI Magazine28(1), 12.

Cox, M. T. (2005). Field review: Metacognition in computation: A selected research review. Artificial intelligence169(2), 104-141.

Flavell, John H. (1979). Metacognition and cognitive monitoring. American Psychologist, 34(10), 906-911.

Frith, C. D. (2012). The role of metacognition in human social interactions. Philosophical Transactions of the Royal Society B: Biological Sciences, 367(1599), 2213-2223.


Practicing Metacognition on a Chatbot

by Roman Taraban, Ph.D., Texas Tech University

Cognition involves many kinds of processes. There is categorization, problem solving, decision making, and comprehension, among others. Metacognition may involve these processes, but is different from them. Metacognitive thinking is thinking about the processes themselves (Draeger, 2015). That is, thinking about the processes involved in categorization, comprehension, and so on, and how these processes relate to one’s information processing capabilities. John Flavell, who coined the term metacognition, suggested that metacognitive processing relates not only to the individual thinkers but to others as well: “Metacognitive knowledge is one’s stored knowledge or beliefs about oneself and others as cognitive agents, about tasks, about actions or strategies, and about how all these interact to affect the outcomes of any sort of intellectual enterprise” (Flavell, 1999, p. 906). Consideration of how thinking in another person informs one’s own metacognitive knowledge is seldom considered in discussions of metacognition. In this post, I relate how reflecting on how others process information, specifically, how machines process information, can inform a person’s understanding of how he or she processes information. The metacognitive processes of interest here are those related to language processing, and the specific machine processing relates to that of machine systems called chatbots.

Chatbots are computer programs that interact with a person auditorily or through text. They are designed to communicate as much as possible like humans, in order to convey a sense of natural language communication. Chatbots are typically developed for commercial purposes, to provide customer service, for instance, or information about products or places. You will find chatbots on websites for companies, organizations, and events.

Recently I taught a graduate seminar on psycholinguistics, which is concerned with language acquisition, production, and comprehension. I assigned students the task of building chatbots for an application that interested them, for instance, a chatbot that could inform a user of the movies currently playing around town and show times. After students had built their chatbots and demonstrated them to the class, I assigned a written take-home metacognitive activity in which students had to discuss some aspect of the nature of chatbot language, for example, ways in which chatbot language might reduce moral relativity, constrain language interactions, or homogenize language. Students essentially had to think about the language processing constraints in chatbots and how that might affect their language interactions.

Students built chatbots to do everything from helping a student choose colleges for graduate work, college courses, movies, and restaurants, to guiding workouts or choosing a football game to watch. In their subsequent metacognitive reflection assignment, students had plenty to say:

  • Chatbots are peculiar devices.
  • Chatbots do not process language as humans would.
  • Chatbots, because of their limited cognitive capabilities, cannot respond to novel stimuli in conversations and therefore cannot problem-solve or be socially engaging.
  • Chatbots have higher potential of providing logical, true and precise answers than humans.
  • The nature of chatbot language has positive characteristics that reinvent the notion of interaction, and negative characteristics that create many confusions and misinterpretations about the use of a language.

Using chatbots as a foil prompted students to consider the nature of their own language processes. As a few examples:

For example, human communication is not a mere string of words put together to make meaning, rather it employs many other resources which feed communication such as the extralinguistic, paralinguistic and metalinguistic cues in order to achieve successful communication. I think that chatbots cannot perform such complex task as efficiently as most people do.

Although chatbots may serve humans as they interact with them, I think they do so with a structured sort of language which is intended to perform very specific tasks. As human language is inherently relative and creative, I think chatbots need much improvement to sound like humans if we need them to interact more “naturally.” In terms of human language, a unique characteristic is the ability to process linguistic and non-linguistic inputs. As humans we can process such inputs with the help of our background knowledge, working memory and other brain functions. Our judgements are further constrained, shaped or developed by moral relativity, i.e. the philosophical standpoints given or attributed by the cultures and societies we belong.

The students’ reflections on chatbot language processing fit Flavell’s (1999) suggestion that metacognition includes beliefs about others as cognitive agents, that is, as intelligent communicative actors. Often, learning about metacognitive strategies may begin by observing others and implicitly mimicking their behaviors. For instance, as children we may notice someone writing down a phone number or looking up a phone number and we recognize and adopt these specific processes to manage information. Knowledge of the strategies becomes more explicit the first time we fail to apply the strategy and cannot remember a phone number. We observe classmates reviewing notes repetitively and self-testing and adopt these methods of regulating and monitoring study behaviors. We rarely, if ever, create objects like chatbots, as in the present case, and use the objects to reflect on others’ and our own metacognitive processes, as a learning process. However, as AI technology and products become more prevalent, there arise many natural opportunities to think about and compare machines’ processes to our own. Of course, to qualify as metacognitive thinking, reflections on man vs machine processing will have to go beyond superficial comments like “My Alexa is not too smart.” To be metacognitive, thinking has to be about the processes themselves, in the machine and in the person.

The theme of this post is to highlight how metacognition is not only about thinking about one’s own thinking, but also thinking about thinking in the entities – humans or machines – with whom we communicate. Building a chatbot gives students direct contact with the processes in the machine and a bridge to reflecting on their own processes by comparison. It forces students to reflect on strengths and limitations of both kinds of language. There are other instances where this type of metacognitive knowledge comes into play naturally. Take child-directed speech (a.k.a. motherese, baby talk), for instance. Caretakers adjust their intonation, vocabulary, and rhythm when speaking to infant siblings. They have a sense that an infant is processing language differently so they adjust their own processing to accommodate. Similarly, in the classroom or at a conference, we become aware (sometimes depressingly) that our message is not connecting and may try to make adjustments in speed, terminology, examples, etc. The difference between those situations and the present one is that there may not be a moment of deliberate metacognitive reflection – how is the other person processing information compared to how I am processing the information. Flavell reminds us that this, too, is metacognitive. Here I am suggesting that we can make those moments more deliberate, indeed, we can turn them into class assignments!

References

Draeger, J. (2015). Two forms of ‘thinking about thinking’: metacognition and critical thinking. Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking/.

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34(10), 906-911. doi.org/10.1037/0003-066X.34.10.906


Hate-Inspired Webforums, PTSD, and Metacognition

by Roman Taraban, Texas Tech University

In linguistics, a register is a variety of speech used for distinct purposes in particular social settings. In a manner consistent with that terminology, I am here using the term discourse register to refer to sets of specific terms and meanings, and to specific vocabularies used by groups in order to achieve distinct purposes. Unlike a dictionary, a register is not so much concerned with the meanings of words as it is with their association with cognitions, affects, and behaviors. A discourse register can link together such disparate phenomena as hate speech, PTSD, and metacognition by virtue of the fact that each has a distinct discourse register, that is, each applies a specific vocabulary and manner of speech. The purpose of this blog post is to suggest that these disparate phenomena are similar by virtue of the way that they operate. The second purpose is to suggest a way of increasing our understanding of metacognitive processing by beginning to implement some of the technology that has already been extensively applied to hate-inspired webforums and trauma-related therapies.

Regarding hate speech, the internet has provided radical right groups the means to organize networks that often promote bias, bigotry, and violence. An example is Stormfront (https://www.stormfront.org/forum/), which was established by white supremacist and ex-felon Don Black in 1996. (Figea, 2015). Right-wing extremists use the internet to build identity and unity with “like-minded” individuals. This has prompted researchers and government analysts to analyze extremist communications in order to gain an understanding of these groups. Importantly, key indicators in the communications are sought out that could indicate future events (Figea, 2015; Figea et al., 2016).

What are the key indicators in extremist communications? The answer lies in part in the concept of a discourse register. It consists of the specific vocabulary and ways of communicating that characterize the shared conversations and practices of a group. For example, Figea (2015) applied machine learning to analyze Stormfront forum exchanges in an attempt to assess the level of three affects: aggression towards an outgroup, racism, and worries about the present or future. A sample of forum posts was classified by humans for the affects, then a machine was trained on the human classifications and tested on a new sample of forum posts. Key indicators for the racism affect were black, race, Jew, protest, and Zionist, corresponding to topics in the forums associated with Black inferiority, Jewish conspiracy, and government corruption (Figea, 2015).

The idea of a shared discourse among a group of individuals provides the theoretical glue that allows binding the activities, speech, and shared identity of groups of individuals. In some cases, the analysis of discourse has provided insights into the motivations and behaviors of extremist and terrorist groups, as described by Figea and colleagues (2015; Figea et al., 2016). In other cases, researchers have applied the idea of discourse and discourse analysis to prosocial activities involving counseling and therapy. Pennebaker and King (1999) proposed that “the way people talk about things reveals important information about them” (p. 1297). In order to assist them in their analyses, Pennebaker and colleagues developed and tested the LIWC (Linguistic Inquiry and Word Count) software. This software has been successfully applied to the analysis of texts in a variety of contexts and applied to a wide range of dimensions. These include analyses of emotionality, social status, group processes, close relationships, deception and honesty, thinking styles, and individual differences (Tausczik & Pennebaker, 2010).

Jaeger et al. (2014) examined the associations between trauma-related experiences (e.g., PTSD, depression, anxiety) and the content of the narratives written by trauma patients. The researchers found significant differences between daily vs trauma-related narratives in the use of cognitive-mechanism words (e.g., cause, know, ought) and negative emotion words (e.g., hate, worthless, enemy). There were also strong associations between the words that patients used and the severity of their trauma. The approach and outcomes in Jaeger et al. was similar to that employed by Figea and colleagues.

A perk of the LIWC software is that it allows individuals to develop their own specialized dictionaries and to import those dictionaries into LIWC to analyze language use for evidence of the target constructs. When individuals express sadness, they use words like sad, loss, cry, alone (Pennebaker & King, 1999). Sadness is part of a person’s emotion register. Can we apply this analytic approach to metacognition and ask, What is the discourse of metacognition? As instructors, how do the ways we talk about teaching reflect a metacognitive register – i.e., words that reflect an understanding of cognitive functioning, learning, limitations, self-regulation, monitoring, scaffolding, and so on. How do the ways we talk about students, classrooms, homework, and student collaboration mirror metacognitive understanding and processing? Current technology allows us to begin exploring these questions. Following the model provided in Figea (2015; Figea et al., 2016), one place to start might be this Improve With Metacognition (IWM) forum. The analysis of published scholarship on metacognition would be another source of texts to use to train and analyze a machine to detect key metacognitive indicators in texts. Human coders would code sentences in a sample of the texts as involving or not involving metacognition. These classification would be used to train a machine. After training, the machine would be tested on a new sample of texts.

Development of a metacognitive register is subject to the same constraints as any good scholarship. The developers need to be experts in the area of metacognition, and they need to have a clear grasp of how metacognition works. The linguistic analysis dictionary that they develop needs to be accurate and comprehensive. It needs to be a team effort – one individual cannot do it alone. The dictionary needs to be tested for construct validity, internal consistency, and for reliable test results across a variety of participants and contexts. In spite of the challenges inherent in the task, the prospect of a ready analytic tool for metacognition could help in advancing the application of the powerful cognitive suite of metacognitive processes in classrooms.

 

References

Figea, L. (2016). Machine learning for affect analysis on white supremacy forum. Downloaded from https://uu.diva-portal.org/smash/get/diva2:955841/FULLTEXT01.pdf .

Figea, L., Kaati, L, & Scrivens, R. (2016). Measuring online affects in a white supremacy forum. In IEEE Xplore. DOI: 10.1109/ISI.2016.7745448

Pennebaker, J. W., & King, L. A. (1999). Linguistic styles: Language Use as an individual difference. Journal of Personally and Social Psychology, 77(6), 1296-1312.

Tausczik, Y. R., & Pennebaker, J. W. (2010). The psychological meaning of words: LIWC and computerized text analysis methods. Journal of Language and Social Psychology, 29(1), 24-54.


Does a Machine Have Metacognition?

by Roman Taraban, Ph.D.,  Texas Tech University

In the movie Arrival, the character Louise Banks is portrayed as a linguist who can decipher an alien language. For much of this task, Louise and colleagues are doing pattern matching, trying to establish a correspondence between English and the alien language. A critical piece of the plot is in the interpretation given to the translation of the alien message “offer weapon.” Louise’s competitors interpret this as “use weapon” and propose to attack and destroy the aliens. Alternatively, Louise considers whether there might be an alternative interpretation for weapon, like “tool” or “technology.” From a metacognitive perspective, we might describe the competitors as thinking at a cognitive level, interpreting the phrase literally and acting accordingly. Louise, we could say, acted metacognitively, questioning whether the literal cognitive process was sufficient. Throughout the movie, Louise questions the sufficiency of her cognitive resources for the task at hand, and figures out how to overcome her limitations. In the end, metacognition saves the day.

Normally, we think of metacognition as a value-added add-on to everyday thinking. The metacognitive individual in a sense transcends his or her own limitations. The person recognizes limitations of memory storage and speed of processing, and the value of external memory, spaced practice, planning, and so on. With this recognition of limitations of memory and processing comes a search for and discovery of strategies for managing cognition. This “higher-order” processing is metacognitive, and in the movie Arrival, Louise Banks is our metacognitive hero.

Although we are inclined to attribute metacognition to bright individuals, like Louise Banks, can we dismiss the possibility that metacognition can exist in “dumb” machines – dumb in the sense that they do not have human-like understanding? Intelligent machines, like computers, process patterns mechanically. Does a computer need to experience metacognition like a human in order for the process to be regarded as metacognitive? Is a jolt of adrenalin a necessary part of the recognition process signaling to us that we should monitor and check our calculations? The present blog is not about some distance aliens, but about a smart machine that is being widely used in many different applications today. The machine is IBM’s Watson.

There are clearly some areas in which today’s computers do not need to be metacognitive. Humans can only hold 7 + 2 chunks of information in short-term memory. An intelligent system like IBM’s Watson https://www.ibm.com/watson/developercloud/nl-classifier.html has 15 terabytes of cache memory and processes 80 teraflops per second, so neither short-term memory nor speed of processing are issues. Metacognitive processes for recognizing and preserving short-term memory would seem to be pointless, as would many of the metacognitive resource-management strategies that humans depend on. Would IBM Watson need to grab a pencil and jot a phone number onto a scrap of paper? Not likely

There may be other ways, though, that machines could exhibit metacognitive behaviors. For instance, a machine like IBM Watson might know that humans are limited in how much information they can process in a unit of time. As a metacognitive strategy, Watson might control and monitor the rate at which he verbalizes in conversation. Watson might change his linguistic register when conversing with a young child https://www.youtube.com/watch?v=vqjndtS8jQU . Watson could attempt to choose an appropriate theme with specific speakers, like Bob Dylan. In a commercial with Dylan, Watson wisely chose to talk about Dylan https://www.youtube.com/watch?v=oMBUk-57FGU. Watson apparently can monitor and modulate its own behavior depending on the context, goal, and particular interaction.

What about monitoring its own resources? If we gave Watson a set of test questions, it is not likely that Watson would reason about them metacognitively like a human. For example, Watson would not acknowledge that word problems are more difficult than straight calculations, so would attack the calculations first. However, it is not difficult to imagine situations in which Watson could reason metacognitively about his own processing and plan, control, and monitor those processes. For instance, recognizing that in the context of a crisis certain information is more critical, Watson could modify the order in which information is compiled and provided to, say, paramedics at the scene of a disaster. This would involve prioritizing specific information, queueing it up in a specific order, delivering it, and monitoring its effective transfer to the paramedics.

The irony, perhaps, is that Watson is not exhibiting “transcendent” behavior, which is how we might view metacognition in humans. Instead, Watson is simply carrying out mechanical computations, which, in a sense, are like any other computations that Watson carries out. The existence of machines like Watson should prompt us to ask whether our metacognitive ruminations may also simply be computational processes. Perhaps the sense of metacognition involving “higher-order” thinking, the self-pride we take in thinking about thinking, is an add-on, an epiphenomenon on which actual metacognitive processing in no way depends. In any case, the question of whether computers can be designed to use metacognitive strategies, to plan and modulate behaviors depending on circumstances, and to monitor the success of their efforts, may deserve a positive “yes.”


Bringing a Small Gift – The Metacognitive Experience

by Roman Taraban, Ph.D.,  Texas Tech University

In the Christmas song, “The Little Drummer Boy,” the young boy brings his humble gift to the “mighty king,” which he presents from the heart. This is an apt situation to bring up at this time of year, for you too, might have received a small gift. For ‘tis the season for metacognition.

John Flavell and others generally describe metacognition as thinking about thinking. More specifically, “Metacognitive knowledge is one’s stored knowledge or beliefs about oneself and others as cognitive agents, about tasks, about actions or strategies, and about how all these interact to affect the outcomes of any sort of intellectual enterprise” (Flavell, 1999, p. 906). Flavell (1999) broadened metacognitive theory to include affect: “Metacognitive experiences are conscious cognitive or affective experiences that occur during the enterprise and concern any aspect of it—often, how well it is going” (p. 906). Affect, as part of metacognitive experiences, is important because if you have the feeling that something is difficult to comprehend, remember, or solve, those feelings may trigger careful metacognitive reflection and changes in goals or strategies (Papaleontiou-Louca, 2008). Nuhfer (2014), in a related vein, affirms the crucial role of affect to metacognition in developing students’ metacognitive skills: “[A]ttempts to develop students’ metacognitive proficiency without recognizing metacognition’s affective qualities are likely to be minimally effective.”

So what is that gift I mentioned earlier? It’s your end-of-semester evaluations, of course. There we ask students to evaluate and comment on whether the course objectives were specified and followed by the instructor, whether the instructor was an effective teacher, and whether the course was a valuable learning experience. These questions prompt students to think about their thinking in the course. Without prompting, students spontaneously also comment on their affect. And here come the gifts, some of them encouraging, pleasant, and precious as gold. Here are a few examples: I very much enjoyed the discussions and deeper exploration of the material. I felt that the papers pushed me to genuinely consider and critically evaluate the material in a way I may not have otherwise. Thank you for an enjoyable and thought-provoking seminar. This has been my favorite psychology class. The work assignments were challenging and (dare I say) fun.

But sometimes the gift can be a bit disconcerting. There was one unfortunate December when I unluckily received my course evaluations just before leaving on a family vacation to Las Vegas. I had gone through the semester thinking how wise I was and how well things were going. The students told me otherwise. Yes, they explained why I deserved those low ratings, so they had to think about their metacognitive experience – i.e., what it was like learning the material in my course and how they felt about the process. For a week, I was inconsolable. But the students had got my attention. I realized I had become too complacent. I had to think deeply about my thinking about how to organize and deliver the course. I had to engage in metacognitions about teaching. And it wasn’t just about thinking about the knowledge I had and they had (or had not). It was also about the affect – how I felt about the course, myself, and the students, in the context of those metacognitions.

That semester was a gift. Every semester is a gift. But we have to accept the gift for it to be meaningful and make a difference. So…all good tidings for the season – I mean, end of the semester.

References

Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34(10), 906-911. doi.org/10.1037/0003-066X.34.10.906

Nuhfer, E. (2014). Self-assessment and the affective quality of metacognition: Part 1 of 2. Retrieved from https://www.improvewithmetacognition.com/self-assessment-and-the-affective-quality-of-metacognition-part-1-of-2/

Papaleontiou-Louca, E. (2008). Metacognition and theory of mind. Newcastle, UK: Cambridge Scholars Publishing.

 


A Whole New Engineer: A Whole New Challenge

by Roman Taraban, Ph.D.,  Texas Tech University

In 1973, cognitive psychologists Kahneman and Tversky (1973) wanted to present their study participants with a stereotypical description of engineers:

Jack is a 45-year old man. He is married and has four children. He is generally conservative, careful, and ambitious. He shows no interest in political and social issues and spends most of his free time on his many hobbies, which include home carpentry, sailing, and mathematical puzzles. (p. 241)

When asked if they thought Jack was an engineer, 90% of the participants thought he was.

Whatever stereotypes of engineers may persist to the present day (e.g., geek, introvert, asocial: http://www.thecreativeengineer.com/2008/12/16/a-few-engineering-myths/ ), various parts of the engineering community are trying to create “a whole new engineer” (Goldberg & Somerville, 2014). Cross-disciplinary centers have been established at universities, like iFoundry which was launched in 2008 at the University of Illinois, in order to prepare engineering students for working in the 21st century. One mandate was to promote “deep reflection and attention to the complex system in which engineering education is embedded” (https://ifoundry.illinois.edu/who-we-are/what-ifoundry ).

On a larger scale, the Franlin W. Olin College of Engineering admitted its first class in 2002 in order to implement a full-scale hands-on, project-based and design curriculum. Olin College provides students with funding for “passionate pursuits,” which are personal projects of academic value proposed by students https://en.wikipedia.org/wiki/Franklin_W._Olin_College_of_Engineering. STEM is being transformed to STEAM, where the addition of A represents Artful Thinking in the context of Science, Technology, Engineering, and Mathematics (Radziwell et al., 2015). To develop artful thinking a facilitator might present a painting and ask students: What do you see? What does it make you think? What is happening? Why do you think so? These questions help learners develop dispositions to observe, describe, question, reason, and reflect. The whole new engineer is becoming a whole lots of things, but is the new engineer becoming more metacognitive?

We know that engineering students can be metacognitive when solving textbook problems (Taraban, 2015). Indeed, by now there is an extensive corpus of research on students’ textbook problem-solving in introductory physics and other areas of STEM. Explaining the material to oneself with the knowledge that this will help one better understand it, or testing oneself with the knowledge that this will help one more reliably retrieve the information later, are examples of metacognitive processes and knowledge. Case and Marshall (1995) described a developmental pathway by which students transition towards deeper understanding of domain concepts and principles, which they labeled the conceptual deep approach to learning, and which is: “relating of learning tasks to their underlying concepts or theory” with the intention “to gain understanding while doing this” (p. 609). Basically, their suggestion is that over the course of development students recognize that a goal of learning is to understand the material more deeply, and that this recognition guides how they learn. Draeger (2015), and others, have suggested that this kind of monitoring of the effectiveness of learning strategies and regulating one’s behavior are characteristic of metacognitive thinking.

The current re-design of the traditional engineer involves sweeping changes, in the classroom, in the university, and in professional practice, and it aims to do this, in part, by infusing more reflection into engineering training and practice. So, what is a reflective practitioner, and are reflective practitioners metacognitive thinkers?

Schön (1987) suggested that reflective practitioners think carefully about what they are doing as they are doing it. Reflective practitioners assess and revise their existing practices and strive to develop more effective behaviors. They critically assess their behavior as a means to improving it. As Schön (1987) puts it, reflective practice is a “dialogue of thinking and doing through which I become more skillful” (p. 31). Schön maintained “that there is a core of artistry, an exercise of intelligence, and a kind of knowing inherent in professional practice, which we can only learn about by carefully studying the performance of extremely competent professionals” (Osterman, 1990, p. 133).

Through reflective practice we submit our behaviors to critical analysis, asking questions like these: What am I doing? What effect is it having? (Osterman, 1990). This very much reminds one of the distinction that Draeger (2015) made between metacognition and critical thinking. Specifically, one can be a critical thinker without being metacognitive. The two processes can overlap but are not identical. Simply, to be metacognitive, one would need to think about the reflective processing itself. Metacognitions would involve knowledge of the benefits of reflective practice, how it relates to self, and metacognitive processes related to monitoring and controlling the reflective practices. Imagine observing any expert – an expert teacher, an expert golfer, an expert acrobat – and striving to mimic that expertise through carefully observing and critiquing one’s own performance. That’s reflective practice. It’s about trying to get a job done in the best possible way. In a complementary fashion, metacognitive knowledge and processing involve intentionally and consciously monitoring and regulating those reflective practices.

In A Whole New Engineer (Goldberg & Somerville, 2014) the authors assert that

Here we are calling attention to the importance of the Whole New Engineer’s ability to do three things:

  • Notice and be aware of thoughts, feelings, and sensations.
  • Reflect and learn from experience.
  • Seek deeper peace, meaning, and purpose from noticing and reflection. (p. 114)

Goldberg and Somerville (2014) make a call to be more attentive and sensitive to surroundings, to notice and reflect, but not necessarily to be metacognitive in those contexts – they are not clear about the latter point. Thus, it may be safe to say that being metacognitive doesn’t automatically come through reflective practice, critical thinking, mindfulness, or artful thinking strategies. Metacognition represents a distinct type of knowledge and process that can potentially enhance the effects of the aforementioned. The whole new engineer can be a whole lot of things, but is not automatically a metacognitive engineer. Simply, an engineering student, or even a practicing engineer, can be good at certain design projects, for instance, and develop a critical eye for that work, but without necessarily developing metacognitive awareness around when to shift strategies or techniques in order to be more effective.

References

Draeger, J. (2015). Two forms of ‘thinking about thinking’: metacognition and critical thinking. Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking/ .

Kahneman, D., & Tversky, A. (1973). On the psychology of prediction. Psychological Review, 80(4), 237-251. http://dx.doi.org/10.1037/h0034747

Osterman, K. F. (1990). Reflective practice: A new agenda for education. Education and Urban Society, 22(2), 133-152.

Radziwill, N. M., Benton, M. C., & Moellers, C. (2015). From STEM to STEAM: Reframing what it means to learn. The STEAM Journal, 2(1), Article 3.

Schön, D. (1987). Educating the reflective practitioner. How professionals think in action. London: Temple Smith.

Taraban, R. (2015). Metacognition in STEM courses: A developmental path. Retrieved from https://www.improvewithmetacognition.com/metacognition-in-stem-courses-a-developmental-path/


Don’t “Just Do It” – Think First

by Roman Taraban, PHD, Texas Tech University

“Just Do It” has been a great slogan for selling athletic equipment and has also spawned some humorous spinoffs, like Bart Simpson’s “Can’t someone else just do it?” And is it not how we sometimes solve problems: “Don’t think, just do it?” Although just doing it (or getting someone else to do it) may have some visceral appeal, models for teaching argue against just doing it when it comes to solving problems.

One of the most influential problem-solving models is Polya’s (1957) 4-step model: i) understand the problem, ii) develop a plan, iii) carry out the plan, and iv) look back. On this model, solvers don’t “do it” until the 3rd step. What is really striking about this model is that it is mostly about critical thinking and metacognitive processing. The principles of understanding the problem, planning one’s approach to solving the problem, and reflecting on the solution after “doing it,” all require critical thinking and metacognition (Draeger, 2015). STEM disciplines have generally embraced the Polya model, suggesting that commitments to metacognitive thinking by researchers and instructors are widespread and well-entrenched. Two disciplines will be considered here to make that point: mathematics and engineering.

In a research study in mathematics, Carlson and Bloom (2005) collected and analyzed the problem solving behaviors of twelve expert mathematicians. The data showed that the mathematicians engaged in metacognitive behaviors and decisions that were organized within a general problem-solving framework consisting of Orienting, Planning, Executing, and Checking. One of the phases, Executing, is where one “does it” – the others are more metacognitive. Researchers have developed comparable models for problem-solving in engineering. These models preface equation-crunching with understanding the problem and planning a solution, and follow up with reflection on the solution. This is exemplified in the six-step McMaster model: Engage, Define the Stated Problem, Explore, Plan, Do It, and Look Back (Woods et al., 1997).

In spite of teachers’ best intentions, might students still just do it? Certainly! An alternative to metacognitive planning before doing, and monitoring, regulating, and reflecting, is to apply a purely rote strategy (Garofalo & Lester, 1985), also termed a “plug and chug” method (Maloney, 2011). Plug and chug in physics and engineering involves a mental search for equations that will solve the problem, but with little conceptual understanding of the nature of the problem, little strategic decision-making, and little metacognitive self-reflection and regulation of the solution process. In disciplines not involving equations, various matching and cut-and-paste strategies could qualify as plug-and-chug. James Stice, a distinguished professor in chemical engineering, described part of his own engineering training (Stice, 1999) that suggests how plug-and-chug may come about:

“When I was an undergraduate student, many of my professors would derive an equation during lecture, and then would proceed to work an example problem. They would outline the situation, invoke the equation, plug in the numbers and arrive at a solution. What they did always seemed very logical and straightforward, I’d get it all down in my notes, and I’d leave the class feeling that I had understood what they had done. Later I often was chagrined to find that I couldn’t work a very similar problem for homework.” (p. 1)

Much of the motivation for research on how experts solve problems, like Carlson and Bloom (2005), has led to developing didactic models for the classroom, like the six-step McMaster model (Woods et al., 1997) in engineering: Engage, Define the Stated Problem, Explore, Plan, Do It, and Look Back. These didactic models have been developed largely in response to the absence of metacognitive thinking among students.

Although teaching methods could account for some of the absence of metacognitive thinking in beginning students, domain-specific knowledge may also be a factor. Few would disagree that domain-specific knowledge plays a key role in successful problem solving. Indeed, Carlson and Bloom attributed the expertise of their mathematicians, in part, to “a large reservoir of well-connected knowledge, heuristics, and facts” (p. 45). Can a novice student readily access domain-related facts, organize information within the problem, muse, imagine, and conjecture over possible strategies, apply heuristics, and effectively monitor progress? Of course not. Obviously, the absence of domain-specific knowledge in beginning students enables and motivates the teaching of domain-specific knowledge. But I would like to argue that the absence of domain-specific knowledge also enables and motivates teaching students metacognitive processes. This may seem illogical, but it’s not. The point is that an absence of domain-specific knowledge provides instructors with a great opportunity to teach the domain-specific knowledge but also how to think about thinking about that knowledge, that is, how to be metacognitive while learning facts and procedures.

Getting students to “Think, then Do It” will require more than working examples for them on the blackboard in order to convey domain-specific knowledge. Instead, within a framework like that provided by Carlson and Bloom, the metacognitive processes at each step of solving the problem should also be modeled. Some students may show metacognitive behaviors early on, and all successful students will eventually catch on. However, to truly be a pedagogical principle, it needs to be part of the learning situation. A model of metacognitive instruction (Scharff, 2015) for the student could be guided by the work on scaffolding metacognitive processes proposed in the seminal work of Brown and Palinscar (1982). The point is to take the domain-specific knowledge that you are trying to convey and to model and scaffold it to students along with the metacognitive decisions and control that go with expert problem solving, and to do it early on in instruction. It is worth mentioning that James Stice, who was taught to plug and chug, became a follower and proponent of the six-step McMaster model as professor of chemical engineering.

There is an old Jack Benny joke. Jack Benny was a comedian known for being a cheapskate. One night a thug stopped him – “Don’t make a move bud, your money or your life.” After a long pause, the thug, clearly annoyed, repeated – “Look bud, I said….Your money or your life.” Jack Benny: “I’m thinking it over.” Just to be fair, sometimes we should just Do It and not think too much about it. When it comes to teaching and learning, though, thinking about thinking is better.

References

Brown, A. L., & Palinscar, A. S. (1982). Inducing strategic learning from texts by means of informed, self-control training. Tech Report No. 262. Urbana: University of Illinois Center for the Study of Reading.

Carlson, M. P., & Bloom, I. (2005). The cyclic nature of problem solving: An emergent multidimensional problem-solving framework. Educational Studies in Mathematics, 58, 45-75.

Draeger, J. (2015). Two forms of ‘thinking about thinking’: metacognition and critical thinking. Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking/ .

Garofalo, J., & Lester Jr., F. K. (1985). Metacognition, cognitive monitoring, and mathematical performance. Journal for Research in Mathematics Education, 16(3), 163-176.

Maloney, D. P. (2011). An overview of physics education research on problem solving. Getting Started in PER..Reviews in PER vol. 2. College Park, MD: American Association of Physics Teachers. http://opus.ipfw.edu/physics_facpubs/49

Polya, G. (1957). How to solve it. Princeton, NJ: Princeton University Press.

Scharff, Lauren (2015). “What do we mean by ‘metacognitive instruction?” Retrieved from https://www.improvewithmetacognition.com/what-do-we-mean-by-metacognitive-instruction/

Stice, J. (1999). Teaching problem solving. In Teachers and students – A sourcebook (Section 4). University of Texas at Austin: Center for Teaching Effectiveness. Retrieved from

http://www.utexas.edu/academic/cte/sourcebook/teaching3.pdf

Woods, D. R., Hrymak, A. N., Marshall, R. R., Wood, P. E., Crowe, C. M., Hoffman, T. W., Wright, J. D., Taylor, P. A., Woodhouse, K. A., & Bouchard C. G. (1997). Developing problem solving skills: The McMaster problem solving program. Journal of Engineering Education, 86(2), 75–91.


Unskilled and Unaware: A Metacognitive Bias

by John R. Schumacher, Eevin Akers, & Roman Taraban (all from Texas Tech University).

In 1995, McArthur Wheeler robbed two Pittsburgh banks in broad daylight, with no attempt to disguise himself. When he was arrested that night, he objected “But I wore the juice.” Because lelemonmon juice can be used as an invisible ink, Wheeler thought that rubbing his face with lemon juice would make it invisible to surveillance cameras in the banks. Kruger and Dunning (1999) used Wheeler’s story to exemplify a metacognitive bias through which relatively unskilled individuals overestimate their skill, being both unaware of their ineptitude and holding an inflated sense of their knowledge or ability. This is called the Dunning-Kruger effect, and it also seems to apply to some academic settings. For example, Kruger and Dunning found that some students are able to accurately predict their performance prior to taking a test. That is, these students predict that they will do well on the test and actually perform well on the test. Other students predict that they will do well on a test, but do poorly on the test. These students tend to have an inflated sense of how well they will do but do poorly, thus they fit the Dunning-Kruger effect. Because these students’ predictions do not match their performance, we describe them as poorly calibrated. Good calibration involves metacognitive awareness. This post explores how note taking relates to calibration and metacognitive awareness.

Some of the experiments in our lab concern the benefits of note taking. In these experiments, students were presented with a college lecture. Note takers recalled more than non-notetakers, who simply watched the video (Jennings & Taraban, 2014). The question we explored was whether good note taking skills improved students’ calibration of how much they know and thereby reduced the unskilled and unaware effect reported by Kruger and Dunning (1999).

In one experiment, participants watched a 30-minute video lecture while either taking notes (notetakers) or simply viewing the video (non-notetakers). They returned 24 hours later. They predicted the percentage of information they believed they would recall, using a scale of 0 to 100, and then took a free-recall test, without being given an opportunity to study their notes or mentally review the prior day’s video lecture. They then studied their notes (notetakers) or mentally reviewed the lecture (non-notetakers) from the previous day, for12 minutes, and took a second free-recall test. In order to assess the Dunning-Kruger effect, we subtracted the actual percent of lecture material that was recalled in each test (0 to 100) from participants’ predictions of how much they would recall on each test (0 to 100). For example, if a participant predicted he or she would correctly recall 75% of the material on a test and actually recalled 50% the calibration score would be +25 (75 – 50 = 25). Values close to +100 indicated extreme overconfidence, values close to -100 indicated extreme underconfidence, and values close to 0 indicated good calibration. To answer our question about how note taking relates to calibration, we compared the calibration scores for the two groups (note takers and non-notetakers) for the two situations (before reviewing notes or reflecting, and after reviewing notes or reflecting). These analyses indicated that the two groups did not differ in calibration for the first, free recall test. However, to our surprise, note takers became significantly more overconfident, and thus less calibrated in their predictions, than non-notetakers on the second test. After studying, notetakers’ calibration became worse.

Note taking increases test performance. So why doesn’t note taking improve calibration? Since note takers are more “skilled”, that is, have encoded and stored more information from the lecture, shouldn’t they be more “aware”, that is, better calibrated, as the Dunning-Kruger effect would imply? One possible explanation is that studying notes immediately increases the amount of information processed in working memory. The information that participants will be asked to recall shortly is highly active and available. This sense of availability produces the inflated (and false) prediction that much information will be remembered on the test. Is this overconfidence harmful to the learner? It could be harmful to the extent that individuals often self-generate predictions of how well they will do on a test in order to self-regulate their study behaviors. Poor calibration of these predictions could lead to the individual failing to recognize that he or she requires additional study time before all material is properly stored and able to be recalled.

If note taking itself is not the problem, then is there some way students can improve their calibration after studying in order to better regulate subsequent study efforts? The answer is “yes.” Research has shown that predictions of future performance improve if there is a short delay between studying information and predicting subsequent test performance (Thiede, Dunlosky, Griffin, & Wiley, 2005). In order to improve calibration after studying notes, students should be encouraged to wait, after studying their notes, before judging whether they need additional study time. In order to improve metacognitive awareness with respect to calibration, students need to understand that immediate judgments of how much they know may be inflated. They need to be aware that waiting a short time before judging whether they need more study will result in more effective self-regulation of study time.

References
Jennings, E., & Taraban, R. (May, 2014). Note-taking in the modern college classroom: Computer, paper and pencil, or listening? Paper presented at the Midwestern Psychological Association (MPA), Chicago, IL.

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of personality and social psychology, 77(6), 1121.

Thiede, K. W., Dunlosky, J., Griffin, T. D., & Wiley, J. (2005). Understanding the delayed-keyword effect on metacomprehension accuracy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31(6), 1-25.


Metacognition in STEM courses: A Developmental Path

by Roman Taraban, PHD, Texas Tech University

There is a strong focus in science, technology, engineering, and math (STEM) courses to solve problems (Case & Marshall, 2004). Does problem solving in STEM involve metacognition? I argue that the answer must surely be ‘yes’. That’s because metacognition involves monitoring the effectiveness of learning and problem-solving strategies and using metacognitive knowledge to regulate behavior (Draeger, 2015). But when does metacognition become part of problem solving, and how does it come about? Can we discern development in metacognitive monitoring and regulation? In this post, I will present some qualitative data from a study on problem-solving in order to reflect on these questions. The study I draw from was not about metacognition per se, however, it may provide some insights into the development of metacognition.

The study I conducted involved freshman engineering majors. These students were asked to solve typical problems from the course in mechanics in which they were currently enrolled (Taraban, 2015). Not surprisingly, students varied in how they began each problem and how they proceeded towards a solution. In order to gain some insight into their problem-solving strategies, I asked students to simply state why they started with the equation they chose and not some other equation, after they had solved the problems.

Students’ responses fell into at least three types, using labels from Case and Marshall (2004): surface, algorithmic, and deep conceptual. When asked why they started with their first equation, some students responded:

  • “I don’t know, it’s just my instinct”.
  • “No special reason. I’m just taking it randomly”.
  • “It’s just habit.”
  • “The first thing that came to my mind.”

Of interest here, these students did not appear to reflect on the specific problem or show evidence of modulating their behavior to the specific problemheir responses fit a surface learning approach: “no relationships sought out or established, learn by repetition and memorization of formulae” (Case & Marshall, 2004, p. 609).

Other students’ responses reflected an algorithmic approach to learning — “identifying and memorizing calculation methods for solving problems” (Case & Marshall, 2004, p. 609):

  • “I am getting three variables in three unknowns so I can solve it.”

Here the student verbally expresses a more structured approach to the problem. The student believes that he needs three equations involving three unknowns and uses that as a goal. Students who take an algorithmic approach appear to be more reflective and strategic about their solutions to problems, compared to surface problem solvers.

Case and Marshall (1995) regarded both the surface and algorithmic pathways as part of development towards deeper understanding of domain concepts and principles, the latter which they labeled the conceptual deep approach to learning: “relating of learning tasks to their underlying concepts or theory” with the intention “to gain understanding while doing this” (p. 609). Basically, their suggestion is that at some point students recognize that a goal of learning is to understand the material more deeply, and that this recognition guides how they learn. Case and Marshall’s description of conceptual deep learning fits Draeger’s (2015) earlier suggestion that monitoring the effectiveness of learning and regulating one’s behavior is characteristic of metacognitive thinking. Once students reach this level, we should be able to more readily observe students’ intentions to understand the material and observe their overt attempts to grasp the material through their explicit reflection and reasoning. Examples of this type of reflection from my study could be gleaned from those students who did not jump directly to writing equations without first thinking about the problem:

  • “If I choose the moment equation first, then directly I am getting the value of F. So in the other equations I can directly put the value of F.”

As students progress from surface to algorithmic to deep conceptual processing, there is certainly development. However, in the present examples that track that development, it is difficult to partial out students’ thinking about the problem content from their thinking-about-thinking, that is, their metacognitions. Draeger (2015) helps here by distinguishing between metacognition and critical thinking. The latter often requires domain-specific knowledge. Draeger suggests that “many students are able to solve complex problems, craft meaningful prose, and create beautiful works of art without understanding precisely how they did it” (p. 2). Basically, critical thinking is about methodology within a domain – e.g., the person knows how to format a narrative or select an appropriate statistical procedure, without necessarily reflecting on the effectiveness of those choices, that is, without metacognition. In the examples I provided above from my work with undergraduates on problem solving, there is invariably a mix of critical thinking and metacognition. Draeger’s distinction signals a need to better decouple these two distinct kinds of cognitive processes in order to better clarify the developmental trajectory of metacognitive processing in problem solving.

Finally, why do we observe such wide variance in students’ approaches to problem-solving, and, relatedly, to metacognition? One reason is that instructors may emphasize assessment and grades (Case & Marshall, 2004). As a consequence, students may focus more on gaining points for the correct answer rather than on the process. Welsh (2015) has suggested that course structure can act as a barrier to deeper learning: “high stakes assessments may overshadow resources designed for metacognitive development” (p. 2). Welsh found that students were more concerned with test performance than with reflecting upon their study strategies and implementing learning strategies recommended by the instructor.

How are we to understand this discord between concern with test performance and metacognition? At some level, when students set goals to do well on tests they are regulating their behavior. Metacognitive resources from the instructor may be in competition with students’ perceived resources (e.g., access to old tests, study buddies, cramming the night before). The instructor can facilitate change, but the leap from surface and algorithmic learner to deep conceptual learner must be undertaken by the student.

Passion and commitment to a topic are strong motivators to find the means to access and acquire deeper conceptual understanding. One measure of teacher success is class test performance, but another can be found in student comments. Here is one that I recently received that I found encouraging: Despite the fact that I was a bit uninterested in the subject matter, this was one of my favorite classes. By the end of the semester, not only was I interested in the subject matter, I was fascinated by it. Perhaps as instructors we need to facilitate good metacognitive practices but also nurture interest in what we teach in order to motivate students to pursue it more deeply through more effective metacognitive practices.

References

Case, J., & Marshall, D. (2004). Between deep and surface: procedural approaches to learning in engineering education contexts. Studies in Higher Education, 29(5), 605-615.

Draeger, J. (2015). Two forms of ‘thinking about thinking’: metacognition and critical thinking. Retrieved from https://www.improvewithmetacognition.com/two-forms-of-thinking-about-thinking-metacognition-and-critical-thinking/ .

Taraban, R. (2015, November). Transition from means-ends to working-forward problem solving. 56th Annual Conference of the Psychonomic Society. Chicago, IL.

Welsh, A. (2015). Supports and barriers to students’ metacognitive development in a large intro chemistry course. Retrieved from https://www.improvewithmetacognition.com/supports-and-barriers-to-students-metacognitive-development-in-a-large-intro-chemistry-course/


Metacognitive Judgments of Knowing

Roman Taraban, Ph.D., Dmitrii Paniukov, John Schumacher, Michelle Kiser, at Texas Tech University

“The more you know, the more you know you don’t know.” Aristotle

Students often make judgments of learning (JOLs) when studying. Essentially, they make a judgment about future performance (e.g., a test) based on a self-assessment of their knowledge of studied items. Therefore, JOLs are considered metacognitive judgments. They are judgments about what the person knows, often related to some future purpose. Students’ accuracy in making these metacognitive judgments is academically important. If students make accurate JOLs, they will apply just the right amount of time to mastering academic materials. If students do not devote enough time to study, they will underperform on course assessments. If students spend more time than necessary, they are being inefficient.

As instructors, it would be helpful to know how accurate students are in making these decisions. There are several ways to measure the accuracy of JOLs. Here we will focus on one of these measures, termed calibration. Calibration is the difference between a student’s JOL related to some future assessment and his actual performance on that assessment. In the study we describe here, college students made JOLs (“On a scale of 0 to 100, what percent of the material do you think you can recall?”) after they read a brief expository text. Actual recall was measured in idea units (IUs) (Roediger & Karpicke, 2006). Idea units are the chunks of meaningful information in the text.   Calibration is here defined as JOL – Recalled IUs, or simply, predicted recall minus actual recall. If the calibration calculation leads to a positive number, you are overconfident to some degree; if the calculation result is negative, then you are underconfident to some degree. If the result is zero, then you are perfectly calibrated in your judgment.

The suggestion from Aristotle (see quote above) is that gains in how much we know lead us to underestimate how much we know, that is, we will be underconfident. Conversely, when we know little, we may overestimate how much we know, that is, we will be overconfident. Studies using JOLs have found that children are overconfident (predicted recall minus actual recall is positive) (Lipko, Dunlosky, & Merriman, 2009; Was, 2015). Children think they know more than they know, even after several learning trials with the material. Studies with adults have found an underconfidence with practice (UWP) effect (Koriat et al., 2002), that is, the more individuals learn, the more they underestimate their knowledge. The UWP effect is consistent with Aristotle’s suggestion. The question we ask here is ‘which is it’: If you lack knowledge, do your metacognitive judgments reflect overconfidence or underconfidence, and vice versa? Practically, as instructors, if students are poorly calibrated, what can we do to improve their calibration, that is, to recalibrate this metacognitive judgment.

We addressed this question with two groups of undergraduate students, as follows. Forty-three developmental-reading participants were recruited from developmental integrated reading and writing courses offered by the university, including Basic Literacy (n = 3), Developmental Literacy II (n = 29), and Developmental Literacy for Second Language Learners (n = 11). Fifty-two non-developmental participants were recruited from the Psychology Department subject pool. The non-developmental and developmental readers were comparable in mean age (18.3 and 19.8 years, respectively) and the number of completed college credits (11.8 and 16.7, respectively), and each sample represented roughly fifteen academic majors. All participants participated for course credit. The students were asked to read one of two expository passages and to recall as much as they could immediately. The two texts used for the study were each about 250 words in length and had an average Flesch-Kincaid readability score of 8.2 grade level. The passages contained 30 idea units each.

To answer our question, we first calculated calibration (predicted recall – actual recall) for each participant. Then we divided the total sample of 95 participants into quartiles, based on the number of idea units each participant recalled. The mean proportion of correct recalled idea units, out of 30 possible, and standard deviation in each quartile for the total sample were as follows:

Q1: .13 (.07); Q2: .33 (.05); Q3: .51 (.06); Q4: .73 (.09). Using quartile as the independent variable and calibration as the dependent variable, we found that participants were overconfident (predicted recall > actual recall) in all four quartiles. However, there was also a significant decline in overconfidence from Quartile 1 to Quartile 4 as follows: Q1: .51; Q2: .39; Q3: .29; Q4: .08. Very clearly, the participants in the highest quartile were nearly perfectly calibrated, that is, they were over-predicting their actual performance by only about 8%, compared to the lowest quartile, who were over-predicting by about 51%. This monotonic trend of reducing overconfidence and improving calibration was also true when we analyzed the two samples separately:

NON-DEVELOPMENTAL: Q1: .46; Q2: .39; Q3: .16; Q4: .10;

DEVELOPMENTAL: Q1: .57; Q2: .43; Q3: .39; Q4: .13.

The findings here suggest that Aristotle may have been wrong when he stated that “The more you know, the more you know you don’t know.” Our findings would suggest that the more you know, the more you know you know. That is, calibration gets better the more you know. What is striking here is the vulnerability of weaker learners to overconfidence. It is the learners who have not encoded a lot of information from reading that have an inflated notion of how much they can recall. This is not unlike the children in the Lipko et al. (2009) research mentioned earlier. It is also clear in our analyses that typical college students as well as developmental college students are susceptible to overestimating how much they know.

It is not clear from this study what variables underlie low recall performance. Low background knowledge, limited vocabulary, and difficulty with syntax, could all contribute to poor encoding of the information in the text and low subsequent recall. Nonetheless, our data do indicate that care should be taken in assisting students who fall into the lower performance quartiles to make better calibrated metacognitive judgments. One way to do this might be by asking students to explicitly make judgments about future performance and then encouraging them to reflect on the accuracy of those judgments after they complete the target task (e.g., a class test). Koriat et al. (1980) asked participants to give reasons for and against choosing responses to questions before the participants predicted the probability that they had chosen the correct answer. Prompting students to consider the amount and strength of the evidence for their responses reduced overconfidence. Metacognitive exercises like these may lead to better calibration.

References

Koriat, A., Lichtenstein, S., Fischoff, B. (1980). Reasons for confidence. Journal of Experimental Psychology: Human Learning and Memory, 6(2), 107-118.

Koriat, A., Sheffer, L., & Ma’ayan, H. (2002). Comparing objective and subjective learning curves: Judgments of learning exhibit increased underconfidence with practice. Journal of Experimental Psychology: General, 131, 147–162.

Lipko, A. R., Dunlosky, J., & Merriman, W. E. (2009). Persistent overconfidence despite practice: The role of task experience in preschoolers’ recall predictions. Journal of Experimental Child Psychology, 102(2), 152-166.

Roediger, H., & Karpicke, J. D. (2006). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17(3), 249-255.

Was, C. (2015). Some developmental trends in metacognition. Retrieved from

https://www.improvewithmetacognition.com/some-developmental-trends-in-metacognition/.

 


To Test or Not to Test: That is the Metacognitive Question

by John Schumacher & Roman Taraban at Texas Tech University

In prepping for upcoming classes, we are typically interested in how to best structure the class to promote the most effective learning. Applying best-practices recommendations in the literature, we try to implement active learning strategies that go beyond simple lecturing. One such strategy that has been found to be effective from research is the use of testing. The inference to draw from the research literature is quite simple: test students frequently, informally, and creatively, over and above standard course tests, like a mid-term and final. Testing is a useful assessment tool, but research has shown that it is also a learning tool that has been found to promote learning above and beyond simply rereading material (Roediger & Karpicke, 2006a). This is called the testing effect. In controlled studies, researchers have shown testing effects with a variety of materials, including expository texts and multimedia presentations (e.g., Carrier & Pashler, 1992; Huff, Davis, & Meade, 2013; Johnson & Mayer, 2009; Roediger & Karpicke, 2006b). Testing has been found to increase learning when implemented in a classroom setting (McDaniel, Anderson, Derbish, & Morrisette, 2007) and is a useful learning tool for people of all ages (Meyer & Logan, 2013). The theoretical explanation for the benefits of testing is that testing strengthens retrieval paths to the stored information in memory more so than simply rereading the material. Therefore, later on a person can more effectively recover the information from memory.

Although implementing testing and other active learning strategies in the classroom is useful in guiding and scaffolding student learning, it is important that we develop an understanding of when and for whom these strategies are most helpful. Specifically, regarding testing, research from our lab and in others is starting to show that testing may not always be as beneficial as past research suggests. Characteristics of the students themselves may nullify or even reverse the benefits of testing. Thus, the first question we address is whether frequent classroom testing will benefit all students. Yet a more pertinent question, which is our second question, is whether frequent testing develops metacognitive practices in students. We will discuss these in turn.

In a formal study of the testing effect, or in an informal test in any classroom, one needs two conditions, a control condition in which participants study the material on their own for a fixed amount of time, and an experimental condition in which participants study and are tested over the material, for instance, in a Study-Test-Study-Test format. Both groups spend an equal amount of time either simply studying or studying and testing. All participants take a final recall test over the material. Through a series of testing-effect studies incorporating expository texts as the learning material, we have produced a consistent grade-point average (GPA) by testing-effect interaction. This means that the benefits of testing (i.e., better later retrieval of information) depend on students’ GPAs! A closer look at this interaction showed us that students with low GPAs benefited most from the implementation of testing whereas mid to high GPA students benefited just as much by simply studying the material.

While at this preliminary stage it is difficult to ascertain why exactly low GPA students benefit from testing in our experiments while others do not, a few observations can be put forth. First, at the end of the experiments, we asked participants to report any strategies they used on their own to help them learn the materials. Metacognitive reading strategies that the participants reported included focusing on specific aspects of the material, segmenting the material into chunks, elaborating on the material, and testing themselves. Second, looking further into the students’ self-reports of metacognitive strategy use, we found that participants in the medium to high GPA range used these strategies often, while low GPA students used them less often. Simply, the self-regulated use of metacognitive strategies was associated with higher GPAs and better recall of the information in the texts that the participants studied. Lower GPA students benefited when the instructor deliberately imposed self-testing.

These results are interesting because they indicate that the classroom implementation of testing may only be beneficial to low achieving students because they either do not have metacognitive strategies at their disposal or are not applying these strategies. High-achieving students may have metacognitive strategies at their disposal and may not need that extra guidance set in place by the instructor.

Another explanation for the GPA and testing-effect interaction may simply be motivation. Researchers have found that GPA correlates with motivation (Mitchell, 1992). It is possible that implementing a learning strategy may be beneficial to low GPA students because it forces them to work with the material. Motivation may also explain why GPA correlated with metacognitive strategy use. Specifically if lower GPA students are less motivated to work with the material it stands to reason that they would be less likely to employ learning strategies that take time and effort.

This leads to our second question: Does frequent testing develop metacognitive skills in students, particularly self-regulated self-testing? This is a puzzle that we cannot answer from the current studies. Higher-GPA students appear to understand the benefits of applying metacognitive strategies and do not appear to need additional coaxing from the experimenter/teacher to apply them. Will imposing self-testing, or any other strategy on lower-GPA students lead them to eventually adopt the use of these strategies on their own? This is an important question and one that deserves future attention.

While testing may be useful for bolstering learning, we suggest that it should not be blindly utilized in the classroom as a learning tool. A consideration of what is being taught and to whom will dictate the effectiveness of testing as a learning tool. As we have suggested, more research also needs to be done to figure out how to bring metacognitive strategies into students’ study behaviors, particularly low-GPA students.

References

Carrier, M., & Pashler, H. (1992). The influence of retrieval on retention. Memory & Cognition,   20(6), 633-642.

Huff, M. J., Davis, S. D., & Meade, M. L. (2013). The effects of initial testing on false recall and             false recognition in the social contagion of memory paradigm. Memory & Cognition41(6), 820-831.

Johnson, C. I., & Mayer, R. E. (2009). A testing effect with multimedia learning. Journal of          Educational Psychology, 101(3), 621-629.

McDaniel, M. A., Anderson, J. L., Derbish, M. H., & Morrisette, N. (2007). Testing the testing effect in the classroom. European Journal of Cognitive Psychology, 19(4-5), 494-513.

Meyer, A. D., & Logan, J. M. (2013). Taking the testing effect beyond the college freshman:        Benefits for lifelong learning. Psychology and Aging, 28(1), 142-147.

Mitchell Jr, J. V. (1992). Interrelationships and predictive efficacy for indices of intrinsic,                         extrinsic, and self-assessed motivation for learning. Journal of Research and       Development in Education25(3), 149-155.

Roediger, H., & Karpicke, J. D. (2006a). The power of testing memory: Basic research and           implications for educational practice. Perspectives on Psychological Science, 1(3), 181-       210.

Roediger, H., & Karpicke, J. D. (2006b). Test-enhanced learning: Taking memory tests     improves long-term retention. Psychological Science, 17(3), 249-255.


Mind the Feedback Gap

by Roman Taraban (Texas Tech University)

mindthegap

The saying “Mind the Gap” originated in 1969 to warn riders on London subways of the gap between the platform and subway car. Since then, it has been broadly applied to situations in which there may be something missing or lacking between where you are and where you want to be. The cautionary message sounded loudly this semester when I realized that my undergraduate students were not particularly interested in the constructive feedback they were receiving on their bi-weekly formative evaluations over the course content, consisting of short-answer and brief essay responses. This was troubling since I was trying to promote metacognition through my feedback. But I am getting a bit ahead of myself.

Feedback in the Classroom

Technology now affords instructors easy-to-use means of providing timely and detailed feedback on work that is submitted digitally. As one example, assignments can be sent to a website and the instructor can use tools like “Track Changes” and “New Comment” in Microsoft WordTM to insert edits and comments in a clear and readable fashion. Beyond these basic digital tools, the coming of age of automated instructional tutors has brought with it a science of just-in-time feedback, synced with the computer’s best guess as to what a student knows at any given moment, and providing little to extensive feedback and guidance, depending on a student’s ability and prior experience (Graesser et al. 2005; Koedinger et al., 1997). In terms of technology, there are broad options available to instructors, from easy markup tools to software that will automatically grade papers. Indeed, there has not been a better time for developing and delivering effective feedback to students.

Students’ Perceptions of Feedback

The utility of feedback has been examined empirically, and has produced several practical suggestions (Koedinger et al., 1997; Shute, 2008). Students’ perceptions of feedback, though, have not been extensively researched; however, a few things are known. Weaver (2006) reported that students found several aspects of feedback to be unhelpful: when the comments provided were general or vague, when the comments did not provide guidance for rethinking or revising, when they focused on the negative, and when they were unrelated to the task. On a more positive note, Higgins and Harley (2002) conducted a survey of college students and reported the criteria that over 75% of students considered important:

  • Comments that tell you what you could do to improve – 92%
  • Comments that explain your mistakes – 91%
  • Comments that focus on the level of critical analysis – 90%
  • Comments that focus on your argument – 89%
  • Comments that focus on the tutor’s overall impressions – 87%
  • Comments that tell you what you have done badly – 86%
  • Comments that focus on the subject matter – 82%
  • Comments that correct your mistakes – 80%
  • Feedback that tells you the grade – 79%
  • Comments that focus on your use of supporting evidence – 79% (p. 60)

Students’ Reactions to Feedback

For several semesters I have been following Weaver’s and Higgins and Harley’s dictums, using formative evaluations in an undergraduate class that prompt critical, reflective, and evaluative thinking, for many of the questions. This semester, I dutifully edited and commented on students’ responses and electronically delivered these back to students. After the second formative evaluation, I announced to students that grades had been posted and that if they wanted more detailed comments to let me know and I would email them as I had done for the first exam. Here is the irony: only 2 out of 30 students wanted the feedback.   Assuring students that sending commented responses would not create extra work for me did not change the outcome on subsequent evaluations. Students simply did not care to hear my thoughts on their work. As it turns out, Higgins and Hartley (2002) had already anticipated my situation when they suggested that students may be extrinsically motivated to achieve a specific grade and to acquire related credentials, and may not be intrinsically motivated to reflect on their understanding of the material through the critical lens afforded by instructors’ comments.

Perceptions of Feedback – A Touchstone

Feedback may be a touchstone of metacognition. Often, to boost metacognition in the classroom, we implement tasks intended to evoke critical thinking. But what better way to increase metacognition than through developing a keener sense in students for feedback. In a way, deeply considering the teacher’s feedback requires “thinking about someone else’s thinking” in order to improve one’s own “thinking about thinking.” It appears that for too long, I have been over-estimating students’ interest in thinking critically about their own work. And as is true with the development of other cognitive abilities, several things will need to happen for change to occur. From my side, more “demandingness” may be required: to be explicit about what I want, to sensitize students to my feedback through questioning and prompting, and to scaffold the process of reflecting on feedback through directed exercises. Most importantly, the feedback needs to have carry-over value to future student work.

It is generally accepted that feedback is an essential component of learning, providing a vehicle for thinking about one’s own thinking. Logically, alerting students to their strengths and weaknesses can provide the means by which they can reflect on how they thought through a task and how to constructively modify their approach in future work. None of this will happen, though, if students fail to consider the feedback. Wojtas (1998) warned of this possibility some years ago, when he reported on the research findings in one university, suggesting that some students were concerned only with their grade and not with substantive feedback. It may be helpful to pose the same stark question to our students in order to begin to close the feedback gap: Are you only interested in your grade?

My own experience has led me to other researchers confronting similar disconcerting situations. Jollards et al. (2009) write “teachers often feel their time is wasted when it is invested in marking work and making comments on assignments, only to see work not collected in class and then left at their doorstep at the end of semester. Even if it is collected the students might not read the feedback, and even if it is read, they might not act on it. As Shute (2008) points out, “Feedback can promote learning, if it is received mindfully” (p. 172). In sum, feedback is necessary because it can give students something to think about and can prompt deeper levels of reflection. Feedback needs to be good if the gap is going to be closed. But it is also the case that good feedback alone is not enough. Metacognition is necessary if feedback is going to lead to meaningful improvement. Students must process the feedback via metacognition if they are to close the gap. (Thanks to John Draeger for these summary points!)

References

Graesser, A. C., McNamara, D., & VanLehn, K. (2005). Scaffolding deep comprehension strategies through AutoTutor and iSTART. Educational Psychologist, 40, 225–234.

Higgins, R., & Hartley, P. (2002). The Conscientious Consumer: Reconsidering the role of assessment feedback in student learning. Studies in Higher Education, 27(1), 53-64. DOI:10.1080/0307507012009936 8

Jollands, M., McCallum, N., & Bondy, J. (2009). If students want feedback why don’t they collect their assignments? 20th Australasian Association for Engineering Education Conference, University of Adelaide, Australia.

Koedinger, K., Anderson, J. R., Hadley, W. H., Mark, M. (1997). Intelligent tutoring goes to school in the big city. International Journal of Artificial Intelligence in Education, 8, 30-43.

Shute, V. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189. DOI:10.3102/0034654307313795

Weaver, M. R. (2006). Do students value feedback? Student perceptions of tutors’ written responses. Assessment & Evaluation in Higher Education, 31(3), 379-394. DOI:10.1080/02602930500353061

Wojtas, O. (1998). Feedback? No, just give us the answers. Times Higher Education Supplement, September 25 1998.

 


What Metacognitive Skills Do Developmental College Readers Need?

by Roman Taraban, Dmitrii Paniukov, and Michelle Kiser

Texas Tech University

In a recent post to the CASP (College Academic Support Programs) listserve, a skeptical developmental programs instructor asked why more attention can’t be given to remedial readers when designing instruction for developmental education. The instructor’s concern highlights the question:  What do we know about students who are not “college-ready” and who enroll in developmental coursework? In particular, where does metacognition fit into their development as skilled readers?

We know that reading ability, as measured by standardized instruments, like the SAT reading test for high-school students, is significantly associated with reading comprehension (Taraban, Rynearson, & Kerr, 2000). But what underlies this reading ability and can it be enhanced in college students?  Prior research revealed a several things.  As University students progress from freshman to senior years, they show small but significant growth in their use of metacognitive reading strategies (Taraban, 2011). This growth happens naturally – i.e., college students typically do not take courses that teach metacognition.  In trying to deliberately develop metacognitive reading strategies in developmental reading students, however, we found that the process can be slow and costly, but it can be done!  In a study of developmental college readers, it took roughly one semester of regular practice with a look-back reading strategy (Garner, 1987) in order to show significant improvement in reading comprehension (Taraban et al., 1997).  In addition to semester-long practice, the intervention was implemented in one-on-one tutoring, pointing to the instructional costs of bringing about detectable gains in reading skills in a remedial population.

Recently my colleagues and I had an opportunity to work with developmental readers who were enrolled in a developmental reading course at a major public research university. The students were primarily freshmen (mean number of completed credits = 16.7). We were primarily interested in three questions: 1) Could a teacher-implemented intervention improve these students’ comprehension and retention of ideas from expository texts? 2) Which metacognitive reading strategies did these students apply on their own? and 3) Was students’ use of metacognitive strategies associated with better retention of information?

The students were asked to read two expository passages and to recall as much as they could either immediately or after a 48-hour delay. They were told that they would be asked later to recall the information from the texts, but they were not prompted to apply any specific learning strategies. The two texts used for the study were each about 250 words in length and had an average Flesch-Kincaid readability score of 8.2 grade level. The passages contained 30 idea units each. Idea units are simple units of meaning derived from the text, and here were used to score the recall data. The participants read and studied one of the passages without interruption (Uninterrupted Condition), and they read and studied one of the passages paragraph-by-paragraph, and then all together (Segmented Condition).  Participants spent an equal amount of total time (10 minutes) reading and studying each of the texts. After they recalled the information, we asked them to report the strategies they used to learn the information. The specific self-reported strategies were organized into six types, as shown in Table 1. To score the strategy-use data, participants were given credit for multiple strategy types, but not for repetitions of the same strategy for the same text.

 TABLE 1: Key Types of Self-Reported Strategies

  1. REPETITION: Re-Reading; Memorize; Repetition
  2. FOCUSING ON SPECIFIC ELEMENTS: Key words; Key concepts; Grouping terms or sentences; Identifying related concepts; Parts that stood out; Parts that were difficult
  3. SELF TESTING: Summarizing; Recalling; Quizzing self; Forming acronyms
  4. GENERATING COGNITIVE ELABORATIONS: Activating prior knowledge; Recalling related experiences; Re-explaining parts of the text in other ways; Comparing and contrasting ideas; Using analogies; Using mental imagery
  5. SEGMENTATION: Grouping sentences for purposes of study; Divide by paragraph
  6. GENERAL: Read slowly; Read thoroughly; Concentrate; Understand passage

Regression analyses were conducted in order to evaluate the effectiveness of the reading approach (Uninterupted vs Segmented) in conjunction with participants’ self-reported use of the six strategy types (see Table 1). Turning to the immediate test, the reading approach mattered. When participants read and studied a segmented text they had significantly higher recall of idea units (M = 11.64) compared to non-segmented text (M = 7.93). Further, all of the participants reported using reading strategies.  Of the six strategy types, participants’ application of FOCUSING ON SPECIFIC ELEMENTS during reading was strongly associated with better recall of information from the text, and REPETITION was also important.  Considering the delayed test next, the reading approach used for the text that was read two days earlier did not matter.  However, using the strategy type SELF TESTING during reading was strongly associated with better recall, and FOCUSING ON SPECIFIC ELEMENTS was also helpful.

To address our skeptical developmental instructor, our data suggest that developmental reading instructors can structure how students process information in order to increase the number of ideas students retain, for follow-up activities like inferencing and brainstorming. The data also showed that developmental readers naturally use metacognitive reading strategies to boost their retention of information both immediately and at a delay.  Interestingly, there is no single best strategy.  Rather, FOCUSING ON SPECIFIC ELEMENTS during reading is most effective for immediate retention and SELF-TESTING during reading is most effective for longer-term retention.  Developmental students’ natural disposition to apply strategies may open opportunities for instructors to further guide, enhance, and channel these metacognitive skills to better benefit students.  What is heartening in these data is the finding that these academically-challenged students self-initiate metacognitive activities to monitor and regulate their study behaviors in order to enhance their academic performance.

References

Garner, R. (1987). Metacognition and reading comprehension. Norword, NJ: Ablex.

Taraban, R. (2011). Information fluency growth through engineering curricula: Analysis of students’ text-processing skills and beliefs. Journal of Engineering Education, 100(2), 397-416.

Taraban, R., Becton, S., Shufeldt, M., Stirling, T., Johnson, M., & Childers, K. (1997). Developing underprepared college students’ question-answering skills. Journal of Developmental Education, 21 (1), 20-22, 24, 26, 28.

Taraban, R., Rynearson, K., & Kerr, M. (2000). College students’ academic performance and self-reports of comprehension strategy use. Journal of Reading Psychology, 21, 283-308.


Are College Students Picky About Using Metacognitive Reading Strategies?

 by Roman Taraban, Texas Tech University

“Picky, picky” is a phrase we use to gently chide someone for being overly selective when making an apparently simple choice.  However, being picky is not always a bad thing, as I will try to show. Oddly enough, this phrase comes to mind when thinking about thinking about thinking, i.e., thinking about metacognition.  To explain the connection, I would like to consider the idea of being picky from two perspectives: research on metacognition and students’ metacognitive behaviors.

My students and I were first attracted to research on metacognition upon reading the work of Michael Pressley and colleagues, which focused on metacognitive strategies for reading comprehension.  Noteworthy in those early efforts were projects involving elementary school teachers and classroom interventions geared toward young students in an effort to teach them how to be more metacognitive in their daily schoolwork (Pressley et al., 1995).  Other work by Pressley and colleagues analyzed adult metacognitions when reading, using a think-aloud method (e.g., Pressley & Afflerbach, 1995), and metacognitions of experts when reading in their discipline (Wyatt et al., 1993). This research made a lot of sense, as it fit nicely within the broader constructs of active learning and constructivism — the belief that students needed to actively engage materials in order to benefit from study.  A simple inference to make is that the application of any active learning strategy will benefit students.   That was our assumption when we constructed and tested the Metacognitive Reading Strategies Questionnaire (MRSQ) (Taraban et al., 2000), drawing on the work of Pressley and others.   Data from 324 undergraduates from a variety of majors and levels were telling.  Of the 35 strategies that we tested, only seven were significantly associated with students’ grade-point averages (GPA).  The strategies were Evaluate text for goals, Set goals for reading, Draw on my prior knowledge, Vary reading style based on goals, Search out information for goals, Infer information, and  Look for important information (here presented in order of greatest to smallest effect sizes).  It was clear that all metacognitive strategies did not predict GPA equally well, and that the successful strategies were mostly related to reading goals. The significant correlations of academic proficiency, measured by GPA, with goal-related reading strategies, are consistent with Garner’s (1987) suggestion that skilled readers know multiple strategies and also know when to apply them.

Recent work on text recall (Schumacher & Taraban, 2014) with an undergraduate sample similar to the earlier study gave us another opportunity to examine students’ strategy use.  We asked students to read and study two expository texts and to recall as much as they could either immediately or after a 48-hour delay.  After they recalled the information, we asked them to report the strategies they used to learn the information. We organized the specific self-reported strategies into five types, as shown in the table below.  A hypothesis that application of any of these strategies would benefit subsequent performance was again not supported.  Of the five strategy types, Self-Testing was the only one that was significantly and positively correlated with recall.  We might infer that for this sample of readers and the criterion measure, which was recall, the most appropriate strategies were those related to Self-Testing.

Key Types of Self-Reported Strategies

1. REPETITION:  Re-Reading; Memorize; Repetition
2. FOCUSING ON SPECIFIC ELEMENENTS: Key words; Key concepts; Grouping terms or sentences; Identifying related concepts; Parts that stood out; Parts that were difficult
3. SELF TESTING: Summarizing; Recalling; Quizzing self; Forming acronyms
4. GENERATING COGNITIVE ELABORATIONS: Activating prior knowledge; Recalling related experiences; Re-explaining parts of the text in other ways; Comparing and contrasting ideas; Using analogiesusing mental imagery
5. SEGMENTATION: Grouping sentences for purposes of study; Divide by paragraph

In conclusion, we can draw a few observations.  As researchers, as instructors, as students, it is important to be cognizant of three interacting factors when students choose and apply metacognitive reading strategies: the criterion measure, reader-selected goals in light of the criterion measure, and readers’ sense of their own ability as it affects their choices of strategies.

ThreeFactors

The assumption that the application of any metacognitive strategy will always enhance performance is too simplistic.  It does not acknowledge the complexity of strategy choice, and it does not do justice to picky students, who are attempting to choose appropriate strategies for specific circumstances.  Some strategies lead to better retention of information and some to better grades. While these will often go together, it might further be the case that picky students know when to employ which strategy.  So maybe sometimes it’s good to be picky.

 

References

Garner, R. (1987). Metacognition and reading comprehension. Norword, NJ: Ablex.

Pressley, M., & Afflerbach, P. (1995). Verbal protocols of reading: The nature of constructively responsive reading.  Hillsdale, NJ: Erlbaum.

Pressley, M., Brown, R., El-Dinary, P. B., & Afflerbach, P. (1995).  The comprehension instruction that students need: Instruction fostering constructively responsive reading.  Learning Disabilities Research and Practice, 10, 215-224.

Schumacher, J., & Taraban, R. (2014, April). Strategy use complements testing effects in expository text recall. Paper presented at Southwestern Psychological Association (SWPA) Conference. San Antonio, TX.

Taraban, R., Rynearson, K., & Kerr, M. (2000).  College students’ academic performance and self-reports of comprehension strategy use. Journal of Reading Psychology, 21, 283-308.

Wyatt, D., Pressley, M., El-Dinary, P., Stein, S., Evans, P., & Brown, R. (1993). Comprehension strategies, worth and credibility monitoring, and evaluations: Cold and hot cognition when experts read professional articles that are important to them.  Learning and Individual Differences, 5, 49-72.