The Metacognitive Reading Strategies Questionnaire (MRSQ): Cross-Cultural Comparisons

by Roman Taraban, Ph.D. Texas Tech University

When you read, do you ask yourself whether the material is contributing to your knowledge of the subject, whether you should revise your prior knowledge, or how you might use the new knowledge that you are acquiring?  Do you highlight information or make notes in the margins to better remember and find information later on? Prior research by Pressley and colleagues (e.g., Pressley & Afflerbach, 1995) suggested that the type of metacognitions suggested by reading strategies like these were critical for effective reading comprehension.  

photo of a stack of books with a pair of reading glasses on top

Inspired by that research, Taraban et al. (2000) conducted a study involving 340 undergraduates and 35 reading strategies like those suggested by Pressley and colleagues and found that self-reports of strategy use were significantly associated with grade-point averages (GPA). Specifically, students who reported higher use of reading strategies also had higher GPAs.  Additionally, responses to open-ended questions showed that students who could name more reading strategies and reading goals also had significantly higher GPAs. 

The data in Taraban et al. (2000) overwhelmingly suggested a strong positive relationship between students’ knowledge and use of reading goals and strategies and their academic performance.  More generally, data by Taraban et al. and others suggest that effective reading depends on metacognitive processing – i.e., on directed cognitive effort to guide and regulate comprehension. Skilled readers know multiple strategies and when to apply them. In the remainder of this post, I review subsequent developments associated with metacognitive reading strategies, including cross-cultural comparisons, as well as raising a question about the relevance of these strategies to present-day text processing and comprehension given widespread technological developments.

Analytic VS Pragmatic Reading Strategies

In 2004, my students and I created a questionnaire, dubbed the Metacognitive Reading Strategies Questionnaire (MRSQ) (Taraban et al., 2004). The questionnaire drew on the strategies tested earlier in Taraban et al. (2000) and organized the strategies into two subscales through factor analytic methods: analytic strategies and pragmatic strategies.  The analytic scale relates to cognitive strategies like making inferences and evaluating the text (e.g., After I read the text, I consider other possible interpretations to determine whether I understood the text.). The pragmatic scale relates to practical methods for finding and remembering information from the text (e.g., I try to underline when reading in order to remember the information.). Students respond to these statements using a five-point Likert-type scale: Never Use, Rarely Use, Sometimes Use, Often Use, Always Use.

Initial applications of the MRSQ suggested that the two-factor model could aid in better understanding students’ use of metacognitive comprehension strategies.  Specifically, in students’ self-reports of expected GPA for the coming academic year, there was a significant positive correlation with analytic strategies but a non-significant correlation with pragmatic strategies, which suggested that students who reported higher use of analytic strategies also anticipated doing well academically in the coming academic year.

Cross-Cultural Explorations of Reading Strategies

Vianty (2007) used the MRSQ to explore difference in students’ use of metacognitive reading strategies in their native language, Bahasa Indonesia, and their second language, English. Participants were students in a teacher education program who completed the MRSQ in English and Bahasa Indonesia. Vianty found that students processed language differently in their native language compared to a non-native language.

In comparing mean use of analytic strategies when reading in their native language compared to English, Vianty found that nearly all means were higher for Bahasa Indonesia.  T-tests showed significant differences favoring Bahasa Indonesia for eight out of sixteen analytic strategies. Conversely, four of the six pragmatic strategies were favored when reading English, however, only one difference (I take notes when reading in order to remember the information) was significant on a t-test. Vianty concluded that students used analytic strategies significantly more in Bahasa Indonesia than English. Conversely, use of pragmatic strategies was higher when reading in English, but the effect was weak.

Taraban et al. (2013) compared US and Indian engineering undergraduates on their application of analytic and pragmatic strategies. The language of instruction in Indian universities is English; however, this is not typically the native language (the mother tongue) of the students.  Therefore, the researchers predicted lower use of analytic strategies and higher use of pragmatic strategies among Indian students compared to US students, reasoning from the findings in Vianty (2007). The latter but not former prediction was supported. Indeed, Indian students applied analytic strategies significantly more frequently than US students.  Pragmatic strategy use was significantly lower than analytic strategy use for US students but not for Indian students, who applied analytic and pragmatic strategies equally often.  Contrary to the findings in Vianty (2007), these findings suggest that students can make significant use of analytic and pragmatic strategies in a non-native language.

The most comprehensive cross-linguistic comparison was conducted recently by Gavora et al. (2019), who compared analytic and pragmatic strategy use, measured by variants of the MRSQ, among 2692 students from Poland, Hungary, Slovakia, and the Czech Republic, enrolled in education programs, primarily teacher and counseling.  Students in Hungary, Slovakia, and the Czech Republic reported significantly higher use of pragmatic over analytic strategies. Students in Poland showed a converse preference, reporting significantly more frequent use of analytic strategies. Quite striking in the results were the significant correlations between pragmatic strategy use and GPA, and analytic strategy use and GPA, for all four countries.  Specifically, the correlation showed that higher frequency use of both pragmatic and analytic strategies was associated with more successful academic performance.

Gavora et al. (2019) suggest that “In order to succeed academically, students direct their reading processes not towards comprehension but to remembering information, which is the core component of the pragmatic strategy” (p. 12). Their recommendation, that “educators’ attention should be focused on developing especially analytic strategies in students,” is strongly reminiscent of the ardor with which Pressley and colleagues began promoting metacognitive reading strategies beginning in the elementary grades. 

However, given the significant correlations between both analytic and pragmatic strategy use with GPA, it may be that the predominance of analytic strategies is not what is important, but whether application of either type of strategy – analytic or pragmatic – aids students in their academic achievement. The data from Vianty (2007) may be informative in this regard, specifically, the finding that those students applied pragmatic strategies more frequently than analytic strategies when the context – reading outside their native language – dictated a more pragmatic approach to reading and comprehension.

A relevant point made by Gavora et al. relates to the samples that have been tested to-date, and the relevance of context to strategy use. They point out that in contexts like engineering (e.g., Taraban et al., (2013), the context may support more analytic thinking and analytic strategy use.  The Gavora et al., sample consisted of humanities students, which, on their argument, may have resulted in an overwhelming affirmation of pragmatic strategies. Further comparisons across students in different programs is certainly warranted.

Changing Times: The Possible Influence of Technology on Reading

An additional question comes to mind, which is the effect of widespread technology in instructional settings. When I, like others, am uncertain about a definition, algorithm, theory, etc., I find it very easy to simply Google the point or look for a YouTube, which I simply need to read or watch for an explanation. This personal observation suggests that perhaps the strategies that are probed in the MRSQ may, at this point, be incomplete, and in some instances, somewhat irrelevant.  The next step should be to ask current students what strategies they use to aid comprehension. Their responses may lead to new insights into contemporary student metacognitions that assist them in learning.

In conclusion, there is no doubt that metacognitive strategies are essential to effective information processing.  However, there may be room to reconsider and update the strategies that students employ when reasoning and searching for information and insights to guide and expand comprehension and learning.  It may be that current technology has made students more pragmatic and a promising goal for further research would be to uncover the ways in which that pragmatism is being expressed through new search strategies.

References

Gavora, P., Vaculíková, J., Kalenda, J., Kálmán, O., Gombos, P., Świgost, M., & Bontová, A. (2019). Comparing metacognitive reading strategies among university students from Poland, Hungary, Slovakia and the Czech RepublicJournal of Further and Higher Education, 1-15.

Pressley, M., & Afflerbach, P. (1995). Verbal protocols of reading: The nature of constructively responsive reading. Hillsdale, NJ: Erlbaum.

Taraban R, Kerr M, Rynearson K (2004) Analytic and pragmatic factors in college students’ metacognitive reading strategies. Reading Psychology, 25(2), 67–81.

Taraban, R., Rynearson, K., & Kerr, M. (2000). College students’ academic performance and self-reports of comprehension strategy use. Reading Psychology, 21, 283–308.

Taraban, R., Suar, D., & Oliver, K. (2013). Information literacy of US and Indian engineering undergraduatesSpringerPlus2(1), 244.

Vianty, M. (2007). The comparison of students’ use of metacognitive reading strategies between reading in Bahasa Indonesia and in English. International Education Journal,8(2), 449–460.


Just-in-Time for Metacognition

By John Draeger, SUNY Buffalo State

This post brings metacognition to an already valuable teaching tool. Just-in-time techniques require that students submit short assignments prior to class. Instructors review those answers before class and use them to shape class time. In my philosophy classes, for example, I assign two short questions via a course management system (e.g., Blackboard). At least one of the questions is directly related to the reading. Students are required to submit their answers electronically by 11:00 p.m. the night before class. When I wake up in the morning, I read through their responses and use them to make decisions about how class time will be used. If students seemed to grasp the reading, then I spend less time reviewing the basic arguments and more time exploring deeper content and connections. If student responses displayed a misunderstanding of the reading, then we spend class time carefully examining passages in the text and digging out the relevant arguments.

Just-in-Time techniques have been used in a variety of disciplines and they have been shown to increase the likelihood that students will complete their reading assignments, read more carefully, and take ownership over their learning (Novak 1999; Simkins & Maier, 2009; Schraff et al. 2011). However, just-in-time assignments are typically used to prompt students to complete their assigned reading pages and gauge their basic comprehension. While both are valuable, I argue that the technique can also be used to promote other important skills.

For example, pre-class questions can be used to develop higher-order thinking skills. Students can be asked to examine an author’s point of view, underlying assumptions, or the implications of her view. Such questions prompt students to move beyond their knowledge of what is contained in the text towards active engagement with that text. Students can be asked to apply concepts in the reading (e.g., stereotype bias) to something in the news. And students can be asked to analyze the connections between related course ideas. In a previous post, “Using metacognition to uncover the substructure of moral issues,” I argued that students begin to “think like a philosopher” when they can move beyond the surface content (e.g., hate speech and national security) and towards the underlying philosophical substructure (e.g., rights, well-being, dangers of governmental intrusion). Like other skills, developing higher-order thinking skills requires practice. Because just-in-time assignments are a regular part of a student’s week, incorporating high-order thinking questions into just-in-time assignments can give students regular opportunities to practice and hone those skills.

Likewise, pre-class assignments can give students a regular outlet to practice and develop metacognition. Students can be asked to reflect on how they prepared for class and whether it was effective (Tanner 2012). Pre-class questions might include: how long did you spend with the reading? Did you finish? Did you annotate the text? Did you write a summary of the central argument? Did you formulate questions based on the reading for class discussion? Was this reading more difficult than the previous? If so, why? Did you find yourself having an emotional reaction to the reading? If so, did this help or hinder your ability to understand the central argument? Are your reading techniques adequately preparing you for class? Or, are you finding yourself lost in class discussion despite having spent time doing the reading? If pre-class questions related to higher-order thinking ask students to do more than simply “turn the pages,” then pre-class questions related to metacognition ask students to do more than simply engage with the material, but also engage with their own learning processes.

When just-in-time questions are a regular part of the ebb and flow of a course, students must regularly demonstrate how much they know and instructors can regularly use that information to guide course instruction. These techniques work because there is a consistent accountability measure built-in. I suggest that just-in-time assignments can also be used to give students regular practice developing both higher-order thinking and metacognition skills. I have been incorporating higher-ordering thinking into just-in-time assignments for years, but I confess that I have only given metacognition prompts when things have “gone wrong” (e.g., poor performance on exams, consistent misunderstanding of the reading). Responses to these questions have led to helpful conversation about the efficacy of various learning methods. Writing this blog post has prompted me to see the potential benefits of asking such questions more often. I pledge to do just that and to let you know how my students respond.

 

References

Novak, G., Patterson, E., Gavrin, A., & Christian, W. (1999). Just-in-time teaching: Blending active learning with web technology. Upper Saddle River, NJ: Prentice Hall.

Scharff, L., Rolf, J. Novotny, S. and Lee, R. (2011). “Factors impacting completion of pre-class assignments (JiTT) in Physics, Math, and Behavioral Sciences.” In C. Rust (ed.), Improving Student Learning: Improving Student Learning Global Theories and Local Practices: Institutional, Disciplinary and Cultural Variations. Oxford Brookes University, UK.

Simkins, S. & Maier, M. (2009). Just-in-time teaching: Across the disciplines, across the academy. Stylus Publishing, LLC..

Tanner, K. D. (2012). Promoting student metacognition. CBE-Life Sciences Education11(2), 113-120.


Creating a Metacognitive Movement for Faculty

by Charity Peak, U.S. Air Force Academy*

Faculty often complain that students don’t complete reading assignments.  When students do read, faculty yearn for deeper analysis but can’t seem to get it.  With SAT reading scores reaching a four-decade low (Layton & Brown, 2012) and nearly forty percent of postsecondary learners taking remedial coursework (Bettinger & Long, 2009), it’s not surprising that college students are increasingly unable to meet the reading expectations of professors.  Faculty sense the waning reading abilities of their students, but they struggle to identify how to address the problem.  After all, they weren’t trained to be reading teachers.

In February 2012, a group of faculty gathered for a Scholarship of Teaching and Learning (SoTL) Circle at the U.S. Air Force Academy to discuss how to get students to read more critically.  The topic spurred such great interest that an interdisciplinary faculty learning community on Reading Critically was formed to investigate the issue and share strategies to use in the classroom.  What evolved was a collective movement by faculty to become metacognitively aware of why and how they were assigning and apprenticing students to read more critically within their disciplines.

Our first meeting tackled the big question, “What do we want to know about college reading?”  Despite our interdisciplinary nature, we easily identified several common areas of concern:  Compliance (completing reading assignments), Comprehension (understanding what they read), and Critical Analysis.  These Three C’s of College Reading guided our discussions over the next two academic years and eventually led to the creation of a website to assist other faculty members struggling with the same issues.

As academics, our first inclination was to dive into the literature to determine what other institutions had discovered about this issue.  Surely we weren’t the only faculty grappling with these concerns. Not surprisingly, the research literature confirmed that the vast majority of college students do not read assignments ahead of time and do not consider the textbook to be a critical component of learning (Berry et al., 2010).  In fact, a number of studies find that college students only read textbooks about six hours per week (Spinosa et al., 2008), with just 20-30% reading compliance for any given day and assignment (Hobson, 2004).  Faculty hoping to set the stage prior to class and engage learners in meaningful discussions during class must first address reading compliance among students.

Unfortunately, reading is not indicative of comprehension.  The combination of students’ weak reading abilities (particularly marginalized students) and difficult textbook structure produce unskilled learners, which faculty are unprepared to handle.  Hobson (2004) explains that most college teachers – content specialists – do not realize their students are struggling to comprehend assigned texts.  Furthermore, if faculty insist on emphasizing reading as part of their course structure, then “helping students improve their reading skills should be the responsibility of every college-level teacher” (p. 4). Without specific strategies to address the reading needs of students, typically far outside the spectrum of the usual subject area specialist, faculty are rendered helpless in creating deep thinking environments in the classroom.

Because low reading compliance predicts nonparticipation (Burchfield & Sappington, 2000), college faculty must address the issue in an effort to drive deeper learning.  Over the course of two years, our Reading Critically faculty learning community identified and shared several research-based strategies to assist faculty in improving reading compliance, comprehension, and critical analysis.  With no budget and nothing more than a dedication to the cause, we invited speakers to our meetings from our own institution to share how they were apprenticing readers within their courses. We discovered the value of pre-class reading guides, concept mapping, equation dictionaries, and even reading aloud in class. The interdisciplinary connectedness and learning through a common academic concern became a welcome respite from the typical silos that exist in higher education.

By the end of our first year together, our faculty learning community had gathered a wealth of research-based practices that could be implemented in courses across all disciplines.  While each of the group’s participants had learned a great deal, we weren’t sure how to spread the word and continue the movement.  Then, we discovered Carnegie Mellon’s Solve a Teaching Problem website.  Alas, a model for us to follow!  We set out to design a website for faculty to Solve a Reading Problem.   Collaboratively, we created a step-by-step way for faculty to address reading issues they were encountering in their courses:

Step 1: Identify a reading problem

Step 2: Investigate a reason for the problem

Step 3: Initiate a strategy to address the problem

Our learning community pooled resources together by suggesting various problems and solutions along with research-based literature to support our ideas.  Faculty then submitted lesson ideas and classroom strategies they found successful in their own courses to support better reading compliance, comprehension, and critical analysis.  While the website is still very much a work in progress, it represents two years of metacognition around why faculty assign readings and how to maximize those opportunities in the classroom.

Ultimately, our faculty learned that we have a responsibility to be metacognitive about our own teaching practices in order to improve learning.  This group’s commitment to the cause created an interdisciplinary metacognitive movement among our faculty that is still developing.  What metacognitive movement can you lead at your institution?

References:

Berry, T., Cook, L., Hill, N,. & Stevens, K. (2010). An exploratory analysis of textbook usage and study habits: Misperceptions and barriers to success. College Teaching, 59(1), 31-39.

Bettinger, E., & Long, B. (2009). Addressing the needs of underprepared college students: Does college remediation work? Journal of Human Resources, 44(3), 736-771.

Burchfield, C. M., & Sappinton, J. (2000). Compliance with required reading assignments. Teaching of Psychology, 27(1), 58-60.

Hobson, E. H. (2004). Getting students to read: Fourteen tips. IDEA Paper No. 40. Manhattan, KS: The IDEA Center.

Layton, L., & Brown, E. (September 24, 2012). SAT reading scores hit a four-decade low. Washington Post. Washington, D.C.

Spinosa, H., Sharkness, J., Pryor, J. H., & Liu, A. (2008). Findings from the 2007 administration of the College Senior Survey (CSS): National aggregates. Los Angeles: Higher Education Research Institute, UCLA.

 

* Disclaimer: The views expressed in this document are those of the authors and do not reflect the official policy or position of the U. S. Air Force, Department of Defense, or the U. S. Govt.