Contact Fritz Mayr at email@example.com for more information or visit HCTA Information
Employers, educators, and the general public all agree that critical thinking is an essential skill for citizens of the 21st century. It is the primary objective of education and a core ability that employers want for their prospective and current employees. A study by the Association of American Colleges and Universities (Hart Research Associates, 2013, p. 1) found that “nearly all employers surveyed (93 percent) say that ‘a demonstrated capacity to think critically, communicate clearly, and solve complex problems is more important than [a candidate’s] major.” Similar results have been found in reports from a wide range of employer and academic reports. But what is critical thinking, and how can we assess it?
The Halpern Critical Thinking Assessment (HCTA) was designed to help educators and employers assess critical thinking skills in their students and employees.
Unique characteristics of the HCTA
The following manual explains the unique properties of the HCTA. The central question is what makes a measure of critical thinking good? A good measure can predict how people act in the real world. The HCTA is the only measure of critical thinking that has real world validity—it can predict what adults do (more precisely, what they say they do) in their daily lives. These studies are described in the validity section in the HCTA Manual.
The HCTA provides a measure of how people think when they contemplate information that relates to real-world experiences. The HCTA uses two response formats: constructed response—how people first respond to a situation (in their own words) and forced choice, which is a measure of how well they can recognize a good response. It is the only measure of critical thinking to use two types of response formats. Even though most experts agree that constructed responses are usually the best measure of what people actually think and do, they often avoid constructed responses because of the time needed to grade the responses. In addition, another common problem with constructed responses is that it is difficult to get good interrater reliabilities. The HCTA uses a computerized grading system that guides the grader with prompts that make it easy and relatively fast for anyone to grade constructed responses. Interrater reliabilities are very high because of a computerized grading system that prompts the grader with questions about the constructed responses and the numerical grading is then computed automatically.
Finally, the HCTA is easy to use. There are several options for users. The HCTA can be administered online or offline. If test administrators opt to administer the test online, respondents are sent a link (via e-mail or other on-line method such as a link in a spread sheet) which opens a screen where the respondent provides basic demographic information and then takes the HCTA on-line. The grading of the constructed responses can be done by the test administrator. If the test administrator opts to grade the constructed responses, this is accomplished with the use of grading prompts. Test administrators receive HCTA test scores (along with subscores and norms) in various output formats (e.g., SPSS, CSV, etc). The easiest alternative involves very little work on the part of the administrator. Provide test takers with a link and then get results fully scored with norms. Alternatively, test administrators who want more control over the testing environment can opt to install the Vienna Test system on their own computers and use the system to administer and grade the tests. The choice is yours. The HCTA is culturally fair. It is currently being used in many countries and languages around the world with comparable norms.
Structure of the HCTA
The HCTA consists of 20 everyday scenarios, each of which is briefly described and presented using common language. For each scenario, respondents are first asked an opened ended (i.e., constructed response) question, which is followed by a forced choice question (e.g., multiple choice, ranking, or rating of alternatives) such as select the best alternative, rate each of the alternatives in terms of their relevance, or indicate which two of the following alternatives indicates a good response. Cognitive psychologists differentiate between free recall and recognition processes in memory and these two types of questions are designed to take advantage of the different cognitive processes. The total score is (approximately) equally weighted between constructed response and forced choice questions.
The questions of the HCTA represent five categories of critical thinking skills: verbal reasoning (e.g., recognizing the use of pervasive or misleading language), argument analysis (e.g., recognizing reasons, assumptions, and conclusions in arguments), thinking as hypothesis testing (e.g., understanding sample size, generalizations), using likelihood and uncertainty (e.g., applying relevant principles of probability such as base rates), as well as decision making and problem solving (e.g., identifying the problem goal, generating and selecting solutions among alternatives). Although there are an equal number of scenarios for each critical thinking category, some categories were worth more total points than other categories in their contribution to the total critical thinking score. The categories were weighted as follows with the following rationale as to their relative importance and contribution to critical thinking:
·Decision making and problem solving: In some sense all of the subtypes of critical thinking skills are involved decision making (generating and selecting from alternatives based on relevant criteria) and problem solving (finding solutions to a situation, or more colloquially, moving from a start space to a goal). Because this category relies on subsets of the other critical thinking skills (e.g., recognizing that an unlikely event is a not an optimal choice when making decisions or examining the reasons for a course of action), it was weighted with more total points than the other categories.
·Thinking as hypothesis testing: The skills of hypothesis testing are not restricted to evaluating formal research; they are (or should be) used in multiple everyday situations. Faulty thinking often involves hasty generalizations from small samples of behavior (e.g., a new friend is late and the respondent generalizes that the new friend must be habitually late) or failure to consider control conditions (e.g., a cold gets better after taking a vitamin supplement, but there is no consideration that it might have gotten better without the supplement).
·Argument analysis: Too often people reach conclusions without consideration of the reasons that support or fail to support the conclusion. The ability to seek and provide reasons and to recognize the differences between conclusions and assumptions is critical for good thinking. It is the difference between uninformed opinions and reasoned thinking.
·Likelihood and uncertainty: A basic understanding of probabilities and how they affect the likelihood of an outcome and how to use probabilities in uncertain situations are an essential component of critical thinking, but these skills are unlikely to develop beyond a rudimentary level without formal instruction. Many concepts relating to likelihood and uncertainty such as regression to the mean (an extreme event is likely to be followed by a less extreme event) and gambler's fallacy (if a fair coin comes up heads in 3 flips, a tail is not more likely on the 4th flip) are counterintuitive. Thus, although these are important concepts, the likelihood and uncertainty category was given a lower weight than some of the other categories so as not to penalize test takers who have not had any formal education in understanding likelihood and uncertainty.
Unlike other tests of critical thinking, the HCTA uses both open ended and forced choice questions. Both response formats have advantages and limitations. Forced choice questions assess the ability to recognize a correct response, but there are few instances in real life where people are presented with an array of answers to select from. Recognition is a lower-level cognitive skill, which is expected to yield higher estimates of critical thinking skill than constructed response questions, which requires higher-level cognitive processing. The disadvantage of constructed response questions is that they benefit people with good writing skills, and thus may underestimate the critical thinking skills of mediocre writers. There is evidence that multiple-choice and open-ended responses are measuring separate cognitive abilities (Bridgeman & Moran, 1997). The constructed response portion of the HCTA attempts to reveal more of the dispositional component of thinking, as it allows test-takers to demonstrate whether they are inclined to apply the appropriate skills (Ku, 2009). Essentially, the constructed response format measures “free recall” as there are few constraints on the type of response that the test-taker may generate, whereas the multiple-choice format measures recognition memory. "The former requires test-takers to consciously search and select appropriate knowledge and skills from their own memory in constructing an answer, whereas the latter requires test-takers to identify the appropriate response from a given list of alternatives" (Ku, 2009, p. 74).
After responding to this constructed response prompt, test takers are then asked "to select the best alternative from a short list of alternatives. A sample alternative might be
We only know that being alcoholic is correlated with being depressed. Although relieving the depression could help alcoholics become sober, there is no reason to believe that depression causes alcoholism or that alcoholism causes depression or that by alleviating depression, alcoholics will find it easier to beat their addiction.
New and Improved Grading
Butler, H. A. (2012). Halpern Critical Thinking Assessment predicts real-world outcomes of critical thinking. Applied Cognitive Psychology, 26, 721-729.doi:10.1002/acp.2851
Halpern, D. F. (2010). Halpern Critical Thinking Assessment. Publisher: SCHUHFRIED (Vienna Test System). http://www.schuhfried.com/vienna-test-system-vts/all-tests-from-a-z/test/hcta-halpern-critical-thinking-assessment-1/
Marin, L., & Halpern, D. F. (2010). Pedagogy for developing critical thinking in adolescents: Explicit instruction produces greatest gains. Thinking Skills and Creativity. Doi: 10.1016/j.tsc.2010.08.002
Halpern, D. F. (2006). Is intelligence critical thinking? Why we need a new construct definition for intelligence. In P. Kyllonen, I. Stankov, & R. D. Roberts (Eds.), Extending intelligence: Enhancement and new constructs. Mahwah. NJ: Erlbaum Associates.
Ku, K. Y. L. (2009). Assessing students’ critical thinking performance: Urging for measurements using multi-response format. Thinking Skills and Creativity, 4, 70-76.
Actual HCTA format: To view a sample of the actual test from the test publisher, contact Mayr@Schuhfried.at
Critical Thinking Workshops with Dr. Diane Halpern
Ideas to Action (i2a) Welcomed Renowned Psychologist and Critical Thinking Expert Dr. Diane Halpern
Thanks to the faculty and staff who participated in making Dr. Halpern’s visit to the University of Louisville a success! See below to access her workshop materials.
How to Make Learning Stick: Applications from the Science of Learning
Thursday, September 20,2012 | 9:00 a.m.–noon
This session focused on the science of learning and provided the opportunity to apply empirically-validated principles to participants' own work with students and colleagues as a way of building an effective learning-centered institution.
Presentation [PPSX] - 25 Principles Handout [PDF]
Teaching and Assessing Critical Thinking: Helping College Students Become Better Thinkers
Friday, September 21, 2012 | 9:00 a.m.–noon
The data are clear: we can teach critical thinking skills so that they generalize across domains and last long into the future. During this session, we discussed ways to enhance critical thinking for college students. Participants identified the critical thinking skills they wanted to develop in their classes and created a plan so that these skills are practiced throughout the curriculum.
Presentation [PPSX] - CT Handout [PDF]
201 Miller Information Technology Center (MITC), Belknap campus.
About Dr. Halpern
Diane F. Halpern, Ph.D. is a past-president of the American Psychological Association, the largest psychological association in the world with over 150,000 members and affiliates in 80 countries. Diane is also a past-president for the Western Psychological Association, The Society for General Psychology, and the Society for the Teaching of Psychology. She is the McElwee Family Professor of Psychology and was the founding Director of the Berger Institute for Work, Family, and Children at Claremont McKenna College. Diane has published hundreds of articles and many books including, Thought and Knowledge: An Introduction to Critical Thinking (5th Ed. coming soon!); Sex Differences in Cognitive Abilities (4th ed.), and Women at the Top: Powerful Leaders Tell Us How to Combine Work and Family (co-authored with Fanny Cheung). Her other recent books include Psychological Science (4th ed. with Michael Gazzaniga and Todd Heatherton) and the edited book, Undergraduate Education in Psychology: A Blueprint for the Future of the Discipline.
Diane has won many awards for her teaching and research, including the 2013 James McKeen Cattell Fellow Award from the American Psychological Society for "a lifetime of outstanding contributions to applied psychological research," Outstanding Professor Award from the Western Psychological Association, the American Psychological Foundation Award for Distinguished Teaching, the Distinguished Career Award for Contributions to Education given by the American Psychological Association, the California State University’s State-Wide Outstanding Professor Award, the Outstanding Alumna Award from the University of Cincinnati, the Silver Medal Award from the Council for the Advancement and Support of Education, the Wang Family Excellence Award, and the G. Stanley Hall Lecture Award from the American Psychological /Volumes/NO%20NAME/Halpern%20PhotosAssociation. Diane’s most recent projects are the development of a computerized learning game that teaches critical thinking and scientific reasoning (with Keith Millis at Northern Illinois University and Art Graesser at University of Memphis, available from Pearson Publishers; ara.pearsoncmg.com) and the Halpern Critical Thinking Assessment (Schuhfried Publishers; DianeHalpern.com) that uses multiple response formats, which allow test takers to demonstrate their ability to think about everyday topics using both constructed response and recognition formats.