|Current Research Activities:|
An investigation of reasoning in professional evaluators' application of culturally responsive evaluation (CRE) principles
A paradigm used in the field of scientific reasoning (Koslowski, 1996) is proposed to investigate how culturally responsive evaluation (CRE) professionals reason about culture in evaluation practice. Evaluators will be presented with six program scenarios, each paired with two options for conducting an evaluation. One of the options provided for each scenario will consist of integrated CRE principles derived from earlier research. Participants will provide ratings for each option, give a qualitative explanation of their ratings, and identify the approach "most likely" to result in desirable evaluation outcomes. In addition, participants will be provided with anomalous evidence regarding their "most likely" choice and will be asked to re-rate the option they chose as best. By adapting this research paradigm we intend to 1) determine the extent to which these CRE principles are applied across varying contexts, 2) determine the extent to which approaches that utilize CRE principles are discernable from other evaluation approaches, 3) investigate the role of culturally situated background knowledge in how evaluators reason about culture in their evaluation work, and 4) verify that CRE principles systematically derived in earlier research (Casillas & Trochim, in preparation) represent how evaluators think about culture in evaluation practice.
Evaluating Causal Explanations of Historical Events (EEE)
This research is an effort to increase the evidence base from which science program administrators and policy-makers become informed while planning science, technology, engineering, and mathematics (or STEM) initiatives and their evaluations. We draw from the literature in scientific reasoning to frame a discussion for how individuals are believed to reason about science problems and propose a study to investigate the role of cultural experience in causal reasoning as a necessary step to understanding disparate performance and participation in the sciences.
In the causal reasoning literature, Klahr and Dunbar (1998) propose a positivist perspective of scientific reasoning which consists of three components: hypothesis formation, experimental design, and evaluation of evidence. In this current research, we focus on the third component, evaluation of evidence. Traditional experiments on evaluation of evidence require that participants identify and/or rate pieces of evidence provided as potentially causal explanations for an occurrence or event (e.g. Amsel & Brock, 1996; Ruffman, Perner, Olson, & Doherty, 1993). Often, participants are then provided with anomalous information and are asked to rate each piece of evidence again in order to test for the relative strength of an individual’s beliefs/confidence in a given piece of evidence. We will use a similar experimental paradigm in this proposed research. In this research, unlike past research on scientific reasoning, we argue for a logical qualification of Lipton (2000), that experience and background information situated in the cultural reality of some ethnic/racial groups affect an individual’s evaluation of evidence in particular ways. We believe this to be especially true when reasoning about social events or problems that have some flexibility in how they are interpreted (i.e. historical or social events whose occurrence can be viewed as the result of multiple causes depending on an individual’s socio-historical position.) Specifically, this study is designed to 1) show that membership in a cultural group affects what an individual perceives as relevant evidence in explaining causes of a social event and 2) investigate the relative strength of an individual’s confidence that a piece of information is relevant to the causal mechanisms for an event even when faced with an anomaly to that evidence.
Participants. Participants will be Cornell University and California State University of Monterey Bay (CSUMB) undergraduates. We will recruit students from Hispanic and Native American backgrounds through Latino and Native American organizations at both campuses. Our sample will represent Hispanic, non-Hispanic white, African-American, and Native American students in large proportions.
Materials. Participants will be asked to respond to one open-ended prompt for each historical occurrence in which they list everything they can remember about a specific historical event/figure. Prompts will be followed by short narratives that accurately portray historical events. Each narrative will be accompanied by two explanations for the event occurrence, both of which are valid explanations from legitimate historical sources. Explanation B represents information as it is presented in textbooks currently used in American high schools. Explanation A represents obscure information that is not widely presented in textbooks and is potentially more relevant to diverse students. Participants will be asked to rate each explanation on a 7 point Lickert Scale. Then individuals will be presented with information that is anomalous to the particular explanation they chose as most relevant, and will be asked to rate the chosen explanation again.
Procedure. Individuals will be briefed on the study and ask to consent to participate. Participants then will be asked to complete demographic information about their racial/ethnic identification, amount of education, generational position (first, second, etc. generation in the U.S.), name and region of the High school they attended, current major, classification, and age. Participants will then be presented with the survey materials. Upon completion of the survey, participants will also be asked to complete a validated acculturation scale and the School Quality of Life Scale.
Cultural responsiveness in stem program planning, implementation, and evaluation
The recent increase in STEM and science-based programs targeted to minority students must be met with appropriate culturally responsive practices (Johnson, 2005). The current project was designed to investigate ways in which program evaluators and staff implement culturally responsive practices. Evaluators across the country and program staff in New York, California, and Texas were invited to participate in a multiphase concept mapping project to 1) Brainstorm culturally responsive practices, 2) Sort or organize these statements according to themes of their own choosing, and 3) Rate each statement on importance and feasibility with respect to their practice. We summarize results of the structured conceptualization effort in comparison to the theoretical literature, discuss statistical differences between perceptions of Importance and Feasibility of practices, and suggest activities that consolidate and align practices as conceptualized by each group in a way that makes principles actionable.
Casillas, W. D., Hopson, R. K., & Gomez, R. (under review). Making culturally Responsive Decisions in Evaluation Practice.
Casillas, W.D. & Trochim, W. M. K. (In Preparation). Cultural responsiveness in program planning, implementation, and evaluation of STEM programs.
Casillas, W.D. & Koslowski, B. M. (In Preparation). The role of culture in evaluation of causal mechanisms.
Johnson, M. & Casillas, W. (2010). Capturing quality: Rubrics for logic models, and evaluation plans. A paper presented at the 2010 American Evaluation Conference.
Trochim, W. M. K., Casillas, W., Johnson, M., & Urban, J. (2010). Using systems thinking concepts in evaluation of complex programs. A paper presented at the 2010 American Evaluation Conference.
Casillas, W. D. (2009). Developing criteria for addressing diversity in evaluation of science, technology, engineering, and mathematics (STEM) programs. A paper presented at the 2009 American Evaluation Conference.
Reyna, V.F. & Casillas, W. (2009). Development and dual processes in moral reasoning: A Fuzzy-trace theory account. In Bartels, D. M., Bauman, C. W., Skitka, L. J., & Medin, D. L. (Eds.) Moral Judgment and Decision Making: The Psychology of Learning and Motivation, Vol. 50. San Diego: Elsevier.