By Kristen Davidson and Caitlin Farrell
One aspect of improving research use in education relates to the capacity of practitioners to interpret and use research. Educators need some knowledge and skills about research methods in order to be critical consumers that can draw justifiable conclusions, apply findings appropriately, and consider the limitations of any given study. But how much knowledge do they need?
To answer that question, we need to be able to gauge educators’ knowledge of research, but getting a sense of individuals’ research knowledge and skills presents challenges. We reached out to colleagues in educational research, practice, and policy through social media to ‘crowdsource’ the question, “How would you get a sense of someone’s knowledge and skills to use research?” We heard approaches that ranged from using credentials as a proxy, to creating scenario-based tasks, to situating an assessment of knowledge and skills in the purpose and context for using research. Below, we discuss some of what we learned from our colleagues.
Credentials as a Proxy
One means of gauging research knowledge and skills is using credentials as a proxy. For instance, when Joel Malin and Chris Lubienski of the University of Illinois studied whether people who are cited by the media in educational policy discussions have “educational expertise,” they used a proxy to represent expertise that incorporated a Google Scholar metric, years of experience, and the attainment of a doctoral or similar degree.
In another case, Elizabeth Farley-Ripple of the University of Delaware found that district leaders with advanced degrees were likely to access and share relevant research with others in districts where information sharing was common. In both these studies, possession of a credential helped explain some part of the various ways in which school districts access and use research-based information.
To get a deeper sense of a person’s knowledge and skills to use research, some suggested using scenario-based tasks. Here, a practitioner would be asked to draw conclusions from a specific study and how they might apply the study’s findings to a real-world problem.
One approach focused on an educator’s ability to synthesize findings in light of a school or district problem. Brandon Palmer of the Principal Research Center suggested, “Give them a performance task in which they have to solve a hypothetical school or district issue by reviewing relevant research and then synthesizing it in a presentation or report addressing/solving the problem.” This task would focus on skills to synthesize relevant findings for a particular issue.
Others suggested an approach that the emphasized a practitioner’s skills to critically assess findings and interpret the research within their own context. Cara Jackson of Urban Teachers shared, “One of the key concepts I’d want educators to grapple with is whether they can imagine alternative explanations for the findings, and the extent to which the researchers address alternative explanations. The other key concept would be the ability to relate the research back to their work, either in the classroom or as an administrator.”
Colleagues at the Center for Research Use in Education suggested survey or interview questions that would ask about a variety of scenarios concerning the rigor with which individuals evaluate research products. For example, they envision scenarios in which educators report on which aspects of research they value when using it (or not) in their work.
A Situated Approach
As pointed out by Daniel Ginsberg, professional fellow of the American Anthropological Association, the purpose of educators’ use of research would lead to very different needs for knowledge and skills. For example, improvement of classroom instructional practices and development of district-wide teacher evaluation policies would be informed by different kinds of research, and it takes different kinds of skill and preparation to access each of these. Chris Mazzeo, director of REL Northwest, suggested that someone’s knowledge and skills may need to be considered alongside other factors, like the degree to which someone values research or political consideration of the context.
The John W. Gardner Center described engaging educator partners in a ‘cycle of inquiry’ after articulating an aim or question specific to their own district context:
- Collect data: What data do we need? Who collects? How? When? Who monitors?
- Analyze data: Who analyzes the data? How? When? Who monitors?
- Learn: What insights will be gained? Do we have the expertise to interpret? Can priorities be identified? Can decisions be made? Are more data needed?
- Make decisions: Who makes decisions? What decision rules are in place? Who informs stakeholders? Are more data needed?
- Take action: What actions are appropriate to respond to findings? Who is responsible for execution? Who monitors effectiveness? How?
This five-step process allows educators to not only take part in the research process, but also gain deeper understandings of appropriate uses of research for particular purposes.
At NCRPP, we have been challenged by studying educators’ knowledge and use of research in districts. Stay tuned for our next blog post, where we’ll share the way we went about getting a sense of knowledge and skills to use research, the issues we faced, and what we’ve learned. A big thank you to everyone who shared their ideas with us!