MME Week: Terri Anderson on Using Best Practices for Mixed Methods Research in Evaluation
Hello! I’m Terri Anderson, Director for Evaluation at the University of Massachusetts Medical School’s (UMMS) Center for Health Policy and Research. I want to share our evaluation team’s experience using the National Institute of Health’s guide, Best Practices for Mixed Methods Research in the Health Sciences, to understand an unexpected evaluation result.
When combining survey data with in-depth interviews, national guidelines can help. Our UMMS evaluation team with expertise in quantitative and qualitative methods is studying the Massachusetts Patient Centered Medical Home (PCMH) Initiative. In this project, 46 primary care practices with varying amounts of PCMH experience will transform over a 3 year period and achieve National Council on Quality Assurance PCMH recognition. Three members from each practice completed a quantitative survey as the baseline assessment of medical home competency.
The assessment results surprised us. A group of practices with two years of PCMH experience scored lower than the novice groups when we expected just the opposite. So, we looked to our qualitative results, comparing code summary reports to the quantitative results. The NIH mixed methods guide terms this approach to integrating multiple forms of data, ‘merging’.
The guide describes ‘connecting’ as well. To connect, we included the quantitative analyses in the semi-structured guides used for subsequent qualitative data collection. With these results we understood the novice groups’ advantage. Integrating data further reinforced the importance of teamwork in evaluation work.
Lessons Learned:
- Form an interdisciplinary team. We established a ‘mixed methods subgroup’ in which quantitative and qualitative team members work jointly rather than in parallel. In a team the focus shifts from ‘this approach versus that approach’ to ‘what approach works best’. Regular meeting times allow the members to learn to work together. Our team originally formed to investigate a single puzzling result but has expanded its work to merge quantitative and qualitative staff satisfaction data.
- Connect your data. We plan to continue using quantitative results in semi-structured interview guides to collect qualitative data. The qualitative results provided an in-depth understanding of the quantitative assessment and the opportunity for interviewees to comment on their practices’ transformation.
Rad Resources:
- Best Practices for Mixed Methods Research in the Health Sciences The National Institutes of Health Office of Social and Behavioral and Social Sciences Research commissioned this recently released guide in 2010. Easily accessible on-line it contains seven sections of advice for a conducting mixed methods project and lists of key references and resources.
- Mixed Methods Topical Interest Group Through the AEA website we can communicate directly with experts in the growing mixed methods field whose work is referenced in the NIH guide.
The American Evaluation Association is celebrating Mixed Method Evaluation TIG Week. The contributions all week come from MME members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.
Related posts:
- MME Week: Leanne Kallemeyn , Daniela Schiazza, and Ann Marie Ryan on Using Mixed Methods to Conduct Integrated Data Analysis
- MME Week: Mika Yamashita on Mixed Methods Evaluation TIG Week
- MME Week: Hongling Sun on Mixed Methods Design
- Jim Dudley on Letting Go of Rigid Adherence to What Evaluation Should Look Like
- McQuiston, Lippin, and Bradley-Bull on Participatory Analysis
Nenhum comentário:
Postar um comentário