Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 24 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
24
Dung lượng
1,93 MB
Nội dung
Colorectal Cancer Screening Program in South Carolina (CCSPSC): Connections to the NCCRT Evaluation Toolkit Heather M Brandt, PhD, CHES Associate Dean for Professional Development, Graduate School Associate Professor, Arnold School of Public Health University of South Carolina e: hbrandt@sc.edu t: 803.777.7096 National Colorectal Cancer Round Table Meeting | December 7, 2017 The Colorectal Cancer Screening Program in South Carolina is funded by the Centers for Disease Control and Prevention (Grant #: NU58DP006137) The grant is awarded to Drs Heather Brandt and Frank Berger of the Center for Colon Cancer Research at the University of South Carolina Contact: Hiluv Johnson, program coordinator, hsjohnso@mailbox.sc.edu Colorectal Cancer Screening Program in South Carolina (CCSPSC) Long-term Outcome: Decreased CRC mortality through increased participation in CRC screening The purpose of the Colorectal Cancer Screening Program in South Carolina (CCSPSC) is to increase participation in CRC screening by working with partner health systems to implement priority evidence-based strategies http://cccr.sc.edu/outreach/ccspsc/ccspsc-program Colorectal Cancer Screening Program in South Carolina (CCSPSC) Long-term Outcome: Decreased CRC mortality through increased participation in CRC screening The purpose of the Colorectal Cancer Screening Program in South Carolina (CCSPSC) is to increase participation in CRC screening by working with partner health systems to implement priority evidence-based strategies http://cccr.sc.edu/outreach/ccspsc/ccspsc-program Connections to the NCCRT Evaluation Toolkit, Version Access the toolkit at: http://nccrt.org/resource/evaluation-toolkit/ A few disclaimers… • I have been involved in the development and editing of the evaluation toolkit due to my background in intervention implementation and evaluation and work with community partners to this • Our program has a team of professional evaluators – the Core for Applied Research and Evaluation (led by Dr Lauren Workman) • We did not use the evaluation toolkit to guide our approach, but it is important to note that we are able to connect our approach to the steps – We used the CDC Framework for Program Evaluation in Public Health NCCRT Evaluation Toolkit: Steps • • • • Step 1: Describe and map your program Step 2: Prioritize your evaluation questions Step 3: Design the evaluation Step 4: Identify or develop data collection instruments • Step 5: Collect the data • Step 6: Organize and analyze information • Step 7: Using and sharing evaluation results Step 1: Describe and map your program • Describe the intervention: – Evidence-based interventions, supportive activities, and additional activities to increase CRC screening • Map the intervention: – Developed a flexible and adaptive, phased approach to implementation of the interventions – noted inputs, activities, and outputs (e.g., logic model, protocol, other tools) CCSPSC Phased Approach to Implementation with Partners Phase Building Partnerships Phase Collecting Baseline Data and Planning Phase Implementing Evidence-based Strategies Phase Supporting and Monitoring Implementation Phase Sustainability and Maintenance Building Partnership with FQHC System MOA Complete Collect Baseline Data Conduct Professional Education Support Implementation of Evidence-based Strategies Annual Review Process Sites Selection Develop Implementation Plan Conduct Training Monitor Implementation Conduct Technical Assistance (TA) Ongoing TA Focus on Sustainability Go Live! Collect Annual Data Evaluation-Activities Collect Annual Data Evaluation Activities Evidence-Based Interventions Select at least two priority, evidence-based interventions: • Provider assessment and feedback Multi-component • Provider reminders and recall interventions • Client (patient) reminders Optional supportive activities: • Professional education • Small media Additional activities: • Standard procedures (policies) • 80% by 2018 pledge Step 2: Prioritize your evaluation questions • Main outcomes: – CRC screening; long-term CRC incidence and mortality and disparities in incidence and mortality • Short-term outcomes: – CRC screening, fidelity to CRC screening guidelines, satisfaction, implementation of evidence-based interventions, health system data use and quality • Process: – environmental scan, training and technical assistance, fidelity to action plans, engagement of stakeholders, policy implementation, use of resources (cost effectiveness), sustainability Step 3: Design the evaluation • Guidance provided by the CDC Colorectal Cancer Control Program – Submitted evaluation plan for review and approval by the CDC; approved in January 2016 • Used the CDC Framework for Program Evaluation in Public Health as a guide • Data collection at multiple points, multiple sources • Mixed methods approach Step 4: Identify or develop data collection instruments • CRC screening data – guidance provided by the CDC for standardized collection – Baseline, quarterly, and annually • CRC incidence and mortality data – Working with the South Carolina Central Cancer Registry • Process data tools (selected examples): – – – – – – Implementation Status Tracking Organizational Assessment and Environmental Scan Readiness Assessment Training Evaluation (pre-test, post-test, follow-up) Observation Form (evidence-based intervention specific forms) Continuous Quality Improvement (CQI) Tracking tool • Qualitative data tools: Focus groups/interviews with stakeholders, providers, patients CCSPSC Readiness Assessment Tool: FQHC Readiness Criteria Based on R=MC2 (Dymnicki et al., 2014) Step 5: Collect the data • Protocol (in addition to evaluation plan and other tools) prescribes when data are collected, how, and by whom – Iterative and ongoing process of considering what data must be collected (vs what we would like to know), how we are collecting the data • For example: – CRC screening data • Baseline and annual data reported to CDC • Quarterly data used for our program and with partner FQHCs to monitor short-term outcomes CBARS Timeline CDC PY1: 7/1/156/30/16 CDC PY3: 7/1/176/30/18 CDC PY2: 7/1/166/30/17 Admin Data CRCS Data Admin Data 10/1/16 3/31/17 Baseline CRCS FQHC System and Sites: PY1 CRCS Data Admin Data 10/1/17 3/31/18 Annual CRCS FQHC System and Sites: PY1 CDC PY5: 7/1/196/30/20 CDC PY 4: 7/1/186/30/19 CRCS Data Admin Data 10/1/18 3/31/19 CRCS Data Admin Data 10/1/19 CRCS Data 10/1/20 3/31/20 3/31/21 Annual CRCS Annual CRCS Annual CRCS FQHC System and Sites: PY1 FQHC System and Sites: PY1 FQHC System and Sites: PY1 CareSouth – Lake View Carolina Health – McCormick Eau Claire – Eastover HopeHealth – Kingstree Little River – Main New Horizon – Greer Regenesis – Langdon Regenesis –750 Langdon Sandhills – Lugoff Sandhills – McBee Baseline CRCS FQHC System and Sites: PY2 FQHC System and Sites: PY2 CareSouth – Lake View Regenesis –750 Church St Eau Claire – Waverly HopeHealth – Timmonsville Little River – Carolina Forest New Horizon – West Faris Annual CRCS FQHC System and Sites: PY2 Baseline CRCS FQHC System and Sites: PY3 Annual CRCS FQHC System and Sites: PY2 Annual CRCS FQHC System and Sites: PY3 Baseline CRCS FQHC System and Sites: PY4 * As of September 2017 Annual CRCS FQHC System and Sites: PY2 Annual CRCS FQHC System and Sites: PY3 Annual CRCS FQHC System and Sites: PY4 Step 6: Organize and analyze information • Evaluation team leads data management and analysis • Protocol specifies processes for managing data • Evaluation team provides regular updates and offers opportunities for interpretation: – UofSC team meetings – Evaluation Committee meetings – Advisory Council meetings – Conference calls with CDC program consultants Step 7: Using and sharing evaluation results • Data inform overall program approach (feedback loop) • Dissemination strategies (selected examples): – Routine updates to partners • Summary reports of evaluation data provided – CCSPSC newsletters (quarterly) – CRC screening data shared (collected baseline, quarterly, and annually) – Presentations at professional and scientific meetings – Baseline data snapshots (next slide) Example of content from quarterly newsletters “Take Home” Points • Evaluation requires iterative and ongoing planning – and being intentional in ensuring the plan meets the needs of the program – Evaluation can evolve over time • Stakeholder engagement is critical • Complexity of the FQHC data systems must be accounted for • Capacity within partner FQHCs to enter, provide, retrieve data is important (see also importance of stakeholder engagement) * I asked Dr Lauren Workman for her “take home” points about the evaluation of our program Working together to increase colorectal cancer screening in South Carolina! Acknowledgments • • • • • • • • • Hiluv Johnson, Cindy Calef, Jay Whitmore, Minjee Lee, Amanda Collins Core for Applied Research and Evaluation (CARE), Arnold School of Public Health, University of South Carolina (led by Dr Lauren Workman) South Carolina Primary Health Care Association American Cancer Society Colorectal Cancer Prevention Network of the Center for Colon Cancer Research Eight federally-qualified health center (FQHC) systems in South Carolina (15 FQHC sites across the eight systems) Advisory Council Evaluation Committee Other partners The Colorectal Cancer Screening Program in South Carolina is funded by the Centers for Disease Control and Prevention (Grant #: NU58DP006137) The grant is awarded to Drs Heather Brandt and Frank Berger of the Center for Colon Cancer Research at the University of South Carolina Contact: Hiluv Johnson, program coordinator, hsjohnso@mailbox.sc.edu