Objectives: Reaction and Planned Action
Level 1 objectives focused on securing support and satisfaction for the North Star Electronic Performance Support System through targeted training Participants were oriented to the tool, and their satisfaction was measured using a five-point Likert scale questionnaire, which included eight formal questions and two informal open-ended questions Stakeholders aimed for an average rating of four out of five, reflecting a strong intent to utilize the system, crucial for enhancing differentiated instruction success The collected data will not only assess participant satisfaction but also inform improvements for future training cohorts.
Objectives: Learning and Confidence
Level 2 learning objectives were established to effectively communicate expected outcomes and define competent performance criteria for training (Phillips & Phillips, 2008) At the start of the North Star Electronic Performance Support System orientation, the facilitator outlined that participants would engage in a simulation exercise to showcase their learning A performance checklist was provided to participants as a reference tool, detailing each item that would be assessed during the simulation The facilitator explained and demonstrated each checklist item, allowing learners to follow along and apply their knowledge throughout the training To ensure all participants could confidently meet the checklist requirements, training facilitators tailored their instruction and provided ample practice time for the new skills.
During the training, facilitators observed participants as they entered, printed, and selected interventions, utilizing an electronic performance checklist to track competencies To successfully complete the course, learners needed to demonstrate all 26 measures on the checklist during a training simulation (refer to Appendix D for objectives) If a participant scored below 26/26, immediate remedial training was provided, followed by a second evaluation from a neutral facilitator using the same checklist Should participants still fail to achieve the required score, additional follow-up sessions were arranged until they could consistently meet the 26/26 standard, ensuring their mastery of learning objectives before applying them in real-world scenarios.
Objectives: Application and Implementation
The application and implementation objectives were clearly defined, focusing on observable and measurable behaviors resulting from the North Star intervention (Phillips & Phillips, 2008) Participants effectively applied their training by accurately entering and utilizing quarterly benchmark data for instructional purposes They utilized a self-assessment performance checklist to gauge their application performance during data entry, which was submitted to the district trainer upon completion The district trainer then reviewed the data using a matching performance checklist, and any discrepancies identified between the coach and participant were addressed through re-teaching during coaching sessions, ensuring that every participant achieved a minimum score of nine out of ten on the checklist (see Appendix D for objectives).
Before each coaching session, participants filled out the right side of an action plan, selecting a balanced literacy tool aimed at enhancing student reading and writing The coach evaluated the teachers' use of data in their instruction, requiring a minimum score of seven out of ten on the data analysis and application coaching rubrics established by Fountas & Pinnell (2005).
Objectives: Business Impact
Impact Objectives should include specific measures that reflect the skills and knowledge acquired through a program, as outlined by Phillips & Phillips (2008) These measures must be results-oriented, easily collectible, and clearly demonstrate the outcomes achieved by participants as a result of the intervention (Phillips & Phillips, 2007; 2008; 2010).
The North Star Educational Tools Targeted Assistance Zone Report tracks changes in student achievement by analyzing data from all targeted assistance zones Notably, the report indicates that the percentage of students in the red zone, those performing below the 20th percentile, is projected to decrease by 20%, based on findings from the classroom targeted assistance reports.
Assistance Zone Report trend line (see Appendix E for does not meet expectation trend lines)
Students identified in the red zone who had not been previously classified as special education were referred to child study after being documented in the red zone for three consecutive testing periods It was anticipated that enhancing teachers' ability to differentiate instruction through data from the North Star System would reduce the number of students in the red zone by 20% by the end of the 2009-2010 school year The financial impact of this reduction was assessed by comparing the costs associated with educating special education students to those for general education students, utilizing standardized cost measures.
Appendix F for standardized values) The difference between actual and projected numbers of students in the red zone was calculated as cost savings
A projected 20% reduction in students falling within the yellow zone (below the 50th percentile) was anticipated, as indicated by the Targeted Assistance Trend Line Report By enhancing teachers' abilities to differentiate instruction through data from the North Star System, it was expected that this reduction would be realized by the end of the 2009-2010 school year The financial impact of this decrease was assessed by comparing the costs associated with educating Title I students versus general education students, using standardized cost values The resulting difference between the actual and projected numbers of students in the yellow zone was calculated as cost savings.
Students in the green (students at the mean) and blue (students above high benchmark) zones were projected to increase by 40%, as measured by the Targeted Assistance Trend Line
The report indicates a promising trend in special education, with expectations showing that only basic general education funding is necessary for these students Projections suggest that the number of special education students in kindergarten through second grade will decline by 20% by the end of the instructional year, attributed to the effective implementation of the North Star Electronic Performance Support System intervention.
Objectives: Return on Investment
The ultimate goal is to achieve a positive Return on Investment (ROI) for the North Star Electronic Performance Support System Demonstrating a favorable ROI is essential to validate the system as a worthwhile investment rather than merely an expense A conservative ROI estimate enhances credibility, as it is more reliable than overestimating and underdelivering.
The North Star intervention is anticipated to achieve a benefit-cost ratio of 1.25:1, translating to a 25% return on investment (ROI) based on the benefits realized in the initial five years.
Before collecting data, organizations must establish a comprehensive data collection and storage plan This plan is essential for ensuring the proper use of data collection tools, timely data gathering, and effective training and assignment of data collectors By developing the data plan prior to implementation, organizations can gather feedback and secure stakeholder buy-in, which enhances accountability and ensures successful follow-through (Phillips & Phillips, 2008).
Data Collection Plan (see Appendix E for the plan) contains the program objectives and forecasted measures of those objectives from each level.
Level 1 data was gathered through a questionnaire designed to assess participants' reactions and intended actions following the training This questionnaire featured eight questions rated on a five-point Likert scale, along with two informal open-ended questions aimed at identifying barriers, enablers, and intangible factors.
Level 2 data assessed participants' proficiency in utilizing the North Star Electronic Performance Support System during training A formal observation of learned actions took place through a simulation at the conclusion of the training, with a performance checklist employed to evaluate all twenty-six skills acquired.
Level 3 data assessed participants' ability to apply the training information in their work environment Regular coaching rubrics were utilized to document the application of new data tools on the job, with a minimum passing score of seven out of ten on the data analysis and application rubric indicating successful implementation.
Over a three-year period, Level 4 benchmark assessment data was gathered quarterly and entered into the North Star Assessment Tools (Swenson, 2007) Trend lines were then projected based on the historical data collected prior to the implementation of the North Star Electronic system.
The Performance Support System, detailed in Appendix A, shows a positive trend in cost savings for educating a specific cohort of children over three years post-implementation By utilizing standard costs outlined in Appendices F and H, significant financial benefits were realized as more students transitioned out of special education and Title I programs than initially projected The objectives included a 20% reduction in special education rates by the end of second grade, a similar 20% decrease in Title I rates, and a 20% increase in the number of students meeting or exceeding expectations, as indicated by the Targeted Assistance Report in Appendix D.
The Level 5 ROI data was derived by translating Level 4 data into monetary savings The project's objective was to achieve a benefit-cost ratio of 1.25:1 and an ROI of 25% based on the benefits realized in the first year.
The data collection tools were carefully aligned with the appropriate data types, sources, timing, and responsible personnel The Literacy Collaborative District Trainer oversaw the data collection plan and sent monthly progress update emails to all stakeholders.
To demonstrate the specific business impact of an intervention on performance improvement, it is essential to isolate its effects Various factors can influence the observed data, making isolation crucial The initial step in this process involves selecting the right isolation technique as part of the evaluation plan, ensuring that the chosen method is both rigorous and suitable while considering cost, time, and effectiveness (Phillips & Aaron, 2008).
Using the Steps to Determine the Most Feasible Method(s) of Isolation tool (see
The analysis of trend line data reveals the impact of the North Star Educational Tools Electronic Performance Support System on student growth While control groups are typically the most reliable method for isolating effects, the district's commitment to implementing North Star data across all schools during the study period made this approach impractical Consequently, using control groups would be unethical, as it would prevent schools from utilizing a potentially beneficial tool to help teachers address student achievement gaps.
Trend line analysis of performance data is a highly effective method for isolating the effects of interventions, ranking as the second most accurate approach With multiple years of historical student performance data readily available through North Star, trend lines emerged as the most cost-effective isolation technique, as no other influencing factors were present during the study Consequently, the trend line isolation method was successfully implemented, confirming the absence of any additional influencing factors.
The Heinemann Benchmark Assessment System collects data quarterly from all district schools, utilizing year-end benchmark data to analyze shifts in teaching Historical data reveals trends for students categorized as not meeting, partially meeting, and meeting or exceeding expectations, indicating the level of intervention needed to address achievement gaps Students in the "does not meet" category require intensive instruction, while those in the "partially meets" category need additional support, and students in the "meets and exceeds" group can succeed with standard classroom instruction Trend lines illustrate historical growth rates across all benchmark areas, with projections made for average student growth over the next three years, highlighting both pre-intervention and anticipated post-intervention averages.
To ensure accurate trend line analysis, it is essential to eliminate other internal and external factors that may impact the measurements within the established timeframe The trend line isolation method, detailed in Appendix J, was employed to pinpoint potential influences on performance data growth over time The analysis revealed that the sole new intervention introduced in the summer of 2007 was the North Star Educational Tools Electronic Performance Support System Although other identified factors were monitored throughout the study, none were present during the analysis period.
Benefits and Cost Calculation Intangible Benefits
Reaction and Planned Action
Learning Results
• At the end of training, learners participated in a simulation exercise, and facilitators formally observed participants entering, printing, and selecting interventions and checked off competencies using an electronic performance checklist.
• Stakeholders anticipated learners would be able to complete at least 28/29 items on the performance checklist
All learners successfully completed 27 out of 29 tasks on their first attempt However, eleven learners required assistance in selecting the appropriate intervention based on OS data After receiving guidance, these learners were able to pass on their second attempt.
• Coaches followed up with the two learners and continued to coach them on how to use the Observation Survey data to identify the correct intervention for a child
Table 5: Results of Performance Check List
Application Data
• Participants used a self assessment performance a check list when entering data the first quarter after training
• When finished participants sent performance checklist to the coach to indicate the data was entered.
• The coach scored the checklist again to ensure data was all in correctly All learners received a superior rating of 100% on data entry.
• 83% of learners used action plans in coaching The coaching rubrics have not been
Impact Data
Stakeholders established ambitious targets to enhance student achievement data through improved teacher performance in data-driven instruction The initial objectives aimed for significant progress by the conclusion of the 2009-2010 academic year.
1 Students in the does not meet expectations red zone (below 20 th percentile nationally) will have reduced by 20% as measured by the Classroom Targeted Assistance Reports After implementation the reduction was 51.83%.
2 Students in the yellow zone (below the 50 th percentile nationally) will have reduced 20% as measured by the Classroom Targeted Assistance Reports After implementation the actual reduction was 28.63%.
3 Students in the green and blue, meets and exceeds expectation zones will increase by 20% After implementation the actual increase was 27.75%.
Table 6: Shifts in Students’ Targeted Assistance Zones After Implementation
Shifts in Trend Lines after Implementation
Figure 2: Trend Lines for End of the Year Benchmark: Does Not Meet Expectations
Figure 3: Trend Lines for End of the Year Benchmark: Partially Meets Expectations
Methods for Converting Data to Money
Student achievement data was translated into monetary values by:
• Multiplying the number of students in the red zone (below the 20 th percentile nationally) with the cost of educating a special education student.
• Multiplying the number of students in the multiplying the number of students in the red zone (below the 20 th percentile nationally) with the cost of educating a Title I student
• Students in the blue and green meets and exceeds columns were multiplied by the cost of educating a general education student
Table 7: Converting Data into Monetary Values
Total Impact of the Intervention
Table 9: Total Impact of Intervention
Return on Investment of Implementation
The proposed return on investment was 25% The implementation over three years’ time had a return on investment of 579%.
ROI Business Impact Forecast for 2010-2011 School Year
In December 2010, the final report was presented, accompanied by a forecast for the 2010-2011 school year, which is detailed in Appendix Q This forecast was developed based on trend lines established post-implementation and was shared with stakeholders on December 13, 2010, alongside the final report.
Figure 6: ROI Business Impact Forecast for 2011
Table 10: Converting Business Forecast Data into Money
Effective continuous improvement models facilitate ongoing evaluation and redesign of processes The North Star Electronic Improvement Support System faced initial tensions due to the time required for data organization and limited usage A comprehensive two-month phase was dedicated to needs assessment, gap analysis, root cause analysis, and selecting the implementation strategy Early meetings were crucial for securing buy-in and support Within the first month of implementation, data across Levels 1 to 3 were collected and reported, all showing positive results Following this, three years were spent gathering impact data to accurately calculate the final return on investment (ROI).
In the early stages of the project implementation, the school district faced significant challenges, losing a levy referendum that led to over nine million dollars in budget cuts This resulted in the closure of two schools and the redistribution of students and staff from eight elementary schools into six buildings, causing high class sizes and low morale The district eliminated fifty-four teaching positions and two administrative roles, leading to heightened stress among staff, administration, parents, and students Notably, the closed schools were the most effective in implementing the Literacy Collaborative and utilizing data for differentiated instruction, fostering a collaborative environment among general education teachers, special education teachers, Title I teachers, teaching assistants, and coaches.
Legislators, university deans, and school district teams across the U.S would visit high-performing schools to observe the effective Response to Intervention (RTI) model in action Despite the closure of aging and smaller schools, the assistant superintendent noted the vibrant energy at both institutions, expressing hope that each staff member would take this enthusiasm and ignite a new flame in another building He envisioned a future where the "phoenix would rise from the ashes," symbolizing the transformation and continued success of the educational community.
Prior to the restructuring, the special education leader was enthusiastic about teachers utilizing data for differentiated instruction and collaborated nationally with the district trainer to promote this model However, after a failed referendum and school closures, a key principal retired due to downsizing In the fall of 2007, the special education department attended a Response to Intervention conference, where they learned that Brainerd's model was deemed ineffective and that a universal approach was preferred Following the conference, some principals became skeptical of the program and reduced their support for the coaches and coaching efforts.
In 2008, the special education department sponsored school leadership teams to attend a conference where it was claimed that Reading Recovery was ineffective and detrimental to the school district's resources This posed significant challenges for implementing the required first-grade intervention Concurrently, middle schools adopted the intervention promoted at the conference as the standard for measuring student growth However, the North Star assessment faced scrutiny when it revealed that half of fifth and sixth graders were unable to read Consequently, the district trainer was tasked with conducting a study to investigate the reasons behind the reading difficulties and the lack of correlation between the new assessment tool and the North Star data.
Over two thousand hours were dedicated by the Literacy Collaborative District Trainer and the District Reading Recovery Teacher Leader to analyze student data across various assessments, including Aims Web, North Star Benchmark, NWEA Map, and Minnesota Comprehensive Assessment Their research revealed that Aims Web data consistently followed a bell curve, indicating that half of the students would always fall below the mean target score, which was adjusted to reflect the district's average performance Despite efforts by fifth and sixth grade teachers to improve reading speed, many students continued to struggle with meeting the ever-changing goals AIMS Web was initially chosen for its claimed predictive validity regarding state test performance; however, a correlation study showed only a 27% relationship between AIMS Web scores and state test comprehension scores, highlighting that students could achieve high comprehension scores regardless of their reading speed.
This was an important piece to show, because the number of students passing the
The Minnesota Comprehensive Assessment (MCA) state tests indicated a higher number of students passing in North Star, aligning with the Adequate Yearly Progress (AYP) made by all elementary schools under No Child Left Behind A significant focus of the study was to explore the differing perceptions of fifth and sixth grade teachers compared to primary teachers regarding students transitioning from second grade It was found that many teachers misunderstood the AIMS Web data, interpreting it as a measure of reading ability rather than the speed of reading, as the data follows a bell curve To improve future studies, it is essential to educate all stakeholders about the data's implications and what it truly represents.
Misinformation regarding student performance on NWEA tests began affecting third and fourth grade teachers, who claimed students had never scored lower However, a subsequent study revealed that most schools in the district had either maintained or improved their spring mean scores over time, with only one school experiencing a decline due to significant demographic changes following the 2007-2008 school year The study highlighted a notable disparity between fall and spring scores, indicating that actual growth targets surpassed the expected growth targets set by the testing company Brainerd School District consistently performed above the state mean, despite testing in the earliest and latest windows for fall and spring, respectively The recommendation was to administer tests during the middle month to enhance score validity Unfortunately, time was allocated to discuss the results of another test, leaving no opportunity to present the North Star data to teachers from third to sixth grade.
Table 11: NWEA Testing Data Over Time Study
The Return on Investment Project faced significant challenges due to high staff and administration turnover over three years In 2007, the project began with the retirement of the original vision holder, the assistant superintendent Subsequently, the new assistant superintendent, a former high school principal, worked diligently during the superintendent's second year, but the elementary curriculum coordinator's retirement led to the high school coordinator assuming additional responsibilities for the elementary schools A year later, the superintendent resigned, and the assistant superintendent succeeded him, while the curriculum coordinator was promoted to one of two assistant superintendents, further increasing her workload.
The state launched the Quality Compensation program to enable districts to fund professional learning communities (PLC) and compensate teachers for their participation in learning initiatives While this program helped cover the costs of coaching and sustaining PLC groups after the referendum's failure, it also led to a decline in principals' attendance at PLC staff development sessions focused on literacy.
Principals have fallen behind in their understanding of the Response to Intervention initiative for K-2, leading to a lack of accountability among teachers regarding its implementation The administration struggled to address concerns from intermediate and junior high teachers due to insufficient knowledge of North Star data Instead of fostering collaboration to resolve challenges, the new administration opted for isolation and separate studies Despite these obstacles, the Literacy Collaborative initiative thrived, demonstrating a significant return on investment Literacy coaches and district trainers conducted annual reports analyzing barriers and enablers, developing action plans for continuous improvement As a result, Kindergarten through second grade teachers successfully enhanced their use of data for differentiated instruction, maintaining momentum in their practices.
Table 122: Interim Barriers that Were Solved During Implementation
Too much instructional time spent on assessment
Had a team of retired teachers travel around and assess and enter the data on North Star so teachers could keep teaching
Hard to schedule plans for quarterly team meetings
Had one sub travel from room to room implementing the same plan Teachers were free to attend the quarterly meetings and focus on data instead of plans
Minutes for the meetings needed to be more timely.
Meeting minutes were typed directly into North Star and were available for all to access as soon as the meeting was finished.
Hard to wait until the next team meetings to see who has implemented action plans
Action plans are also on North Star and as people upload data plans and show results or completed projects, everyone can access and track as things are accomplished
People needed more clarification on roles, responsibilities and purpose of the ROI project
The comprehensive implementation plan, along with a detailed roles and responsibilities chart and a glossary of terms, has been made accessible online via North Star Additionally, all materials distributed during meetings are promptly uploaded to North Star for easy reference.
Summary Data needed to be available each quarter.
Each monthly report that the District Trainer wrote was posted on North Star for all to access ongoing results.
Fear that the cost of the project and evaluation were too expensive and take money away from potential raises.
Revenues and expenditures were consistently tracked and presented in a PowerPoint to the new superintendent, Director of Teaching and Learning, and Financial Director Although monthly update meetings were planned, time constraints prevented updates; however, the district trainer ensured that all records were maintained and brought to each meeting.
A comprehensive study was conducted to analyze the Return on Investment (ROI) process, aiming to identify key barriers and enablers that could inform an effective action plan for future ROI studies This final examination revealed numerous obstacles hindering implementation, alongside various facilitators that contributed to the project's progress (refer to table 13).
Level five ROI data was only calculated at the end of the entire three year program It should have been calculated and communicated every year.
The K-2 coaches and teachers kept on coaching, reading, and learning, so implementation kept growing.
Data was only collected on Levels 1-3 at the beginning It should have been continuously collected and tracked throughout the three year implementation At least once a year
Trend Lines for End of the Year Benchmark Assessment Performance Level: Meets
Figure 12: Projected Trend Lines for Meets and Exceeds Expectations
Using Pre Data as a Base 61.33% pre intervention average of children meeting or exceeding benchmark
The timeframe must be sufficient to allow for performance change and normal influences in the operational environment
Who is best positioned to know and assist in identifying these factors? Seek their input
Appendix J: Job Aid: Steps to Determine the Most Feasible Method(s) of Isolation Step One: Step Two:
Identify the key internal and external factors that could influence the measure in the performance setting during the established time frame (apply the 80/20 Rule) The factors are:
B) employee turn over E) Change in program
(drop staff development) H) C) Change in curriculum F) Change in Leadership
Select one of the L-4 measures that your program should influence
Increasing the number of students in the meets and exceeds benchmark performance levels
Decreasing the numbers of kids in the partially meets and does not meet categories.
Identify the timeframe necessary to monitor progress and collect data to determine how the selected measure has changed?
6 years: three years to set trend line before the intervention, and three to measure impact after implementation.
Is it feasible to establish a control group arrangement? Yes
Can a trend be established from historical performance data on the selected measure and are the criteria for using trend analysis or forecasting methods met?
To establish the control group criteria, consider the factors identified in step three Carefully select both the control and experimental groups by ensuring that the factors align with the established criteria It is essential to consult experts who understand how these factors may impact the measurements, as their insights will be valuable in designing an effective control group.
Use trend line or forecasting methods to isolate the effects
Consider logistical issues, ethical issues, contamination issues, etc., and ability to withhold the solution considering the time frame in Step Two
Use estimates from participants or other sources to isolate the effects
Identify who is best positioned to provide estimates on the impact of the training and other influencing factors from step three Ask: are they positioned to know, are
If using trend analysis, be certain there are no other influencing factors If using forecasting, verify the appropriate mathematical relationship with other
Isolation Methods Feasible Because… i.e Strengths Not Feasible Because… i.e Weaknesses
1 Control groups Could be done with matched pairs
In education it is difficult to have schools commit to several years of not using a tool (control group) that could help kids.
In education, performance is quantified through trend lines, which facilitate the collection and graphical representation of data Both teaching and learning generate substantial amounts of performance data over time, making trend lines a valuable tool for measuring educational outcomes.
To accurately assess the impact of an intervention on data trends, it is essential to cross-check trend data with the performer’s estimate of its effects This verification process helps determine the proportion of the total shift attributable to the intervention, accounting for various other influencing factors.
If other factors arise, a trend line is no longer feasible Forecasting would be appropriate as it could account for the additional variables.
Not needed unless other factors arise
To enhance the credibility of the trend line, it's essential to gather end users' or performers' estimates of impact in percentage form Ensuring that this feedback remains anonymous will help maintain its integrity and prevent bias.
5 Supervisor’s estimate of impact (percent) Easy to use/inexpensive Supervisors are too removed from the daily teaching to know what impacts teachers.
6 Management’s estimate of impact (percent) Easy to use/inexpensive Senior management is even farther removed from classrooms than supervisors.
7 Use of experts/previous studies
Previous studies were used to obtain conservative standardized values of the cost of educating special education, Title I and General Education Students
Finding standardized estimates for each year is not feasible with limited resources Very conservative baseline data of 2000 and 2002 standardized values were used throughout the project.
8 Calculate/estimate the impact of the other factors Would help to isolate the effects of the tool if other factors arise.
Not needed unless other factors arise, then a less expensive forecasting method would be implemented.
9 Customer inputs Not applicable Other North Star Customers may have different interventions in place, so the data will not be applicable to Brainerd.
2005 Table copy permission by developer K Minchella, Ph.D
Salaries and Employee Benefits HRD Staff (No of People x
Average Salary x Employee Benefits Factor x No of Hours on Project) $9850
Meals, Travel, and Incidental Expenses
Salaries and Employee Benefits (No of People x Avg Salary x Employee Benefits Factor x No of Hours on Project) $74,898 Meals, Travel, and Incidental Expenses
Participants X Avg salary X Employee benefits factor X Hours or Days of training time
Meals, Travel, and Accommodations (No of participants X Avg daily expenses X Days of training)
Participant Replacement Costs (if applicable)
Meals, Travel, and Incidental Expenses
Equipment Expenses (offsite, part of development expenses)
Salaries and Employee Benefits HRD Staff (No of People x Avg
Salary x Employee Benefits Factor x No or Hours on Project)
Meals, Travel, and Incidental Expenses (to conferences for marketing)
Printing and Reproduction (Brochures/ Ad) $570
Outside Services (Web site license fees) $150
Salaries and Employee Benefits HRD Staff (No of People x Avg
Salary x Employee Benefits Factor x No or Hours on Project)
Meals, Travel, and Incidental Expenses
Appendix M: Converting Trend Line Data to Monetary Values
Table 2 : Average Per Pupil Cost per Target Zone
Average Yearly Per Pupil Cost of Educating a Child
DNM Does Not Meet Expectations Special Education $12,474
Partially Partially Meets Expectations Title I $7667
Table 19: 2004-2005 Actual Student Cost Per Targeted Assistance Zone
2004 – 2005 Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Table 20: 2005-2006 Student Cost Per Targeted Assistance Zone
2005 – 2006 Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Table 21: 2006-2007 Student Cost Per Targeted Assistance Zone
2006-2007 Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Table 22: Calculating average number of students per year to use for projections
Year Average Number of Students
Table 23: 2007-2008 Projected Student Cost Per Targeted Assistance Zone
2007-2008 PROJECTED Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Average cost per student $7999 (rounded)
Table 24: 2008-2009 Projected Student Cost Per Targeted Assistance Zone
2008-2009 PROJECTED Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Table 25: 2009-2010 Projected Student Cost Per Targeted Assistance Zone
2009-2010 PROJECTED Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Average cost per student $7701 (rounded)
Table 26: Pre Intervention Average Cost Per Child
Pre intervention PROJECTEDAverage Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Average cost per student $8217 (rounded)
Table 27: Projected 3 Year Average Per Child
Post Intervention PROJECTED Average Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Average cost per student $8074 (rounded)
Target Audience Key Message Objective Approach Frequency Responsibility Comments/
Roll out Superintendent and school board
Value of ROI Appreciation of results based measures
Approval of intervention, sponsorship granting resource
Leadership Meeting face to face July 14, 2007 District Trainer Educate Supt on the process of ROI and the impact on district
Roll out Principals Value of ROI
Impact on teams Gain support Leadership Meeting face to face July 28, 2007 District Trainer Educate Supt on the process of ROI and the impact on district
Value of ROI Impact on kids and teams
Gain support and address needs or issues
August 25, 2007 District Trainer and Coaches
Educate them on the process of ROI and the impact on district
Planning principals Director of Learning District Trainer
Explain Needs Assessment Gap Analysis Performance Issues
Discussion face to face August 4
Sept 18 District Trainer Use online survey
Collection All Stakeholders Follow up
Gather Follow up data for levels 1 - 2
Questionnaire Reported at June 8 th Meeting
2007 Follow up May 17, 20107 District Trainer Levels 1 and 2 data collection and analysis
Superintendent Principals Dir of Teaching District Trainer
Initial and ongoing findings keep informed so able to encourage other stakeholders
Face-to-face power point and Web presentations of data and impact
District Trainer Fidelity of collection how to have team meetings to analyze
Coaches fidelity analysis action research review data analyze action plan coach
2 hr Training August 25, 2011 Team meetings (1.5 hr per class)
Logistics, Fidelity of collection, Team meetings to analyze, Levels 3-4 analysis
Superintendent and school board report success communicate results, seek support for ongoing revision
1 hr meeting Power Point and discussion Dec, 13, 2010 District Trainer make sure to be conservative and re- explain ROI process.
Director of Teaching and Learning Principals report success seek feedback communicate results, seek more input for future revisions
Face-to-face Power point ongoing blog end of program reporting on all 5 levels Dec, 13, 2010 District Trainer gather and analyze new needs assessment data
Coaches report success seek feedback communicate results, seek more input for future revisions
Face-to-face Power point ongoing blog end of program reporting on all 5 Dec, 13, 2010
How did it impact efficacy, stress, gather new needs assessment data
Appendix O: Action Plan for Improvement
Table 29: Action Plan for Improvement
Develop a plan of implementation for improving measurement and evaluation in your organization
Consider all of the items included in all modules Identify a particular time frame and key responsibilities.
Issue Actions Media Selection Time Responsibility
1 Perception of HR Deliver Power Point that shows the benefits for the District Trainer’s role in the project
Power Point and discussion with stakeholder leadership team/
Gather data and analyze issues stakeholders are having with project.
Online survey, informal interviews, focus groups
3 Objectives Develop Objectives together face-to-face work session with stakeholders
District Trainer Stake holder team
5 point scale likert questionnaire Meet as a team and analyze results questionnaires and focus groups
5 Learning measures review quarterly data and action plans, and reflections. team meetings, coaching action plans
Review North Star data to see shifts in student achievement. team meeting, action plan and implementation, north star
District Trainer Director of teaching and learning, Coaches
7 Impact measures Review North Star
Power Point, and North Star
District Trainer Director of teaching and learning, Coaches Leadership team
8 ROI measures Conduct ROI analysis Data, North Star, costs work sheets
9 Use of technology use existing computer lab develop and implement training on north star computer lab, smart board, mobi,
10 Communicating results create ROI finished Study, create executive summary or ews articles work sessions with coaches face-to-face presentation to stakeholders
Executive summaries sprinkled in waiting rooms around town
Appendix P: End of Program Calculated Amounts
Table 2 : Average Per Pupil Cost per Target Zone
Average Yearly Per Pupil Cost of Educating a Child
Expectations Special Education $12,474 Chambers, J.G., Parrish, T &
Expectations Title I $7667 US Department of Education.
Expectations General Education $6556 Chambers, J.G., Parrish, T &
Table 30: 2007-2008 Student Cost Per Targeted Assistance Zone
2007-2008 Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Table 31: 2008-2009 Student Cost Per Targeted Assistance Zone
2008-2009 Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Average cost per student $7453 (rounded)
Table 32: 2009-2010 Student Cost Per Targeted Assistance Zone
2009-2010 Student Cost Per Targeted Assistance Zone
Proficiency Levels Percent of Children Number of Children Cost to Educate for
Average cost per student $7201 (rounded)