Evaluation of the achievement levels for mathematics and reading on the national assessment of educational progress

266 38 0
Evaluation of the achievement levels for mathematics and reading on the national assessment of educational progress

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

www.ebook3000.com Evaluation of the Achievement Levels for Mathematics and Reading on the National Assessment of Educational Progress matics and Committee on the Evaluaation of NAE EP Achievem ment Levels for Mathem Reading Editors Christopheer Edley, Jr.,, and Judith A A Koenig, E B Board on Tessting and Asssessment and Committee C o National S Statistics on Education Division D of Behavioral B an nd Social Scciences and E THE NATIONAL ACADEMIES PRESS 500 Fifth Street, NW Washington, DC 20001 This activity was supported by Contract No ED-IES-14-C-0124 from the U.S Department of Education Any opinions, findings, conclusions, or recommendations expressed in this publication not necessarily reflect the views of any organization or agency that provided support for the project International Standard Book Number-13: 978-0-309-43817-9 International Standard Book Number-10: 0-309-43817-9 Digital Object Identifier: 10.17226/23409 Additional copies of this report are available for sale from the National Academies Press, 500 Fifth Street, NW, Keck 360, Washington, DC 20001; (800) 624-6242 or (202) 334-3313; http://www.nap.edu Copyright 2016 by the National Academy of Sciences All rights reserved Printed in the United States of America Suggested citation: National Academies of Sciences, Engineering, and Medicine (2016) Evaluation of the Achievement Levels for Mathematics and Reading on the National Assessment of Educational Progress Washington, DC: The National Academies Press doi: 10.17226/23409 www.ebook3000.com Prepublication copyc Uncorrrected proo ofs w establiished in 186 63 by an Acct of The National Academy off Sciences was Cong gress, signe ed by Presid dent Lincoln, as a privvate, nongo overnmenta al instittution to ad dvise the nation on isssues relate ed to sciencce and tech hnology Members are ellected by th heir peers for f outstan nding contriibutions to research Dr Marcia M McNutt is president The National Academy off Engineering was estaablished in n 1964 unde er the chartter of the National N Accademy of Sciences S to o bring the practices o of engin neering to advising the nation Members M arre elected b by their peers for extra aordinary contribution c ns to engine eering Dr C D Mote e, Jr., is prresident The National Academy off Medicine (formerly tthe Institutte of Medicine) was blished in 1970 under the charte er of the Naational Academy of Scciences to estab advisse the natio on on medical and hea alth issues Members are elected d by their peerrs for distinguished contributionss to medicin ne and hea alth Dr Vicctor J Dzau u is pre esident The three t Academies work k together as the Natiional Academies of S Sciences, Engineering, an nd Medicin ne to provid de indepen dent, obje ective analyysis and advic ce to the nation and conduct c oth her activitie es to solve complex p problems and inform i public policy decisions d The T Nationaal Academie es also enccourage educ cation and research, r recognize outstanding contributio ons to know wledge, and d incre ease public understand ding in mattters of scie ence, engin neering, an nd mediicine Learn more abo out the Nattional Acade emies of Scciences, En ngineering, and ww.nationa al-academie es.org Medicine at ww Prepublication copyc Uncorrrected proo ofs Repo orts document the eviidence-base ed consenssus of an au uthoring committee off expe erts Reportts typically include fin ndings, nclusions, a and recomm mendations based on inform mation gath hered by th he committe ee and com mmittee de eliberationss Repo orts are pee er reviewed d and are approved a byy the Natio onal Academ mies of Scien nces, Engin neering, and d Medicine Proc ceedings ch hronicle the e presentattions and diiscussions a at a worksh hop, symp posium, or other convening even nt The stattements and opinions contained in prroceedings are those of o the partiicipants and d have not been endo orsed by other participan nts, the pla anning com mmittee, orr the Nation nal Academ mies of nces, Engin neering, and d Medicine Scien For information n about other products and activvities of the e National A Academies,, pleasse visit nationalacade emies.org/w whatwedo www.ebook3000.com Prepublication copy- Uncorrected proofs COMMITTEE ON THE EVALUATION OF NAEP ACHIEVEMENT LEVELS CHRISTOPHER EDLEY, JR (Chair), School of Law, University of California, Berkeley School of Law PETER AFFLERBACH, Department of Teaching and Learning, Policy and Leadership, University of Maryland SYBILLA BECKMANN, Department of Mathematics, University of Georgia H RUSSELL BERNARD, Institute for Social Science Research, Arizona State University, and Department of Anthropology, University of Florida KARLA EGAN, National Center for the Improvement of Educational Assessment, Dover, NH DAVID J FRANCIS, Department of Psychology, University of Houston MARGARET E GOERTZ, Graduate School of Education, University of Pennsylvania (emerita) LAURA HAMILTON, RAND Education, RAND Corporation, Pittsburgh, PA BRIAN JUNKER, Department of Statistics Carnegie Mellon University SUZANNE LANE, School of Education, University of Pittsburgh SHARON J LEWIS, Council of the Great City Schools, Washington, DC (retired) BERNARD L MADISON, Department of Mathematics, University of Arkansas SCOTT NORTON, Standards, Assessment, and Accountability, Council of Chief State School Officers, Washington, DC SHARON VAUGHN, College of Education, University of Texas at Austin LAURESS WISE, Human Resources Research Organization, Monterey, CA JUDITH KOENIG, Study Director JORDYN WHITE, Program Officer NATALIE NIELSEN, Acting Board Director (until June 2015) PATTY MORISON, Acting Board Director (from June 2015) KELLY ARRINGTON, Senior Program Assistant FM-v Prepublication copy- Uncorrected proofs BOARD ON TESTING AND ASSESSMENT DAVID J FRANCIS, (Chair) Texas Institute for Measurement, Evaluation, and Statistics, University of Houston MARK DYNARSKI, Pemberton Research, LLC, East Windsor, NJ JOAN HERMAN, National Center for Research on Evaluation, Standards, and Student Testing, University of California, Los Angeles SHARON LEWIS, Council of Great City Schools, Washington, DC BRIAN STECHER, Education Program, RAND Corporation, Santa Monica, CA JOHN ROBERT WARREN, Department of Sociology, University of Minnesota NATALIE NIELSEN, Acting Director (until June 2015) PATTY MORISON, Acting Director (from June 2015) FM-vi www.ebook3000.com Prepublication copy- Uncorrected proofs COMMITTEE ON NATIONAL STATISTICS LAWRENCE D BROWN (Chair), Department of Statistics, Wharton School, University of Pennsylvania JOHN M ABOWD, School of Industrial and Labor Relations, Cornell University FRANCINE BLAU, Department of Economics, Cornell University MARY ELLEN BOCK, Department of Statistics, Purdue University MICHAEL E CHERNEW, Department of Health Care Policy, Harvard Medical School DON A DILLMAN, Department of Sociology, Washington State University CONSTANTINE GATSONIS, Center for Statistical Sciences, Brown University JAMES S HOUSE, Survey Research Center, Institute for Social Research, University of Michigan MICHAEL HOUT, Survey Research Center, University of California, Berkeley THOMAS L MESENBOURG, U.S Census Bureau (retired) SUSAN A MURPHY, Department of Statistics, University of Michigan SARAH M NUSSER, Department of Statistics, Center for Survey Statistics and Methodology, Iowa State University COLM A O’MUIRCHEARTAIGH, Harris Graduate School of Public Policy Studies, University of Chicago RUTH D PETERSON, Criminal Justice Research Center, Ohio State University ROBERTO RIGOBON, Sloan School of Management, Massachusetts Institute of Technology EDWARD H SHORTLIFFE, Biomedical Informatics, Columbia University and Arizona State University CONSTANCE F CITRO, Director BRIAN HARRIS-KOJETIN, Deputy Director FM-vii www.ebook3000.com Prepublication copy- Uncorrected proofs PREFACE Since 1969 the National Assessment of Educational Progress (NAEP) has been providing policy makers, educators, and the public with reports on the academic performance and progress of the nation’s students Over the years, NAEP has been updated in ways that kept it current with the ever-changing educational and policy context, but maintained the trend lines it was designed to track By the late 1980’s, there were concerns about the extent to which U.S students were learning what they need to be globally competitive in the 21st century There were calls to set high, world-class achievement goals for U.S students In response, NAEP adopted standards-based reporting for the subjects and grades it assessed Today, some 24 years later, the focus on achievement standards is even more intense At its best, standards-based reporting can provide a quick way to summarize students’ achievement and track their progress It can clearly demark disparities between what we expect our students to know and be able to as articulated in the standards— and what they actually know and can It can stimulate policy conversations about educational achievement across the country, identifying areas and groups with high performance as well as those with troubling disparities It can inform policy interventions and reform measures to improve student learning There are potential downsides, however Standards-based reporting can lead to erroneous interpretations It can overstate or understate progress, particularly when the goal is to monitor the performance of subgroups In its attempt to be easily understood by all audiences, it can lead to over-simplifications, such as when users not the necessary work to ensure their understandings are correct At a time when policy makers are vitally interested in ensuring that all of the country’s students achieve high standards, it is critical that test results are reported in a way that leads to accurate and valid interpretations, and that the standards used in reporting deserve the confidence of policymakers, practitioners, and the public The U.S Department of Education turned to the National Academies of Science, Engineering, and Medicine to answer these questions and the Academies set up the Committee on the Evaluation of NAEP Achievement Levels for Mathematics and Reading The purpose of the project was to evaluate to what extent NAEP’s standards (or achievement levels) are reliable and valid Are they reasonable? Are they informative to the public? Do they lead to appropriate interpretations? The committee of 15 brought a broad range of education and assessment experience Our consensus findings, conclusions, and recommendations are documented in this report The committee benefited from the work of many others, and we wish to thank the many individuals who assisted us We first thank the sponsor who supported this work: the U.S Department of Education and the staff with the department’s Institute for Education Sciences who oversaw our work, Jonathon Jacobson and Audrey Pendleton During the course of its work, the committee met three times in person and four times by video conference The committee’s first meeting included a public session FM-ix Prepublication copy- Uncorrected proofs Mullis, I.V.S., Martin, M.O., & Foy, P (2008) TIMSS 2007 International Mathematics Report: Findings from IEA’s Trends in International Mathematics and Science Study at the Fourth and Eighth Grades Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College The Nation’s Report Card (2013) 2013 mathematics and reading: Grade 12 assessments Available: http://www.nationsreportcard.gov/reading_math_g12_2013/#/whatknowledge [August 2016] The Nation’s Report Card (2015a) Knowledge and Skills: Grade Available: http://www.nationsreportcard.gov/reading_math_2015/#mathematics/questions?gr ade=4 [August 2016] The Nation’s Report Card (2015b) National achievement level results: Grade mathematics Available: http://www.nationsreportcard.gov/reading_math_2015/#mathematics/acl?grade=4 [August 2016] The Nation’s Report Card (2015c) National achievement level results: Grade reading Available: http://www.nationsreportcard.gov/reading_math_2015/#reading/acl?grade=4 [August 2016] The Nation’s Report Card (2015d) National achievement level results: Grade mathematics Available: http://www.nationsreportcard.gov/reading_math_2015/#mathematics/acl?grade=8 [August 2016] The Nation’s Report Card (2015e) National achievement level results: Grade reading Available” http://www.nationsreportcard.gov/reading_math_2015/#reading/acl?grade=8 [August 2016] The Nation’s Report Card (2015f) State Achievement Level Results: Grade Available: http://www.nationsreportcard.gov/reading_math_2015/#mathematics/state/acl?gra de=4 [August 2016] The Nation’s Report Card (2015g) State Results Overview: Grade Available: http://www.nationsreportcard.gov/reading_math_2015/#mathematics/state?grade= [August 2016] The Nation’s Report Card (2016) Homepage Available: http://www.nationsreportcard.gov/ [March 2016] National Academy of Education (1993a) Setting Performance Standards for Student Achievement: A Report of the National Academy of Education Panel on the Evaluation of the NAEP Trial State Assessment: An Evaluation of the 1992 Achievement Levels Washington, DC: Author National Academy of Education (1993b) Setting Performance Standards for Student Achievement: Background Studies Washington, DC: Author National Assessment Governing Board (1990) Setting Appropriate Achievement Levels for the National Assessment of Educational Progress: Policy Framework and Technical Procedures Washington, DC: Author National Assessment Governing Board (1993) Setting Achievement Levels On the National Assessment of Educational Progress: Policy Statement Washington, DC: Author Refs-9 Prepublication copy- Uncorrected proofs National Assessment Governing Board (1995) Developing Student Performance Levels for the National Assessment of Educational Progress: Policy Statement Washington, DC: Author National Assessment Governing Board (2006) Reporting, Release, and Dissemination of NAEP Results: Policy Statement Washington, DC: Author National Assessment Governing Board (2015) Mathematics Framework for the 2015 National Assessment of Educational Progress Available: http://www.edpubs.gov/document/ed005557p.pdf?ck=15 [February 2016] National Center for Education Statistics (2012) NAEP: Looking ahead—Leading Assessments into the Future Washington, DC: Author National Center for Education Statistics (2013a) Are the Nation’s 12th-Graders Making Progress in Mathematics and Readings? Washington, DC: Institute of Education Sciences, U.S Department of Education National Center for Education Statistics (2013b) The Nation’s Report Card: A First Look: 2013 Mathematics and Reading (NCES 2014-451) Washington, DC: Institute of Education Sciences, U.S Department of Education National Center for Education Statistics (2016) History of Framework Changes Available: https://nces.ed.gov/nationsreportcard/mathematics/frameworkcomparison.aspx [October 2016] National Commission on Excellence in Education (1983) A Nation at Risk: The Imperative for Educational Reform Washington, DC: U.S Department of Education National Education Goals Panel (1999) The National Education Goals report: Building a nation of learners, 1999 Washington, DC: U.S Government Printing Office National Governors’ Association (1991) Time for Results: The Governors' 1991 Report on Education Washington, DC: Author National Research Council (1999) Grading the Nation’s Report Card: Evaluating NAEP and Transforming the Assessment of Educational Progress Committee on the Evaluation of National and State Assessments of Educational Progress J.W Pellegrino, L.R Jones, and K.J Mitchell, (Eds.) Board on Testing and Assessment, Division of Behavioral and Social Sciences and Education Washington, DC: The National Academy Press National Research Council (2001) Adding It Up: Helping children learn mathematics J Kilpatrick, J Swafford, and B Findell (Eds.) Mathematics Learning Committee, Center for Education, Division of Behavioral and Social Sciences and Education, Washington, DC: National Academies Press National Research Council (2002) Scientific Research in Education Committee on Scientific Principles for Education Research R.J Shavelson, and L Towne, (Eds.) Center for Education, Division of Behavioral and Social Sciences and Education Washington, DC: The National Academy Press National Research Council (2005) Measuring Literacy: Performance Levels for Adults Committee on Performance Levels for Adult Literacy R.M Hauser, C.F Edley, Jr., J.A Koenig, and S.W Elliott, (Eds.) Board on Testing and Assessment, Refs-10 www.ebook3000.com Prepublication copy- Uncorrected proofs Division of Behavioral and Social Sciences and Education Washington, DC: The National Academies Press National Research Council (2008) Assessing Accomplished Teaching: Advanced Level Certification Programs Committee on Evaluation of Teacher Certification by the National Board for Professional Teaching Standards M.D Hakel, J.A Koenig, and S.W Elliott, (Eds.) Board on Testing and Assessment, Division of Behavioral and Social Sciences and Education Washington, DC: The National Academy Press National Research Council (2011a) Allocating Federal Funds for State Programs for English Language Learners Panel to Review Alternative Data Sources for the Limited-English Proficiency Allocation Formula under Title III, Part A, Elementary and Secondary Education Act Committee on National Statistics; Board on Testing and Assessment; Division of Behavioral and Social Sciences and Education Washington, DC: The National Academies Press National Research Council (2011b) Successful K-12 STEM Education: Identifying Effective Approaches in Science, Technology, Engineering, and Mathematics Committee on Highly Successful Schools or Programs in K-12 STEM Education Board on Science Education, Board on Testing and Assessment, Division of Behavioral and Social Sciences and Education Washington, DC: The National Academies Press National Research Council (2013) Monitoring Progress toward Successful K-12 STEM Education: A Nation Advancing? Committee on the Evaluation Framework for Successful K-12 STEM education Board on Science Education; Board on Testing and Assessment; Division of Behavioral and Social Sciences and Education Washington, DC: The National Academy Press Nitko, A.J (Ed.) (1991) June 1991 [Special Edition] Educational Measurement: Issues and Practice, 10(2), 2-41 Norcini, J.J (2003) Setting standards on educational tests Medical Education, 37(5), 464-469 Norcini, J.J., Lipner, R.S., Langdon, L.O., and Strecker, C.A (1987) A comparison of three variations on a standard-setting method Journal of Educational Measurement, 24, 56-64 Norcini, J.J., and Shea, J.A (1992) The reproducibility of standards over groups and occasions Applied Measurement in Education, 5, 63-72 Norcini, J.J., Shea, J.A., and Kanya, D.T (1988) The effect of various factors on standards setting Journal of Educational Measurement, 25, 57-64 OECD (2009) PISA 2009 Assessment Framework: Key Competencies in Reading, Mathematics and Science Washington, DC: OECD Publishing OECD (2014), PISA 2012 Results: What Students Know and Can Do – Student Performance in Mathematics, Reading and Science (Volume I, Revised edition, February 2014) Washington, DC: OECD Publishing Pearson, D., and DeStefano, L (1993) An evaluation of the 1992 NAEP reading achievement levels, report two: An analysis of the achievement-level descriptions In L.A Shepard, R Glaser, R Linn, and G Bohrnstedt, Setting Performance Standards for Student Achievement Report of the NAE Panel on the Evaluation Refs-11 Prepublication copy- Uncorrected proofs of the NAEP Trial State Assessment: An Evaluation of the 1992 Achievement Levels (pp 183-203) Stanford, CA: The National Academy of Education Pellegrino, J.W (2000) A response to ACT’s technical advisors on NAEP standard settings Educational Measurement: Issues and Practice, 19(2), 14-15 Perie, M (2008) A guide to understanding and developing performance-level descriptors Educational measurement: Issues and Practice, 27(4), 15-29 Phillips, G.W (2007) Expressing International Educational Achievement in Terms of U.S Performance Standards: Linking NAEP Achievement Levels to TIMSS Washington, DC: American Institutes for Research Pitoniak, M., Dion, G., and Garber, D (2010) Final Report on the Study to Draft Achievement-Level Descriptions for Reporting Results of the 2009 National Assessment of Educational Progress in Mathematics for Grade 12 Princeton, NJ: Educational Testing Service Plake, B.S., Huff, K and Reshetar, R (2010) Evidence-centered assessment design as a foundation for achievement-level descriptor development and for standard setting Applied Measurement in Education, 23(4), 342-357 Plake, B.S., and Impara, J.C (2001) Ability of panelists to estimate item performance for a target group of candidates: An issue in judgmental standard setting Educational Assessment, 7(2), 87-97 Plake, B.S., Impara, J.C., and Potenza, M.T (1994) Content specificity of expert judgements in a standard-setting study Journal of Educational Measurement, 31, 339-347 Plake, B.S., Mellican, G.J., and Mills, C.N (1991) Factors influencing intra-judge consistency during standard-setting Educational Measurement: Issues and Practice, 10(2), 15-16, 22, 25-26 Reckase, M.D (2001) The constoversy over the national assessment governing board standards (p 239) Brookings Papers on Education Policy 2001, 4, 231-265 Schmidt, W.H., Houang, R., and Cogan, L (2002) A coherent curriculum: The case of mathematics American Educator, 26(2), 10-26 Schmidt, W.H., Wang, H.C., and McKnight, C.C (2005) Curriculum coherence: an examination of US mathematics and science content standards from an international perspective Journal of Curriculum Studies, 37(5), 525-559 Shepard, L.A (1976) Setting standards and living with them Florida Journal of Educational Research, 18, 28-32 Shepard, L.A., Glaser, R., Linn, R., Bohrnstedt, G (1993) Setting Performance Standards for Student Achievement Report of the NAE Panel on the Evaluation of the NAEP Trial State Assessment: An Evaluation of the 1992 Achievement Levels Stanford, CA: The National Academy of Education Shultz, E.M., Lee, W., and Mullen, K (2005) A domain-level approach to describing growth in achievement Journal of Educational Measurement, 42(1), 1-26 Schulz, E.M., and Mitzel, H.C (n.d.) The Mapmark Standard Setting Method Available: http://files.eric.ed.gov/fulltext/ED490643.pdf [April 2016] Silver, E.A., and Kenney, P.A (1993) Expert panel review of the 1992 NAEP mathematics achievement levels Simmons, C., and Mwalimu, M (2000) What NAEP’s public have to say In M.L Bourque, and S., Byrd, (Eds.), Student Performance Standards on the National Refs-12 www.ebook3000.com Prepublication copy- Uncorrected proofs Assessment of Educational Progress: Affirmations and Improvements (pp 183219) Washington, DC: National Assessment Governing Board Sireci, S.G., Hauger, J.B., Wells, C.S., Shea, C., and Zenisky, A.L (2009) Evaluation of the standard setting on the 2005 grade 12 national Assessment of Educational Progress mathematics tests Applied Measurement in Education, 22(4), 339-358 Skorupski, W., and Hambleton, R.K (2005) What are panelists thinking when they participate in standard setting studies Applied Measurement in Education, 18(3), 233-255 Smith, R.L., and Smith, J.K (1988) Differential use of item information by judges using Angoff and Nedelsky procedures Journal of Educational Measurement, 25, 259274 Stufflebeam, D.L (2001) Lessons in contracting for evaluations American Journal of Evaluation, 21(3), 293-314 Stufflebeam, D.L., Jaeger, R.M., and Scriven, M (1991) Summative Evaluation of the National Assessment Governing Board’s Inaugural 1990-91 Effort to set achievement levels on the National Assessment of Educational Progress Washington, DC: National Assessment Governing Board Subkoviak, M.J., Kane, M.T., and Duncan, P.H (2002) A comparative student of the Angoff and Nedelsky methods: Implications for validity Mid-Western Educational Researcher, 15(2), 3-7 U.S Department of Education (2011a) Figure Percentage of 4th-grade students reaching the TIMSS international benchmarks in mathematics, by education system: 2011 Washington, DC: Institute of Education Sciences, National Center for Education Statistics Available: http://nces.ed.gov/timss/figure11_2.asp [April 2016] U.S Department of Education (2011b) Figure Percentage of 8th-grade students reaching the TIMSS international benchmarks in mathematics, by education system: 2011 Washington, DC: Institute of Education Sciences, National Center for Education Statistics Available: http://nces.ed.gov/timss/figure11_4.asp [August 2016] U.S Department of Education (2012) Mathematics literacy: Proficiency levels Washington, DC: Institute of Education Sciences, National Center for Education Statistics Available: https://nces.ed.gov/surveys/pisa/pisa2012/pisa2012highlights_3_1.asp [April 2016] U.S General Accounting Office (1992) National Assessment Technical Quality Report No GAO/PEMD-92-22R Washington, DC: Author U.S General Accounting Office (1993) Educational Achievement Standards: NAGB’s Approach Yields Misleading Interpretations Report No GAO/PEMD-993-12 Washington, DC: Author Vinovskis, M.A (1998) Overseeing the Nation’s Report Card: The Creation of Evaluation of the National Assessment Governing Board Washington, DC: National Assessment Governing Board Watanabe, T (2007) In pursuit of a focused and coherent school mathematics curriculum The Mathematics Educator, 17(1), 2-6 Refs-13 Prepublication copy- Uncorrected proofs Wixson, K.K., Valencia, S.W., Murphy, S., and Phillips, G.W (2013) A Study of NAEP Reading and Writing Frameworks and Assessments in Relation to the Common Core State Standards in English Language Arts Washington, DC: American Institutes for Research Wyatt, J., Korbin, J., Wiley, A., Camara, W.J., and Proestler, N (2011) SAT Benchmarks: Development of a College Readiness Benchmark and its Relationship to Secondary and Postsecondary School Performance New York, NY: The College Board Available: https://research.collegeboard.org/sites/default/files/publications/2012/7/researchre port-2011-5-sat-college-readiness-benchmark-secondary-performance.pdf [April 2016] Zeiky, M.J (2001) So much has changed: How the setting of cutscores has evolved since the 1980s In G.J Cizek (Ed.), Setting Performance Standards: Concepts, Methods, and Perspectives Mahwah, NJ: Lawrence Erlbaum Associates, Inc Zeiky, M.J (2012) So much has changed: An historical overview of setting cut scores In G.J Cizek (Ed.), Setting Performance Standards: Foundation, Methods, and Innovations, Second Edition (pp 15-33) New York, NY: Routledge Zeiky, M., Perie, M., Livingston, S (2008) Cutscores: A Manual for Setting Standards of Performance on Educational and Occupations Tests Educational Testing Service: Princeton, NJ Zenisky, A., Hambleton, R.K., and Sireci, S.G (2009) Getting the message out: An evaluation of NAEP score reporting practices with implications for disseminating test results Applied Measurement in Education, 22(4), 359-375 Refs-14 www.ebook3000.com Prepublication copy- Uncorrected proofs APPENDIX A Agenda for Public Forum INTERPRETATIONS AND USES OF NAEP ACHIEVEMENT LEVELS May 27, 2015 1:00-5:00 Committee on the Evaluation of NAEP Achievement Levels in Reading and Math National Academy of Science Building Lecture Room 2101 Constitution Ave., NW Washington DC AGENDA This session is sponsored by the National Academy of Science’s Committee on the Evaluation of NAEP Achievement Levels in Reading and Math, which is charged with evaluating the extent to which NAEP achievement levels are reasonable, reliable, valid, and informative to the public The Committee’s goal for the session is to gather information on uses/interpretations of NAEP results that will help to guide their evaluation The session is separated into parts, each led by a group of panelists from a variety of perspectives The panel discussions will each be facilitated by a committee member, with the goal of having a free-flowing, moderated conversation among the panelists, audience, and committee members 1:00 WELCOME, OVERVIEW OF AGENDA Brian Junker, Carnegie Mellon, Committee Member 1:10 PANEL DISCUSSION 1: EDUCATION WRITER PERSPECTIVES Facilitator: Brian Junker, Carnegie Mellon, Committee Member  Sarah Butrymowicz, Hechinger Report AppA-1 Prepublication copy- Uncorrected proofs     1:55 Catherine Gewertz, Education Week Lyndsey Layton, Washington Post Emily Richmond, Education Writers Association Bob Rothman, Alliance for Excellent Education PANEL DISCUSSION 2: STATE AND LOCAL POLICY PERSPECTIVES Facilitator: Scott Norton, CCSSO, Committee Member     Michael Casserly, Council of Great City Schools Scott Jenkins, National Governors Association Wendy Geiger, Virginia Department of Education Nathan Olson, Washington Department of Education 2:40 Break 2:55 PANEL DISCUSSION 3: EDUCATION POLICY AND ADVOCACY PERSPECTIVES Facilitator: Laura Hamilton, RAND, Committee Member  Patte Barth, National School Board Association  Renee Jackson, National PTA  Sonja Brookins Santelises, Education Trust  Dara Zeehandelaar, Fordham Institute 3:40 PANEL DISCUSSION 4: USES OF NAEP ACHIEVEMENT LEVELS FOR ASSESSMENTS OF THE COMMON CORE STATE STANDARDS Facilitator: Suzanne Lane, University of Pittsburgh, Committee Member   4:10 Enis Dogan, Partnership for Assessment of Readiness for College and Careers (PARCC) Jacqueline King, Smarter Balanced Assessment Consortium PANEL DISCUSSION 5: SYNTHESIS Facilitator: Brian Junker, Carnegie Mellon, Committee Member   Michael Kane, ETS Lorrie Shepard, University of Colorado 4:50 Wrap Up, Final Q&A 5:00 Adjourn open session AppA-2 www.ebook3000.com Prepublication copy- Uncorrected proofs Instructions for Panelists at the Public Forum Held on May 27, 2015 We know that NAEP achievement level results are used for the following general purposes:  To compare student achievement overall and by student groups for the nation, states, and urban districts  To compare U.S student achievement with international benchmarks  To compare with students’ performance on the state assessment – and serve as an audit of the results  To serve as models/external benchmarks in devloping achievement levels (and their descriptions) for other assessments For the workshop, we would like to delve into these (and other) uses more deeply and explore the interpretations, decisions, and actions that result from the information As a very simplistic example, when state NAEP results are released, state officials compare their current results with their past results, those of other states, and those from their own state assessment They interpret the comparisons and make inferences about student performance They communicate those inferences to others, and decisions may be made or actions may be taken Given that example, please consider how you (or the consituency you represent) use NAEP results What comparisons you make? What inferences you draw from them? Who you communicate them to? What decisions are made and what actions are taken? NAEP reports performance results as mean scale scores and cumulated percents of students who score at each achievement levels (percent basic and above, percent proficient and above, percent advanced) How you use scale score information versus achievement level information? Are there any particular groups that you track, such as at-risk students? If so, please discuss the information that you use, the kinds of inferences you make, and the actions/decisions that result When you examine the results, you use any of the NAEP questionnaire data (e.g., to crosstabulate the test scores by questionnaire responses)? If so, please explain what you The performance level categories (Basic, Proficient, Advanced) each include a description (called Achievement Level Descriptions or Performance Level Descriptions) Do you use these descriptions? If so, please explain how you use them AppA-3 Prepublication copy- Uncorrected proofs Appendix B Biographical Sketches of Committee Members and Staff CHRISTOPHER EDLEY, JR (Chair) is the honorable William H Orrick, Jr., distinguished professor and faculty director at the Chief Justice Earl Warren Institute on Law and Social Policy at the School of Law of the University of California at Berkeley, where he previously served as dean Earlier, he was a professor at Harvard Law School His academic work is in administrative law, civil rights, education policy, and, generally, domestic public policy His public policy work has included policy and budget positions under Presidents Jimmy Carter and Bill Clinton He also served as a senior policy adviser in the presidential campaign of Barack Obama and on the transition board, with responsibility for education, immigration, and health More recently, he cochaired the congressionally chartered National Commission on Education Equity and Excellence, which was charged to revisit the 1983 report, A Nation at Risk, and recommend future directions for reform He is a member of the American Academy of Arts and Sciences, the National Academy of Public Administration, the Council of Foreign Relations, and the Gates Foundation’s National Programs Advisory Panel He has a B.A in mathematics from Swarthmore College, an M.A from the Harvard University John F Kennedy School of Government, and a J.D from Harvard Law School PETER AFFLERBACH is a professor of reading in the Department of Teaching and Learning, Policy and Leadership in the College of Education at the University of Maryland His early career was as a teacher, first in grades K-6, then in remedial reading and writing in junior high school, and then in high school English His research interests focus on reading assessment, reading comprehension strategies, and the verbal reporting methodology, especially aspects of individual differences that are sometimes neglected in reading theory and practice, including motivation and engagement, metacognition, student self-efficacy and self-concept, and epistemic beliefs He is a long-time member the Standing Reading Committee of the National Assessment of Educational Progress, and he was a member of the 2009 NAEP Reading Framework Committee and of the Feedback Committee for the Common Core State Standards/English Language Arts He has served on numerous committees and panels for the Program for International Student Assessment (PISA), the National Assessment of Adult Literacy, and the National Accessible Reading Assessment Projects He is an elected member of the International Reading Association’s Reading Hall of Fame He has an M.S in developmental reading and a Ph.D in reading psychology from the State University of New York at Albany SYBILLA BECKMANN is the Josiah Meigs distinguished teaching professor and director of the Mathematicians Educating Future Teachers program in the Department of Mathematics at the University of Georgia Previously, she was a J.W Gibbs instructor of mathematics at Yale University Her major current research interests are in mathematical AppB-1 www.ebook3000.com Prepublication copy- Uncorrected proofs cognition, the mathematical education of teachers, and mathematics content from prekindergarten through Grade She also works on helping college faculty learn to teach mathematics for elementary and middle grade teachers She served as a member of the writing team for mathematics curriculum for prekindergarten through grade for the National Council of Teachers of Mathematics, on the development of several state mathematics standards, and as a member of the mathematics writing team for the Common Core state standards for mathematics She is currently a member of the U.S National Commission on Mathematics Instruction She is the recipient of several teaching awards from the University of Georgia and of the Louise Hay Award for Contributions to Mathematics Education from the Association for Women in Mathematics She has a Ph.D in mathematics from the University of Pennsylvania H RUSSELL BERNARD is a research professor of anthropology and director of the Institute for Social Science Research at Arizona State University He is also professor emeritus of anthropology at the University of Florida Previously, he has taught or done research at other U.S universities and at universities in Greece, Japan, and Germany He has also conducted field research in Greece, Mexico, and the United States His areas of research include technology and social change, language death, and social network analysis He has participated for many years in summer courses, sponsored by the U.S National Science Foundation, on research methods and research design He is a member of the National Academy of Sciences, and he is a recipient of the Franz Boas Award from the American Anthropological Association He has a Ph.D in anthropology from the University of Illinois at Urbana-Champaign KARLA EGAN is currently an independent consultant Previously, she was an associate with the National Center for the Improvement of Educational Assessment, which provides technical support to national, state, and local education agencies on issues related to the design, development, implementation, and documentation of general assessments and alternate assessments Before that, she was a senior research scientist for CTB/McGraw-Hill Her work focuses in particular on standard setting, achievementlevel descriptors, and test security She also has worked on developing achievement-level descriptions and setting standards for the alternative assessments given to students with significant learning disabilities She has designed and led over 40 standard setting workshops, created a nationally recognized framework to develop achievement-level descriptors, and implemented almost all major standard setting methods She has a Ph.D in sociology from the University of Massachusetts, Amherst DAVID J FRANCIS is the Hugh Roy and Lillie Cranz Cullen distinguished university chair and director of the Texas Institute for Measurement, Evaluation, and Statistics at the University of Houston At the university he also directs the Center for Research on Educational Achievement and Teaching of English Language Learners and the Englishlanguage learners strand of the Center on Instruction His research is focused on language and literacy development in Spanish-speaking children, reading and reading disabilities, attention problems, developmental consequences of brain injuries and birth defects, and adolescent alcohol abuse He is a member of the Independent Review Panel for the National Assessment of Title I, a former chair of the Advisory Council on AppB-2 Prepublication copy- Uncorrected proofs Education Statistics, and a member of the Technical Advisory Group of the What Works Clearing House, and a former member of the National Literacy Panel for Language Minority Children and Youth He is a recipient of the Albert J Harris Award from the International Reading Association and several awards from the University of Houston for career accomplishments in research, teaching, and service He has a B.S in psychology from Kalamazoo College and an M.A and a Ph.D in clinical neuropsychology from the University of Houston MARGARET E GOERTZ is a professor emerita of education policy and a senior researcher at the Consortium for Policy Research in Education in the Graduate School of Education at the University of Pennsylvania Previously, she taught at the Bloustein School of Planning and Public Policy at Rutgers University and was executive director of the Education Policy Research Division of Educational Testing Service She has conducted research and led national and state-level studies on education policy and policy implementation, including state and local implementation of Title I, the No Child Left Behind Act, and the Individuals with Disabilities Education Act; the design and implementation of standards-based reform by state education agencies, school districts, and elementary and secondary schools; and the interface between federal and state accountability and school improvement policies Most recently, she completed a study of how state education agencies are organized to manage and use evidence in their policies or practices to improve low-performing schools She has a Ph.D in social science from Syracuse University LAURA HAMILTON is a senior behavioral scientist and associate director for education at the RAND Corporation, a professor at the Pardee RAND Graduate School, and an adjunct faculty member in the Learning Sciences and Policy program at the University of Pittsburgh Her research addresses educational assessment, accountability, the measurement and evaluation of instruction and school leadership, the use of data for instructional decision making, and evaluation of technology-based curriculum reforms She has focused on the collection and analysis of interview, focus group, survey, and student outcome data, particularly in several large multisite studies Most recently, she led an investigation of how districts and charter management organizations are implementing new teacher and principal evaluation and compensation reforms and an evaluation of personalized-learning initiatives She serves on several state and national panels on topics related to assessment, accountability, educator evaluation, and data use and recently served as a member of the committee that revised the Standards for Educational and Psychological Testing She has an M.S in statistics and a Ph.D in educational psychology from Stanford University BRIAN JUNKER is a professor in the Department of Statistics at Carnegie Mellon University His major interest is in highly multivariate data that show an interesting and interpretable dependence structure, especially when it reveals connections among ideas in seemingly unrelated fields He is also interested in capture-recapture models for estimating the size of wildlife and human populations, which share many features with models for multiple-choice tests He has studied latent variable models used in the design and analysis of standardized tests such, as the SAT and the Graduate Records AppB-3 www.ebook3000.com Prepublication copy- Uncorrected proofs Examination; in the analysis of small-scale experiments in psychology and psychiatry; and in the analysis of large-scale educational surveys such as the National Assessment of Educational Progress Some of his recent work aims to characterize the dependence structure implied by these models, so that one can quickly decide whether they are the right tool for a particular problem He has a B.A in mathematics from the University of Minnesota and an M.S in mathematics and a Ph.D in statistics from the University of Illinois JUDITH KOENIG (Study Director) is on the staff of the Board on Testing and Assessment of the National Academies of Science, Engineering, and Medicine, where she directs measurement-related studies designed to inform education policy Her work has included other studies on the National Assessment for Educational Progress; teacher licensure and advanced-level certification; inclusion of special-needs students and English-language learners in assessment programs; setting standards for the National Assessment of Adult Literacy; assessing 21st-century skills; and using value-added methods for evaluating schools and teachers Previously, she worked at the Association of American Medical Colleges and as a special education teacher and diagnostician She has a B.A in elementary and special education from Michigan State University, an M.A in psychology from George Mason University, and a Ph.D in educational measurement, statistics, and evaluation from the University of Maryland SUZANNE LANE is a professor in the research methodology program at the University of Pittsburgh Her research and professional interests are in educational measurement and testing, with a focus on technical and validity issues in large-scale assessment programs and the effectiveness of education and accountability programs She has served as president of the National Council on Measurement in Education and on its Joint Committee for the Revision of the Standards for Educational and Psychological Testing She also served on the Management Committee for the next revision of the Standards for Educational and Psychological Testing published in 2014 She has also served as vice president of the measurement and research methodology division of the American Educational Research Association She has been on a number of technical advisory committees for the College Board, ETS, the Partnership for Assessment of Readiness for College and Careers, the U.S Department of Education’s evaluation of the National Assessment of Educational Progress and its technical review panel for the Race to the Top, and the National Center for Educational Outcomes She has a Ph.D in educational psychology from the University of Arizona SHARON J LEWIS recently retired from the position of director of research for the Council of the Great City Schools in Washington, D.C., which works to improve teaching and learning in the nation’s urban schools as well as help develop education policy She has previously worked as a national education consultant Earlier, she was assistant superintendent of research, development and coordination, with the Detroit Public Schools She has served on numerous NAS committees, most recently an evaluation of D.C public schools, and is currently a member of the Board on Testing and Assessment Her work focuses on improving learning for disadvantaged and at-risk students She has an M.A in educational research from Wayne State University AppB-4 Prepublication copy- Uncorrected proofs BERNARD L MADISON is a professor in the Department of Mathematical Sciences at the University of Arkansas, where he earlier served as dean of the Fulbright College of Arts and Sciences Previously, he was at Louisiana State University His major current research interests are in articulation, assessment, quantitative literacy, and teacher education Previously, he structured and directed the program "Mathematical Sciences in the Year 2000" at the National Research Council, including the national colloquium, Calculus for a New Century He has worked extensively with the calculus program of the College Board And as chair of its Mathematics Academic Advisory Committee He also served on the National Commission on the Future of the Advanced Placement Program and as a member of the writing team for the Common Core State Standards for Mathematics He has a bachelor's degree in mathematics and physics from Western Kentucky University and master's and doctoral degrees in mathematics from the University of Kentucky SCOTT NORTON is strategic initiative director of standards, assessment, and accountability at the Council of Chief State School Officers In this role, he works with states to implement the Common Core State Standards and assessments and to create and implement new student-focused accountability systems The team is also responsible for the State Collaboratives on Assessment and Student Standards, the National Conference on Student Assessment, and collaborative work with the assessment consortia Previously, he served as the assistant superintendent of the Office of Standards, Assessments, and Accountability at the Louisiana Department of Education, where his responsibilities included the implementation of content standards and development of the Louisiana Comprehensive Curriculum as well as the state's transition to full implementation of the Common Core State Standards He has a Ph.D in educational administration and supervision from Louisiana State University SHARON VAUGHN is the H.E Hartfelder/Southland Corp Regents chair and executive director of The Meadows Center for Preventing Educational Risk at the University of Texas Previously, she held positions at the University of Miami and the University of New Hampshire and as a classroom teacher in public schools in Arizona and Missouri Her research focuses on strategies and educational interventions in teaching reading to students who are at risk, particularly students with learning difficulties and behavior problems and students who are English-language learners Her work spans from the middle grades to the secondary grades She is the recipient of numerous awards, including the Council for Exceptional Children research award and faculty and research awards from the University of Texas She has a B.S in education from the University of Missouri and an M.Ed in education and a Ph.D in education and child development from the University of Arizona JORDYN WHITE is a program officer for the Committee on National Statistics Currently she is the study director for a workshop sponsored by the Office of Minority Health on Improving Collection of Criminal Justice System Involvement Indicators in Population Health Data Collections She is also currently working on the Standing Committee on Reengineering Census Operations and the Panel to Modernize the Nation’s AppB-5 www.ebook3000.com Prepublication copy- Uncorrected proofs Crime Statistics She comes to CNSTAT from having worked five years at the U.S Census Bureau; she first worked on the methodology, design, implementation, and production of the American Community Survey, and later served on the 2020 Census Non-Response Follow-Up development and operations planning team She has a B.S in psychology from the University of Pittsburgh and an M.S in criminal justice from St Joseph’s University LAURESS WISE is a principal scientist at the Human Resources Research Organization (HumRRO), where he previously served as president At HumRRO, he currently directs a project to provide quality assurance for the National Assessment of Educational Progress (NAEP) His work has ranged broadly in educational research and educational policy and assessment He serves on technical advisory committees for the Hawaii, Oklahoma, and Rhode Island departments of education, and is currently directing the independent evaluation of California’s new high school exit exam He recently served as cochair for the committee that revised the Standards for Educational and Psychological Testing of the American Education Research Association, the American Psychological Association, and the National Council on Measurement in Education, and he is currently serving as president of the National Council of Measurement in Education He previously served on the Panel for the Evaluation of the NAEP Trial State Assessment of the National Academy of Education He has a Ph.D in mathematical psychology from the University of California, Berkeley Спизжено у ExLib: avxhome.in/blogs/exLib Stole src from http://avxhome.in/blogs/exLib: My gift to leosan (==leonadin GasGeo&BioMedLover from ru-board :-) - Lover to steal and edit someone else's Любителю пиздить и редактировать чужое AppB-6 .. .Evaluation of the Achievement Levels for Mathematics and Reading on the National Assessment of Educational Progress matics and Committee on the Evaluaation of NAE EP Achievem ment Levels for. .. Suggested citation: National Academies of Sciences, Engineering, and Medicine (2016) Evaluation of the Achievement Levels for Mathematics and Reading on the National Assessment of Educational Progress. .. Academies of Science, Engineering, and Medicine to answer these questions and the Academies set up the Committee on the Evaluation of NAEP Achievement Levels for Mathematics and Reading The purpose of

Ngày đăng: 20/01/2020, 14:58

Từ khóa liên quan

Mục lục

  • Cover

  • Evaluation of the Achievement Levels for Mathematics and Reading on the National Assessment of Educational Progress

  • ©

  • Summary

  • 1 Introduction

  • 2 Setting Achievement Levels: History

  • 3 Setting Achievement Levels: NAEP Process

  • 4 Reliability of the Achievement Levels

  • 5 Validity of the Achievement Levels

  • 6 Interpretations and Uses of NAEP Achievement Levels

  • 7 Setting New Standards: Considerations

  • 8 Conclusions and Recommendations

  • References

  • Appendix A: Agenda for Public Forum

  • Appendix B: Biographical Sketches of Committee Members and Staff

Tài liệu cùng người dùng

  • Đang cập nhật ...

Tài liệu liên quan