1. Trang chủ
  2. » Giáo án - Bài giảng

Academic assessment of critical thinking in distance education information technology programs

19 2 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Nội dung

101 Chapter Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Mina Richards Trident University, USA Indira R Guzman Trident University, USA ABSTRACT The purpose of this book chapter is to elucidate the process of assessment in higher education with a focus on distance learning and information technology programs Its mission is to bring awareness of academic assessment concepts and best practices in the use of standard assessment tools and direct measures to evaluate student learning The chapter provides definitions of academic assessment and presents the application of signature assignments and rubrics in the Computer Science and Information Technology Management programs to demonstrate student learning results INTRODUCTION While the growth of distance education programs has been significant in the past two decades, the assessment of student learning in online schools is emerging The topic of assessment is important because in the last few years, academic assessment has become an integral part of student learning at traditional and online institutions (WASC, 2013; Miller, 2012) From the perspective of accrediting bodies, an assessment process must be implemented to demonstrate continuous improvement in the quality of teaching programs In the context of globalization, American Universities teaching abroad also transfer the assessment process and best practices to their foreign branches (Noori & Anderson, 2013) The role of faculty is critical through taking ownership, developing strategies, and ensuring that educational goals are met and constantly refined When assessment methods and techniques are incorporated into the classroom, DOI: 10.4018/978-1-4666-9455-2.ch005 Copyright © 2016, IGI Global Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs student learning is evaluated and measured against a set of learning outcomes Formative and summative assessments are used to collect data and compare results against some form of metrics or benchmarks Formative assessments are used to capture students’ knowledge at some point in time during class whereas summative are conducted at the end of the course or program of study These evaluations also detect possible gaps and opportunities to drive course improvement and learning activities Assessment is a continuous process improvement that should be aligned with the institution’s mission to provide an exceptional educational experience for students (Walvoord, & Banta, 2010, p.27) This outcome-based approach equally benefits faculty and students by institutionalizing a culture of instruction and learning An important element in assessment is Learning Outcomes (LOs) These are defined as educational objectives concisely articulated to convey the fundamental learning and skills that students should achieve during a course or program of study (WASC, 2013, p.12) LOs must be task specific and measurable They are widely recognized as the prerequisite for a “learner centered” approach To measure LOs, rubrics are among the most common direct methods used to collect assessment data and evaluate student learning There are a variety of rubric designs, all of which are meant to inform mastery of skills or areas needing improvement The Association of American Colleges and Universities (AAC&U) has been recognized as the leader developer of rubrics The association has designed a series of Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics for the purpose of measuring student achievement in 15 intellectual and practical skill areas of study (AAC&U, 2014) To emphasize the application of rubrics to learning outcomes, this chapter discusses a university assessment process in terms of signature assignments selected for the Computer Science (CS) and Information Technology Management (ITM) programs The university is an online institution granting undergraduate, masters, and Ph.D degrees During the assessment planning, a curriculum map was prepared to determine alignment between courses and specific Institutional Learning Outcomes (ILOs) Within the ILOs to be measured, the critical thinking ILO was chosen since it was the academic theme for the year Signature assignments were then carefully selected by course to ensure that critical thinking could be measured at the introduced, reinforced, or emphasized levels The assessment results are discussed as evidence that critical thinking abilities can be assessed through rubrics and across the first and last course of each program The findings also indicate that relevant gains were obtained from students attending the emphasized level courses when compared to the students sampled in the introduced level course The study methodology is explained in this chapter This chapter develops some standard practices in the assessment process and presents conclusions about the advantages and challenges of the assessment tools and practices from the perspective of faculty and University administration The authors of the chapter have the experience of serving in the Assessment Committee of the University and serving as Program Chair of these academic programs ASSESSMENT OF STUDENT LEARNING Assessment Lifecycle The adoption of Internet technologies has given the opportunity to extend higher education to a larger portion of adult learners through e-learning modalities (WASC, 2013; Miller, 2012) Similar to traditional universities, online schools must also assess the quality of instruction, educational activities, and teach102  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs ing strategies In recent years, the connection between academic assessment and teaching institutions has been more apparent than ever While assessment is multifaceted, it requires a framework to guide student learning and curricula Accrediting bodies and pressure from peer institutions compel universities to adopt an assessment framework Assessment is also changing the emphasis from a teaching-centered approach to a student-centered experience to prepare students for professional careers Assessment is also considered an ongoing process of refinement through planning and operation of instructional methods The literature on assessment conveys many different rationales serving different ends, but all lead to a common purpose of improving student learning For example, a program chair uses assessment to evaluate degree programs and curricula to enhance program competencies and meet accreditation requirements An administrator considers assessment as way to meet accreditation requirements Therefore, assessment is an essential component to virtual instruction, and it is the basis to determine if a learner meets educational goals Assessment becomes centric to the scope of academic roles and a direct link to responsibilities of student grades, curricula planning, program evaluation, and program auditing Among the described dimensions, an educator focuses on assessment as a means to improve a course or how a subject can be better taught (Bell & Farrier, 2008) Capacho (2014) concurs that a systematic approach is needed in assessment to evaluate online learning and converge between evaluation methods of e-learners and best practices in instruction As with any process, assessment requires factual evidence to measure internalization of student knowledge, and therefore use those results to design instructional objectives The purpose of student learning can be defined in terms of learning outcomes Proitz (2010) described learning outcomes from a historical perspective beginning with the “objective movement” in the 1900’s to articulate learning goals and inspiring refinement through “mastery learning” theories along with Bloom’s models in the 1950’s To anchor learning outcomes to assessment, the literature describes assessment as a lifecycle to understand its process (New York University, 2014) Although there has been a tendency to see assessment as disjointed phases, in the context of completeness, each phase feeds to one another as shown in Figure Figure Ongoing cycle of academic assessment Adapted from New York University (2014) 103  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table The assessment lifecycle Assessment Lifecycle Phase Process Responsibilities Planning Assessment Activities Align learning outcomes with curriculum at the program and course level Develop timeline of assessment deliverables Identify academic strategic plan and role of ILOs (Accreditation Liaison Officer) Train faculty (Assessment Office) Provide guidance to faculty in planning for assessment cycle (Assessment Office) Collecting Program Data Gather program data from direct and indirect assessments through dashboards and instruments Review curriculum maps to match data to courses Collect and store student learning data (Assessment and Institutional Research Offices) Gather evidence by program (Institutional Research, Faculty, Program Chair) Analyzing Program Data Examine and evaluate data against learning outcomes, previous improvements, and plan for future improvements Identify analytical approach and interpret results from data dashboards Review analysis and plans for improvement (Faculty and Program Chair) Review completeness of reports and report variances (Assessment Office) Implementing improvements Evaluate student learning outcomes and program data to determine gaps in student learning Plan and programmatic and learning improvements with faculty and course developers Establish a timeline of implement improvements and the LOs to be assessed in the next cycle Coordinate improvement plans and implementation (Faculty, Program Chair, Assessment Office and in some cases Curriculum Office if the changes have been identified to the curriculum) Enforcing Quality Review improvements and plan to gather relative data for next assessment cycle Collect and store student learning data to determine improvement and measure to baseline (Assessment Office, Institutional Research, Program Chair and faculty) Adapted from UC Davis (2015) The basic assessment process is organized into a four phase cycle consisting of planning, inquire, analyze, and apply (UC Davis, 2015) To illustrate the process and accountabilities for each of the assessment phases, Table provides a summary of the assessment lifecycle UC Davis assessment stages were adapted and expanded to include monitoring quality and delineate processes and responsibilities As noted in Table 1, the assessment lifecycle for a single learning outcome can take about two years to evaluate since assessment is usually conducted once a year The first two phases can take one year to complete while implementing improvements and monitoring quality can take one extra year or more Phases and need careful attention and can be the most challenging in terms of analysis, adoption, and tracking of results Having access to accurate data from a Learning Management System is crucial Documentation and benchmarking are also needed to compare previous and current results An institution needs to establish an ongoing and consistent culture of assessment since accreditors need 3-5 years of assessment data to demonstrate continuous improvements Consequently, assessing student learning provides viable evidence to support quality of learning throughout a degree program and also a measure of degree quality and university rigor The board of directors of the Association of American Colleges and Universities (AAC&U, 2014) affirmed this notion by stating that “concerns about college affordability and return on investment are legitimate, but students and families also need good information as they make decisions and evaluate schools.” 104  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs STUDENT LEARNING OUTCOMES AND CURRICULUM MAPPING Student Learning Outcomes The first step on assessment is to define a list of Program Learning Outcomes (PLOs) to support student-centered teaching Through learning outcomes, the assessment process evaluates not only the quality of programs and student learning but also the teaching effectiveness Learning outcomes are similar to establishing goals, and for this reason, the SMART criteria (specific, measurable, attainable, realistic, and timely) is used to guide their writing Asking questions such as what is a program meant to teach? Or what competencies and skills should students demonstrate at the end of program of study? are helpful to begin formulating the initial list of learning outcomes Davis and Wong (2007) conducted studies into the factors that contribute to a student-centered experience indicating that student learning is an interaction between learning activities and approaches to learning Building outcomes that meet those criteria is a repetitive process among faculty until a consensus is reached and final outcomes are endorsed When developing a list of learning outcomes, the statements must be clear and specific, and they should be aligned with the curriculum and focused on learning products not processes (University of Richmond, 2008) The following examples provide an illustration how program learning outcomes should be articulated by incorporating active verbs Example Poor: The students will understand computing hardware configurations and application software Better: Analyze computing hardware configurations and application software to identify information technology solutions to meet business needs Example Poor: Faculty will teach communication skills Better: Communicate effectively with a range of audiences to propose information technology management solutions Bloom’s Taxonomy is also a commonly used framework to define educational levels and build learning outcomes The levels of performance are identified and classified into multiple constructs (Allen, 2006) Bloom Taxonomy assist in categorizing how students learn in the various domains at the cognitive, affective, and behavioral level Cognitive domains establish what students should know; affective represents their way of thinking; and behavioral mean how students should perform (University of Richmond, 2008) Bloom’s cognitive criteria are shown in Table Once learning outcomes are identified, the use of assessment tools such as rubrics are particularly helpful to collect program data and benchmark results against a baseline to measure academic program strength (Capacho, 2014) The outcome results must inform which outcome rates above or below the baseline, therefore providing insights into the learning gaps that must be addressed A learning outcome can also be measured in a given course to identify course strength For institutions with a mature as- 105  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table Bloom’s taxonomy Knowledge To know specific facts, terms, concepts, principles, or theories Comprehension To understand, interpret, compare and contrast, explain Application To apply knowledge to new situations, to solve problems Analysis To identify the organizational structure of something; to identify parts, relationships, and organizing principles Synthesis To create something, to integrate ideas into a solution, to propose an action plan, to formulate a new classification scheme Evaluation To judge the quality of something based on its adequacy, value, logic, or use Bloom Taxonomy CSU Bakersfield, PACT Outcomes Assessment Handbook (as cited in University of Richmond 2008) sessment process, practicing assessment creates a forum of inquiry to seek improvement among faculty, students, and other stakeholders (Bell, & Farrier, 2008) These forums provide a venue to enhance curriculum development and improve the evaluation cycle of student learning By engaging all parties in assessment units, the institution shows collaboration and fosters a culture of student learning and academic improvement across colleges Curriculum Mapping Evidence of student learning must be shown at various stages during the program of study to demonstrate competence and knowledge growth For this purpose, a curriculum map is developed analyzing the learning outcomes and curriculum at the introduced (I), reinforced (R), and emphasized (E) level A curriculum map is a tool, in the form of a matrix used to graphically represent the extent to which program learning outcomes are being addressed across the curriculum By having a matrix, faculty can determine which courses are aligned as introduced, reinforced, and emphasized and ensure that programs have several learning opportunities related to each outcome Curriculum maps are helpful tools to show the relationship between ILOs and PLOs or the alignment between courses and PLOs Faculty can also evaluate how each course fits into the program and which courses provide the most support to master a concept as in the case of capstone courses Lam and Tsui (2013) offered a perspective that curriculum maps are widely used to evaluate the extent of curriculum courses and connection to learning outcomes An example of a conventional curriculum map format showing alignment of course with PLOS is illustrated in Table The process of developing a curriculum map includes the following steps Once the program learning outcomes have been developed, create a curriculum map to illustrate how these outcomes are addressed across the curriculum Enter the courses within the program on the vertical axis and the learning outcomes on the horizontal axis Specify the level to which each course contributes to a learning outcome I = Introduced Students are introduced to key concepts or skills related to the outcome 106  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table Hypothetical example of curriculum map showing alignment between PLOs and courses Course Program Learning Outcome Program Learning Outcome Program Learning Outcome Program Learning Outcome Program Learning Outcome I I I I I I I I I I I R R R R 207 208 304 I 309 421 R 429 R R R 437 R R 439 R R R 444 E E E E I I I I R R R R R R R E 447 481 Program Learning Outcome R 430 434 Program Learning Outcome E E E E E E E E E R=Reinforced Students build on introductory knowledge related to the outcome They are expected to demonstrate their knowledge or ability at increasingly proficient levels E=Emphasized Students are expected to demonstrate their ability to perform the outcome with a reasonably high level of independence and sophistication Curriculum mapping can be automatically created through assessment software or manually prepared as shown in Table Regardless of method, one of the most essential tasks is for faculty to prepare it each year and analyze the skills, content, and assessment opportunities with a degree program Krajcik (2011) supported that standard curriculum maps align curriculum development, instruction, and assessments Accrediting institutions also audit curriculum maps to ensure that specific learning outcomes are being covered in the coursework Additionally, the following guidelines are useful as starting point for building curriculum maps and identifying student learning outcomes • • • Coverage of Learning Outcomes: Each learning outcome should be reinforced and emphasized across multiple courses Avoidance of Redundancy: A curriculum map makes redundancies visible, e.g when all courses address a specific learning outcome Alignment of Courses: Each course should support at least one (and ideally more) learning outcomes 107  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs It is difficult to meaningfully address all learning outcomes in a single course, unless it is an introductory course or capstone course If a course does not appear to address any learning outcomes, the course should be further reviewed to determine whether it should be required, eliminated, or whether an important learning outcome has been missed CRITICAL THINKING SKILLS AS MEASURE OF STUDENT LEARNING Critical Thinking Definition Critical thinking as defined in the literature conveys a multidimensional process that it is better understood if its parts are examined for assessment purposes The many definitions are trans-disciplinary in nature For example, the Critical Thinking Community (2014) webpage includes an excerpt from a seminal study conducted in education in 1941 by Edward Glaser In arguing for the need to develop students’ critical thinking skills, Glaser defined critical thinking as follow: The ability to think critically, as conceived in this volume, involves three things: (1) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one’s experiences, (2) knowledge of the methods of logical inquiry and reasoning, and (3) some skill in applying those methods Critical thinking calls for a persistent effort to examine any belief or supposed form of knowledge in the light of the evidence that supports it and the further conclusions to which it tends It also generally requires ability to recognize problems, to find workable means for meeting those problems, to gather and marshal pertinent information, to recognize unstated assumptions and values, to comprehend and use language with accuracy, clarity, and discrimination, to interpret data, to appraise evidence and evaluate arguments, to recognize the existence (or non-existence) of logical relationships between propositions, to draw warranted conclusions and generalizations, to put to test the conclusions and generalizations at which one arrives, to reconstruct one’s patterns of beliefs on the basis of wider experience, and to render accurate judgments about specific things and qualities in everyday life (p.14) In the last ten years, higher education has engaged in educating students through a series of learning outcomes ranging from communications to knowledge acquisition One of the most popular outcomes in practice is training students to think critically, that is, evaluating the pros and cons of something and coming up with an informed opinion or a recommendation to solve a problem Therefore, critical thinking is the ability to be curious and look at a situation or argument from different angles while providing the best workable solution Systemic thinking is only achieved through training, application, and practice Prior beliefs on critical thinking supported that only individuals with logical and mathematical intelligences made better critical thinkers; however, new pedagogical views have changed that belief, and it is possible to turn any student into a well cultivated critical thinker (Paul & Elder, 2008) These authors described the elements of thought in eight dimensions as noted in Table Critical thinking (CT) has become an organizing core in instruction for the development of students’ reasoning skills Although each teaching discipline has its own perspective of critical thinking, online schools use classroom discussions to enhance it For example, the use of Socratic questioning helps 108  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table Elements of thought Question at Issue: Problem issue Purpose: Goal and objective Information: Data, facts, observations, experiences Concepts: Theories, definition, axioms, laws, principles, models Assumptions: Presupposition taken for granted Implication: Consequences Point of View: Frame of reference, perspective, orientation Interpretation and Inference: Conclusions, solutions Elements of Thought (Paul & Elder, 2008) elevate discussions and uncover issues and assumptions in a logical manner Faculty can also contribute by formulating questions that touch on analysis and evaluation Through this exchange, discussions are helpful to conduct analytical reasoning while evaluating an argument from multiple viewpoints A well designed discussion forum always considers Bloom’s Taxonomy Similarly, the dominance of critical thinking must be present on quizzes, summaries, discussions, and assignments Thus, the enhancement of this skill must be evaluated and measured through assessment techniques to determine the extent of student practice at the introduced, reinforced, and emphasized levels The next sections illustrate the critical thinking learning outcome through signature assignments and assessment measures Assessing Student Learning through Formative Assessment Assessments provide evidence of knowledge, skills, and values learned and what remains to be learned The most accurate way to assess student learning is combining multiple types of assessments Accrediting bodies require at least one direct measure for each learning outcome, and course grades are not accepted as evidence of student learning Baroudi (2007) and Johnson and Jenkins (2013) define two types of assessments: formative and summative, which provide two basic approaches to measure learning • • Formative Assessment: Activities used by the instructor to determine students’ knowledge while providing them with developmental feedback, which can also be used to improve instruction Summative Assessment: Activities to evaluate and grade student learning at some point in time during the course or program of study ◦◦ Direct Measures: Direct measures are assessments based on the actual student’s work including reports, exams, demonstrations, performances, and completed works The strength of direct measures is that faculty members can capture a sample of what students can do, which can be very strong evidence of student learning ◦◦ Indirect Measures: Assessments through indirect measures are based upon a report of perceived student learning These reports are available from many sources including students, faculty, and employers Indirect measures are not as strong as direct measures because one needs to make assumptions into the context of the report Direct and indirect measures are both used to improve programs and courses in a continuous improvement cycle Table provides a summary of assessment measures and types 109  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table Direct and indirect measures Direct Measures Indirect Measures Faculty grades and feedback on course assignments, quizzes, and papers that students submit in each course Student self-reflective essays, and end-of course reflective online discussions Signature assignments in selected courses across the program: • Assignments that assess student achievement of learning outcomes in selected courses in the beginning, middle and end of the program Email Communications with Students, CAFA (Course and Faculty Assessment) Survey Grading and Student Feedback Rubrics are used to assess the achievement of the PLOs Student Exit Survey, Persistence Study, Complaint Resolution Systems, Priorities Survey for Online Learners (PSOL), Student Satisfaction Tracker Survey, Survey of Non-persisting Students, and Alumni Survey Capstone Course Course assignment that spans course duration and rubrics for grading and feedback Capstone Course Self-Reflective Essays and end-of-course reflective online Discussions Using Rubrics for Formative Assessment As part of the assessment strategy to examine learning outcomes is to use a set of rubrics The term rubrics have been mostly associated with grading rubrics to monitor students’ knowledge and assist instructors with ongoing feedback; however, rubrics are multiuse Andrade (2000) defined rubrics as instructional tools to give students informative feedback on course assignments such as essays or class projects Conversely, analytical rubrics are considered assessment tools to measure student learning of a given learning outcome at the institutional, program, or class level (ACC&U, 2014) Rubrics as tools have a wide adoption in education and are very popular among students and instructors Wolf and Stevens (2007) stated that among the most significant rubrics’ properties is that they must clearly identify performance assessment as observable and measurable Rubrics consist of four components The levels signify the progression of performance quality, ranging from excellent to emerging Weights can be assigned Levels are associated with score points to measure performance according to weights The criteria are short statements describing the elements being measured in each assignment Each criterion includes descriptors according to gradation and performance ranks Table is an example of a rubric component exclusively designed for this chapter and related to critical thinking, which could be adapted to online classrooms and other learning outcomes Table Example of critical thinking component in rubrics Criteria Critical Thinking The writer’s conclusions are logical and reflect informed evaluations 110 Level Excellent Level Good Level Fair Level Poor Demonstrates mastery conceptualizing the problem, and viewpoints and assumptions of experts are analyzed, synthesized, and evaluated thoroughly Conclusions are logically presented with appropriate rationale Demonstrates considerable proficiency conceptualizing the problem, and viewpoints and assumptions of experts are analyzed, synthesized, and evaluated proficiently Conclusions are presented with necessary rationale Demonstrates partial proficiency conceptualizing the problem, and viewpoints and assumptions of experts are analyzed, synthesized, and partially evaluated Conclusions are somewhat consistent with the analysis and findings Demonstrates limited or poor proficiency conceptualizing the problem, and viewpoints and assumptions of experts are analyzed, synthesized, and limited or poorly evaluated Conclusions are either absent or poorly conceived and supported  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Preparing Signature Assignments Signature assignments are usually prepared by faculty and course developers, or they can simply be chosen from the existing course syllabus A signature assignment is a report, exam, project, or milestone created to measure competency or progress in achieving specific learning outcomes Signature assignments are purposely assigned to core courses to gather evidence at the main path although any course will work well as long as the specific learning outcome is being addressed One of the benefits of using core courses is the ability to compare student learning and results across programs and teaching faculty Once the curriculum map is complete, faculty must identify the appropriate courses for offering signature assignments A signature assignment should be selected at three different levels in the program (i.e introduced, reinforced, and emphasized) In the curriculum sequence, it is recommended that signature assignments be selected after students have had the opportunity to develop and refine the skills for a given outcome Faculty must discuss and agree on the components of the curriculum map, signature assignment design or selection, and rubrics It is also recommended to embed a simple narrative in the syllabus notifying students that a certain assignment will be used to collect program data Faculty should also consider reviewing the American Association of Colleges and Universities VALUE rubrics (2014) to guide signature assignment scoring and assessment Planning for signature assignments must start at the time the annual program reports and plans are prepared to ensure focus and coordination VALUE Rubrics and Critical Thinking Assessment The next step to evaluate signature assignments is to select analytical rubrics like the Critical Thinking VALUE rubrics These rubrics offer more in-depth analysis and have been created for national use to compare findings across schools The Association of American Colleges and Universities (ACC&U) has been the pioneer developing rubrics for multipurpose applications The development includes several VALUE rubrics with the intent to measure student learning in 15 intellectual and practical skill areas of study (ACC&U, 2014) The association is headed by a diverse advisory board and partner campuses forming the development teams Depending on the application, these rubrics can be used as analytical or instructional tools AAC&U also engages in several practices to validate the reliability of VALUE rubrics and ensure a common understanding of criteria and descriptors among users VALUE rubrics have become a trend in higher education to evaluate formative assessments for their direct measure properties VALUE rubrics include progressively more sophisticated criteria for meeting student learning outcomes A sample of this rubric can be found in the ACC&U website, which site address is listed in the reference page When VALUE rubrics are used as assessment tools, the evaluator must be familiar with critical thinking definitions before attempting to grade any work sample or collection of work Consistent with Bloom’s Taxonomy, Striven and Paul (1987) defined “critical thinking as the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action In its exemplary form, it is based on universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness.” Thus, Critical Thinking VALUE rubrics connect the following components of thinking, which are part of the development of critical thought and as they have been used in the present study: 111  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs • • • • • Explanation of Issues: Interpretation/Knowledge; Evidence: Evidence/Application; Influence of Context and Assumptions: Analysis; Student Position (Perspective): Position/Synthesis; Conclusions (Implications and Consequences): Conclusions/Evaluation Operationalizing Critical Thinking with Signature Assignments The assessment of any program learning outcome is directly related to the Institutional Learning Outcomes In the case of critical thinking, this outcome should be present at the program and institutional outcomes A recommended way to assess both outcomes is to have signature assignments scored with rubrics that measures critical thinking criteria such as the VALUE rubrics The signature assignment implementation can be done as follows Example of a Critical Thinking Learning Outcome: Students will demonstrate the ability to use the elements of critical thinking in their thinking process to effectively solve problems and make informed decisions To measure the above learning outcome, it will require selecting courses that demonstrate the outcome at least at the introduced and emphasized levels As mentioned earlier, the signature assignment can be specifically designed or chosen from the curriculum To assess the achievement of the learning outcome, faculty needs to adhere to the following steps Each program has learning outcome that is aligned to the institutional learning outcome Critical thinking “signature assignments” must be developed to directly assess learning at a minimum of introduced and emphasized levels within the program Faculty must have a uniform “stand alone” rubrics to be used for grading the signature assignments Critical Thinking VALUE rubrics are shown to be calibrated to signature assignments Assessment data can be collected and analyzed from rubrics’ results Table assists in documenting the signature assignments based on the program curriculum map per program Table Signature assignment selection by program Program Learning Outcome Module or week number Which session will the signature assignment be implemented? Faculty responsible 112 Signature Assignment One in Course Number (Introduced) Signature Assignment Two in Course Number (Reinforced) Signature Assignment Three in Course Number (Emphasized)  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs The rest of the chapter puts emphasis on using ACC&U rubrics for evaluating the critical thinking learning outcomes for institutional assessment purposes CRITICAL THINKING STUDY METHODOLOGY Assessing Critical Thinking To check students’ readiness in critical thinking, two studies were carried out in the College of Information Systems (CIS) of a regionally accredited online university The College consists of two undergraduate degrees, one in Information Technology Management (BSITM), one in Computer Science (BSCS) program, and one master’s degree in Information Technology Management (MSITM) The goal of the study was to assess the extent to which students have acquired critical thinking skills during a program of study The first study selected the first and last core course for each program considering that students had the opportunity to develop and refine critical thinking skills over the period of time they took to complete the program A case assignment was chosen for each course to designate the signature assignment Each assignment consisted of 3-5 pages of written text analyzing and proposing solutions to the assigned case study Case studies were chosen because they require deeper theory deliberation and practical application The second study selected two elective courses in the master’s program to examine the level of critical thinking skills gained among students attending courses in close sequence Since these courses were electives and taken in close proximity, assessing critical thinking development was assumed to be acquired in shorter time The same case assignment selection protocol was performed in the second study Both studies included students enrolled in the 2013 winter session Table summarizes the critical thinking Program Learning Outcomes related institutional outcomes, case studies by course, and number of students per class Sample Selection As previously described, signature assignments are used to assess students’ competences or the progress achieved during a program of study or at the institutional level A total of 78 critical thinking signature assignments were completed As shown in Table 8, for the BSITM and CS programs, 36 students were selected and 30 more subjects were chosen from the MSITM Additionally, 12 graduate students were chosen for the second study with elective courses The number of students evaluated in the different courses was based on course enrollment During data collection, all student names were deleted in the case study and prior to evaluating the signature assignments The graders read each paper twice and analyzed them carefully before applying the rubrics Methods All signature assignments were scored using slightly modified levels for the Critical Thinking VALUE rubrics on a scale of – The representing levels were = weak, = marginal, = adequate, and = strong These rubrics provided the critical thinking criteria to effectively score the learning outcome All faculty graders were initially introduced through training to the Institutional Learning Outcome of 113  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table Critical thinking study sample Institutional Learning Outcome (ILO) Critical Thinking Signature Assignments Course with Signature Assignment Number of Students BSCS Analyze a computer problem and identify an appropriate solution Design, evaluate, implement, and maintain computer-based systems, processes, or components to meet desired organizational needs Case CSC111 First course in Computer Science students Case CSC425 Capstone students BSITM Describe how information is used in organizational decision making and how its strategies interact with information management strategies Case ITM206 First course in BSITM 10 students Case ITM491 Capstone 10 students Perform critical analysis of complex information technology situations and offer and evaluate alternative solutions Case ITM524 First course in MSITM 15 students Case ITM591 Capstone 15 students Case ITM515 Elective course in MSITM students Case ITM530 Elective course in MSITM students MSITM critical thinking (CT) and the use of VALUE rubrics Each program selected carefully one assignment from a group where CT was introduced, and an advanced or capstone course where CT was emphasized The same VALUE rubrics were used by all the faculty graders Data collection included both, rubric scores and course groupings Statistical means were calculated per criterion and per student To observe the variation in means, the means were accumulated for each course and compared to the average between the first and last class for undergraduate and master’s courses Also, the percentages were calculated by level within each criterion to determine what level was the highest and lowest within criterion Tables 9-13 show these results The first and second course number listed in Tables 9-11 represents the assignments where CT was Introduced or Emphasized respectively Table MSITM courses 1st study – critical thinking VALUE rubrics results Average Ratings CT Criterion ITM524 ITM590 Gain in Mean Interpretation/knowledge 2.80 3.33 0.53 Evidence/Application 2.47 3.20 0.73 Analysis 2.73 3.13 0.40 Position/Synthesis 2.20 3.27 1.07 Conclusions/Evaluation 2.47 3.13 0.66 Average 114 0.68  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table 10 BSCS courses 1st study – critical thinking VALUE rubrics results Average Ratings CT Criterion CSC111 CSC425 Gain in Mean Interpretation/Knowledge 2.25 2.75 0.50 Evidence/Application 2.25 3.25 1.00 Analysis 2.25 3.00 0.75 Position/Synthesis 2.75 2.75 0.00 Conclusions/Evaluation 2.00 3.00 1.00 Average 0.65 Table 11 BSITM courses 1st study – critical thinking VALUE rubrics results Average Ratings CT Criterion ITM206 ITM491 Gain in Mean Interpretation/Knowledge/ 2.20 2.60 0.40 Evidence/Application 2.20 2.60 0.40 Analysis 2.00 2.40 0.40 Position/Synthesis 1.80 2.60 0.80 Conclusions/Evaluation 2.40 2.80 0.40 Average 0.48 Results of the Study All courses assessed in the first study showed increase in means except for the “position” criterion in Computer Sciences as indicated in Table 10 Computer Science courses teach many technologies and programming languages where knowledge is more procedural, and there is little room to “synthesize” principles in technical courses Significant increases were observed in “conclusions and evaluation” demonstrating that students improve their critical thinking abilities between the first and last course in the BSCS program The criterion of “analysis” also showed a substantial increase, validating that analytical abilities were improved over time The mean variations in undergraduate BSITM program had moderate increases, but the “position” or synthesis showed a notable increase over other criteria The BSITM details are shown in Table 11 The MSITM program identified that all rubrics’ criteria had mean increases Additional details for these courses are displayed in Table Overall, both ITM programs indicate an increase in critical thinking skills as depicted in Table 12 It is also interesting to note that all criteria in the ITM programs consistently showed gains in means Table 11 provides a concise summary of the increase in means for the courses evaluated in the first study For the second study, Table 13 indicates that elective courses received only modest advances, and the analysis shows a different pattern Thus, critical thinking results were not as evident as in the first study ITM515 and ITM530 can be taken in sequence and not have prerequisites In addition, the case study 115  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table 12 Comparing CIS programs Mean Summary Gains by Program CT Criterion Interpretation/Knowledge BSCS BSITM MSITM 0.50 0.40 0.53 Evidence/Application 1.00 0.40 0.73 Analysis 0.75 0.40 0.40 Position/Synthesis 0.00 0.80 1.07 Conclusions/Evaluation 1.00 0.40 0.66 Average 0.65 0.48 0.68 Table 13 MSITM elective courses comparison 2nd study – critical thinking VALUE rubrics results Gain in Means CT Criterion ITM515 ITM530 Gain in Mean 2.17 2.50 0.33 Interpretation/Knowledge/ Comprehension Evidence/Application 2.5 2.5 Analysis 2.5 2.5 Position/Synthesis 2.5 2.5 Conclusions/Evaluation 2.17 2.33 0.16 Average 0.10 evaluated for ITM515 was at the beginning of the course while the case study evaluated for ITM530 was in the middle of the course The different results can be explained by analyzing the two studies The first study sampled courses taught as the first and last course in a program, which indicates that students had several courses of study in between the two courses In the second study, elective courses showed identical mean results in “application, analysis, and synthesis” and could indicate that students did not connect these constructs to support how they synthesized course knowledge with other course experiences These results remain consistent with the lack of synthesis on written essays (Norton & Pitt, 2011) while these students attend difficult courses through online schools The electives courses had a small variation in the first and last criterion or “interpretation of materials and conclusions.” The greatest increase was in “interpretation” showing the ability to comprehend new topics The lowest increase was in the “conclusions” or creating a new point of view, which could also be justified by the level of difficulty in course materials between the two courses ASSESSMENT IMPLICATIONS AND CONCLUSION The importance of assessment is now well established at traditional face to face institutions but less mature in virtual academic settings A sound academic strategic plan will include assessment as one of the main components for continuous improvement Faculty and administrators need to have a clear 116  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs understanding of the reasons why assessment is important for their institution and its alignment with strategic goals (Dickeson, 2013) In return, this understanding will increase their engagement and commitment to each assessment activity Assessment results are part of the program metrics that gauge internal progress as well as ways the programs follow academic standards as expected by policy makers and the public (Dickeson, 2013) In this chapter the authors offer some general observations to support the factors that make assessment successful for distance learning programs First, administrators and faculty must have a clear understanding of the purpose and culture of assessment Collaboration and involvement of those who directly participate in the learning process and evaluations is imperative Assessment of student learning is integral to curricula design to improve the quality of academic programs Students advance in their degree program by demonstrating that they have acquired the skills commensurate with learning outcomes The use of learning management systems to gather program data facilitates the task of tracking student learning and provides the ability to compare assessment results across degree programs and across time The major impediment in the assessment cycle is not using reliable assessments tools to collect data and not closing the assessment loop through continuous improvements Second, in addition to accrediting bodies, other external institutions now identify standard sets of knowledge areas and knowledge units that programs can follow This is the example of the Association of Information Systems (AIS), ACM and IEEE that jointly developed model curricula in Information Systems for undergraduate and graduate programs It is also the case of the most recent CS2013 model curriculum for Computer Science programs and even more specific domain areas as the ones developed by the International Information Systems Security Certification Consortium (ISC2) for security programs and certifications Program chairs and faculty need to understand those model curricula and identify the differences and similarities with the own programs Each institution also needs to develop its own Institutional Learning Outcomes and program learning outcomes to best align with external standards from different accreditors Identifying the priorities of alignment is a task that should match the university’s academic strategic plan and mission Some accreditations have more priorities than others depending on the institution Third, the assessment cycle requires team brainstorming Given the different standards that each academic program must follow, it is important for program chairs and faculty to map Institutional Learning Outcomes with program learning outcomes and other external academic standards following a strategic plan Through this process, having an Institutional Learning Outcome like the one assessed in this chapter serves to analyze measures for improvement as far as course content Once the analysis is completed, faculty and program chairs need to identify the curricula changes that should be implemented in the program For example, when it comes to critical thinking, the study findings indicate that case assignments could be modified to improve position/synthesis in Computer Science and evidence/application in BSITM One way of addressing that gap is by requiring students to use more hands-on exercises and adding a course on business analytics that could provide the skills needed to use and apply evidence Establishing round table discussions with different stakeholders to improve curriculum is important because they provide opinions and expertise leading to consensus decisions Since assessment is a team effort, it is also necessary to separate assessment results from faculty performance evaluations and other similar campus activities (Hutchings, 2010) The assessment process focuses on student learning, programs, and courses, which in return make the full process a mechanism for institutional accountability 117  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs REFERENCES AAC&U (2014) AAC&U urges higher education to take the lead to develop better evidence on college learning and completion Retrieved from http://www.aacu.org/press/press-releases/aacu-urges-highereducation-take-lead-develop-better-evidence-college-learning Allen, M (2006).California State University Bakersfield assessment workshop materials Retrieved from http://www.skylinecollege.edu/sloac/assets/documents/AllenWorkshopHandoutJan06.pdf Andrade, H G (2000) Using rubrics to promote thinking and learning Educational Leadership, 57(5) Retrieved from http://www.ascd.org/publications/educational-leadership/feb00/vol57/num05/UsingRubrics-to-Promote-Thinking-and-Learning.aspx Association of American Colleges and Universities (2014.) Critical thinking VALUE rubrics Retrieved from http://www.aacu.org/value/rubrics/pdf/CriticalThinking.pdf Baroudi, Z (2007) Formative assessment: Definition, elements, and role in instructional practice Post Script: Post Graduate The Journal of Educational Research, 8(1), 37–48 Bell, M., & Farrier, S (2008) Measuring success in e-learning A multi-dimensional approach Electronic Journal of E-Learning, 6(2), 99–109 Capacho, J (2014) Representative model of the learning process in virtual spaces supported by ICT [TOJDE] Turkish Online Journal of Distance Education., 15(4), 75–89 doi:10.17718/tojde.63316 Davis, R., & Wong, D (2007) Conceptualizing and measuring the optimal experience of the eLearning environment Decision Sciences Journal of Innovative Education, 5(1), 97–126 doi:10.1111/j.15404609.2007.00129.x Davis, U C (2015) Academic Assessment Retrieved from http://assessment.ucdavis.edu/why/index.html Dickeson, R C (2013) Making Metrics Matter: How to Use Indicators to Govern Effectively Proceedings of the Western Association of College and University Business Officers, Anchorage, Alaska Retrieved from http://www.academicstrategypartners.com/wp-content/uploads/2014/02/Making-Metrics-Matter.pdf Glaser, E M (1941) An Experiment in the Development of Critical Thinking New York: Bureau of Publications, Teacher’s College, Columbia University Hutchings, P (2010, April) Opening doors to faculty involvement in assessment (NILOAOccasional Paper No 4) National Institute for Learning Outcomes Assessment Johnson, E., & Jenkins, J (2013) Formative and summative assessments Retrieved from http://www education.com/reference/article/formative-and-summative-assessment/ Krajcik, J (2011) Learning progressions provide road maps for the development and validity of assessments and curriculum materials Measurement, 9(2/3), 155–158 doi:10.1080/15366367.2011.603617 Lam, B., & Tsui, K (2013) Examining the alignment of subject learning outcomes and course curricula through curriculum mapping Australian Journal of Teacher Education, 38(12) doi:10.14221/ ajte.2013v38n12.8 118  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Miller, M A (2012, January) From denial to acceptance: The stages of assessment (NILOAOccasional Paper No 13) National Institute for Learning Outcomes Assessment New York University (2014) How to assess an academic program Academics Retrieved from http://www nyu.edu/academics/academic-resources/academic-assessment/guidance-for-academic-programs.html Noori, N., & Anderson, P (2013) Globalization, governance, and the diffusion of the American model of education: Accreditation agencies and American-style universities in the middle east International Journal of Politics Culture and Society, 26(2), 159–172 doi:10.1007/s10767-013-9131-1 Norton, L., & Pitt, E (2011) Writing essays at university: A guide for students by students Write Now Centre for Excellence in Teaching and Learning Retrieved from http://www.writenow.ac.uk/assessmentplus/documents/WritingEssaysAtUni-11.pdf Paul, R., & Elder, L (2008) The Miniature Guide to Critical Thinking Concepts and Tools The Foundation for Critical Thinking Retrieved from http://www.criticalthinking.org/files/Concepts_Tools.pdf Proitz, T (2010) Learning outcomes What are they? Who defines them? When and where are they defined? Educational Assessment, Evaluation and Accountability, 22(2), 119–137 doi:10.1007/s11092010-9097-8 Striven, M., & Paul, R (1987) Critical thinking as defined by the National Council for Excellence Retrieved from http://www.criticalthinking.org/pages/defining-critical-thinking/766 The Critical Thinking Community (2014) Defining critical thinking Retrieved from http://www.criticalthinking.org/pages/defining-critical-thinking/766 University of Richmond (2008) Assessment workbook for academic programs Retrieved from http:// www.nyu.edu/content/dam/nyu/academicAssessment/documents/Student%20Learning%20Outcomes/ Creating%20Learning%20Outcomes-University%20of%20Richmond.pdf Walvoord, B E., & Banta, T W (2010) Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education San Francisco, CA: John Wiley & Sons, Inc WASC Senior College and University Commission (2014) 2013 Handbook of Accreditation Retrieved from http://www.wascsenior.org/resources/handbook-accreditation-2013 119 ... http://www.criticalthinking.org/pages/defining -critical- thinking/ 766 The Critical Thinking Community (2014) Defining critical thinking Retrieved from http://www.criticalthinking.org/pages/defining -critical- thinking/ 766... training to the Institutional Learning Outcome of 113  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table Critical thinking study sample Institutional... 0.68  Academic Assessment of Critical Thinking in Distance Education Information Technology Programs Table 10 BSCS courses 1st study – critical thinking VALUE rubrics results Average Ratings CT

Ngày đăng: 11/10/2022, 07:39

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

w