1. Trang chủ
  2. » Ngoại Ngữ

aspire-reflections_on_evaluation_of_aimhigher

13 2 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 13
Dung lượng 73,5 KB

Nội dung

The authors Dr Chilosi is Research Assistant for Aimhigher and Associate Lecturer in the Department of Education at the University of Greenwich (corresponding author; address: QMC 162, University of Greenwich, Old Royal Naval College, Park Row, Greenwich, London, SE13 9LS; e-mail: d.chilosi@gre.ac.uk) Dr Noble is Pro Vice Chancellor (Learning and Quality) at the University of Greenwich Dr Broadhead is Pro Warden (Students) and Senior Lecturer in the Department of History at Goldsmiths College, University of London Dr Wilkinson is Pro Vice Chancellor (Infrastructure and External Relations) at London South Bank University Red herrings and real problems Some reflections on the evaluation of Aimhigher Abstract Aimhigher is a key government initiative to widen participation in higher education (HE) Drawing on our experience of evaluating an Aimhigher partnership, this discussion paper engages with current methodological debates on the evaluation of Aimhigher After summarising how Gorard et al have criticised Widening Participation (WP) research, it shows how the Higher Education Founding Council England (HEFCE) has responded to the criticisms It then argues that the debate is distorted by red herrings, but there are also real problems Doubts are raised on whether the Aimhigher partnerships are well-placed to research on their contribution to WP, and currently employed data-gathering and analytical strategies are sound It is recommended that academic departments and Aimhigher partnerships collaborate to evaluate the initiative, and that serious consideration is given to employing an experimental design Suggestions on how this can be done are made Increasing and broadening HE participation is a dominant trait of current HE policy in the UK (Dearing Report 1997; DfES 2003a; 2003b; 2006) The focus of the policy is on raising the attainment and aspirations of young non-traditional learners through the provision of targeted activities Aimhigher is a key government initiative to widen and therefore increase participation in HE It organises cultural activities to ease the educational progress of young non-traditional learners aged between 14 and 19 These activities range from mentoring, summer schools, visits to university and others, and are designed to influence young people towards attaining in school, developing HE awareness, and aspiring to enter into HE They are delivered by partnerships between HE institutions, further education colleges, local education authorities, schools and other educational agents located in deprived and low-participation areas Since 2004, following the integration of two previous initiatives, Aimhigher: Excellence Challenge and Partnership for Progression, Aimhigher has been the majority provider of WP activities for young people The volume of the activities increased exponentially in recent years The extent to which Aimhigher has demonstrated to be an effective means to widen HE participation is contested Gorard et al’s criticism of the evaluation of WP initiatives In an influential review of Widening Participation (WP) research written for the HEFCE and published in July 2006, Gorard et al (2006; see also Gorard and Smith 2006) have harshly criticised the evaluation of WP initiatives The authors complain that the quality of WP literature is below acceptable standards, as is typically marred by defects in reporting the evidence, makes unwarranted claims of causality and is not based on the experimental method As a result it is difficult to quantify the impact of WP interventions and assess their cost-effectiveness, or otherwise (see also Gorard 2002a; 2002b; Gorard and Cook 2007) The low quality of WP research is partly blamed on the fact that practitioners contribute to WP research Practitioners, in fact, Gorard et al argue, often produce descriptive reports, they not base themselves on the experimental method and are under pressure ‘to produce results in accord with some predetermined plan’ (Gorard 2002c: 381) Given these premises, it will not come as a surprise that for Gorard et al (2006) the evaluation of Aimhigher, much of which is undertaken by practitioners not relying on the experimental method, is wanting In their view, to date no convincing evidence of impact has been produced on pre-entry interventions for school pupils and partnership-based interventions, such as Aimhigher This is due to the predominance of a wishful, rather than sceptical, approach to the evidence, the lack of experimental research, over-reliance on attitudes of the target groups of the initiatives, and insufficient attention paid to their behaviour in relation to actual HE participation To remedy this situation Gorard et al make two basic recommendations First, they make a plea for evaluators to change attitude, and take a sceptical approach towards the evidence And, second, they recommend that evaluators rely less on secondary and attitudinal analyses and more on experimental designs, so as to make robust claims of causality The HEFCE’s response Gorard et al’s criticisms were addressed by the HEFCE in another review of WP research published later in the same year, in November 2006, and based on a survey of the evidence collected by the HEIs The stated objective of the report was to address ‘concerns about lack of progress in widening participation (WP)’ (HEFCE 2006: 9) The Higher Education Initial Participation Rate (HEIPR) and the University Council Admission System (UCAS) data on HE applications and entries, in fact, provided (and continue to provide) sobering indications on the extent to which HE participation in England had increased, let alone widened, during the previous years ‘The HEIPR had fallen from 42.3 per cent in 2003-04 to 42 per cent in 2004-05; the performance indicators … showed falls in the proportion of entrants from lower socio-economic groups and from low participation neighbourhoods of 0.3 per cent and 0.2 per cent respectively between 2003-04 and 2004-05’ (HEFCE 2006: 9) In response to Gorard et al’s criticisms, the HEFCE stressed three points First, it acknowledged that there are caveats associated with attitudinal analyses of the impact of Aimhigher and other WP initiatives However, it reasserted their value as a monitoring and evaluating device and emphasised that, to date, attitudes of learners and teachers have been consistently and overwhelmingly positive Specifically, the caveats associated with attitudinal analyses stressed by the HEFCE are that they are weak means to establish causal connections, that positive responses may be elicited by paying attention to people, and that the impact on attitudes may not be longlasting Second, the HEFCE acknowledged that not enough has been done to assess the impact of WP initiatives on schooling attainment and HE participation, while stressing that reasonable expectations need to be placed on the effect of the interventions on the former The one exception to the rule is represented by Aimhigher, with regard to which the HEFCE feels satisfied that convincing and precise evidence has been produced on attainment by the national evaluation carried out by the National Foundation for Educational Research (NFER), and, to a lesser extent, on HE participation by the NFER and the HEIs For example, it has been found that participating in Aimhigher activities was associated with ‘[a]n average improvement of 2.5 points in GCSE total point scores’ and a ‘3.9 percentage point increase in Year 11 pupils intending to progress to HE’ (HEFCE 2006: 23) Third, the WP evidence, according to the HEFCE, should be taken seriously Practitioners have a sceptical attitude towards the evidence and acknowledge caveats; it is possible, but unlikely, that the evidence is reported selectively Moreover, ‘[i]f the ‘evidence bar’ is set too high’, the HEFCE (2006: 6-7) pointed out, ‘we run the risk of discouraging any attempt to estimate the effectiveness of the interventions’ (p 6-7) In conclusion, recommendations were made that evaluation strategies for Aimhigher and other WP initiatives are strengthened and the quantitative impact of individual initiatives on HE participation is investigated How exactly, according to the HEFCE, the evaluation of Aimhigher should be strengthened was set out in three documents, Measuring success: A guide to the evaluation of Aimhigher, written by Sue Hatt and published in November 2007, and the recently issued Guidance for Aimhigher partnerships and the Guidance to Aimhigher partners Paper C In these, a plea for more systematic evaluation of Aimhigher as distinct from monitoring was made In contrast to the previous report, however, the expectation of measuring the discrete impact of Aimhigher was relaxed The viability of the experimental method was also explicitly rejected ‘The key evaluation question’, the HEFCE (cited in Johnson and Paton 2008: 6) pointed out, ‘often seems too difficult to answer: did the wp programme lead to these outcomes or was it due to other influences? There seems no scope for setting up a social science experiment in which the experiences of a wp group is compared with a control group However, we may have made too much of the problem It may be good enough for the practical purposes of policy makers and politicians if we can make an evidence based judgement on the balance of probabilities that a given programme is producing reasonable results And if this is all that is available to us the choices are clear’ Instead, Hatt (2007) and the HEFCE (2008) recommended that a coherent evaluation strategy is deployed at the national and the local levels In particular, they see the national evaluation as being concerned with secondary indicators of educational progression and widened participation, such as trends in HE applications by social class, analysis of overall attitudinal changes of Aimhiger participants based on the data provided by the partnerships and investigating the possibility of examining statistical associations between Aimhigher interventions and HE participation The Aimhigher partnerships were advised to focus on participants’ attitudes and behaviour through survey data and tracking studies, investigating the different impact of different types of activity, examine the impact of Aimhigher on providers and gauge the schools’ perspective ‘As relationships are strengthened among schools, colleges and HEIs, funders expect more of the evidence for the effectiveness of WP will come from schools If schools value WP activities, build them into their plans for school improvement and can provide evidence of improved learner outcomes, the case for WP is made’ (HEFCE 2008: 45) Reflexive research Legitimate doubts can be raised on whether the Aimhigher partnerships are wellplaced to research on their contribution to WP The HEFCE understates the risk that practitioners report selectively and make unwarranted claims of causality Aimhigher partnerships have a strong vested interest in claiming success Indeed, particularly as the HEFCE places more emphasis on evaluation, their very survival is at stake The situation is exacerbated by the fact that Aimhigher partnerships are fragile spaces of collaboration relying strongly on trust and belief (Chilosi et al 2008a) As a result, partnerships have powerful financial and normative incentives to report success and under-explore the unintended effects of and potential improvements to current WP policies Moreover, practitioners hold administrative posts and research is supervised by managers The results are that Aimhigher partnerships often lack the expertise in social research and statistical methods necessary to carry out experiments and complex manipulation of existing datasets aimed at measuring discrete effects of determinate interventions, Aimhigher researchers are under particularly intense pressure to ‘trying to deliver bricks without straw’ (Bone and McNay 2005: 33), as stressed by Gorard et al, much research is descriptive, rather than analytical, and pays scant attention to previous findings Finally, and related to the last point, short-term funding makes it difficult to carry out long-term research projects, thereby restricting the scope for longitudinal studies and large scale, in depth, comparative analyses In other respects, however, partnerships exhibit distinct institutional advantages with respect to conventional academic settings Accountability to the partnership favours ethical research, as it leads researchers to research ‘with’, rather than ‘on’ Aimhigher Although there is a serious risk of ‘going native’ and doing research ‘for’ Aimhigher, the dynamic implies that the views and interests of at least some of the subjects (namely, the Aimhigher practitioners) are easy to access and thus can be adequately taken into account when carrying out the research and the reporting the findings Supervision by managers provides researchers with a valuable practical, as opposed to ‘armchair’, perspective, and reflexive research offers the opportunity to improve one’s own approach to practice Moreover, social links spanning HEIs and educational sectors facilitate access to an otherwise difficult to reach evidence-base Notoriously, partnerships are singularly difficult to penetrate, partly reflecting their fragile institutional status As a result, Aimhigher partnerships are ideally placed to address the lack of cross-institutional studies, which Gorard et al see as one of the main defects of current WP research and, in our view wrongly, blame on practitioners’ reflexive research Hence, there are grounds to encourage collaboration between Aimghigher partnerships and academic departments with a scholarly interest in WP towards the evaluation of the initiative Data and analysis There are doubts also on the extent to which currently employed datagathering and analytical strategies are sound Secondary data on schooling attainment and HE participation has been abused, both on the part of the promoters and the detractors of Aimhigher ‘[S]econdary data … help partnerships identify the direction and pace of change’ (Hatt, Baxter and Tate 2007: 287) However, unless the comparison of secondary data on schooling attainment and HE participation include areas not exposed to the initiative and is accompanied by in-depth study of the context, this is as far as they can get Aimhigher is neither the sole nor, realistically, the most important factor influencing HE applications and entries For example, increasing numbers of HE applicants and entrants may merely signal demographic shifts, rather than increased HE participation, and given that the latest figures on borough population date back to the 2001 national census, one can only make imprecise assertions on this The expected effect of Aimhigher on borough and regional level data, let alone national data, is limited, one might argue even negligible To our knowledge two measures on the quantitative impact of Aimhigher on HE entry are available to date The one produced by the NFER cited earlier, and one produced by the authors (Chilosi et al 2007a) Both place the increased probability of applying as a result of being exposed to Aimhigher in the order of per cent Assuming that the estimate is correct, an optimistic estimate would place the Aimhigher effect at the borough-level in the order of sixty applicants, where the total number is in the order of 1,000 (the computation is based on the case of South East London, where Aimhigher is targeting about 10,000 learners each year across six boroughs) Even when school or ward-level data is examined, the measures fall short of accounting for arguably more important intervening factors Thus, for instance, Johnson and Paton (2008) found that while Aimhigher claimed credit for increased attainment in some Southampton schools, this was not the perception of the latter Basing, as Hatt, Baxter and Tate (2007) do, assertions of success of the programme on a declining difference between applicants from a high and low socio-economic status in a region of intervention is a dangerous game to play The finding is hardly typical In South East London, for example, there is no evidence of such changes taking place The same goes for national data, which, as mentioned earlier, to date have given rise to little evidence that in recent years HE participation has risen or widened, indicating that the current government’s focus on attainment and aspirations to widen and increase HE participation may be ill-thought, and thus the cost-effectiveness of Aimhigher cannot be taken for granted Attitudes provide more direct indications of impact than secondary data on schooling attainment and HE applications and entries They are counterfactual insofar that learners and other actors are at liberty of stating that participating in Aimhigher activities was not a positive experience and that it did not change attitudes towards HE Unless a control group is included in the study attitudinal measures not permit to quantify the impact of the initiative in a precise manner However, quantification is a useful, but not indispensable, means of evaluating the effectiveness of the initiative, and an experimental design permits robust claims of causality only under certain conditions, which, as the HEFCE pointed out, are difficult to meet in a WP context, a point to which we shall return later Without sharing the view that a vitalist ontology prevents one from quantifying the impact of WP activities (Paczuska 2004), qualitative studies make it possible to gain insights into the reasons underlying the results of quantitative analysis and capture more precisely than quantitative measures aspects of WP policy, such as, for example, the perception of student ambassadors of learners (Staetsky 2008; Chilosi et al 2008b; 2008c; Gorard 2002d; 2002e; Hatt 2007) They also give learners and other actors the opportunity to express their views in a relatively unstructured manner, thereby highlighting unexpected effects Indeed, it can be argued that the data on learners’ attitudes is currently underused For instance, to date, the different impact of various activities has been mainly evaluated through the potentially misleading perceptions of the providers Survey analysis carried out by the authors (Chilosi et al 2007b) has suggested that, contrary to the views of the providers, those individuals taking part in an activity which did not involve a visit to a university tended to find it more helpful than the others Nonetheless, doubts remain on how representative a few quotations are and on the reliability of the surveys, given that small samples and low response rates are typical Thus, for example, Johnson and Paton (2008) found that the pre-activity survey they examined was too flawed for its results to be even presented, thereby undermining the claim that learners’ attitudes towards HE changed as a result of the intervention Tracking-surveys (Chilosi et al 2007b) have suggested that enthusiasm for Aimigher activities does tend to wane over time, and thus we can expect the ‘euphoria effect’ to significantly influence the results This was not the case for attitudes towards HE And yet, doubts remain on the extent to which positive attitudes towards HE were the result of participating in an Aimhigher activity There was a tendency towards pleasing the interviewer on the part of the respondents, as evidenced by the fact that very few individuals expressed views on what put them off from going into university, whilst almost all of them were happy to answer the question on what attracted them to it In another research (Chilosi et al 2008d) based on interviews with school learners, it was found that virtually all the respondents declared a positive intention to go into HE, regardless of whether they participated in an Aimhigher activity or not Most participants stressed that Aimhigher encouraged educational progression, but did not change their mind about HE, as they intended to go into it also before taking part in the activity Similar remarks apply to the study of attitudes of the provider of activities, who, as argued earlier, have a strong incentive to report success Notoriously, there is an imperfect correspondence between stated and actual behaviour, and Aimhigher participants are no exception; it is common for them to declare an intention to enter in HE and subsequently fail to so The national evaluation of Aimhigher: Excellence Challenge (Emmerson et al 2006; Morris, Rutt and Yeshanew 2004, 2005; Morris and Golden 2005; Morris and Rutt 2005, 2006) found that in most cases activities had a statistically significant impact on GCSE results and progress into post-compulsory education, and promoted positive attitudes towards HE The one study addressing the effect on HE participation (Emmerson et al 2006), however, found that Aimhigher did not have a statistically significant impact on the variable In a tracking study of Aimhigher participants conducted by the authors (Chilosi et al 2008e) the proportions of individuals declaring an intention to enter into HE between potential entrants and individuals who enrolled at the University of Greenwich was compared The figures turned out to be very similar Measuring the effect of Aimhigher As argued earlier, the current institutional setting implies that it is difficult for Aimhigher partnerships to examine statistical associations between Aimhigher interventions and HE participation However, if the ‘evidence bar’ set too low the potential for fruitful research carried out at the local level would be lost, and there is a risk that misleading messages are given to the public In theory, the effect of Aimhigher on HE entry of past cohorts can be measured through the complex manipulation of existing datasets This approach has the advantages of being relatively economical to implement and being unobtrusive, as it does not require any new data to be produced In practice, the author’s attempt of following such a route (Chilosi et al 2007a) suggests that there are two main difficulties First, the monitoring data is good for the purpose for which it was gathered, namely monitor how many participants have been targeted and how It is less fit for the purpose of carrying out complex statistical analysis, the results of which are sensitive to inaccuracies embedded in the datasets Second, there is not enough data available to control for all relevant factors Hence, the most one can achieve is measures of association, as opposed to causality An experimental design can help addressing these limitations for investigating the impact of Aimhigher on current and future cohorts It may be that in a context of scarce resources there is a case for evaluators to focus on Aimhigher participants However, as argued by Gorard (2002f), the ethical implications of deliberately excluding individuals from a given intervention are less serious than it is often thought Insofar as one cannot assume a priori that the intervention is indeed beneficial, as opposed to having no effect or being harmful, and notwithstanding what the HEFCE states the available evidence to date falls short of proving the case for Aimhigher, failing to exploit the opportunity to appropriately investigate its implications represents a more serious ethical infringement than deliberately excluding some individuals from it Moreover, the HEFCE overstates the difficulties of finding appropriate comparators Our experience with interviewing learners suggests that, perhaps unsurprisingly, there are many individuals from a social background similar to that of Aimhigher participants who have not been exposed to the programme There are, however, two obstacles that prevent partnerships from adopting an experimental design for the local evaluation First, as mentioned earlier, short-term funding restricts the scope for long-term projects Particularly since Aimhigher cohorts often take unconventional educational routes, longitudinal studies need to be carried out over a number of years to provide solid evidence of impact, and thus this obstacle could be partly overcame by carrying out small-scale research projects involving those Aimhigher participants and control groups who are in their final school-years Second, a rigorous experimental design demands that either all eligible subjects have to be included in the study or a random sample has to be selected, and then the chosen individuals or relevant institutions are compelled to provide information on their educational choices Arguably this would not be ethically justified in the case of Aimhigher, particularly as there is an issue of stigma associated with participating in WP activities, reflecting the fact that current WP policies embody a deficit model of the excluded (Gorard et al 2006; Chilosi et al 2008c; 2008d; Taylor 2008), and thus there is case for minimising direct contact with them One possible way forward would be to employ a design similar to that used by Maras et al (2007), namely asking school-learners to answer a set of questions, regardless of whether they took part in Aimhigher activities In this way, convincing evidence that being exposed to WP activities improves attitudes towards studying and HE was produced The fact that individuals self-select themselves for the study on grounds other than being Aimhigher participants implies that sampling bias and intrusion is limited Investigating the impact of Aimhigher on HE participation would imply surveying, perhaps combining qualitative and quantitative techniques, university-age individuals, both in HE and not, and investigating whether they have participated in Aimhigher activities and whether these have had an impact on their educational choices Although individuals not in HE are difficult to reach, as shown by Heath, Fuller and Paton (2008), it is not impossible Word-count: 4,727 Acknowledgements We would like to thank Gwenlian Evans for reading and commenting on a previous draft of the paper References Bone, J and McNay, I 2005 Higher Education and the Human Good Bristol: Tockington Press Chilosi, D., Noble, M., Broadhead, P and Wilkinson, M 2007a Measuring the impact of Aimhigher on participation in higher education from South East London Paper presented at the FACE conference London: University of East London ——— 2007b How long is the impact of Aimhigher? An analysis of three surveys carried out by Aspire South East London Aimhigher Paper presented at the Participation Research Group seminar Southampton: University of Southampton ——— 2008a On partnership and governance: The case of an Aimhigher WP partnership Paper presented at the Exploring Learning Trajectories conference Ormskirk: Edge Hill University ——— 2008b Employability and the student ambassador scheme in South East London In University Life Uncovered: Making Sense of the Student Experience Southampton: The Higher Education Academy ——— 2008c The kid behind the hood An analysis of the views of the student ambassadors Unpublished article London: University of Greenwich ——— 2008d HE choice and the Aspire experience Unpublished paper London: University of Greenwich ——— 2008e The impact of Aimhigher on learners in South East London Paper presented at the Tree by Sea conference Brighton: University of Sussex Dearing Report 1997 Higher Education in the Learning Society: Report of the National Committee into Higher Education London: HMSO http://www.ncl.ac.uk/ncihe/index.htm Department for Education and Skills 2003a White Paper: The future of Higher Education London: DfES http://www.dfes.gov.uk/hegateway/hereform/index.cfm ——— 2003b Widening Participation in Higher Education Nottingham: DfES ——— 2006 Widening Participation in Higher Education Nottingham: DfES Emmerson, C., C Frayne, S McNally and O Silva, O 2006 Aimhigher: Excellenge Challenge: A policy evaluation using the Labour Force Survey Research Report RR813 London: DfES Gorard, S 2002a Fostering scepticism: the importance of warranting claims In Evaluation and Research in Education, 16:3, 136-49 ——— 2002b The role of causal models in education as a social science In Evaluation and Research in Education, 16:1, 51-65 ——— 2002c Political control: A way forward for educational research? In British Journal of Educational Studies, 50:3, 378-89 ——— 2002d The role of secondary data in combining methodological approaches In Educational Review, 54:3, 231-37 ——— 2002e Can we overcome the methodological schism? Four models for combining qualitative and quantitative evidence In Research Papers in Education, 17:4, 345-61 ——— 2002f Ethics and equity: pursuing the perspective of non-participants In Social Research Update, 39, 1-4 Gorard, S and Smith, E 2006 Beyond the ‘learner society’: what have we learnt from Widening Participation research In Journal of Lifelong Education, 25:6, 575-94 Gorard, S., Adnett, N., May, H., Slack, K., Smith, E and Thomas, L 2006 Review of Widening Participation Research: Addressing the Barriers to Participation in Higher Education Bristol: HEFCE Gorard, S and Cook, T 2007 Where does good evidence come from? In International Journal of Research and Method in Education, 30:3, 307-23 Hatt, S 2007 Measuring Success: A Guide to Evaluation for Aimhigher Aimhigher Hatt, S., Baxter, A and Tate, J 2007 Measuring progress: an evaluative study of Aimhigher South West 2003-2006 In Higher Education Quarterly, 61:3, 284305 Heath, S, Fuller, A and Paton, K 2008 Network-based ambivalence and educational decision-making: a case study of ‘non-participation’ in higher education In Research Papers in Education, 23:2, 219-29 HEFCE 2006 Widening Participation: A Review Bristol: HEFCE ——— 2008 Guidance for Aimhigher Partnerships Bristol: HEFCE Johnson, R and Paton, K Chalk and cheese? 2008 Evaluating the impact of Aimhigher in six Southampton schools Paper presented at the Participation Research Group seminar Southampton: University of Southampton Maras, P., Carmichael, K Patel, S and Wills, J 2007 The trouble with year 10 1316 year old school students’ attitudes to higher education In Social Psychology of Education, 10:, 375-97 Morris, M and S Golden 2005 Evaluation of Aimhigher: Excellence Interim Report 2005 Research Report RR648 London: Challenge DfES Morris, M and S Rutt 2005 Evaluation of Aimhigher: Excellence Challenge: Aspirations to HE: One year on Research report RR651 London: DfES ——— 2006 Evaluation of Aimhigher: Excellence Challenge longitudinal pupil analysis report Brief No: RB814 London: DfES Morris, M., S Rutt, and T Yeshanew 2004 Pupil outcomes: The impact of Aimhigher Excellence in cities evaluation consortium London: DfES ——— 2005 Pupil outcomes one year on Research Report 649 London: DfES Morris, M., S Golden, E Ireland and M Judkins 2004 Evaluation of Aimhigher: Excellence Challenge: The views of Partnership Coordinators Research report RR650 London: DfES Paczuska, A 2004 Learning communities: student mentoring as a ‘site’ for learning about higher education In Journal of Access, Policy and Practice, 2:1, 59-70 Staesky, L 2008 Participation in HE: Review of Quantitative Literature Southampton: University of Southampton Taylor, Y 2008 Good students, bad pupils: constructions of “aspiration”, “disadvantage” and social class in undergraduate-led widening participation work In Educational Review, 60:2, 155-68

Ngày đăng: 20/10/2022, 08:39

w