Background
The past decade has seen rising numbers of students seeking degree level study outside their home countries UNESCO estimates that figures for international students have risen from 1.7 million in
According to the Institute for International Education (IIE), the number of international students rose from 2 million in 2000 to 2.5 million in 2006, with an estimated 2.9 million for the 2006/7 academic year Among the eight most popular destination countries for international study, four are English-speaking: the US, UK, Australia, and Canada The UK ranks as the second most favored destination, accounting for 13% of all international students in 2006/7 Despite variations in enrollment figures across different countries, the UK has consistently seen an upward trend in international student numbers, increasing from 305,395 in 2002/3 to 376,190 in 2006/7 The leading countries of origin for international students in the UK during 2006/7 included China, India, the US, Germany, France, Ireland, Greece, Malaysia, and Nigeria.
These effects of the internationalisation of higher education have been particularly pronounced in the
UK governmental policies have significantly promoted the growth of international student enrollment in higher education Initiatives by former Prime Minister Tony Blair in 1999 and 2006 aimed at increasing non-UK student numbers, with the first initiative exceeding its targets and the second seeking to add 70,000 international students by 2011 This surge in overseas students coincided with the diversification of the UK higher education sector and a shift in funding responsibility from the state to individual students Additionally, the statutory cap on tuition fees for home students has created economic incentives for higher education institutions (HEIs) to explore alternative income sources, leading to the development of internationalization strategies As a result, government policies have indirectly motivated HEIs to enhance their financial stability by attracting more international students.
According to the Higher Education Statistics Agency (HESA, 2008), the UK has seen significant growth in international student enrollment, particularly in postgraduate programs In the academic year 1999/2000, there were 408,620 postgraduate students in the UK, with 23% identified as non-UK students By the year 2006/07, the number of postgraduate students had increased, highlighting the growing trend of international participation in higher education.
In recent years, the total number of students has reached 559,390, with 8.6% originating from the non-UK EU and 24.4% from other regions The significant increase in overall student numbers is primarily driven by the latter group, as the proportion of non-UK EU students has remained stable since the 2002/3 academic year Over the past nine years, both the absolute and proportional numbers of non-UK students have risen While HESA statistics do not differentiate between research and taught postgraduate students, a 2004 report by the Higher Education Policy Institute indicates that the surge in student enrollment is largely linked to international students enrolling in taught Master's programs, whereas postgraduate research student numbers have remained consistent during this timeframe.
A UK degree is highly valued by international students due to its prestigious reputation and high-quality education Obtaining a postgraduate degree in the UK not only demonstrates advanced English language proficiency, eliminating the need for future formal testing, but also offers a one-year Master's program that is more cost-effective and time-efficient compared to two-year programs in mainland Europe Furthermore, the British Council highlights the diverse range of courses, flexible study options, and the multicultural environment of UK universities, where over 30% of students are often from international backgrounds.
The recruitment of non-native English speaking (NNES) students into UK higher education raises concerns about maintaining degree standards, as highlighted in a recent debate involving the Parliamentary Select Committee, the Quality Assurance Agency (QAA), and the BBC Professor Alderman's inaugural lecture at the University of Buckingham criticized the low levels of English literacy among international students, whose tuition fees are vital for university funding This claim received support from several BBC articles featuring unnamed academics In contrast, the QAA noted the challenges in verifying such claims due to confidentiality and legal constraints, while also conducting research on the English language proficiency of international students.
In the 1990s, Australia experienced a significant increase in non-native English-speaking (NNES) students in higher education, sparking debates similar to those in other countries Coley (1999) highlighted the diverse sources of evidence used to assess English proficiency entry requirements across Australian higher education institutions (HEIs), revealing a lack of standardization Discrepancies between selection processes, entry requirements, and reported language proficiency raise concerns about the criteria and evidence used in the selection of NNES students.
UK university entry requirements for NNES applicants
IELTS test
As part of their admission criteria, UK universities typically require NNES applicants to produce evidence of language skills in the form of formal English test scores The International English
The International English Language Testing System (IELTS) is the most commonly referenced language proficiency test on university websites While other tests are accepted, IELTS remains the standard against which alternative scores are evaluated This test is jointly managed by the British Council, IDP: IELTS Australia, and the University of Cambridge ESOL.
IELTS, a globally recognized examination, has years of experience in accurately assessing English language proficiency With a presence in 120 countries through a network of 500 locations, it has served approximately 6,000 organizations worldwide.
In the academic IELTS for university entry, scores are presented in whole and half numbers, each accompanied by qualitative descriptions of language abilities (IELTS, 2007) Rather than setting a specific pass score, IELTS focuses on grading performance, encouraging institutions to interpret test scores based on course requirements, their experience with international students, and the analysis of sub-test scores.
2007) In view of this, IELTS therefore leaves to academic stakeholders the responsibility for setting their own entrance requirements in terms of test scores
IELTS has established guidelines linking test scores to courses based on their linguistic and academic demands A band score of 7.5 is deemed 'acceptable' for rigorous programs like medicine and law, while a score of 7 is considered 'probably acceptable.' In contrast, scores of 6 and 5.5 are viewed as suitable for less demanding fields such as animal husbandry and catering Courses that fall into more academically or linguistically challenging categories occupy a middle ground From the perspective of UK higher education institutions, there is a need to evaluate whether these classification guidelines accurately represent the diverse array of courses available in the UK.
Band Score Qualitative description of capability
7 Good user Has operational command of the language, though with occasional inaccuracies, inappropriacies and misunderstandings in some situations Generally handles complex language well and understands detailed reasoning
6 Competent user Has generally effective command of the language despite some inaccuracies, inappropriacies and misunderstandings Can use and understand fairly complex language, particularly in familiar situations
A modest user has a partial command of the language, enabling them to grasp the overall meaning in various situations, despite frequent mistakes They can manage basic communication effectively within their specific field.
4 Limited user Basic competence is limited to familiar situations Has frequent problems in understanding and expression Is not able to use complex language
The growth of interdisciplinary courses and the differences in course types and levels in higher education have raised questions about IELTS band scores Despite this, IELTS provides guidance for institutions and departments to establish score standards that better fit their specific contexts.
While limiting university admissions to a single English language test may seem ideal for consistency, it is impractical; thus, many UK higher education institutions (HEIs) accept various alternative tests to demonstrate proficiency Commonly recognized assessments include the Test of English as a Foreign Language (TOEFL), Cambridge ESOL certificates, and the Test of English for International Communication (TOEIC) Each test varies in structure, complicating standardization, although equivalence tables and specific requirements are available on university websites Notably, TOEFL offers both paper and internet-based formats, each emphasizing different language skills Additionally, some UK HEIs, such as those offering the Test of English for Educational Purposes (TEEP), have developed their own English language exams.
University of Reading and the University of Warwick English Test (WELT).
UK university entrance requirements for English test scores
IELTS recommends scores between 6 and 7.5 for entry into tertiary studies, with most higher education institutions (HEIs) adhering to these guidelines Searches on the IELTS website reveal that while a score of 7.5 is rarely required, it is predominantly associated with medical and veterinary programs The majority of institutions accept scores between 6 and 6.5, indicating that students have a generally effective command of the English language However, some institutions accept lower scores, with a few allowing as low as 4.5, which is classified as a Modest user (5) according to Brown (2008).
UK higher education institutions (HEIs) currently recognize a diverse spectrum of English language proficiency levels for university admission, as indicated by varying test score requirements Limited users, who possess partial command of the language and basic competence restricted to familiar situations, often struggle with complex language use.
The variability in entry test scores reflects the contextual sensitivity to disciplinary, program, and institutional diversity that IELTS encourages higher education institutions (HEIs) to consider when establishing score requirements This is supported by the fact that individual HEI websites show differing entry requirements both within and among institutions Research students and certain programs may require higher scores or specific sub-score levels, particularly for postgraduate studies This diversity in requirements helps address challenges associated with rigid score thresholds, where changes in underlying measurements may not correlate with significant consequences, similar to issues seen in pass-fail assessment decisions in education.
Several UK higher education institutions (HEIs) have adopted internationalisation strategies that include the establishment of academic English departments These departments play a crucial role in the selection process, deliver pre-sessional or foundation courses, and provide continuous support throughout students' degree studies Pre-sessional courses are designed to help students with borderline entry scores achieve the necessary language proficiency before starting their programs, which partly explains why UK HEIs accept lower entry test scores Notably, students who complete pre-sessional courses at their admitting institution may not need to take a formal English test afterward Additionally, research indicates that improving one band score in English proficiency requires 200-300 hours of full-time study (Gillett, 2008).
This review of UK higher education institution (HEI) entry requirements aims to provide an overview of the current regulatory landscape, rather than an exhaustive analysis of entry criteria The surprising variety of acceptable English test scores raises questions about the selection processes used by institutions It is evident that regulatory information does not capture the actual decision-making processes, criteria, or justifications behind selections Therefore, there is a compelling case for a deeper exploration of selection rationales to better understand the connection between degree standards and linguistic proficiency.
The extensive literature on the internationalisation of higher education primarily focuses on the student perspective, reflecting current UK higher education policies However, there is a notable lack of research examining the academic staff perspective, particularly regarding academic practices beyond teaching and learning, as well as the pedagogical implications of diverse nationalities, cultures, and varying levels of English language proficiency in the classroom.
The literature review encompasses three key areas: first, it examines predictive validity studies related to the IELTS test; second, it explores the role of formal test scores in higher education admissions; and finally, it presents a recent investigation into the selection process for postgraduate students at a UK university.
Predictive validity studies of IELTS and academic progress
The diversity of entry scores for tertiary study raises questions about the evidence supporting these requirements Predictive validity studies, which correlate entry test scores with academic outcomes for non-native English speakers (NNES), have been commonly used to address this issue An influential study by Criper and Davies (1987) suggested that a score of 6.5 effectively distinguished between academic success and failure However, later research has shown weaker correlations between English entry test scores and academic performance (Cotton and Conroy, 1998; Hill et al, 1999; Kerstjens and Neary, 2000; Lee and Greene, 2007) Methodological issues further complicate these findings, leaving higher education institutions (HEIs) uncertain about establishing effective cut-off points for admissions (Banerjee, 2003) Some researchers argue that linguistic ability is just one of many factors influencing academic success (Rea-Dickins et al, 2007; O’Loughlin, 2008), and the predictive validity model fails to capture the complexity of the learning process Supporting this perspective, a recent study found that NNES students' language proficiency correlated with their pre-entry test scores, indicating that IELTS scores may be a reliable measure (Ingram and Bayliss, 2007).
Academic success is significantly influenced by various factors, including motivation, subject discipline, program structure, socio-cultural context, and the need for ongoing language support Each of these elements plays a complex role in shaping a student's educational experience.
Outcomes research indicates that focusing on the learning process and context can clarify the link between learning proficiency and academic progress It is essential to investigate whether admissions personnel consider factors beyond English language test scores in their selection decisions.
The knowledge of admissions staff about English language tests
Research has examined the attitudes and knowledge of admissions staff regarding test instruments at their institutions across Australia, China, and the UK, revealing that staff understanding of tests and score significance is often limited (Coleman et al, 2003; Rea-Dickins et al, 2007; O’Loughlin, 2008) Despite the authors' expertise in English language testing, there is a consensus that institutions and test providers need to enhance awareness of test structures and scores The underlying assumption is that informed admissions decisions could lead to better selection processes and outcomes, although this hypothesis lacks validation through intervention studies O’Loughlin (2007) highlights the need for further research to investigate how admissions staff utilize their knowledge in interpreting test scores for selection purposes, given the limited understanding demonstrated in previous studies and the reliance on questionnaires as a research method.
Sub-scores and their interpretations are often highlighted as an area of limited understanding A study by Rea-Dickins et al (2007) examined the linguistic development of non-native English speaking postgraduate students.
Students from the Departments of Education and Politics at a UK university expressed that their listening skills were often undervalued by formal assessments They found the IELTS reading test inadequate for reflecting their academic reading abilities, particularly in the context of demanding postgraduate studies Unlike the IELTS reading test, which evaluates immediate comprehension through direct questions, academic reading necessitates synthesizing multiple texts to produce original written work, a task for which the students felt unprepared The authors of the study propose that sub-scores in listening and reading may serve as more accurate predictors of academic success than those in speaking and writing While additional research is needed to validate these findings, the study indicates that these sub-scores could play a significant role in selection processes.
Decision making processes of student selection
A singular example of micro-level research on student selection is found in Banerjee’s doctoral thesis
In her 2003 study, Banerjee explored the selection rationales of admissions tutors for MBA and MA in Politics programs at a UK university through semi-structured interviews She discovered that these tutors employed a balanced judgment approach rather than the algorithmic selection model typical at the undergraduate level Their decisions were influenced by various, sometimes competing, criteria, particularly for borderline applicants whose experiences and qualifications did not fit neatly into categories Factors considered included academic history, work experience, secondary education, references, and application details Tutors occasionally conducted interviews to validate application information, requiring nuanced judgment across multiple competencies, including English language proficiency Such detailed decision-making was primarily necessary under unusual circumstances or high demand for places, aligning with the IELTS recommendation for flexible use of test scores based on individual cases.
Banerjee conducted a study on eight non-native English speaking (NNES) students selected by admissions tutors, categorizing them based on their level of 'academic risk.' Through student interviews and critical incident diaries, she discovered that the admissions tutors' evaluations of risk were accurate, as those identified as having the highest linguistic risk faced more challenges and incurred greater time costs in overcoming their difficulties.
Two distinct models of selection emerge from studies on university admissions: one treats language requirements as a simple yes or no decision, while the other involves a complex interplay of multiple criteria The first model reflects a controlled environment, devoid of risk, whereas the second captures the complexities of real-life decision-making The relationship between these models and their practical applications remains speculative due to limited research evidence It appears that the scale of operations influences selection processes; undergraduate admissions, with larger applicant pools, tend to align with the simpler model, while postgraduate admissions, characterized by smaller, more diverse classes, may better fit the complex decision-making model Further research is needed to explore the factors and circumstances that selection personnel consider when evaluating candidates for admission.
Cranfield University, a unique postgraduate institution in the UK, is recognized by the Higher Education Funding Council of England (HEFCE) for its expertise in engineering and aerospace Unlike traditional universities, Cranfield emphasizes applied knowledge and is structured into four specialized Schools: the School of Management (SOM) and the School of Applied Sciences (SAS), distinguishing it from its peers.
The School of Engineering (SOE) and the College of Health (CH) offer multidisciplinary Master's programs that emphasize practical applications and maintain strong connections with industry and management The campus's unique rural location distinguishes it from typical UK higher education institutions, requiring students to take initiative for their social and extracurricular engagements, as highlighted in Banerjee’s study.
In the 2007/8 academic session, Cranfield University had 1,646 students enrolled in taught postgraduate Master's programs, with a distribution across four Schools: Management (33%), Applied Sciences (29%), Engineering (31%), and Health (7%) The university has seen a steady increase in overseas students, contributing to its long-established international character In 2008, Cranfield was ranked second globally for its international student community by Times Higher Education (THE).
In the 2007/8 academic year, Cranfield University showcased a diverse student body with over 110 nationalities represented on campus, comprising 36% UK students, 31% from non-UK EU countries, and 33% from other global regions This demographic distribution indicates a lower percentage of UK students and a higher proportion of non-UK EU students compared to sector averages reported by HESA However, these institutional statistics do not fully capture the rich national diversity present within individual Master's programmes.
In the 2007/8 academic session, Cranfield University offered approximately 65 taught Master's programmes, some of which include multiple options, ranging from four to six Each programme is typically led by a dedicated Course Director, with few exceptions where directorship is shared It is uncommon for one individual to oversee more than one Master's course, while the responsibility for directing options may vary, sometimes remaining under the purview of the overall Course Director.
The role of Director is often assigned to a member of the teaching staff with relevant academic expertise, although Options Directors may occasionally handle selection responsibilities During the 2007/8 academic year, the Schools of Health and Management provided approximately 10 programs, while the remaining programs were evenly distributed between the Schools of Engineering and Applied Sciences.
The disciplinary specialisms of Cranfield are associated with a bias towards men amongst staff and students (28% women)
Cranfield Course Directors are responsible for student selection, similar to Admissions Tutors in larger institutions, and play a vital role in upholding academic standards They serve as a bridge between classroom practices and institutional policies, and their oversight of assessments allows them to evaluate the impact of selection decisions on academic outcomes Additionally, Course Directors often have the final say in borderline selection cases, emphasizing their unique position within the academic framework.
Cranfield's one-year Master's programmes are designed with a consistent modular structure across various subjects and schools, featuring modules that last from one to four weeks The first term focuses on lectures and practicals, while the second term involves a collaborative group project, and the final term is dedicated to an individual research project and thesis Assessments occur after each module, with examinations scheduled for January and April, and students experience brief breaks during Christmas and Easter, dedicating the summer to their research projects The group project is highly valued for its practical application in real-world settings, particularly in the multinational fields of management and aerospace Class sizes range from 10 to 80 students, contributing to Cranfield's top ranking in education quality, with the MBA cohort exceeding 100 students.
UK and eleventh in the world for the staff student ratio in the THE World University Rankings in
A recent study at Cranfield highlighted the significant impact of student diversity on teaching and learning, with English language proficiency identified as a key concern for both students and staff (Lloyd-Jones et al, 2007) Lecturers noted that while listening and speaking skills improved in the initial months, students struggled with academic writing Additionally, the dynamics of group size influenced learning, as students tended to socialize within their mother tongue groups, which may hinder English language development and cultural integration This issue extends beyond the immediate group, as non-native English speakers expressed a strong desire to enhance their English skills and appreciated constructive feedback Therefore, there is a compelling institutional need to further explore English language proficiency.
Cranfield University offers a unique setting for research due to its exclusive focus on postgraduate education, diverse international student population, commitment to applied knowledge, and emphasis on science and engineering disciplines This environment presents an opportunity to conduct comparative case study research that can validate and enhance our understanding of non-native English speaker (NNES) student selection procedures and rationales (Ward Schofield, 2000).
This study aims to investigate current admission practices concerning English language testing and assess the impact of selection decisions on academic progress, highlighting the necessity for continuous academic English support at a UK university for international postgraduate students.
The study has the following aims:
1 To describe and explain Course Directors’ admission practices and experience in relation to IELTS scores
2 To examine the relationship between non-native English speaking students’ pre- admission IELTS scores and their
- ongoing English language support needs
3 To compare the consequences of different admission criteria and practices upon postgraduate students’ academic progress in a variety of courses
A case study approach was selected for examining a contemporary phenomenon, aligning with previous research, particularly Banerjee’s study (2003) The research design adopted an inductive methodology, allowing findings to shape subsequent inquiries and data collection, which proved beneficial as early pilot study results were unexpected and influenced the overall research direction The investigation evolved into inter-related studies, each with distinct objectives and methods, highlighting challenges encountered during the research process and their implications Two foundational assumptions guided the research: first, that selection practices would vary between Schools and Masters programmes, supported by prior findings from Lloyd-Jones et al (2007); second, that all non-native English speakers (NNES) would possess formal English test scores upon entering the programme However, the latter assumption was disproven, impacting the research's aims and limiting comparisons of NNES students’ academic progress, prompting a shift in focus to those who participated in the pre-sessional Summer Programme in English.
Data collection methods
The pilot study aimed to investigate the regulations and practices regarding the admission of non-native English speaking (NNES) students into taught Master's programs at Cranfield University It focused on the pre-sessional Summer Programme English course and the ongoing academic English support available on campus Initial data was collected from the university's webpages, supplemented by informal interviews with four Registry staff members involved in the admission process, including linguists, administrators, and academic English teachers Notes were taken during these unrecorded interviews to capture insights into the selection process and the Summer Programme.
The Registry provided student statistics, detailing nationalities, MSc programmes, and English language conditions, with additional insights from the European Partnership Programme (EPP) and the 2007 English language Summer Programme It was found that English language test sub-scores were only available in paper format, necessitating a manual review of admissions files for 177 students, representing 10.5% of all taught Masters students This review included students whose admission was contingent on demonstrating English proficiency across 10 MSc programmes from four Schools: one from CH, five from SAS, and two each from SOM and SOE The selection of Masters programmes aligned with the initial focus of Course Directors in the interview study, aiming to identify potential differences in admissions practices among Schools and programmes Additionally, the admissions files of 10 Summer Programme students were reviewed, concentrating on individual student progress rather than extending the search to all non-native English-speaking students in the newly included Masters programmes.
Findings
Selection procedures
Cranfield University aligns with other UK higher educational institutions by requiring minimum English test scores for non-native speakers, specifically IELTS 6.5, TOEFL 580 (paper), 237 (computer), or 92 (Internet), with alternatives like TOEIC (830) and Cambridge ESOL accepted These standards have been maintained for nine years, allowing Schools and Course Directors to set higher requirements if desired Test results must be obtained within two years prior to the course start date To combat concerns over fraudulent certificates, scores and sub-scores are verified against electronic test databases.
The pilot study revealed a unique entry route for European MSc students through the European Partnership Programme (EPP), which is associated with the EU Erasmus Programme This established scheme allows EU undergraduates to apply for their final year at Cranfield, earning a double degree from both their home institution and Cranfield Participation is limited to continental institutions with a partnership agreement with Cranfield, currently listing 64 institutions across 15 countries, with ongoing additions from central Europe The EPP has seen significant growth, with the majority of non-UK EU Masters students now enrolling at Cranfield through this route, comprising 330 out of 508 students in the 2007/8 academic year This shift is also evident in the nationalities of participants in the English language Summer Programme, where approximately 50% of students now hail from Europe, a change from the previous dominance of developing countries and the Pacific Rim.
The admission and selection process for non-native English speakers entering through the EPP route at Cranfield University differs from that of other applicants After applying to their host institution and then to Cranfield, candidates are interviewed in their home country by Cranfield staff, with Course Directors occasionally involved English language proficiency is assessed by Academic English staff and Course Directors based on interview evaluations and written applications, although not all students have taken formal English tests For instance, while a TOEIC score of 750 is required for a Diploma in the French Grandes Écoles, Cranfield's entry requirement is higher at 830, highlighting the significance of the many French students enrolled due to partnerships with 29 French higher education institutions The EPP procedure offers several advantages, including Course Directors' growing familiarity with applicants' undergraduate courses, enabling detailed assessments of their academic capabilities Additionally, it provides immediate feedback on students who may require pre-sessional academic English tuition, allowing conditional offers based on successful completion of the Summer Programme, which included 14 EPP taught Masters students in 2007/8.
Students whose first language is not English and who cannot participate in the EPP require tailored assessment arrangements for English language proficiency European students attending institutions outside the EPP must provide a satisfactory test score as per regulations However, for non-European applicants with prior English experience in work or education, formal testing may be unnecessary In cases of uncertainty, the Course Director is advised to conduct a telephone interview to evaluate the applicant's language skills and decide whether to request a test or waive the requirement based on the assessment outcome.
Academic English provision
The increasing enrollment of international students at Cranfield has highlighted the necessity for academic English support Students with an IELTS score below 6.5 can enroll by completing an intensive pre-sessional English programme, known as the Summer Programme, which runs from July to September This programme enhances students' proficiency and confidence in key language skills essential for academic success Generally, students with an IELTS score of 6 require one month of tuition, those with 5.5 need two months, and students scoring 5 must participate in the full three-month course While students typically improve by 0.5 band scores for each month of tuition, results can vary The programme accommodates up to 6 students for three months, 12 for two months, and a maximum of 30 for the final month, featuring morning classes and afternoon activities, including regular writing tasks Assessment is primarily formative, with weekly meetings between students and tutors to set goals and track progress At the conclusion of the course, a report detailing each student's achievements and ongoing English support needs is provided to their Course Director, with language skills graded from A to E A grade of E signifies inadequate language proficiency for successful Master’s level study, and approximately 10% of Summer Programme participants do not meet the necessary standards for admission to a Master's programme.
Students may be advised to pursue weekly English tuition for academic writing, facilitated by academic English staff Since 2008, these sessions have been available to other non-native English speaking (NNES) students, leading to a high demand that necessitated a screening test for eligibility Occasionally, one-on-one tuition is available based on resource availability Additionally, the School of Management provides oral English classes, which are not free or unlimited, but students from other schools can join if approved by their Course Director.
Analysis of selected application forms
The admissions and selection process reveals complexities, as evidenced by the analysis of 177 records where English language proficiency was a prerequisite If algorithmic selection were the standard, most students would need to present test scores, yet only two-thirds (118) of the sampled students had such scores, with 50 undergoing interviews The records do not clarify whether the test or the interview holds more importance Notably, in over a third of applicants (68), the test score is the only evidence of English language proficiency provided.
The Registry records indicate that interviews are commonly used to assess English language proficiency, occasionally via telephone Out of 61 interviewed students, 11 were exempted from the English test score requirement While the records lack detailed explanations for these waivers, factors such as previous degree-level education or work experience in England may have influenced the decisions.
In postgraduate education, the diverse population means that English language waivers do not necessarily indicate a breach of institutional policy, as many applicants possess strong English skills that make formal tests unnecessary (Sastry, 2004) The absence of provided reasons for waivers likely leads to an underreporting of recorded interviews However, these records confirm that interviews are conducted by Cranfield representatives, including Course Directors, who share the responsibility, although this distribution may vary based on the number of EPP students, with a higher count of non-EU students favoring the Course Directors.
All students with an English test score 118
Students with an English test score alone 68
All students interviewed and with an English test score 50
Students interviewed but without an English test score 11
Interview with Course Director and linguist 1
Total number of recorded English test scores 141
Students with more than one English test score 20
Students with other tests – Cambridge ESOL etc 19
Students with overall IELTS scores of 6/6.5 with a sub-score of 5.5 or below 9
Table 2: Methods used to assess students’ English language proficiency for admission to Cranfield of a purposive sample of 177 students in 2007/8 (taken from Cranfield Registry data)
Differences in selection processes were evident among various schools and programs, particularly in the School of Management (SOM), where applicants were more likely to submit multiple test scores, including the General Management Aptitude Test (GMAT) While the GMAT is not specifically an English language test, it evaluates analytical writing, verbal ability, and numerical reasoning, making it a common requirement for business and management disciplines All students who submitted GMAT scores were applying to SOM courses Additionally, there were notable variations in the use of interviews among different Master's programs, with some Course Directors emphasizing interviews as a key method for evaluating candidates For instance, one Course Director in the School of Engineering (SOE) interviewed nearly half of the non-native English-speaking (NNES) applicants, while other programs, such as the School of Health (CH) and SOM, did not prioritize Course Director interviews, with some programs interviewing less than 10% of NNES students.
School Sampling code Total number of students Number of NNES students Number of students interviewed by CD
Table 3: Numbers of recorded interviews conducted by Course Directors in the selection of NNES students in 10 Masters programmes (taken from Registry data)
Although the IELTS test is the most frequently used test of English proficiency amongst the sample
The popularity of the IELTS test is slightly higher than that of the TOEFL, but an overall score may conceal significant discrepancies in sub-scores Among 24 students with overall scores of 6 or 6.5, nine had at least one sub-score of 5.5 or lower, with six of these deficiencies occurring in writing Notably, one student achieved an overall score of 6.5 yet received a writing sub-score of only 4, while two students with overall scores of 6 had writing and speaking sub-scores below 6.
The pilot study confirmed that selection practices and procedures vary between Schools and Masters programmes, revealing differences among Course Directors regarding the preference for selection interviews These findings were integrated into the next phase of the research by adding the topics of selection interviews and sub-scores to the interview schedule for Course Directors As a result, the pilot study effectively influenced the direction of the subsequent enquiry.
The diverse measures used to evaluate English language proficiency for institutional entry impacted the study's second aim Initially, it was believed that most non-native English speaking (NNES) students would have English test scores; however, the pilot study revealed otherwise The absence of a consistent baseline for NNES students' English proficiency at entry made the planned comparisons unreliable Additionally, the limited number of students with test scores would likely invalidate any statistical analysis Consequently, alternative methods were explored to achieve the study's second objective.
School Overall IELTS score Listening score Reading score Writing score Speaking score
Table 4: IELTS scores for NNES students scoring 6 or above overall but below 6 on one sub-score (taken from Registry data)
While the findings of this pilot study provide insights, they should not be overgeneralized due to purposive sampling, which limits broader applicability Additionally, the admissions data is likely incomplete, as it primarily records decisions without requiring explanations or justifications for interview processes A deeper investigation into Course Directors' selection practices and experiences is necessary for a comprehensive understanding, which will be explored in the following section.
7 INTERVIEW STUDY WITH COURSE DIRECTORS
Method
Sampling
The research team, drawing on the extensive experience of two members within the institution, established the sampling rationale and selection criteria for Course Directors in the interview study They also considered insights from admissions staff involved in a pilot study on admission practices The preliminary findings supported a purposive sampling approach aimed at capturing a diverse range of viewpoints rather than a singular typical perspective Leveraging contextual knowledge and the socio-economic landscape of UK higher education, the team identified specific criteria that could shape a Course Director's perspective on the selection of non-native English speaking (NNES) students.
demand for places on the programme
courses with large groups sharing a first language other than English
The demand for places at the institution necessitates a detailed analysis of the admission and selection processes, focusing on the number of applicants, offered places, and accepted enrollments While such data may exist at departmental or school levels, it was not publicly accessible during the study Consequently, the assessment of demand for individual courses relied on the expertise and insights of the research team members (CN and SM) To validate sampling decisions, interviews included inquiries about student applicants, offers, and conversion rates, although the availability of this data varied.
One Course Director was selected for the study due to his interest in English language testing, although his Masters program also met the sampling criteria As the research unfolded, it became evident that another factor to consider was the length of experience in the directorship role, particularly since one director was in his first year This realization raised questions about how staff responsible for selection decisions acquire their skills and highlighted the potential need for academic development and training in this area Additionally, the common practice of conducting selection interviews, typically in one-on-one settings, further justified the inclusion of this category For this specific MSc program, two interviews were conducted: one with the current Course Director and another with his predecessor.
Table 5 details the Course Directors involved in the study, highlighting their respective Schools, the variations in criteria, and the corresponding interviewee codes Notably, all Course Directors were male, with the exception of the Course Director from Cranfield Health The interview study included eight programs from the pilot study.
Of 16 Course Directors approached to participate, one declined on the grounds that he was about to relinquish the role He suggested that the incoming staff might contribute instead As this programme had been designated a low demand course, it was important to pursue alternatives to avoid omitting a potentially significant perspective Two solutions were found to the problem In the first, a departmental staff member with four years experience of admission and selection for one of the programme options consented to an interview In the second, it was decided to follow up the Course admission and selection responsibilities A group interview was preferred due to the shared situation facing the group members (Morgan, 1997) and the limitations of experience in the role upon data gathering in a one-to-one situation As well as exploring departmental staff attitudes, the focus group aligned with the emergent criterion of role expertise already discussed above and brought into the study the views of academic staff who were themselves multilingual, non-native English speakers (German and Egyptian) Some of the focus group members had experience of teaching in other UK HEIs and abroad and so were able to offer comparative viewpoints Another MSc programme was omitted (although it had been used in the pilot study) when it was realised that the Director was on maternity leave and her role was temporarily being undertaken by a colleague
Code School Demand Monolingual groups Class size Course
CH Cranfield Health unspecified None 14 4 years
Sciences high 14 French 49 In first year of post
SOE1 School of Engineering high None 36 10 years +
SOE2 School of Engineering low 10 Polish 42 10 years +
SOE3 School of Engineering high 19 French 52 4 years
SOE4 School of Engineering high None 50 10 years +
SOM1 School of Management high None 42 3 years
SOM2 School of Management high None 30 1 year
SOM3 School of Management high None 55 5 years
Table 5: Sampling for Course Directors’ Interview study
The selected interviewees were contacted via email and/or telephone, where they received a brief overview of the research, an explanation of their selection, and an invitation to discuss the admission and selection practices for NNES students They were assured that their interviews would be audio-recorded and transcribed, with their identities protected in future presentations and publications While school titles were retained for reporting purposes, MSc programme titles were coded to maintain participant confidentiality.
Interview schedules
The interview schedule (Appendix 1) was designed to investigate Course Directors' admission and selection practices concerning English language proficiency, focusing on how they utilize English language test scores in their selection decisions It comprised three sections addressing key topics related to these practices.
factual information about the MSc programme and the interviewee’s experience of
selection and admission practices of non-native English speaking students on the programme
the relationship between international classes, pedagogy and academic progress of NNES students
The third topic was pursued to highlight Course Directors' perspectives on the academic and linguistic development of non-native English speakers (NNES) and to investigate practical strategies teachers can employ to facilitate learning in international postgraduate settings For instance, the extensive use of group projects in Cranfield MSc programs reveals challenges that may arise in classes with multilingual students Understanding how educators manage and address these challenges can provide insights into existing literature on cultural differences and second language acquisition.
During the study, the institution was involved in a contentious debate regarding the online publication of all Master's theses, prompting the inclusion of a question to gather participants' opinions The use of semi-structured interviews facilitated a flexible and engaging questioning style, enabling the interviewer to explore new and emerging ideas, examples, and concepts as they developed.
The Course Director SOE1 was selected as the pilot interviewee due to his well-articulated views on the subject, ensuring a fruitful discussion Anticipating a productive interview, additional questions were incorporated to explore the programme's history, its standing in the UK higher education sector, and its structure and assessment methods, particularly regarding group project work, which provided rich insights into language and cultural dynamics.
The focus group guide, designed to encourage open discussion, was adapted from the previously used interview schedule, featuring fewer questions and prompts Although the focus group method did not allow for piloting the guide, its modification from an already tested schedule mitigated potential drawbacks Ultimately, the discussions within the focus group were productive and interactive.
Interviews and focus groups conducted by GL-J lasted between 45 to 60 minutes, primarily held in her office from March to December 2008 Two interviews, specifically with SAS2 and SOM3, were conducted at their request in their own offices, with SAS2 opting out of audio recording; notes were taken during this session All other interviews were audio recorded for transcription purposes by GL-J.
The analysis of the data was conducted with MAXQDA software, employing both deductive and inductive methodologies The deductive approach aligned with the predetermined questions in the schedules, while the inductive approach facilitated the discovery of new and unexpected themes within the data.
This article is organized into three thematic sections linking research literature with empirical data The first section examines various selection practices at Cranfield and the impact of sampling criteria on these practices The second section focuses on Course Directors' understanding and application of English test scores in the selection process, highlighting their relationship with other selection factors The final section explores Course Directors' perspectives on the academic and linguistic progress of non-native English speakers (NNES) students, considering the implications for program structure, delivery, and pedagogy Additionally, recurring subsidiary topics are referenced across the sections to provide a comprehensive overview.
Models of selection practices
Demand for places
The hypothesis that demand for university places influences selection practices is pertinent to the current UK discussion on degree standards and internationalisation While there is no evidence of reduced entry requirements due to declining course demand, high-demand courses have seen Directors raising language entry requirements This approach serves as a filter for allocating places, exemplified by SOE1, which increased the required English test scores due to minimal competition Additionally, Course Directors may find it easier to evaluate candidates with borderline English proficiency.
Director in SOE describes how demand and English language entry requirements are related on his course
Course Director: “We were able to more or less guarantee that because with more than 400 applicants we could just pick the ones You can really cherry pick.”
Interviewer: “Could you raise the entry level requirement?”
Course Director:”I don’t know that, I haven’t thought about it but I don’t know whether we would need to.”
Similar examples occurred in SOM and SOE but only in courses designated as high demand.
Large student groups with a shared language other than English
Course Directors emphasized that the social and cultural integration of non-native English speaking (NNES) students is crucial for their academic and linguistic development They noted that a significant minority of monolingual students poses challenges, as this can lead to delayed integration The formation of social cliques among these students limits opportunities for enhancing English language skills and hinders the process of enculturation.
Approximately one-third of our students are French, and our European partner universities typically impose a maximum limit of four or five students per MSc program Within each option, they prefer no more than two French students, as past experiences indicate that larger groups tend to form cliques, communicate primarily in their native language, and consequently hinder their English language improvement.
The negative impact on learning and academic progress is evident for both native English speakers and non-native English speakers (NNES) students, particularly in group projects Course Directors prioritized balanced group allocation based on nationality, English proficiency, and academic ability The SAS1 and SOE3 courses exemplify how thoughtful group selection can reduce barriers for large minority groups within a cohort In the SOE3 program, students deliver weekly oral presentations for the first six months, while SAS1 encourages collaboration among students from different options, fostering interaction with diverse peers These considerations also applied to smaller options and programs with limited class sizes.
Class size
There was no evidence that class size affected selection decisions independent of demand or due to the presence of a majority NNES student group with a shared language other than English.
Course Directors' use of English test scores in the selection process
General selection criteria
Course Directors focus on evaluating students' academic abilities and potential for success in Master's level studies, highlighting the distinct differences between undergraduate and postgraduate education.
‘intensity’ of Masters study such that study time was jealously protected as described in Extract 7:
Balancing the demands of a nine-to-five schedule, five days a week, poses challenges in effectively managing time While there are numerous suggestions for course enhancements, it's essential to recognize that introducing new elements requires the removal of existing ones We cannot continuously expand the curriculum without making adjustments elsewhere.
Course Directors seek motivated and confident students who can effectively express themselves and present arguments relevant to their field, highlighting the significance of these skills in group projects and the values of independence and critical thinking in UK higher education Proficiency in the English language is crucial for developing and showcasing these abilities, guiding the interview process One seasoned SAS Course Director emphasized the importance of assessing an applicant's adaptability to the UK higher education system, offering a distinct perspective on the interview's role.
When evaluating a student's potential to adapt to a new academic environment, it's essential to consider their academic performance alongside their language skills, confidence, and motivation While language proficiency is important, it should not be the sole criterion for rejection; a holistic assessment of the student's capabilities is crucial for making informed decisions.
Course Directors adhered to institutional and School regulations when evaluating students' English test scores, viewing it primarily as a matter of compliance While most recognized that English tests assess various language skills, there was limited understanding of the differences among the tests Only the Directors in the School of Management (SOM) and two from the School of Arts and Sciences (SAS1B and SAS3) regularly examined sub-scores, while others tended to do so only in borderline situations.
Borderline cases
Course Directors approached borderline cases and applicants without test scores with distinct strategies, scrutinizing application forms for validation and discrepancies in qualifications and skills A well-crafted personal statement could raise suspicions regarding its authenticity Evaluations often drew from prior experiences with students from specific institutions or countries While preferences for interviews varied, those who supported this method emphasized its effectiveness in assessing speaking skills In contrast, evaluating writing skills posed a greater challenge due to authenticity concerns, prompting many Course Directors to seek insights into applicants' spontaneous writing through emails and instant messaging.
Receiving emails from students expressing interest in a course can be concerning, especially when they use text speak This raises questions about their English proficiency, as effective communication should involve proper sentences However, after further conversation with the student, clarity and coherent language often emerge, indicating their true understanding and capability.
Two Course Directors, one each in SOE and SAS, had invited applicants to write a piece on a topic related to the course at interview.
Sceptics
Two Directors from SOE and SOM expressed skepticism towards English tests after careful consideration, highlighting the need for further exploration of their perspectives, despite not representing the majority view.
The Course Director at SOE, with over a decade of experience, previously relied solely on test scores to evaluate applicants' English language proficiency, foregoing interviews Concerned about his students' writing standards, he had increased the course's entrance requirement to a score of 7 two years ago, but this change did not yield the expected improvements Doubting the effectiveness of his previous assessment methods, he is now exploring alternative evaluation strategies, including additional tests, writing tasks, or interviews.
Despite raising academic standards, many students continue to struggle with the course, prompting a reconsideration of whether the current assessment methods effectively evaluate their capabilities This situation raises the question of whether to further increase the barriers or to acknowledge that the testing measures may not accurately reflect the skills and knowledge expected of students at Cranfield.
The SOM Director, with three years of experience, focused on enhancing the admissions and selection process during his tenure While adhering to institutional guidelines regarding entry test scores, he observed a lack of correlation between these scores and candidates' linguistic abilities or academic progress To address this issue, he implemented interviews for most applicants, believing this approach led to an improvement in candidate quality His pragmatic perspective on testing is illustrated in Extract 11.
While IELTS scores can indicate proficiency, they are not entirely reliable, as two individuals may achieve the same score of 6.5 yet demonstrate vastly different abilities A significant concern with such examinations is that they reflect performance under specific conditions on a particular day; if the same test were taken again a month later under different circumstances, the results could vary considerably.
While examinations can provide some insights, they are not entirely reliable for assessing communication skills Therefore, I prefer to conduct interviews as a supplementary evaluation Although interviews do not directly test English language proficiency, they offer valuable indicators of an individual's ability to communicate effectively, which is crucial.
Course Directors frequently share cautionary examples of students whose test scores do not accurately reflect their linguistic and academic development One notable instance involves a student who, despite holding an undergraduate degree from the UK, ultimately failed an MSc program These experiences highlight the importance of practical insights over the statistical validity of formal testing.
Management of the selection process and learning how to select
Some Course Directors strongly believed that interviews provided valuable insights for selection decisions, while others preferred to depend solely on test scores This divergence in attitudes reflected a perspective that selection was primarily an administrative task, influenced by considerations of efficiency and resource management, as illustrated by experiences from another UK university.
In our department, admissions tutors focused on recruiting for various degrees, but we found that much of our work was administrative rather than academic To streamline the process, we appointed an administrator to handle the admissions tasks, except for cases that required academic judgment She would categorize applications into clear accept and reject groups based on set criteria, while also consulting with academics on borderline cases Personally, I realized that only about 20% of my admissions decisions were based on academic considerations.
While this participant discusses experiences beyond the institution in question, their perspective aligns with that of a minority of Course Directors This viewpoint contrasts with existing research literature, which suggests that enhanced knowledge of admissions tests among stakeholders leads to better selection decisions It indicates potential resistance among some academics to pursue further training in this area Additionally, the interviews revealed a lack of consensus among participants regarding the need for extra training in selection processes.
The confidentiality surrounding selection decisions raises concerns about how Course Directors acquire knowledge regarding admissions and selection processes Most Course Directors, when starting in their roles, relied on mentorship from their predecessors and gained insights through a blend of personal experience and guidance from colleagues.
Gaining insights about candidates often comes from conversations with colleagues and initial impressions when reviewing applications Drawing on past experiences with similar applications can guide decision-making, leading to opportunities for promising candidates.
Effective mentoring for a Course Director is most successful when the previous director remains at the institution; however, if they leave, the new director may find themselves significantly unsupported This lack of continuity can hinder the transition and overall effectiveness of the course leadership.
Director whose predecessor left the institution felt there was a place for formal training for new Course Directors.
Course Directors' views of NNES students academic and linguistic process
Relationship between test scores and academic progress
As well as the two Course Directors quoted in section 8.2.3, opinions about the extent to which
The reliability of English proficiency test scores as indicators of a student's ability to succeed academically is debated among Course Directors While some are content with existing selection procedures, others express skepticism Numerous instances of discrepancies between test scores and actual academic performance align with research findings that fail to establish a clear link between English language proficiency and academic success This highlights Course Directors' views that academic achievement is influenced by a complex combination of factors and circumstances.
Course Directors emphasized that immersing oneself in the UK educational culture is crucial for both linguistic and academic growth The diverse student population at Cranfield facilitates this immersion, as English serves as the common language among international students However, some Master's programs may still have a notable number of students whose first language is not English Directors raised concerns about non-native English-speaking (NNES) students who reside with fellow compatriots in the UK, as this can hinder their opportunities to use and practice English effectively.
Limiting exposure to English can hinder linguistic development; however, intervening in students' extracurricular activities is deemed inappropriate, leaving institutions and Course Directors with little control over this aspect The significance of English immersion will be further discussed in relation to Summer Programme students in Section 9.2.3.
Speaking skills
An exploration of Course Directors' perspectives on the language skill development of NNES students revealed that writing skills are the primary concern While speaking skills may start off limited, most students manage to improve significantly within the first two to three months, enabling them to engage in satisfactory conversations for course requirements Instances of students with severely poor pronunciation who could not complete the course are infrequent A NNES lecturer's insights on students' spoken English further illustrate this point.
While spoken English may initially pose some challenges for students, they tend to adapt rapidly In my experience, spoken English is not a significant issue, and I have not encountered major problems in this area.
NNES Lecturer, SAS, focus group
Cultural references often influence student participation in class discussions, which can complicate a lecturer's ability to evaluate each student's language development accurately A Course Director reflects on the interplay between culture and language, particularly concerning the challenges faced by Chinese students in their course.
Interviewer: “How long does it take before the Chinese students’ spoken English is reasonably good?”
The Course Director notes that while some students may be quiet, others are willing to stand out and engage This behavior may be influenced by cultural factors or a lack of comfort with the English language, suggesting that improved language skills could lead to greater participation and visibility among students.
Interviewer: “Do they make any further progress or remain like that throughout the course?”
The Course Director believes that students often adapt by the second semester, suggesting that their initial struggles may stem more from cultural differences than language proficiency While it's possible that their English improves after three months, the Director remains uncertain about this assertion.
While there is consensus that oral skills are not a significant issue, this may be attributed to the unique context of Cranfield, particularly its geographical isolation This environment encourages students to depend more on themselves, the campus, and the surrounding area for extracurricular activities.
Writing skills
One topic on which there was broad agreement amongst Course Directors was concern about standards of written English (Extracts 3 and 16)
Among 40 students, a few may struggle with report writing if not properly supervised This lack of management can lead to subpar written reports Therefore, it is essential to identify and support weaker students to ensure quality outcomes.
Academic performance often correlates with thesis outcomes, highlighting the need for focused supervision on borderline students By monitoring their report structure and writing closely, we can ensure they receive the support necessary for success, despite the small number of students requiring this attention.
The thesis presents a persistent challenge as a publicly accessible document that reflects academic standards subject to external evaluation The discussion surrounding the online publication of all Masters theses heightened Course Directors' concerns regarding these standards Additionally, the burden of poor thesis writing significantly affects supervisors and their workloads, an often overlooked aspect of academic responsibilities Since the thesis supervision process is largely undocumented and there are strong incentives for producing quality work, there is a lack of evidence to show variations in workloads based on students' writing abilities.
Additionally, the timing of the thesis allows little scope for substantive improvements in writing proficiency
While there was agreement on the importance of student writing, opinions varied on how to address the issue, as some academic staff did not see it as their responsibility to correct English language skills To support at-risk students, several Course Directors implemented pedagogical strategies, such as early writing assignments to identify those needing additional English support Many courses also created more opportunities for writing practice, with one course modeling group project reports after thesis structures to enhance learning Additionally, a long-standing Course Director noted that there is now significantly more learning support available for thesis writing than in the past.
Recent discussions about writing skills revealed unexpected issues, highlighting that poor writing is not limited to non-native English speakers (NNES), as UK students also struggle more than in the past Additionally, the assessment of written English varies by school, with marks often not specifically allocated for language, but rather included under broader criteria like 'presentation' or 'structure.' This inconsistency has led to differing approaches among Course Directors regarding the correction of grammatical and spelling errors in assignments and theses For instance, SAS5 and SOM2 actively proofread their students' work, believing that providing corrections serves as valuable feedback and may motivate students to enhance their writing skills.
In grading coursework, we focus on identifying English language issues within students' reports and encourage improvements in future submissions While language clarity is crucial, it only affects the overall mark if the report becomes unreadable There is no specific mark allocated for English proficiency; however, presentation elements may contribute to the grade, primarily assessing the structure of the report and the quality of accompanying map work in certain assignments.
Course Directors across various Schools, particularly those from the School of Education (SOE), have emphasized the need for more willing and capable proofreaders on campus to alleviate the burden on supervisors However, the interviews did not delve deeply into the specific nature of proofreading, leaving unclear whether the focus should be on grammar or more substantial editing issues regarding the expression of ideas and arguments The findings suggest that theses are the academic products most impacted by the tension between maintaining academic standards and the increasing number of non-native English speaking (NNES) students, indicating that finding effective solutions is complex and not immediate.
This section outlines how the study's second aim was achieved by examining the relationship between NNES students' entry test scores and their subsequent academic progress Due to the limited number of students with English test scores, the research shifted focus to the Summer Programme students This alternative strategy was justified as these students formed a distinct group with identified borderline language abilities, allowing for effective tracking of their progress.
To assess the progress of Summer Programme students in their English language proficiency, profiles were created to track their development throughout the year Although the initial plan aimed to correlate these profiles with academic performance, such as assignment and project grades, institutional regulations prohibited the publication of individual marks As a result, alternative methods for evaluation were necessary.
Data collection methods
Documentary sources
The study utilized documents and questionnaires as data sources, focusing on the 2007 Summer Programme students who underwent various linguistic assessments facilitated by academic English staff These assessments included English test scores and individual reports detailing grades in specific language skills, along with comments and recommendations for future studies Additionally, all students participated in a pre-test IELTS exercise in late September 2007, evaluating their academic writing, listening, and reading skills, though speaking was not included This pre-test aimed to trial new items for future IELTS papers, and while the results were not directly comparable to official IELTS tests, they provided an approximate measure of students' abilities, especially for those lacking entry scores The relative differences in individual pre-testing scores may still offer valuable insights into students' linguistic capabilities despite being based on a single examination.
Examination scripts
Apart from a student’s need for ongoing language tuition there were no specific measures of student’s language abilities once the student had commenced the MSc course The earlier diversity study
A study by Lloyd-Jones (2007) and Course Directors highlighted concerns regarding students' writing skills, prompting a focus on improving these abilities Course assessments were selected as the primary data source due to their contextual relevance, while additional testing was deemed unacceptable to students Among the three written assessment formats—course assignments, examinations, and theses—examinations were chosen for investigation due to their authenticity and standardized timing across courses Access to examination scripts was granted by the Registry, contingent on consent from students and Course Directors After obtaining consent from all Course Directors of Summer Programme students and reaching out individually to students, 22 participants were included in the study, with two declining Course Directors or Administrators provided the necessary exam scripts, which were copied and returned, although some scripts were unavailable due to external examining requirements It was agreed that copies would be destroyed after the study's completion.
The analysis of the scripts focused on two primary questions: first, to assess the extent of textual content that could support the interview data for specific programs, and second, to identify examiner comments regarding language, whether critical or positive Two reviewers, an Academic English lecturer and a research team member, independently evaluated the scripts, resulting in strong consensus between them, with only minor uncertainties regarding the interpretation of certain stylistic or structural comments.
Questionnaire for thesis supervisors
Due to the impracticality of evaluating over 20 theses within the given timeframe, a new approach was taken to assess the quality of students' thesis writing Insights from interviews revealed that supervisors of Summer Programme students could provide valuable feedback on students' writing proficiency An electronic questionnaire was developed to gather supervisors' opinions on whether their supervisees' English language skills impacted thesis marks and supervisory workload Respondents who noted a negative effect were prompted to elaborate, while an open-ended question allowed for additional comments The questionnaire was tested with a former Course Director from the CPLT, leading to minor adjustments in its wording.
In November 2008, following the final thesis submission, questionnaires were electronically distributed to 22 supervisors of Summer Programme students, as provided by the Course Directors An oversight led to the exclusion of supervisors for two students who opted out of the exam script study Each supervisor received a personalized email detailing the specific student they supervised, despite the questionnaires being identical To encourage participation, reminders were sent to non-responders one and two weeks after the initial invitation.
Academic progress
In January 2009, coinciding with the notification of degree outcomes, the research team accessed records for Summer Programme students, which included crucial details on thesis marking outcomes, essential for the focus of the present study on thesis writing at Cranfield.
The university does not differentiate grading for Master's programmes, resulting in three potential outcomes: a straight pass, a pass with minor corrections, and a revise and resubmit instruction requiring significant thesis revisions within approximately three months While minor corrections do not hinder degree attainment, the need for a revise and resubmit creates uncertainty until the revised thesis is reassessed Additionally, questionnaire data offers insights into the challenges Summer Programme students face in improving their language skills.
Findings
Summer Programme students – language assessment at entry
In 2007, 29 students enrolled in the Summer Programme, with 26 intending to pursue taught Master's courses However, one student withdrew before the Master's began, and another switched to a research programme early in the year Ultimately, 24 students completed both the Summer Programme and their Master's degrees, contributing to the current research, though the small sample size was deemed insufficient for statistical analysis.
Summer Programme student School Language assessment Entry test score Test sub-scores
SOE1 SOE English test TOEFL CBT 220 L 20 St.W22
SOE3 SOE English test TOEFL CBT 243 L 25 St W 22
SOE4 SOE English test IELTS 6.5 L 7 R6.5
SOE6 SOE English test TOEFL IBT 92 R 27 L 26
SOE7 SOE English test IELTS 6 L 5.5 R 6.5
SOM1 SOM English test IELTS 6 L 6: R 6
SOM2 SOM English test IELTS 6
SOM3 SOM English test IELTS 5.5 L 6 : R 5
SOM4* SOM English test IELTS 6 L 6 : R 6.5
W 6 : S 6 SOM5 SOM English test + Interview IELTS 6 L 6 : R 6.5
SOM6 SOM English test IELTS 5.5 L 5: R 6.5
* SOM4 IELTS score 3 months earlier 5.5
Bold type low sub scores
IBT Internet based test CBT Computer based test
L Listening St W Structured Writing R Reading E Essay
Table 6: Summer Programme students’ language assessments at entry
Table 6 presents the details of 24 Summer Programme students involved in the research, including their language assessment methods, entry test scores, and sub-scores where applicable The findings align with previous pilot and interview studies, highlighting variations in admission practices among Schools for borderline non-native English speakers (NNES) Specifically, SAS assesses students solely through interviews, SOM relies on English test scores, while SOE employs both assessment methods but favors formal English tests Additionally, challenges were encountered in retrieving the records for Student SOE5, who transferred to another MSc program within the same School.
Students SOE3, SOE4, and SOE6 meet the university's English entry requirements; however, SOE3 and SOE6 have not achieved the necessary standards for their MSc programs SOE4 submitted two IELTS scores nearly a year apart, with the first score being 6 and the subsequent score improving to 6.5.
Student SOE1 achieved an IELTS score between 6 and 6.5, while the other students fell below the minimum entry requirement, with all but one scoring at least one sub-score of 5.5 or lower, indicating a range between modest and competent user levels in the IELTS band descriptors Notably, student SOM4 took two IELTS tests three months apart, showing a one band score improvement in the later result, as detailed in the table.
The majority of Summer Programme students attended the Summer Programme for four weeks but four students, SAS2, SAS10, SOM2 and SOM 6 attended for eight weeks.
Summer Programme students – pre-test IELTS
In September, all Summer Programme students took a pre-test IELTS exam just before the conclusion of the program and the start of their Master's course It's important to note that the pre-test differs from the official IELTS exam, making direct score comparisons unreliable The marking system also varies, as only the Writing skill receives a band score akin to IELTS, while Listening and Reading are reported as raw scores, with 49 total marks for Listening and 45 for Reading Table 7 presents the pre-test IELTS results alongside the students' IELTS entry scores when available.
The Summer Programme students' performance is notably below pre-entry test scores and university entry requirements, with Writing band scores averaging between 5 and 5.5 Among the students with pre-entry IELTS scores of 6 or higher, only one achieved a score of 6 in Writing, while the others scored between 5.5 and 4 Extremes in scores are evident, with two students from SAS (SAS1 and SAS4) and one from SOE (SOE6) scoring 6, while two students from SAS (SAS2 and SAS11) and one from SOM (SOM3) scored between 4 and 4.5 Listening and Reading raw scores exhibit a similar distribution pattern.
The analysis reveals that students scored 27 (55%) in Listening and 26 (58%) in Reading Notably, the three students who achieved a score of 6 in Writing also excelled in Listening and Reading when their scores were combined Conversely, among the three lowest Writing scorers, two had similarly low scores in Listening and Reading (36 and 38), while the third student scored higher at 47 Additionally, three students with Writing scores of 5.5 had combined Listening and Reading scores below 40 Despite the caution warranted by the pre-test conditions, these findings suggest that the students' English language skills are at a borderline level.
Summer Programme student Language assessment Entry test score Pre-test IELTS scores (09.07)
SOE1 English test TOEFL CBT
SOE3 English test TOEFL CBT
SOE6 English test TOEFL IBT 92 6 37 37 74
SOM6 English test +Interview IELTS 5.5 5.5 24 21 45
Table 7: Summer Programme students’ pre-test IELTS and entry score
Summer Programme students – reports
Upon completion of the Summer Programme, academic English staff generate individual reports for each student, grading them on an alphabetic scale from A to C across four key components: academic reading, speaking, writing, and listening comprehension with note-taking These reports provide detailed feedback on each student's progress, attendance, and tailored advice for future improvements, including the necessity for ongoing English language support The reports are forwarded to the respective Course Director for further guidance Additionally, Table 8 presents a comparison of students' grades from the Summer Programme with their entry scores, pre-test IELTS scores, and overall entry test scores, while also recommending that students participate in weekly academic writing tutorials Students are cautioned against spending excessive time with peers who share the same language to enhance their learning experience.
Entry Test Score Pre-test IELTS scores Summer Programme reports
Rec Recommendation for continuing language support
W Writing L Listening R Reading LN Listening and note taking S Speaking
Table 8: Summer Programme students’ entry and pre-test IELTS scores and Report grades
There is a notable correlation between pre-test IELTS scores and Summer Programme reports However, Students SOM4, SAS9, and SAS10 exhibit pre-test IELTS scores that are less favorable than their report grades, with SOM4 showing the most significant discrepancy Overall, students with lower pre-test scores tend to struggle in their assessments.
Students with low writing or combined Listening and Reading scores were advised to attend weekly English language tutorials Student SAS2 was referred to the Disability Learning Support Officer due to writing difficulties Additionally, students SAS4 and SAS7, who were entering courses with many peers sharing their first language, were encouraged to take advantage of opportunities for English language practice.
Exam scripts of Summer Programme students
A review of exam scripts revealed that most examinations necessitated students to produce extensive written responses, with the exception of one course in the School of Education (SOE), which featured minimal textual content In contrast, other SOE exam scripts predominantly included lengthy passages Interestingly, exam scripts from a quantitatively focused course in the School of Management (SOM) contained significantly less text compared to those from SOE.
Examinee comments regarding written language were frequently noted, with three students—SOM1, SOM2, and SAS3—receiving critical feedback highlighting issues such as superficiality, inaccurate English usage, and potential language barriers Specific remarks included concerns about weak language, lack of explanation, and poor writing style, indicating significant problems with written English Additionally, two other students, SOM6 and SAS9, were also critiqued for their poor style in writing.
In a recent analysis, it was observed that the phrase "dumping not answering" highlighted a lack of detailed responses, with requests for further explanations beyond simple bulleted lists Among the students, two provided excessive answers to queries, while one student earned commendation for their presentation (SOM5) Additionally, some statements in other students' scripts suggested potential language issues; however, these could not be definitively linked to language difficulties Notably, written feedback was most commonly noted on the scripts of SOM students.
Exam scripts were not initially expected to reveal significant language difficulties, as examiners are not required to specifically assess English language proficiency Additionally, the high-pressure environment of exams may lead to a more lenient evaluation of language quality Variability in how Course Directors prioritize English in assignments suggests similar inconsistencies among exam markers After rigorous evaluation to eliminate ambiguous comments, the findings presented are likely to underrepresent the impact of language issues in exam scripts, thus providing strong evidence for the research Results are detailed in Table 9.
Summer Programme students – workload of thesis supervisors
In a recent survey, 15 out of 22 supervisors participated, yielding a response rate of 68% Their experience in supervising Master's theses varied significantly, spanning from two to 20 years, with five supervisors having over 15 years of experience While the number of theses supervised per academic session ranged from one to 16, it is noteworthy that 13 supervisors were responsible for supervising between four and seven theses.
Only half of the respondents addressed the relationship between students' English language proficiency and thesis marks, indicating a lack of clarity in the question Among supervisors, 20% believed that language proficiency positively impacted thesis marks, while 33.3% disagreed Notably, none of the supervisors indicated that language proficiency had an adverse effect, but seven provided critical comments regarding the writing skills of non-native English speakers—two comments were general critiques, while five were specific to individual supervisees.
Summer Programme student Entry test score Exam script Effect upon supervisor workload Thesis outcome
SAS1 Unavailable None Revise and represent
Partial access Adverse PGDiploma subject to conditions
SAS3 Critical of language Adverse Revise and represent
Partial access Adverse Revise and represent
SAS6 Unavailable Unknown Revise and represent
SAS8 None Unknown Revise and represent
SAS9 Critical of style Adverse Revise and represent
243 None None Revise and represent
SOE4 IELTS 6.5 Unavailable Unknown Minor corrections
92 None Adverse Revise and represent
SOE7 IELTS 6 Unavailable Unknown Revise and represent
SOM1 IELTS 6 Critical of language Unknown Revise and represent
Critical of language Equivocal Revise and represent
SOM3 IELTS 5.5 None Unknown Revise and represent
SOM4 IELTS 6 Critical of language Adverse Revise and represent
SOM5 IELTS 6 Positive on style Adverse Minor corrections
SOM6 IELTS 5.5 Critical of style Unknown Revise and represent
Table 9: Summer Programme students’ entry scores compared with, exam script comments, supervisor’s reported workloads and thesis outcomes
Eighty percent of supervisors reported that their students' English language proficiency negatively impacted their supervisory workload, while twenty percent noted no effect One supervisor indicated that, despite no increase in workload, he had to read several drafts, focusing more on language issues than content Twelve respondents provided detailed comments about their supervisory experiences, highlighting the extra time required for reviewing multiple drafts and the need for additional feedback with each iteration These comments underscored that the challenge lies not only in the volume of drafts but also in the significant qualitative shift in the reading process, which necessitates interpreting poorly written work.
In the process of refining their work, supervisors highlighted the need for multiple revisions to achieve clarity in thought and expression Specific challenges related to data presentation were noted by two supervisors, emphasizing the importance of clear communication One supervisor dedicated significant time to early drafts and recommended that their supervisee engage a proofreader to enhance the final version The feedback received from students is detailed in Table 9.
Appendix 4 presents 12 responses to the final open question, highlighting the significance of individual supervisory relationships while also addressing broader concerns regarding the impact of internationalization on higher education The findings emphasize the crucial role of supervisors and the compromises required to meet disciplinary and language standards in thesis production Notably, it was found that the English language proficiency of half of the Summer Programme students adversely affected their supervisors' workload.
Summer Programme students – thesis outcomes
As of January 2009, Table 9 highlights the outcomes of the Examination Boards for Summer Programme students regarding degree awards and thesis status Seven students successfully received their Masters qualifications, while 14 are required to revise and resubmit their theses for further assessment Additionally, one student, SAS2, was awarded a PG Diploma instead of a Masters degree, contingent upon passing an additional examination These results reinforce the pilot study's findings that Summer Programme students face a higher risk of academic failure.
Tables 10, 11 and 12 collate the results of the various studies employed to assess the progress of the Summer Programme students for individual Schools
In the Summer Programme for SOE students, four out of seven successfully earned their degrees without needing to revise their theses However, language challenges were evident, as illustrated by the case of SOE5, whose thesis was approved but led to an increased workload for the supervisor due to the student's language difficulties.
Despite successful academic outcomes, the absence of responses from supervisors of three students raises concerns about potential language issues Among those required to edit their theses, one student shows no English language difficulties, indicating that language deficiencies are not the sole reason for lack of success However, supervisory feedback for SOE5 highlights that English proficiency negatively impacted this student's academic performance Overall, the group exhibits a high revision and resubmission rate of 43%, with two students identified as having language problems, one without, and uncertainties regarding the remaining students despite their academic achievements.
CBT 243 5.5 36 32 68 B+/A- A- A-/B+ B+ None None Revise and represent
IBT 92 6 37 37 74 B B+ A- B+ None Adverse Revise and represent
SOE7 IELTS 6 5.5 24 23 47 B+ C B B+ Revise and represent
SP St Summer Programme student
PI Pre-test IELTS W Writing L Listening R Reading T Total
SP Rep Summer Programme Reports LN Listening and note taking S Speaking
Exam Scr Exam Scripts None no comments about language
Table 10: SOE Summer Programme students’ entry scores and measures of progress
The SOM results indicate that 83% of students required revisions, with only one in six graduating without significant thesis revisions Among the students, three faced language challenges during supervision, and critical feedback regarding language and style was noted in four exam scripts, suggesting that five students struggled with written communication Additionally, the supervisor of the fifth student, SOM3, did not complete the questionnaire, leaving uncertainties regarding this student's performance despite low scores and grades.
SOM1 IELTS 6 5.5 34 33 67 B B B+ B Critical of language Revise and represent SOM2 IELTS 6
5 36 27 63 B+ A- A- B+ Critical of language Equivocal Revise and represent
5.5 4 29 18 47 C-/C C C C- None Revise and represent SOM4 IELTS 6 5.5 21 18 39 A- B+ A- B Critical of language Adversely Revise and represent SOM5 IELTS 6 5.5 27 25 52 B B+ B+ B/B+ Positive on style Adversely Minor corrections SOM6 IELTS
5.5 5.5 24 21 45 B B B B Critical of style Revise and represent
SP St Summer Programme student
PI Pre-test IELTS W Writing L Listening R Reading T Total
SP Rep Summer Programme Reports LN Listening and note taking S Speaking
Exam scr Exam Scripts None no comments about language
Table 11: SOM Summer Programme students’ entry scores and measures of progress
Student SAS2 received a PG Diploma instead of a Master's degree, contingent upon retaking an exam Among the remaining students, 54% (six students) need to revise their work, with four having documented language issues noted by their supervisors In contrast, SAS1's supervisor indicates no language difficulties, aligning with the student's IELTS scores and Summer Programme grades, suggesting that the need for significant thesis revisions is academically motivated More concerning are three students (SAS5, 10, and 11) who passed their theses but exhibit writing problems Overall, seven students have been identified with language issues negatively impacting their academic performance, while two students show satisfactory writing skills, and two are revising their theses without supervisor feedback.
SAS1 6 33 42 75 A A- A- B+ None Revise and represent
SAS3 5.5 8 24 32 B- C+ B- C+ Critical of language Adversely Revise and represent
SAS4 6 37 35 72 A- B+ B+ B+ Adversely Revise and represent
SAS8 5.5 34 34 68 B+ B+ B B+ None Revise and represent
SAS9 5 21 19 40 B B B+ A- Critical of style Adversely Revise and represent
SP St Summer Programme student
PI Pre-test IELTS W Writing L Listening R Reading T Total
SP Rep Summer Programme Reports LN Listening and note taking S Speaking
Exam scr Exam Scripts None no comments about language
Table 12: SAS Summer Programme students’ entry scores and measures of progress
The findings highlight the challenges researchers encounter when isolating the impact of English language skills on academic progress While insights from supervisors offer valuable triangulation, the incomplete data and the complexities of correlating various assessment methods impede definitive conclusions regarding the role of linguistic competence in students' academic performance Nonetheless, it can be reasonably inferred that, collectively, the Summer Programme students maintain a borderline level of English language proficiency throughout their year of study.
The discussion is framed by the three aims of the research study with references to the literature included where relevant.
Course Directors' admissions practices and experiences
The study revealed significant variations in selection practices across the institution, influenced by School affiliation and historical organizational structures linked to the presence of two campuses Selection procedures differed notably in their reliance on English test scores, with the School of Management (SOM) emphasizing English testing, while the School of Arts and Sciences (SAS) primarily relied on assessments conducted by linguist staff through interviews The School of Education (SOE) exhibited a blend of both approaches, utilizing formal English tests alongside evaluations by linguist staff for Erasmus Programme participants Although these differences were rooted in School affiliation, Course Directors, particularly in SOE, demonstrated autonomy in selection practices, leading to diverse methods across Masters programs SOM's strong collective culture is evident in its supplementary regulations for admitting non-native English-speaking students, allowing Course Directors to modify procedures only by adding measures that enhance, rather than replace, existing requirements The implications of these varied practices among Schools and programs will be further explored in relation to the research's third aim.
The study revealed that decision-making in selection rationales, as described by Banerjee (1995), involves complex criteria that often compete with one another, independent of school affiliation or program Notably, insights from applicant interviews highlight how Course Directors assess both language and academic skills, allowing for a comprehensive evaluation of an applicant's disciplinary knowledge through their English proficiency This finding contradicts the literature suggesting that English language ability is considered in isolation, as evidenced by the School of Management (SOM), which, despite emphasizing English entry test scores, opts for a diverse range of assessments rather than relying solely on one test.
The findings may stem from varying perspectives among academics and administrators, with academics favoring complex judgments and administrators prioritizing transparency and quality assurance Postgraduate applicants are diverse not only in nationality and language but also in age, educational background, academic qualifications, and career experience These factors influence a Course Director's decision on admissions, yet they remain unrecorded Consequently, applicants often meet some criteria while falling short on others, a nuance overlooked in existing literature and policy discussions Unlike typical UK undergraduate applicants, who share similar age and qualifications, postgraduate candidates present unique challenges that necessitate different selection procedures.
High demand for course places significantly influences applicant compliance with entry requirements, reducing the need to evaluate borderline candidates with diverse abilities and experiences Data indicates that Course Directors of popular programs are more inclined to increase English entry test score thresholds, ensuring all students meet the criteria While language selection becomes more straightforward in these scenarios, navigating other selection criteria may present greater challenges.
Course Directors displayed a lack of consensus regarding the predictive value of entry test scores for future academic success While some expressed skepticism about relying on a single test score, many shared anecdotes of students who demonstrated little correlation between their English entry test scores and actual academic performance, regardless of whether those scores were high or low.
Directors often relied on their own experiences and those of their colleagues, particularly concerning students from specific countries or institutions, which significantly influenced their selection decisions These decisions were based on a comprehensive assessment of various criteria, rather than evaluating each factor in isolation Overall, the behavior of Course Directors aligned with these practices.
Admissions Tutors in Banerjee’s study (1995) and with the inconclusive results of studies of the relationship between English tests and academic outcomes
The knowledge of English tests and their structure among Course Directors varied, with only one showing interest in learning more about the available English language proficiency tests A small number of respondents viewed further development activities as unnecessary All Course Directors were aware of the institutional and School requirements for non-native English speakers (NNES) applicants Although Course Directors in the SAS had less detailed knowledge of the tests, this was deemed redundant as linguist staff were responsible for assessing English language competence These findings align with previous studies by Coleman et al (2003) and Rea-Dickins et al.
In the current study, Course Directors believed their knowledge was adequate for making selection decisions, challenging the assumption that increased awareness of English would enhance these judgments Their understanding of English tests aligned with the balanced judgment model of decision-making, indicating that while a single test result plays a role, it is not the sole factor in determining whether to extend an offer to an applicant.
The perception of non-native English speakers (NNES) students' writing abilities is often negative, as test scores do not guarantee satisfactory academic writing, especially for extended texts like theses This raises concerns about how to effectively assess writing skills, with no clear solutions available While selection interviews can provide insights into oral skills, they do not address writing abilities For instance, a student from the Summer Programme, whose English proficiency was evaluated through an interview, later raised suspicions of a learning disability, highlighting the risks of current assessment practices Additionally, the authenticity of submitted application forms and papers complicates the evaluation of writing skills Course Directors, anxious about writing assessments, have attempted to engage students in electronic interactions and spontaneous writing Ultimately, writing assessment often relies on implicit knowledge and the applicant's previous educational background and institution as factors in selection decisions.
Course Directors have made significant modifications to their programs to identify at-risk students early and enhance writing skills While some initiatives, such as introducing a portfolio with an initial literature review, have successfully improved writing abilities, results vary across courses In one instance, limited writing opportunities before the thesis may have contributed to the production of poorly written theses.
Concerns about writing skills prompt a debate on whether postgraduate entry requirements should be elevated Course Directors generally believe that the rigorous nature of Master's programs leaves little room for students to engage in non-academic pursuits Evidence from the Summer Programme students supports this notion Although some UK higher education institutions mandate higher English test scores for postgraduate studies compared to undergraduate programs, this practice is not universally adopted The experience of Course Director SOE1, who increased his course's entry requirements, has not yielded positive outcomes.
A consensus emerged that students possess adequate oral skills for Master's study, although further research in diverse contexts is needed to validate this finding Factors such as student diversity, the rural campus location, and the emphasis on group work significantly contribute to the enhancement of oral skills In environments where these elements are less pronounced, results may vary Additionally, all interviewees emphasized the importance of immersion in the English language and UK higher education culture, which, while often beyond a Course Director's control, should be considered in student selection processes.
The relationship between Summer Programme students' entry assessments and
and subsequent linguistic and academic progress
The diverse selection practices at the institution generated valuable data for analysis in the pilot and interview studies; however, they complicated efforts to compare students' entry test scores with their later academic performance due to the small number of participants Moreover, excluding non-native English speakers (NNES) without formal English entry test scores would have led to the omission of several MSc programs from the study, significantly narrowing its scope.
The alternative option focused on the progress of Summer Programme students identified as borderline in English language skills Although final degree results are not yet available, the overall progress indicates that nine out of 24 students have successfully earned their Master's degrees, while 14 are revising their theses and one student may obtain a PG Diploma after passing an exam Typically, about 10% of Summer Programme students do not achieve MSc degrees, but if all students pass their resubmitted theses, the overall pass rate will exceed expectations.
The findings of this study align with Banerjee’s (2005) research on students deemed 'at risk' due to English language proficiency, revealing that eight such students in the Lancaster study faced persistent language challenges and had lower success rates compared to other non-native English speakers This data supports the Admissions Tutors' initial assessments, demonstrating their effective diagnostic skills in identifying students likely to struggle in Master's programs Similarly, the current study highlights that Summer Programme students mirror the 'at risk' group, as they too encounter ongoing language difficulties, particularly in writing This indicates that Cranfield staff are proficient in recognizing students vulnerable to language-related issues during the selection process.
Banerjee's research highlighted that students in the 'at risk' group experienced challenges related to additional study time and effort While the current study did not focus on the student experience, it revealed increased supervisory workloads for teaching staff Among the Summer Programme students, 11 faced writing difficulties that negatively affected their thesis supervision, while only three students showed no issues with written language This study thus supports and expands upon Banerjee's findings by identifying the additional costs from the perspective of teaching staff.
The Summer Programme students demonstrate a borderline status in both academic and linguistic progress While their findings highlight challenges, it remains uncertain how much linguistic issues specifically affect individual academic success within this group.
The consequences of different admission criteria and practices upon postgraduate students' academic progress
This study aimed to compare the impact of various admission criteria and practices on students' academic progress across different courses While program-level differences could not be assessed due to small sample sizes, significant disparities were observed at the School level The analysis revealed that students from the School of Management (SOM) performed worse academically, with 83% of SOM students required to revise their theses, compared to 63% of students from the School of Arts and Sciences (SAS) and 42% from the School of Education (SOE) Specifically, out of six SOM students, five faced revisions, while among eleven SAS students, one did not achieve a Master's degree, and six are revising their work In contrast, three out of seven SOE students are also revising their theses.
Stricter admission criteria often lead to academic challenges for students with borderline English skills, particularly those with lower entry test scores like IELTS 5.5 and TOEFL IBT 64, primarily found in the School of Management (SOM) In contrast, students in the School of Engineering (SOE) exhibit higher entry scores, meeting institutional requirements such as IELTS 6.5 and TOEFL IBT 92 This discrepancy may be attributed to varying English proficiency levels at entry and disciplinary differences, as fields like Management studies emphasize written communication and impose stricter assessment standards The prevalence of language-related feedback on SOM students' exam scripts further supports this notion However, the situation is complicated by the interdisciplinary nature of many courses at Cranfield, which blend management with scientific and technological content, as seen in the SAS and SOE programs.
The limited sample size necessitates caution in drawing definitive conclusions about the relationship between school admission criteria and academic outcomes To better understand the diverse factors influencing the academic progress of borderline non-native English speaking (NNES) students, further longitudinal research is essential.
This study investigates the admission process for non-native English-speaking (NNES) students into taught postgraduate Master's programs at a specific UK higher education institution (HEI) It also tracks the progress of a cohort of NNES students whose English language proficiency was deemed borderline at the time of entry Conducted amid growing public concern regarding the impact of increasing internationalization on academic literacy and degree standards in the UK, the research highlights critical issues surrounding language proficiency and academic success.
Limited literature suggests that UK academic admissions staff use complex selection criteria, where English test scores are important but not the sole determining factor in applicant evaluations for Masters programs Various factors influence judgments about an applicant's potential for success, with differing selection practices across schools Some institutions prioritize English test scores, while others rely on assessments from linguist specialists No single assessment method proved to be more effective in identifying students at risk due to borderline English proficiency, although this finding needs validation through larger studies Additionally, high demand for specific programs may lead to increased test score requirements, potentially restricting admissions for non-native English speakers with marginal language skills.
Academic staff expressed significant concerns regarding writing standards, particularly related to the Masters thesis that students complete at the end of their programs Students with borderline English language skills exhibited ongoing writing challenges, with over half needing to revise their theses This situation adversely impacted the workloads of their thesis supervisors, a trend also noted among those whose students passed on the first submission These findings suggest that writing issues may affect a broader range of students beyond just those with borderline skills, raising questions about the effectiveness of current assessment methods for evaluating the writing abilities of non-native English-speaking (NNES) students upon entry.
Current research is limited by the absence of evidence from students whose entry test scores meet but do not exceed the required thresholds Future studies should investigate the writing abilities of all non-native English speaking (NNES) students in taught Master's programs to better understand the extent of the issue Additionally, the case study approach used in this research cautions against generalizing the findings to other higher education institutions (HEIs).
A previous UK study reveals similarities that suggest opportunities for comparative case study research in diverse settings, including urban areas, undergraduate institutions, and environments with varying national group compositions, to further test and enhance the findings.
We wish to thank the British Council, IELTS Australia and University of Cambridge for granting us the opportunity to carry out this research
We extend our heartfelt gratitude to the Course Directors, lecturers, and Registry staff at Cranfield University for their invaluable contributions to the study, with special thanks to Chris Binch for his assistance in reviewing the examination scripts.
Banerjee, LV, 2003, Interpreting and using proficiency test scores, unpublished PhD thesis,
Department of Linguistics and Modern English Language, Lancaster University
British Council, 2008, ‘Why study a UK postgraduate degree?’, accessed 16 December 2008 from www.educationuk.org/pls/hot_bc/page_pls_user_article?x3348511603&y=0&a=0&d@42
British Council, 2009, ‘PM takes up the initiative on international education’, accessed 14 January
2009 from www.britishcouncil.org/eumd-news-pmi-ie-launch-takes-up.htm
Brown, L, 2008, ‘The incidence of study-related stress in international students in the initial stage of the international sojourn’ Journal of Studies in International Education, vol 12, no 5, pp 5-28
Coghlan, S, 2008, ‘Whistleblower warning on degrees’, accessed 18 November 2008 from news.bbc.co.uk/1/hi/education/7358528.stm
Coley, M, 1999, ‘The English language entry requirements of Australian universities for students of non-English speaking background.’ Higher Education Research and Development vol 18, no 1 pp 7-17
Coleman, D, Starfield, S and Hagan, A, 2003, ‘The attitudes of IELTS stakeholders: student and staff perceptions of IELTS in Australian, UK and Chinese tertiary institutions’, in IELTS Research Reports,
Volume 5, IELTS Australia Pty and British Council, Canberra, pp 160-207
Cotton, F and Conrow, F, 1998, ‘An investigation into the predictive validity of IELTS amongst a group of international students studying at the University of Tasmania’ in IELTS Research Reports,
Volume 1, IELTS Australia Pty Ltd Canberra, pp 72-115
Criper, C and Davies, A, 1988, ELTS validation project report, British Council and University of Cambridge Local Examinations Syndicate, London
Department for Innovation, Universities and Skills, 2008, ‘Prime Minister’s Initiative’, accessed 14 December 2008 from www.dius.gov.uk/international/pmi/index.html
Gillett, A, 2008, ‘Using English for academic purposes A guide for students in higher education’, accessed 18 December 2008 from www.uefap.com/index.htm
Hammersley, M, 2006 ‘Ethnography: problems and prospects’ Ethnography in Education, vol 1, no 1 pp 3-14
Higher Education Statistics Agency, 2008, ‘Students and Qualifiers Data Table’, accessed 14
December 2008 from www.hesa.ac.uk/index.php?option=com_datatables&Itemid1& task=show_category&catdex=3
IELTS, 2007, IELTS Handbook, co-produced by the University of Cambridge ESOL Examinations, the British Council and IELTS Australia Pty Ltd
Ingram, D and Bayliss, A, 2007, ‘IELTS as a predictor of academic language performance, Part 1’ in
IELTS Research Reports, Volume 7, IELTS Australia Pty and British Council, Canberra, 2007, pp
Institute of International Education (2005a) ‘Global destinations for international students’, accessed 4 December 2008 from www.atlas.iienetwork.org/?pH027
Kerstjen, M and Neary, C, 2000, ‘Predictive validity in the IELTS test’ in IELTS Research Reports,
Volume 3, IELTS Australia Pty and British Council, Canberra, pp 85-108
Lee, Y and Greene, J, 2007, ‘The predictive validity of an ESL placement test’ Journal of Mixed
Methods Research, vol 1, no 4 pp 366-89
Lloyd-Jones, G, Odedra, H and Neame, C, 2007, ‘An exploration of diversity in a UK postgraduate university.’ in Education in a Changing Environment, Doherty, E, ed Informing Science Press,
Maxwell, J, 2004, Qualitative Research Design: An Interactive Approach, Sage Publications,
Morgan, DL, 1997, Focus groups as qualitative research, Sage Publications, Thousand Oaks
O’Loughlin, K, 2008 ‘The use of IELTS for university selection in Australia: a case study.’ in IELTS
Research Reports, Volume 8, IELTS Australia Pty and British Council, Canberra, pp 145-241
Rea-Dickins, P, Kiely, R and Yu, G, 2007, ‘Student identity, learning and progression (SILP): the affective and academic impact of IELTS on ‘successful’ candidates’ in IELTS Research Reports,
Volume 7, IELTS Australia Pty and British Council, Canberra, pp 59-136
Select Committee on Innovation, Universities, Science and Skills, 2008, Letter from Peter Williams,
Chief Executive, Quality Assurance Agency to the Chairman of the Committee, 30.10.08 accessed
16 December 2008 from www.publications.parliament.uk/pa/cm200708/cmselect/ cmdius/905/8071710.htm
Sastry, T, 2004, Postgraduate education in the United Kingdom, Higher Education Policy Institute, London
Ward Schofield, J, 2000, ‘ Increasing the generalizability of qualitative research’ in Case Study
Method, Gomm, R, Hammersley, M and Foster, P, eds Sage Publications, London, pp 69-97
Yin, RK, 2003, Case study research: design and methods, Sage Publications, London
Interview schedule for course directors
Nature and reasons for research project
Can you tell me something about the course you direct?
How long has it been offered?
Are there similar courses around the UK?
How many students do you admit?
What is the employment rate of your graduates?
What type of employment do they enter?
In which countries do they work?
What is your experience of a teaching on the MSc programme and b as a Course Director?
In relation to English language proficiency, how do you select students on your course?
What factors do you consider when assessing an applicant’s English language ability?
What evidence do you use in assessing an applicant’s English language ability?
What features of a student’s CV and background would prompt you to recommend the student takes the Summer Programme?
What part does English language testing (IELTS/TOEFL etc) play in selection and admission to your course?
Do you consider different language skills separately eg listening, speaking, reading and writing?
Do you admit students under the European Partnership Programme?
What are your views on current admission practices in relation to English language proficiency? How could current selection and admission practices be improved?
What do you think are the educational consequences of mixed English language ability classes for a non-native English speaking students? b native English speaking students? c teaching staff?
Can you describe the programme structure and its assessment?
Is English language assessed on the course? If so, how?
Have you observed problems in group projects related to mixed English language ability?
Sometimes student cohorts may include a large group of students who share a language other than English Have you observed any educational consequences of this?
Native English speaking students may adopt specific roles in group projects such as proof reading and editing What are your views on this?
Have you ever taught a non-native English speaking student who you thought might be dyslexic? Have you recommended students for academic English support during the course?
What are your views on the web publication of Masters theses?
What are your views on the current ongoing provision of academic English support?
How could ongoing academic English provision be improved?
Would training for teaching staff in admissions and selection be helpful? If yes, what form should it take?
Focus group guide
Nature and reasons for research project
Tell me about your experience of postgraduate teaching (Cranfield and elsewhere)? (particularly, any experience of teaching non-native English speakers)
Tell me about the new roles you are undertaking within the department?
What are you looking for in selecting students for a Masters course?
What evidence will you look for?
How will you assess English language ability?
Does the international mix of students at Cranfield affect learning and teaching?
More specifically: a NNES students b NES students c teaching staff d management of group projects
What do you know about existing English language support for students?
What are your views on the form English support for students should take?
Is there a place for training staff in the selection of non-native English speaking students?
Questionnaire for supervisors of masters theses of Summer Programme students
Thank you for participating in this survey, which is part of a research study funded by IELTS/British Council that investigates how students' English language proficiency impacts their progress in specific Master's programs at Cranfield University.
The survey aims to discover whether deficiencies in students' English language proficiency are reflected in thesis marking and supervisory workload
The survey should take about 5 minutes to complete Please answer the questions with reference to the supervision of the student named in the accompanying email
Once you press the submit button you will not be able to edit your responses All data will be treated confidentially and anonymised in future publication
1 I am a thesis supervisor on the
Title of the Masters programme
2 Please enter the number of years you have been supervising MSc theses on this programme
3 Please enter the number of MSc theses you supervised in 2007/8
4 My student's English language proficiency has contributed to the thesis mark
Adversely not at all beneficially
5 My student's English language proficiency affected my supervisory workload
6 Please provide any further information that you feel may be relevant to the topic
Thank you for responding to the survey
APPENDIX 4: SUPERVISORS’ RESPONSES TO QUESTION 6 OF THE
QUESTIONNAIRE FOR SUPERVISORS OF MASTERS THESES
Students are reminded that their thesis must be grammatically correct and are encouraged to engage with their supervisor early in the process It is the student's responsibility to utilize the supervisor's services, recognizing that each student's needs may vary.
SAS Supervisor: English language proficiency varies considerably among my students – as noted above for some European students it has been a problem
SAS Supervisor: As I understand the role of supervisor we do not correct English So unless they are minor mistakes then I suggest that they get a proof reader
The SOE Supervisor emphasizes that the technical writing skills necessary for the course are not being adequately met by most overseas students, both from the EU and beyond It is crucial to address this gap in English proficiency before these students enroll in the program.
SAS Supervisor: Her English was generally good
SOM Supervisor: I would appreciate any "great solutions" but I guess we just have to live with this only downside of our truly international, multicultural groups of students
As a SOM Supervisor, I emphasize the importance of writing the introductory chapter early in the process I critically assess not only the content but also the quality of the English used in this chapter Based on my evaluation, I advise students on whether they should consider hiring a professional proofreader It's essential for them to understand that I do not serve as their proofreader.
Despite the availability of support, some guidance was overlooked, resulting in unnecessary changes to previously corrected sections This led to further revisions and, ultimately, the inclusion of sub-standard elements in the final submission.
The SAS Supervisor focuses on the written thesis document, necessitating significant time investment to address English language issues This process often involves multiple revisions, especially when new content is introduced, which subsequently requires further corrections.
As a SAS Supervisor, I emphasize the importance of non-native English speakers striving for fluency in their writing This fluency is essential for distinguishing between individuals who possess a solid understanding of scientific concepts but struggle with English expression, and those who may not fully grasp the material.
SOM Supervisor: Because I spent a lot of time draft-reading the work, it meant that I spent relatively less time on the evaluation of the actual content of the work
SOE Supervisor: The standard of English from China / Taiwan has improved in recent years although individual students will have variable skills.