Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 57 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
57
Dung lượng
1,28 MB
Nội dung
Built Environment Architecture Building Planning and Landscape Design National Student Survey Discipline Report May 2012 NSS Discipline Report – Built Environment Contents Foreword How to use this report _ Architecture _ 2.1 2.2 2.3 2.4 2.5 2.6 2.7 2.8 2.9 2.10 2.11 Building _ 21 3.1 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 3.10 3.11 Comparison with all subjects combined _ 21 Comparison with STEM combined 22 Relationships between aspects of the student experience 23 Impact of aspects of the student experience on overall satisfaction _ 24 Range of institutional results for overall satisfaction 25 Comparison by nation _ 26 Comparison by institution type _ 27 Comparison by full-time/part-time 29 Comparison by gender _ 30 Comparison by domicile 31 Comparison with selected items from the Postgraduate Taught Experience Survey _ 32 Planning and Landscape Design _ 34 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 4.10 4.11 Comparison with all subjects combined Comparison with STEM combined _ Relationships between aspects of the student experience 10 Impact of aspects of the student experience on overall satisfaction _ 11 Range of institutional results for overall satisfaction 12 Comparison by nation _ 13 Comparison by institution type _ 14 Comparison by full-time/part-time 16 Comparison by gender _ 17 Comparison by domicile 18 Comparison with selected items from the Postgraduate Taught Experience Survey _ 19 Comparison with all subjects combined _ 34 Comparison with STEM combined 35 Relationships between aspects of the student experience 36 Impact of aspects of the student experience on overall satisfaction _ 37 Range of institutional results for overall satisfaction 38 Comparison by nation _ 39 Comparison by institution type _ 40 Comparison by full-time/part-time 42 Comparison by gender _ 43 Comparison by domicile 44 Comparison with selected items from the Postgraduate Taught Experience Survey _ 45 Comparison between cognate subjects 47 NSS Discipline Report – Built Environment Additional HEA resources 49 6.1 Research _ 49 6.2 Case studies of enhancement activities _ 49 6.3 Postgraduate surveys 50 6.4 Consultancy and change programmes 50 Further reading _ 51 Appendix A: Brief description of analyses 53 Appendix B: Full list of subjects covered in this report 54 Appendix C: Information about the NSS _ 56 Appendix D: NSS items _ 57 NSS Discipline Report – Built Environment Foreword The National Student Survey (NSS) and its role The National Student Survey (NSS) is an annual survey of all higher education students in their final year of study in England, Wales and Northern Ireland, and for some institutions in Scotland It was introduced in 2005, and is an important source of information for anyone interested in the quality of an undergraduate degree programme It is administered by Ipsos-MORI on behalf of HEFCE, and aims to “gather feedback on the quality of students‟ courses in order to contribute to public accountability as well as to help inform the choices of future applicants to higher education”.1 This report covers the wide-reaching discipline of Built Environment, which the Higher Education Academy (HEA) defines to cover Architecture, Architectural Technology, Construction, Facilities Management, Housing, Landscape Architecture, Spatial Planning, Surveying, Real Estate, Transport and Urban Design It is one of a series of 28 NSS discipline-based reports that has been initiated, compiled and written by the HEA and their survey team This report offers a high-level analysis of the discipline of Built Environment and aims to provide the higher education sector with a better understanding of the experience of this student community Its general findings can be used as a first step into further qualitative investigation, which can lead ultimately to a genuine quality enhancement of the students‟ learning experience The importance of using the NSS scores only as an instigator for further investigation, however, must be stressed; the true worth of the NSS is only apparent when the data it provides are used as a benchmark, and/or to compare with like disciplines and institutions across the sector The analysis given in this report covers student responses for the JACS codes covering: Architecture – K100; Building – K200; and Landscape Design and Planning (Landscape Design – K300, Planning – K400 and others in Architecture and Planning – K900) The NSS asks participants to rate their level of agreement with 22 positive statements, on a five-point scale (in addition to „not applicable‟).The statements are grouped into six areas plus an overall statement: teaching; assessment and feedback; academic support; organisation and management; learning resources; personal development In addition to this, the survey also invites free-text comments about particular aspects of their experience This report focuses on the quantitative data for the subjects to enable „like with like‟ comparisons as far as possible It is also useful to compare results for departments/faculties with similar students (while at the same time recognising the breadth of subjects included in Built Environment), and this has been approximated by comparing with mission groups or institution types It is important to note that not all differences will be reliable or statistically significant, and caution must always be taken when interpreting or relying on small differences These reports are designed to help subject communities to use the information about the student experience for the continued enhancement of learning and teaching within the identified areas Highlighted features in the Built Environment discipline It is interesting to note that within all three areas of the Built Environment discipline covered in this report, the overall satisfaction of the students is most strongly correlated to their satisfaction on the quality of learning and teaching scale The results for the Built Environment subjects alone highlight a significant variation in the overall satisfaction between students studying the same subject grouping at different HEIs, but this variation is least evident in the Landscape, Design and Planning subject grouping http://www.hefce.ac.uk/whatwedo/lt/publicinfo/nationalstudentsurvey/ [Accessed 16 May 2012] NSS Discipline Report – Built Environment The Landscape, Design and Planning subject group in this report comprises the highest number of female students (48%) compared with the other two Built Environment groupings Initial results from this report indicate that male students are significantly more satisfied than their female counterparts when studying Architecture, an area that perhaps requires further analysis Overall, the results for the Built Environment subjects show lower levels of satisfaction compared to the experience of all other students responding to the NSS Furthermore, Architecture and Building report lower levels of satisfaction with their experience compared with the experience of all other students in the wider subject area of STEM responding to the NSS The report clearly shows differences between the different cognate degrees, as would be expected, and the diversity between subjects needs further longitudinal exploration Overall the survey suggests that strengths and weaknesses coexist across all subject areas and probably impact on the 75-81% overall satisfaction with the quality of the course The strengths are to be commended in the context of the changes experienced across the sector and the weaker response areas provide the basis for further consideration in relation to the variables and practices of the local context Recommendations This report, as one of a series of 28 discipline reports by the HEA, is most timely in its recognition of the importance of enhancing the student learning experience This will become increasingly significant postSeptember 2012, when UK HEIs will feel the full impact of student fee increases, the introduction of Key Information Sets (KIS) and the ongoing challenges of attracting future students to their HEI in the everchanging economic environment within which HE operates We are pleased to recommend this report to you covering the NSS from 2010-11 The data as presented in this report aim to make it easy to compare any local results with comparable degrees elsewhere and to start to evaluate your own students‟ experience However, these comparisons must be viewed as only one piece of the jigsaw to understand and ultimately improve the student experience We would stress that results from the NSS and findings from this report for the Built Environment discipline must be included in any quality assurance discussions and must assimilate, not in isolation but together with, other sources of information These should include formal reviews and assessments such as accreditation visit reports, programme reviews, module evaluations, and university-wide reviews, as well as informal reviews through student-staff panels, focus groups and other mechanisms where student views are expressed In going forward, we would also stress that it is critical for any proposed changes resulting from such an integrated systematic review to be discussed with students prior to implementation The best long-term improvements in the quality of the student experience can only come through integrated student-staff initiatives, when everyone is engaged with the process Jane Kettle and Aled Williams Discipline Leads for Built Environment at the Higher Education Academy May 2012 NSS Discipline Report – Built Environment How to use this report This report presents data from the 2011 National Student Survey (NSS) for specific subjects, aggregated across all institutions By providing information about how subjects are reflected nationally in the NSS, the charts and tables are designed to help departments, faculties and institutions to contextualise and understand their own results This report includes NSS data for the following subjects, as classified in the Joint Academic Coding System (JACS) (see Appendix B for a more detailed list) Architecture (K100) Building (K200) Planning and Landscape Design (K300, K400, K900) Note about students studying multiple subjects Unless otherwise indicated, all students studying a subject at 50% FPE (full-person equivalent) or more will have their responses allocated to that subject Students studying two subjects at 50% FPE may therefore have their responses allocated twice In addition, students studying multiple courses, all at less than 50% FPE, will be excluded from the data These decisions have been taken to ensure that a response is allocated to a subject when the student has had a significant experience of that subject When used with an awareness of the limitations, NSS data can play a useful role in supporting improvements in learning and teaching By allowing comparisons and benchmarking, the data can highlight areas that would reward further investigation, either as areas of apparent success or challenge NSS results can be a useful starting point for discussions about learning and teaching, either with colleagues, senior managers, student representatives or students themselves It is also advisable to triangulate the data with quantitative and qualitative information from other sources in order to effectively target, design and evaluate enhancement activities This report presents a high-level picture of the discipline through the lens of NSS data, broken down and analysed in a number of different ways It does not provide a detailed picture of students‟ learning experiences, nor does it dictate specific areas for intervention However, it can be used in conjunction with local NSS data to gain an overview of the views of a group of students, which can provide an excellent starting point for further investigation and discussion As with all uses of quantitative data, caution should be exercised when interpreting small differences between respondent groups Small differences may be due to random variations in response, demographic characteristics of the respondents, method of response and many other factors, and small numerical differences may not in fact correspond to genuine differences in experience A further reason for caution is that for those comparisons involving institutional-level characteristics (UK nation, institution type and mission group) where small numbers of respondents are reported these may represent a cohort from one single institution, rather than respondents from a range of institutions A standard method of evaluating whether patterns in the survey sample are likely to reflect patterns in the wider population is to use tests of statistical significance Methods of analysis used in this report are detailed in NSS Discipline Report – Built Environment Appendix A Significance levels are included in the tables, but for ease of use significance levels of 0.05 or lower have been highlighted in bold – this is the level at which results are standardly taken to be significant, and suggests that there is a 95% or greater probability that the patterns found in the survey sample are reflective of the final-year undergraduate population as a whole Unless otherwise stated, where differences are significant (at the 0.05 level) the higher score is in bold text Where there are more than two scores being compared and the significance level is 0.05 or lower, the significance level itself is in bold text, and indicates that there is at least one significant different between two of the scores It should be noted, however, that significance testing assumes that the survey has been conducted using a random sample, or a design that approximates this In fact, the NSS attempts to survey the whole final-year undergraduate population and, while all surveys may experience non-response bias, it can be more difficult to correct for this in a „census‟ type survey A review by Paula Surridge for the HEA described tests for nonresponse bias that found no significant effect2 and the overall profile of NSS respondents is broadly representative of the wider student body However, it is not possible to say whether each subgroup explored in this report (such as part-time students, or the results for HEI „mission groups‟) is similarly representative For this reason, the significance levels included in this report should only be taken as indications of confidence in the survey results and we recommend that caution be exercised when interpreting, using or relying on small differences Similarly, the error bars placed around institutional scores may, if anything, be too narrow where non-response bias is substantial In order to present the data in a more complete manner, tables rather than charts have been used for the majority of this report3 Because there are different response rates for each item in the NSS, no single number of responses can be included for each group in a table Instead, the range between the lowest and the highest number of responses is shown The percentage values included in the tables correspond to the proportion of students who agreed with the relevant statement (survey item), i.e selected either „definitely agree‟ or „mostly agree‟ The number of responses to each item includes all of the responses (including those who disagreed) This report contains high-level analyses involving institutional and demographic characteristics Other than The Open University, no institutions are identified anywhere in the report – in the section on part-time students, the OU‟s results have been separated out as they constitute such a large proportion of the part-time student responses No group smaller than 23 students is reported, and every care has been taken to ensure that no student can be identified either directly or through implication The analyses included in this report were carried out by Mrs Gosia Turner The HEA acknowledges the assistance of the Higher Education Funding Council for England (HEFCE) in providing the NSS dataset used in this report The National Student Survey three years on: What have we learned (Surridge, 2009) The data contained in the tables can be used to create charts, if desired, by copying the entire table into a Microsoft Word document, and then copying the required data from that document into a Microsoft Excel spreadsheet NSS Discipline Report – Built Environment Architecture There are 3043 students in the NSS dataset who study Architecture at 50% FPE or more 40.1% of students who responded are women, 82.1% are from the UK, and 96.2% study full-time 2.1 Comparison with all subjects combined This table compares the experiences of students across the UK responding to the NSS in Architecture with the experience of all other students responding to the NSS These percentages, in this table and all other tables in the report, correspond to the proportion of students who agreed with the relevant statement, i.e selected either „definitely agree‟ or „mostly agree‟ The number of responses to each item includes all of the responses (including those who disagreed) Q1 Staff are good at explaining things All (excl Architecture) 88.1% Architecture Sig 81.3% 000 Q2 Staff have made the subject interesting 80.9% 82.9% 007 Q3 Staff are enthusiastic about what they are teaching 85.4% 84.5% 262 Q4 The course is intellectually stimulating 83.7% 85.1% 067 Q5 The criteria used in marking have been made clear in advance 73.2% 61.1% 000 Q6 Assessment arrangements and marking have been fair 74.6% 62.8% 000 Q7 Feedback on my work has been prompt 62.6% 61.1% 000 Q8 I have received detailed comments on my work 66.9% 65.7% 025 Q9 Feedback on my work has helped me clarify things I did not understand 61.4% 63.9% 004 Q10 I have received sufficient advice and support with my studies 75.0% 72.6% 011 Q11 I have been able to contact staff when I needed to 83.0% 78.1% 000 Q12 Good advice was available when I needed to make study choices 72.1% 70.2% 000 Q13 The timetable works effectively as far as my activities are concerned 78.5% 71.9% 000 Q14 Any changes in the course or teaching have been communicated effectively Q15 The course is well organised and is running smoothly 73.5% 65.5% 000 72.6% 59.9% 000 Q16 The library resources and services are good enough for my needs 81.0% 83.4% 001 Q17 I have been able to access general IT resources when I needed to 83.4% 81.2% 000 Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to Q19 The course has helped me to present myself with confidence 75.7% 71.4% 000 79.0% 77.0% 000 Q20 My communication skills have improved 81.9% 83.9% 002 Q21 As a result of the course, I feel confident in tackling unfamiliar problems 79.2% 79.8% 204 Q22 Overall, I am satisfied with the quality of the course 83.1% 78.1% 000 237817 - 261279 2963 - 3043 Number of responses to each item (range lowest – highest) NSS Discipline Report – Built Environment 2.2 Comparison with STEM combined This table compares the experience of students across the UK responding to the NSS in Architecture with the experience of all other students in the wider subject area of STEM responding to the NSS Q1 Staff are good at explaining things STEM (excl Architecture) 87.9% Architecture Sig 81.3% 000 Q2 Staff have made the subject interesting 78.4% 82.9% 000 Q3 Staff are enthusiastic about what they are teaching 83.7% 84.5% 311 Q4 The course is intellectually stimulating 84.2% 85.1% 448 Q5 The criteria used in marking have been made clear in advance 72.6% 61.1% 000 Q6 Assessment arrangements and marking have been fair 75.8% 62.8% 000 Q7 Feedback on my work has been prompt 59.8% 61.1% 005 Q8 I have received detailed comments on my work 61.6% 65.7% 000 Q9 Feedback on my work has helped me clarify things I did not understand 58.6% 63.9% 000 Q10 I have received sufficient advice and support with my studies 75.4% 72.6% 001 Q11 I have been able to contact staff when I needed to 84.4% 78.1% 000 Q12 Good advice was available when I needed to make study choices 72.3% 70.2% 001 Q13 The timetable works effectively as far as my activities are concerned 80.0% 71.9% 000 Q14 Any changes in the course or teaching have been communicated effectively Q15 The course is well organised and is running smoothly 77.2% 65.5% 000 76.3% 59.9% 000 Q16 The library resources and services are good enough for my needs 83.3% 83.4% 561 Q17 I have been able to access general IT resources when I needed to 84.6% 81.2% 000 Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to Q19 The course has helped me to present myself with confidence 78.5% 71.4% 000 76.0% 77.0% 000 Q20 My communication skills have improved 78.1% 83.9% 000 Q21 As a result of the course, I feel confident in tackling unfamiliar problems 78.1% 79.8% 029 Q22 Overall, I am satisfied with the quality of the course 83.7% 78.1% 000 64464 - 69602 2963 - 3043 Number of responses to each item (range lowest – highest) NSS Discipline Report – Built Environment 2.3 Relationships between aspects of the student experience 21 items in the NSS are grouped into scales, each measuring a different aspect of the student experience (see Appendix D), while item 22 examines overall satisfaction This table shows the extent to which these different scales are correlated with one another In other words, it gives an indication of the strength of the relationship between different aspects of the student experience Values nearer indicate a stronger relationship However, due to the fact that this analysis shows correlations rather than causal relationships, it is not possible to conclude that improving one aspect of the student experience will automatically lead to improvements in another aspect, even where the relationship appears strong Q22 Overall, I am satisfied with the quality of the course Q22 Overall, I am satisfied with the quality of the course Quality of Learning and Teaching scale Assessment and Feedback scale Academic Support scale Organisation and Management scale Learning resources scale Quality of Learning and Teaching scale Assessment and Feedback scale Organisation and Management scale Academic Support scale Learning Resources scale Personal Development scale 750 654 704 640 401 652 620 688 571 376 619 643 606 361 511 603 405 586 394 495 400 Personal Development scale All correlations are statistically significant at 0.01 level The strongest relationship appears to be between overall satisfaction and quality of learning and teaching 10 NSS Discipline Report – Built Environment 4.9 Comparison by gender Female Male Sig Q1 Staff are good at explaining things 88.5% 86.3% 137 Q2 Staff have made the subject interesting 79.0% 78.3% 704 Q3 Staff are enthusiastic about what they are teaching 84.5% 83.6% 904 Q4 The course is intellectually stimulating 83.2% 78.7% 066 Q5 The criteria used in marking have been made clear in advance 71.8% 72.2% 806 Q6 Assessment arrangements and marking have been fair 72.7% 70.6% 206 Q7 Feedback on my work has been prompt 59.9% 57.7% 160 Q8 I have received detailed comments on my work 68.8% 65.2% 212 Q9 Feedback on my work has helped me clarify things I did not understand 58.1% 55.0% 186 Q10 I have received sufficient advice and support with my studies 73.6% 74.1% 824 Q11 I have been able to contact staff when I needed to 83.2% 82.3% 600 Q12 Good advice was available when I needed to make study choices 68.5% 68.8% 258 Q13 The timetable works effectively as far as my activities are concerned 78.7% 79.7% 750 Q14 Any changes in the course or teaching have been communicated effectively 75.4% 75.4% 960 Q15 The course is well organised and is running smoothly 70.0% 72.0% 152 Q16 The library resources and services are good enough for my needs 81.1% 83.6% 379 Q17 I have been able to access general IT resources when I needed to 80.6% 82.9% 522 Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to 71.6% 76.5% 126 Q19 The course has helped me to present myself with confidence 76.8% 80.4% 249 Q20 My communication skills have improved 81.6% 83.0% 655 Q21 As a result of the course, I feel confident in tackling unfamiliar problems 75.2% 78.8% 256 Q22 Overall, I am satisfied with the quality of the course 81.1% 81.9% 385 Number of responses to each item (range lowest – highest) 616 664 686 729 43 NSS Discipline Report – Built Environment 4.10 Comparison by domicile The following analysis breaks down the NSS results for the subject by students‟ place of residence Students are allocated to one category only, so those based in the UK are not included in the EU category for the purpose of this analysis UK EU Sig 91.9% NonEU 80.0% Q1 Staff are good at explaining things 87.5% Q2 Staff have made the subject interesting 78.9% 78.4% 71.1% 655 Q3 Staff are enthusiastic about what they are teaching 84.6% 75.7% 75.6% 307 Q4 The course is intellectually stimulating 81.4% 62.2% 80.0% 015 Q5 The criteria used in marking have been made clear in advance 72.2% 67.6% 71.1% 403 Q6 Assessment arrangements and marking have been fair 71.6% 75.7% 66.7% 635 Q7 Feedback on my work has been prompt 59.0% 67.6% 44.4% 015 Q8 I have received detailed comments on my work 66.9% 70.3% 64.4% 966 Q9 Feedback on my work has helped me clarify things I did not understand 56.6% 63.9% 46.7% 406 Q10 I have received sufficient advice and support with my studies 74.4% 73.0% 60.0% 010 Q11 I have been able to contact staff when I needed to 82.5% 89.2% 84.4% 758 Q12 Good advice was available when I needed to make study choices 68.9% 67.6% 62.2% 285 Q13 The timetable works effectively as far as my activities are concerned 79.7% 70.3% 73.3% 079 Q14 Any changes in the course or teaching have been communicated effectively 75.3% 70.3% 82.2% 212 Q15 The course is well organised and is running smoothly 71.8% 62.2% 57.8% 114 Q16 The library resources and services are good enough for my needs 83.3% 75.7% 62.2% 005 Q17 I have been able to access general IT resources when I needed to 81.7% 75.7% 88.9% 206 Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to 74.6% 65.7% 68.9% 532 Q19 The course has helped me to present myself with confidence 79.4% 78.4% 60.0% 002 Q20 My communication skills have improved 82.6% 86.5% 73.3% 265 Q21 As a result of the course, I feel confident in tackling unfamiliar problems 77.7% 70.3% 64.4% 098 Q22 Overall, I am satisfied with the quality of the course 82.4% 73.0% 64.4% 000 Number of responses to each item (range lowest – highest) 1222 1310 35 - 37 45 45 487 Where there are statistically significant differences for an item, this is highlighted in bold in the „Sig.‟ Column 44 NSS Discipline Report – Built Environment 4.11 Comparison with selected items from the Postgraduate Taught Experience Survey The national Postgraduate Taught Experience Survey (PTES) is run annually by the Higher Education Academy in conjunction with institutions This table shows comparisons between data from NSS items and data from relevant items in PTES The PTES data are from the 2011 administration of the survey, and the full report can be accessed on the HEA‟s website There are relevant items in PTES for all NSS items except items 10, 11, 12 and 22 For NSS items and 16 there are multiple relevant items in PTES Unless otherwise stated, the relevant item wording in PTES is either identical to the NSS item, or contains only insignificant differences The relevant PTES item numbers are in square brackets Please note that whereas the NSS is compulsory for HE providers in England, Wales and Northern Ireland, PTES is voluntary 80 institutions took part in PTES 2011, as opposed to the 253 institutions that took part in NSS 2011 Differences in results between PTES and the NSS may, therefore, reflect differences between the institutions taking part Nonetheless, PTES includes many of the same questions as found in the NSS as well as some that add further information to the NSS-type questions The results below compare the experience of those final-year undergraduates studying Planning and Landscape Design across the UK answering the NSS with the experience of taught postgraduates studying Architecture, Building and Planning 45 NSS Discipline Report – Built Environment Please also note that no tests for significance have been undertaken for this table; the differences between results for NSS and PTES items are provided for interest only, and should only be taken as indicative NSS PTES Q1 Staff are good at explaining things [PTES Q4a] 87.3% 77.9% Q2 Staff have made the subject interesting [PTES Q4b] 78.6% 75.4% Q3 Staff are enthusiastic about what they are teaching [PTES Q4c] 84.0% 81.5% Q4 The course is intellectually stimulating [PTES Q3d] 80.9% 83.4% Q5 The criteria used in marking have been made clear in advance [PTES Q11a] 72.0% 68.8% Q6 Assessment arrangements and marking have been fair [PTES Q11b] 71.6% 68.2% Q7 Feedback on my work has been prompt [PTES Q11c] I received feedback in time to allow me to improve my next assignment [PTES Q11d – no direct NSS equivalent] Q8 I have received detailed comments on my work [PTES Q11e] 58.8% 53.7% N/A 53.0% 67.0% 66.6% Q9 Feedback on my work has helped me clarify things I did not understand [PTES Q11f] 56.5% 56.8% Q13* The timetable works efficiently as far as my activities are concerned [PTES Q14a] 79.2% 73.0% Q14 Any changes in the course of teaching have been communicated effectively [PTES Q14b] 75.4% 70.6% Q15 The course is well organised and is running smoothly [PTES Q14c] 71.0% 67.3% Q16 The library resources and services are good enough for my needs [PTES Q16a] 82.4% 73.2% N/A 77.9% The library resources and services are easily accessible [PTES Q16b – no NSS equivalent] I am satisfied with the quality of learning materials available to me (Print, online material, DVDs etc.) [PTES Q16f – no NSS equivalent] Q17 I have been able to access general IT resources when I needed to [PTES Q16c] Q18 I have been able to access specialised equipment, facilities or rooms when I needed to [PTES Q16e] Q19 The course has helped me to present myself with confidence [PTES Q17d] N/A 70.7% 81.8% 75.4% 74.2% 60.8% 78.7% 66.1% Q20 My communication skills have improved [PTES Q17e] 82.4% 64.7% Q21 As a result of my course, I feel confident in tackling unfamiliar problems [PTES Q17f] Number of responses to each item (range lowest – highest) 77.0% 69.4% 1302 1392 780 - 1027 * PTES Q14a, the equivalent of the NSS item Q13, is slightly differently worded: „The timetable fits well with my other commitments‟ 46 NSS Discipline Report – Built Environment Comparison between cognate subjects This analysis compares NSS results for cognate subjects Because of important differences in the demographic characteristics of students studying different subjects, and the large effect that subject of study has been shown to have on NSS results, caution should always be exercised when comparing NSS data between subjects Cognate subjects have been selected for this comparison in order to provide potentially useful information and to suggest where there may be opportunities for learning between subjects, but the limitations of inter-subject comparison should be borne in mind Unlike in previous tables, only students who study that subject at 100% FPE have been included in this comparison 47 NSS Discipline Report – Built Environment Architecture Q1 Staff are good at explaining things 81.3% 80.7% 83.0% Planning and Landscape Design 86.8% Q2 Staff have made the subject interesting 82.9% 68.6% 71.0% 77.8% 000 Q3 Staff are enthusiastic about what they are teaching 84.4% 75.3% 78.6% 83.1% 000 Q4 The course is intellectually stimulating 85.1% 72.6% 82.0% 79.5% 000 Q5 The criteria used in marking have been made clear in advance Q6 Assessment arrangements and marking have been fair 60.8% 69.9% 68.3% 72.5% 000 62.7% 67.0% 72.2% 71.2% 000 Q7 Feedback on my work has been prompt 61.1% 47.5% 52.3% 58.0% 000 Q8 I have received detailed comments on my work 65.4% 53.2% 52.0% 66.4% 000 Q9 Feedback on my work has helped me clarify things I did not understand Q10 I have received sufficient advice and support with my studies Q11 I have been able to contact staff when I needed to 63.9% 47.5% 52.3% 55.8% 000 72.4% 69.7% 73.6% 72.5% 010 77.9% 75.5% 82.0% 81.9% 000 Q12 Good advice was available when I needed to make study choices Q13 The timetable works effectively as far as my activities are concerned Q14 Any changes in the course or teaching have been communicated effectively Q15 The course is well organised and is running smoothly 69.8% 65.6% 69.1% 67.8% 014 71.7% 72.3% 77.5% 78.6% 000 65.4% 68.4% 75.1% 75.6% 000 59.5% 63.3% 72.7% 70.9% 000 Q16 The library resources and services are good enough for my needs Q17 I have been able to access general IT resources when I needed to Q18 I have been able to access specialised equipment, facilities, or rooms when I needed to Q19 The course has helped me to present myself with confidence Q20 My communication skills have improved 83.4% 84.6% 84.7% 82.6% 127 81.1% 84.6% 83.9% 81.0% 000 71.3% 74.6% 76.8% 73.5% 000 76.9% 73.5% 77.9% 78.1% 000 84.0% 74.4% 79.7% 82.6% 000 Q21 As a result of the course, I feel confident in tackling unfamiliar problems Q22 Overall, I am satisfied with the quality of the course 79.8% 73.4% 80.3% 76.7% 000 77.9% 75.3% 80.4% 80.7% 000 2849 - 2926 3330 - 3563 2505 - 2596 1119 - 1189 Number of responses to each item (range lowest – highest) Building Civil Engineering Sig .000 Where there are statistically significant differences for an item, this is highlighted in bold in the „Sig.‟ Column 48 NSS Discipline Report – Built Environment Additional HEA resources The Higher Education Academy supports institutions and discipline communities to use student survey data to bring about the enhancement of the student learning experience For more about our work on the National Student Survey please visit http://www.heacademy.ac.uk/nss 6.1 Research The HEA has produced a number of key pieces of research relating to the NSS: Dimensions of Quality (2010) Produced by Graham Gibbs, this report sets out to identify those factors that give a reliable indication of the quality of student learning Its focus is broader than just the use of student survey data, but it provides a useful overview of different mechanisms of evaluating educational quality Available from: http://www.heacademy.ac.uk/assets/documents/evidence_informed_practice/Dimensions_of_Quality.pdf The National Student Survey three years on: What have we learned? (2009) This report by Paula Surridge summarises some key pieces of research to give an overview of findings relating to the NSS It also gives recommendations for future work It is a very useful guide to NSS data, especially regarding the important question of what it can and cannot tell us Available from: http://www.heacademy.ac.uk/assets/documents/research/surveys/nss/NSS_three_years_on_surridge_02.06.09 pdf National Student Survey of Teaching in UK Universities: Dimensionality, multilevel structure and differentiation at the level of university and discipline: preliminary results (2008) This report, by Herb Marsh and Jacqueline Cheng, is a technical investigation of a number of issues, focusing in particular on the relative effects on NSS scores of various factors such as institution and discipline It is a rich source of information that can help to illuminate raw NSS data Available from: http://www.heacademy.ac.uk/assets/documents/research/surveys/nss/NSS_herb_marsh28.08.08.pdf 6.2 Case studies of enhancement activities Through its Institutional Working Group, the HEA has collected case studies describing how NSS data have been used to enhance learning and teaching within institutions: 12 case studies from 2007, available from: http://www.heacademy.ac.uk/assets/documents/subjects/bioscience/nss-case-studies.doc case studies from 2010, available from: http://www.heacademy.ac.uk/assets/EvidenceNet/Case_studies/NSS_case_studies_Nov_2010.pdf 49 NSS Discipline Report – Built Environment 6.3 Postgraduate surveys In addition to supporting the sector to use NSS data for the enhancement of learning and teaching, the HEA has also developed its own national surveys, looking at the postgraduate student experience Postgraduate Taught Experience Survey PTES has been running since 2009, and in 2011 about 39,000 students from 80 institutions completed the survey The survey asks students about a wide range of elements of their learning experience, including feedback, teaching and skills development It also asks about the depth and sophistication of the learning they have engaged in 85 institutions are taking part in the survey in 2012 For more information visit: http://www.heacademy.ac.uk/ptes Postgraduate Research Experience Survey PRES is the sister survey of PTES and is aimed at postgraduate research students It runs every two years, and in 2011 over 31,000 students from 102 institutions completed the survey The survey will next run in 2013 For more information please visit: http://www.heacademy.ac.uk/pres 6.4 Consultancy and change programmes The HEA runs regular change programmes for departments and faculties wishing to explore their NSS results More information can be found here: http://www.heacademy.ac.uk/change The HEA is also currently developing an institutional consultancy service, which will provide senior managers with advice, tailored analysis and support to help them use survey data to strategically address issues in learning and teaching If you are interested in this service then please email: nss@heacademy.ac.uk 50 NSS Discipline Report – Built Environment Further reading In addition to the research produced by the HEA described in the previous section, there are number of other studies and reviews that provide useful information about the strengths and limitations of NSS data Alan Fielding, Peter Dunleavy and Mark Langan (2010) Interpreting context to the UK‟s National Student (Satisfaction) Survey for science subjects Journal of Further and Higher Education 34 (3), 347368 This is an investigation into the complex issues that can arise when interpreting NSS data A number of important findings are contained in the article, such as the absence of a strong correlation between the experience of feedback and overall satisfaction, and the important subject differences in students‟ responses to the NSS items Abbi Flint, Anne Oxley, Paul Helm and Sally Bradley (2009) Preparing for success: one institution‟s aspirational and student focused response to the National Student Survey Teaching in Higher Education 14 (6), 608-618 This article discusses the involvement of students in the process of using NSS data for quality enhancement purposes Various activities are described, including an event to allow academics to hear student perspectives in detail, and the publication of a „You Said, We Did ‟ document to inform students of the changes that had resulted from their feedback HEFCE (2011) National Student Survey: Findings and trends 2006 to 2010 Bristol: HEFCE This is the latest annual report on the NSS by HEFCE It provides an overview of the 2010 data, as well as looking at trends in the data from 2006 to 2010 around various demographic characteristics of the student population Paul Ramsden, Denise Batchelor, Alison Peacock, Paul Temple and David Watson (2010) Enhancing and developing the National Student Survey: report to HEFCE Bristol: HEFCE This report, commissioned by HEFCE, provided an interim evaluation of the functions and performance of the NSS, in order to arrive at recommendations about whether the survey should be updated or developed The study proposed no substantial changes to the survey, but recommended that a full review be undertaken in 2015 John Richardson (2005) Instruments for obtaining student feedback: a review of the literature Assessment and Evaluation in Higher Education 30 (4), 387-415 This is a very useful review of the research literature concerning the different kinds of survey tools that can be used to gather information about students‟ learning experiences John Richardson, John Slater and Jane Wilson (2007) The National Student Survey: development, findings and implications Studies in Higher Education 32 (5), 557-580 This article describes the history and development of the NSS, focusing on the mechanisms and findings of the two pilot surveys that took place in 2003 and 2004 Ruth Williams and John Brennan (2003) Collecting and using student feedback on quality and standards of learning and teaching in HE Bristol: HEFCE 51 NSS Discipline Report – Built Environment This is a report commissioned by HEFCE in order to: i) to identify good practice in collecting feedback from students, for quality enhancement; and ii) to make recommendations about the design and implementation of a national survey of students This report played an important role in the development of the NSS Mantz Yorke (2009) Student experience surveys: some methodological considerations and an empirical investigation Assessment & Evaluation in Higher Education 34 (6), 721-739 This article looks at a number of issues and controversies around the design and administration of sectorwide student surveys, including the NSS 52 NSS Discipline Report – Built Environment Appendix A: Brief description of analyses Differences between subjects and between various demographic groups within each subject were analysed using the chi-square test Each NSS item was recoded from the point Likert scale into the point Likert scale and was treated as a discrete variable The Pearson chi-square statistics were calculated for each NSS item separately and relevant statistical significance values are reported in the tables If the statistical significance value is equal or less than 0.05 then the difference between groups on the particular NSS item is statistically significant NSS scale scores were calculated for each student as the averages taken over all items that belong to the same scale The averages were taken over the original point Likert scale items, hence they take values between and Some missing values were allowed, i.e scale score was computed even if a student did not provide an answer to all questions in the scale The minimum number of answers required per scale was The correlation coefficients between scales and Q22 (the overall satisfaction item) were calculated using the Bivariate correlation (Two-tailed)4 The regression analysis employs simple linear regression The dependent variable – Q22 on a point Likert scale – was treated as a continuous variable, which is fairly common although not the only method used to analyse this type of data This method was chosen for the simplicity of the output and interpretation of the regression coefficients The NSS scale scores were used as the explanatory variables in the regression equation Some of them are highly correlated, hence the regression analysis may be subject to the multicolinearity issue Multicolinearity usually inflates the standard errors of the coefficients, but it does not affect the main conclusion from the model The graph(s) shown in the report are simple error bar graph(s) The dots represent the proportion of students who agreed with the overall satisfaction item (Q22) Error bars represent 95% confidence intervals Please see the Introduction for further guidance on the interpretation of statistical significance and confidence intervals as used in this report Mean scale scores can oversimplify Likert scale categories, and have only been used in this report where necessary – to undertake correlations and multiple regression analyses 53 NSS Discipline Report – Built Environment Appendix B: Full list of subjects covered in this report Architecture includes: K100 Architecture The study of the design, construction and erection of structures Combines design creativity with technical competence K110 Architectural Design Theory Design of buildings for human activity, taking into account both internal and external environmental factors K120 Interior Architecture The study of enclosed spaces; design, implementation and materials K130 Architectural Technology The theory and practice of advanced techniques and new materials in architectural design and construction K190 Architecture not elsewhere classified Miscellaneous grouping for related subjects which not fit into the other Architecture categories Building includes: K200 Building The study of building materials and techniques Includes building and environment law and economics, architectural engineering and quantity surveying K210 Building Technology The understanding of building design and its relationship with production K220 Construction Management The implementation of construction projects to the client's specification from inception to completion K230 Building Surveying The analysis of a building's performance from design and construction, through to maintenance and repair K240 Quantity Surveying The financial management of project design and construction, whether for client or contractor K250 Conservation of Buildings The repair and restoration of old or damaged buildings K290 Building not elsewhere classified Miscellaneous grouping for related subjects which not fit into the other Building categories Planning and Landscape Design include: K300 Landscape Design The study of the design, construction and management of land-based scenery Includes buildings within The landscape and the habitat surrounding them K310 Landscape Architecture The scenic design of the natural environment and the layout of gardens and open spaces K320 Landscape Studies The planning and management of the built and natural environment as landscape K390 Landscape Design not elsewhere classified Miscellaneous grouping for related subjects which not fit into the other Landscape Design categories K400 Planning (Urban, Rural and Regional) The study of the interaction between town and country land use Includes the use of land for building K410 Regional Planning The preparation of strategic plans for the development of a region K420 Urban and Rural Planning The planning of the infrastructure and development of settlements, including new towns and the management of change K421 Urban Planning The planning of the infrastructure, development and management of settlement in towns K422 Rural Planning The planning of the infrastructure, development and management of settlement in the country K430 Planning studies Reconciliation of the dynamics of the economic, environmental and social effects in the planning context K440 Urban Studies The interaction of the planning process and management policies on the built environment K450 Housing 54 NSS Discipline Report – Built Environment The development and management of housing projects in the private and social sectors and in land use planning K460 Transport Planning The development and management of transportation systems K490 Planning (Urban, Rural and Regional) not elsewhere classified Miscellaneous grouping for related subjects which not fit into the other Planning (Urban, Rural and Regional) categories K900 Others in Architecture, Building and Planning Miscellaneous grouping for related subjects which not fit into the other Architecture, Building and Planning categories K990 Architecture, Building and Planning not elsewhere classified Miscellaneous grouping for related subjects which not fit into the Others in Architecture, Building and Planning categories 55 NSS Discipline Report – Built Environment Appendix C: Information about the NSS The NSS is a survey of final-year students on undergraduate programmes It is compulsory for publicly-funded HE providers in England, Wales and Northern Ireland, and some Scottish institutions take part on a voluntary basis5 Ipsos-MORI administer the survey on behalf of HEFCE, and contact all suitable students using a variety of methods (including email and telephone) The survey was introduced in 2005, and in 2011 154 HEIs and 99 FECs took part, and 265,000 students responded – an overall response rate of 65% NSS data are currently available primarily from the Unistats website (http://unistats.direct.gov.uk), which allows visitors to compare overall satisfaction results at course and institutional level, as well as download spreadsheets with more comprehensive information In addition HEFCE releases headline figures, as well as annual reports providing national-level analysis From September 2012, course-level NSS data will be incorporated into Key Information Sets, which will be available on institutional websites and on a new central website For reasons of reliability and confidentiality, the threshold for public reportability of the results is a response rate of 23 responses, which must also represent at least 50% of the eligible students Where there are less than 23 responses, responses from more than one year, or from across different courses, can be aggregated to produce publicly reportable data In addition to the public availability of the data, institutions receive their own data at a more detailed subject level The reportability threshold for the data that institutions receive is 10 responses, rather than 23 Data at the individual student level are also available for researchers on application to HEFCE Data at that level have been used in this report The NSS is based to a significant extent on the Course Experience Questionnaire (CEQ), which has been in use in Australia since 1993 There has been a significant amount of research on the CEQ, and a more limited amount on the NSS, and this research indicates that the two surveys are both reliable – they yield consistent and repeatable data – and valid – they measure what they purport to measure The NSS asks participants to rate their level of agreement with 22 positive statements, on a five-point scale (in addition to „not applicable‟): definitely disagree; mostly disagree; neither agree nor disagree; mostly agree; definitely agree The statements are grouped into six areas, or „scales‟, in addition to an overall statement: teaching; assessment and feedback; academic support; organisation and management; learning resources; personal development As well as asking participants to rate their agreement with 22 statements, the survey also invites them to add free-text comments about particular positive or negative aspects of their experience Institutions can choose to utilise a bank of optional statements in addition to the 22 core statements, which are not publicly reported 14 Scottish institutions took part in 2011 56 NSS Discipline Report – Built Environment Appendix D: NSS items The teaching on my course Staff are good at explaining things Staff have made the subject interesting Staff are enthusiastic about what they are teaching The course is intellectually stimulating Assessment and feedback The criteria used in marking have been clear in advance Assessment arrangements and marking have been fair Feedback on my work has been prompt I have received detailed comments on my work Feedback on my work has helped me clarify things I did not understand Academic support 10 I have received sufficient advice and support with my studies 11 I have been able to contact staff when I needed to 12 Good advice was available when I needed to make study choices Organisation and management 13 The timetable works efficiently as far as my activities are concerned 14 Any changes in the course or teaching have been communicated effectively 15 The course is well organised and is running smoothly Learning resources 16 The library resources and services are good enough for my needs 17 I have been able to access general IT resources when I needed to 18 I have been able to access specialised equipment, facilities, or rooms when I needed to Personal development 19 The course has helped me to present myself with confidence 20 My communication skills have improved 21 As a result of the course, I feel confident in tackling unfamiliar problems Overall satisfaction 22 Overall, I am satisfied with the quality of the course 57 ... covering: Architecture – K100; Building – K200; and Landscape Design and Planning (Landscape Design – K300, Planning – K400 and others in Architecture and Planning – K900) The NSS asks participants... commitments‟ 33 NSS Discipline Report – Built Environment Planning and Landscape Design There are 1393 students in the NSS dataset who study Planning and Landscape Design at 50% FPE or more 47.7% of... studying Planning and Landscape Design across the UK answering the NSS with the experience of taught postgraduates studying Architecture, Building and Planning 45 NSS Discipline Report – Built Environment