(Irish Survey of Student Engagement; Suirbhé na hÉireann ar Rannpháirtíocht na Mac Léinn) has become an established feature of the higher education landscape in Ireland since its development in 2012-2013. Development and implementation of is driven by the intention to inform, support, and encourage enhancement discussions and activities throughout institutions, and to inform national policy. 

Video Guide

Watch this Video Guide to help you use this site. 

Thumbnail for How to use the Trends Over Time Research 2016-2021 Microsite video

Sometimes videos do not play due to the cookie settings selected for your browser. If the video above does not play, please follow the link to the video on YouTube here

The results presented in the Trends Over Time Research 2016-2021 originate from six years of student feedback gathered by (the Irish Survey of Student Engagement; Suirbhé na hÉireann ar Rannpháirtíocht na Mac Léinn). Offering the same questions each year during this six year period allows us to present national results to understand and explore any changes in students’ perceptions of their experiences and engagement with their higher education institutions. Meaningful change does not happen quickly, and it best happens when based on the results of multiple years of data.

Within institutions, enhancements are being made as a result of, and staff and students are best placed to measure and understand the impact of those enhancements through interrogation of their institutional data because of their understanding of the local context. This research includes case studies from institutions highlighting how they have used and PGR to effect change. It is hoped that this body of case studies will continue to grow in order to provide examples and effective practices to support others who are looking to make the most of the valuable dataset has created.

This research concludes with the contextualisation of results in Ireland against 10 other countries. Readers may find these analyses of interest and they may spark further research.

This addition to the student engagement discourse in Ireland will stimulate discussion about the trajectory of student engagement developments and their impact over time, as well as provide participating institutions with inspiration and direction for their own analyses and actions.

Who takes part? invites responses from first year undergraduate, final year undergraduate and taught postgraduate students in higher education institutions in Ireland. Participation is encouraged by senior management, faculty and staff, as well as student representatives, with local coordination usually provided by a lead member of staff and national coordination provided by the Project Manager.


What data are collected?

The survey responses are securely collected for each of the participating higher education institution by a survey company. The data are anonymised and aggregated to national results. It is these national-level results that are presented in the annual National Report and in this research. The anonymous dataset of responses for each individual institution is returned to that institution for local analysis at the level of institution/ faculty/ school/ college/ department/ learning support unit/ etc.


The importance of local context

The comprehensive nature of data gathered is a key strength of the national project. Interpretation of detailed results requires appreciation of the local context. Staff and students within participating institutions are best placed to interrogate their institutional data. Irish higher education institutions have multiple sources of data about their students, of which the dataset is a valuable component, which are used in varying and increasingly sophisticated ways to identify good practice and plan appropriate enhancement actions. Institutions can use results as a tool to understand and improve the student experience and to measure the impact of recent interventions. Institutions are committed to interpreting and utilising data to enhance the experiences of their students and do not support the use of student engagement results for any overly simplistic purpose that could be perceived as ranking institutions. 


Use of the results at a national level

At a sectoral level, data are used as part of the consultations between the HEA and by QQI with the higher education institutions they interact with, and by other stakeholders such as the National Forum for the Enhancement of Teaching and Learning, the National Student Engagement Programme, and the Department of Further and Higher Education, Research, Innovation and Science [government ministry]. The data are also used by the representative bodies of the higher education institutions, as well as the institutions themselves. For instance, the data sharing agreement between THEA and the higher education institutions it represents has facilitated sectoral analysis of data by THEA. An example of cross-institutional collaboration is the report using data from PGR published by the Association of Irish Careers Services (AHECS) Postgraduate Research Students Task Group (Lardner et al, 2020; available here). The Interim Results Bulletin 2021 (, 2021a; available here) is an example of where brought student voices to the heart of national policy and decision-making at a time when the sector was preparing for the 2021/2022 academic year.


Use of the results internationally

Internationally, the management has participated in a significant international collaboration which culminated in the publication of Global Student Engagement (Coates, Gao, Guo & Shi, 2022; available here), while the results of PGR have been reflected in European research and policy (Sursock, Fuller, Michalik & Peterbauer, 2021).

Every year a National Report of the given year’s results is published and disseminated widely throughout the higher education sector (find them here). The anonymised dataset is archived with the Irish Social Sciences Data Archive (ISSDA) annually. In 2016, the format of the Irish Survey of Student Engagement was reviewed following three years of data collection fieldwork. The question set was revised and shortened to 67 questions to incorporate learning from this fieldwork and also changes to the parent survey, the National Survey of Student Engagement (NSSE).

The results presented in this research are from six years of student feedback, from 2016 to 2021, using the current question set. The use of the same questions every year allows for the presentation of national results to understand and explore any changes in students’ perceptions of their experiences and engagement with their higher education institutions.


Aims of the research

  1. Provide insight into the progress of student engagement from 2016 to 2021. This insight will be gleaned from:
    1. the presentation of quantitative data over the six years
    2. a mixture of quantitative and qualitative data from the participating institutions demonstrating the impact the results of have had over the six years.
  2. Provide participating higher education institutions with a national comparison point against which to compare their own progress in addressing student feedback and enhancing the student experience.
  3. International comparator data are also included to provide participating higher education institutions with international context with which they may reflect on their own results.

Readers unfamiliar with or PGR are advised to refer to Project rationale and governance and to read the introductions to the National Report 2021 or the PGR National Report 2021, which provide a more detailed policy and operational context for the surveys.

Higher education in Ireland during the COVID-19 pandemic

In response to the growing threat posed by the spread of COVID-19, all higher education institutions in Ireland were required to restrict access to campus for the majority of staff and students from 12 March 2020, in compliance with nationwide restrictions on movement, and begin the pivot to emergency online delivery of teaching and assessment.The unprecedented change brought about by the COVID-19 global crisis cannot be overlooked. Nic Fhlannchadha, Lau & Stanley (2022) present an examination of the results to questions included in and PGR designed to examine the impact of COVID-19 on students (see below and here). These results align closely with the results of the IUA Enhancing Digital Teaching and Learning (2021) report “Your Education, Your Voice, Your Vision”.

Public health guidance related to COVID-19 has necessitated a move away from the traditional on-campus higher education model towards a remote and blended/ hybrid model in the 2020-2021 and 2021-2022 academic years. Such changes may be short-lived, or they may change how higher education operates in the long-terms. In either case, feedback from a national survey over a number of years has enduring value in understanding the experience of students in higher education in Ireland.

The consideration of the experiences of undergraduate and postgraduate students during the COVID-19 pandemic offers the opportunity to learn from the experience. As institutions and students’ unions work together over the coming years, the feedback from the 2021 survey will serve as a crucial reminder of what is most valued by their students and what should therefore be retained under new approaches to delivery. It should also serve as a powerful measure of the national student experience of taught and research students during the COVID-19 pandemic and inform local and national efforts to minimise the negative impacts on students.


The Interim Results Bulletin 2021

The results of 2021 presented in the Interim Results Bulletin 2021 are valuable because they provide standardised data from 43,791 students across 25 HEIs in Ireland. These students included full-time and part-time students, Irish domiciled and internationally domiciled students, students from across a range of fields of study and undertaking a range of programme types (among other student and course characteristics). The results were all generated during national fieldwork carried out in February-March 2021, during which time Ireland was in Level 5 lockdown (the highest level of restrictions). 


What did we learn?

The results suggest that respondents felt supported in terms of ongoing effective and timely communication (80.8% somewhat/ definitely agree), adequate online learning opportunities (84.3% somewhat/ definitely agree), and their ability to access the online learning sufficiently (86.3% somewhat/ definitely agree). Fewer respondents indicated that they had a suitable study environment at home (74.0% somewhat/ definitely agree). First year undergraduate respondents consistently had the highest rate of agreement, followed closely by taught postgraduate respondents. For all questions, final year undergraduate respondents had the lowest level of agreement. The most striking difference was for how connected respondents felt to their HEI despite the restricted access to campus, with a slight majority disagreeing with this statement (47.1% somewhat/ definitely agree). Here, the taught postgraduate students agreed most strongly (53.5% somewhat/ definitely agree), while the first and final year undergraduates responded more similarly and at a lower rate (46.7% and 43.8% somewhat/ definitely agree, respectively).


What makes this report special?

This interim report on the results for these specific additional questions brings a new and comprehensive evidence base into public view, and emphasises the exceptional circumstances experienced by students in higher education in Ireland during the 2020-2021 academic year.

How is the indicator score for each indicator calculated?

Indicator scores are NOT percentages but rather represent relative performance. They are calculated scores to enable interpretation of the data at a higher level than individual questions, i.e. to act as signposts to help the reader to navigate the large data set. Responses to questions are converted to a 60-point scale, with the lowest response placed at 0 and the highest response placed at 60. If a respondent selects “Quite a bit” as their response choice, their response converts to 40.

Indicator scores are calculated for a respondent when they answer all or almost all related questions. The exact number of responses required varies according to the indicator, based on psychometric testing undertaken for the North American National Survey of Student Engagement (NSSE). All responses are required for Higher-Order Learning, Quantitative Reasoning, Learning Strategies, Collaborative Learning, and Student-Faculty Interaction. All responses but one are required for Reflective and Integrative Learning, Effective Teaching Practices, Quality of Interactions, and Supportive Environment. The indicator score is calculated from the mean of (non-blank) responses given. Indicator scores for any particular student group – for example, the first year undergraduate cohort – are calculated as the mean of individual indicator scores.

Consequently, and crucially, indicator scores cannot be combined across indicators to calculate an average overall indicator score in any meaningful or statistically sound way.


How can I best understand indicator scores for different groups?

Indicator scores provide greatest benefit when used as signposts to explore the experiences of different groups of students – for example, first year undergraduate students and final year undergraduate students, or Irish domiciled students and internationally domiciled students. Indicator scores also provide an insight into the experiences of comparable groups over multiple datasets – for example, the experiences of 2021 first year undergraduate students relative to 2020 first year undergraduate students.


How can I best understand indicator scores for different indicators?

Different indicators should not be compared to each other. For example, there is no simple, direct link between indicator scores for Higher-Order Learning and indicator scores for Reflective and Integrative Learning. No useful interpretation can be drawn from the fact that indicator scores for Higher-Order Learning are generally higher than indicator scores for Reflective and Integrative Learning.

However, the following differences could usefully be explored: Higher-Order Learning indicator scores for final year undergraduate students are higher than Higher-Order Learning indicator scores for first year undergraduate and taught postgraduate students; Reflective and Integrative Learning indicator scores appear notably lower for first year undergraduate students than Reflective and Integrative Learning indicator scores for final year undergraduate and taught postgraduate students. 


To date, analysis of data demonstrates that greatest variation is evident within higher education institutions rather than between institutions. This has also been found to be the case in other countries that have implemented comparable surveys.

This reinforces the view that students and staff within individual higher education institutions are best placed to interrogate their local data. They best understand the local context and are well-placed to plan appropriate enhancement actions on that basis.

Do you have questions?

Contact the Project Manager at or using any of the social media channels listed here.