What difference has StudentSurvey.ie made in your institution?

Since 2013, over 300,000 first and final year undergrads and taught postgrads have used StudentSurvey.ie to share their thoughts and feedback with their higher education institution. But in that time what difference has StudentSurvey.ie made in your institution? How have the results been used to enhance the student experience across the 25 higher education institutions that take part?

We are asking staff and student representatives to reflect upon 9 years of StudentSurvey.ie and how this major national survey has made an impact in your institution.

What was the result that triggered a response from the institution?

The trigger for this action was the underuse of StudentSurvey.ie data at a Faculty and Programme level. While there were very good examples of where individual programmes had made changes due to StudentSurvey.ie, it was not embedded.


What did the institution do to respond to this feedback?

The Institute-wide Programmatic Review 2021, which happens every five years, provided an ideal opportunity to embed StudentSurvey.ie data analysis as part of the student feedback loop. At a Faculty level, it gave an opportunity for the faculty to analyse the data and see where interventions and improvements could be made. At a programme level, it provided the programme team with an opportunity to see what the responses from students were and how satisfied they were with their experience.

As part of student engagement activities, an analysis of the student attendance and part-time working patterns was carried out and yielded interesting results. Analysis of the free text data demonstrated the clear impact the lecturers were having on students’ lives, showing how supportive and engaged they are. To enable this analysis, a programme-specific team undertook the analysis of the data for their own programme. A special word of thanks to the academic staff who engaged with the input for this report. These are Dr Denise Earle, Dr Gary Cahill and Ms Martina O’Gorman.


What was the impact of the response?

The response was very positive as the academic staff saw what an impact they were making. This was especially true when looking at the qualitative responses.

As this was an Institute-wide Programmatic Review, it allowed the student voice to be heard and listened to.

Identification of areas of improvement at a programme level were submitted to the External Review Panel and are being implemented as part of the Programmatic Review. These will be reviewed annually. 

This has highlighted the impact of the StudentSurvey.ie data to staff and how it can be interrogated to further enhance the learning experience. It has prompted the development of a NFTL funded Institute project to develop a dashboard which will enable staff to easily engage with the data.


More information about the methodology

Faculty Level Analysis

For each faculty, Dr Fintan Bracken prepared an analysis of the nine indicator scores and their relevance. In Table 1, a sample of the analysis of Faculty A scores relative to the Institute and the National results. The time series analysis of performance of Faculty B is given in Table 2. Green indicates where the Faculty exceeds both the Institute and the National mean values.

Table 1 Faculty A StudentSurvey.ie indicator scores for 2020 compared to the whole of IT Carlow and nationally

Indicator

Faculty A 2020

Institution Mean 2020

National Mean 2020

Higher-Order Learning

36.1

36.8

36.4

Reflective and Integrative Learning

31.3

32.1

31.5

Quantitative Reasoning

21.9

20

21.1

Learning Strategies

30.5

32

31.7

Collaborative Learning

34.7

33

31.3

Student-Faculty Interaction

17.4

16.6

13.9

Effective Teaching Practices

36.5

36.1

34.9

Quality of Interactions

38.6

39.1

38.5

Supportive Environment

29.7

27.1

28


Table 2 Faculty B StudentSurvey.ie indicator scores over time 2017-2020

Indicator

2017

2018

2019

2020

Higher-Order Learning

33.7

32.9

33

35.4

Reflective and Integrative Learning

26.3

26.4

27.3

29.2

Quantitative Reasoning

19

20.4

21.4

22.1

Learning Strategies

26.8

29.5

27

29.9

Collaborative Learning

34

33.3

35.4

36.4

Student-Faculty Interaction

14.2

15.1

16.7

18

Effective Teaching Practices

34

31.6

32.7

35

Quality of Interactions

39.3

39.4

38

39.1

Supportive Environment

28.6

28.8

29

30.2


Programme Level Analysis

The Programme Team in the sample below undertook a thematic analysis of the qualitative and quantitative StudentSurvey.ie data for the years 2018, 2019 and 2020. An overarching analysis to identify strengths and weaknesses of the programme is summarised below.

Good features of programme

Practical work.

Lecturer student interaction.

Interactive classes.

Project work.

Extra support offered.

Weakness of programme

Difficult to catch up if material is missed.

High workload.

Not enough tutorials.

Suggestions for Improvement

More tutorial classes.

More help with project work.

Reduce the pace of some classes.


Qualitative Analysis Outputs

These are the outputs from a sample programme.

Themes identified as being best for engaging students in learning:

  • Support – academic, social and personal.
  • Teamwork.
  • Blackboard.
  • Lecturers.
  • Practical classes.
  • Small class sizes.
Logo
‘The lecturers really get involved with the students and openly take questions and take time to explain what is not understood. The material we are taught is relevant to the course so I anyway find it interesting and engaging. Also some of the lecturers almost seem passionate about what they are teaching so it makes it easier to listen and learn from’.

Themes identified to improve students’ engagement in learning:

  • Extended library opening times.
  • More guest lectures/speakers.
  • More practical work.
  • More emphasis on continuous assessment rather than final exams.
  • More tutorials.
  • More use of Blackboard.
Logo
‘Perhaps there could be more regular short quizzes in my particular course, to help review and consolidate what we have just learned’.

Quantitative Analysis Outputs

Looking at the Quantitative Analysis, themes explored include programme structure, programme facilities and programme delivery.

Student Attendance and Part-time Work

The following data are based on additional questions added to the survey by IT Carlow, which focus on attendance and the numbers of students with part-time jobs. The images below summarise the findings and the impact of part-time work on attendance can be clearly seen.