NTFS Project Final Report: TESTA 2009-12)

Share Embed


Descrição do Produto

NTFS Project Final Report

TESTA (2009-12) Transforming the Experience of Students through Assessment

University of Winchester Bath Spa University University of Chichester University of Worcester

TESTA Team Dr Tansy Jessop Yaz El Hakim Professor Graham Gibbs Professor Paul Hyland Dr Duncan Reavey Dr Ian Scott 31 July 2012 Final Report written by Project Leader: Dr Tansy Jessop TESTA NTFS Project Final Report

Page 1

Table of contents

Page

1. Executive Summary

4

2. Background

7

3. Aims and Objectives

8

4. Methodology 4.1 Research Methodology 4.1.1.1 Specific Research Tools a) The Programme Audit b) Assessment Experience Questionnaire c) Focus Groups 4.1.1.2 Pre and post-intervention data collection 4.1.1.3 Research representation and participation a) Representation b) Participation 4.2 Approach to Change 4.2.1.1 Specifics of the Change Process

9 9 9 9 10 10 11 11 11 11 12 12

5. Planning and implementing the project work

13 13 14 14 15 16

5.1. 5.2. 5.3. 5.4. 5.5.

Project Leaders Programme Leaders Quality Assurance and Senior Management The Student Dimension of TESTA Managing Data Collection and Analysis

6. Outputs and Findings 6.1. Research Findings 6.2. Research Outputs

16 16 17

7. Outcomes 7.1. Aims and Objectives against achievements 7.2. Project Outcomes 7.3. Impact on learning, teaching and research communities 7.4. Beneficiaries: who, how, and why? 7.5. Lessons learned

19 19 21 22 23 24

8. Conclusions 9. Implications 10. References

24 25 26

NTFS Projects End of Year Report Template

Page 2 of 38

APPENDICES APPENDIX A: TESTA programmes in the four partner universities

28

APPENDIX B: TESTA expansion beyond the partner universities

29

APPENDIX C: Assessment Experience Questionnaire (AEQ)

30

APPENDIX D: TESTA Researcher Network

32

APPENDIX E: Dissemination of TESTA

33

APPENDIX F: Publications

37

APPENDIX G: Network of TESTA-linked Institutions by Use and Group

38

APPENDIX H: Google Analytics Summary of TESTA website usage

39

NTFS Projects End of Year Report Template

Page 3 of 38

Executive summary 1. The Transforming the Experience of Students Through Assessment (TESTA) project is a NTFS research and development project designed to address assessment and feedback issues at the programme-level. It is built on a robust, triangulated research methodology with qualitative and quantitative elements, and underpinned by educational principles and research literature. 2. The main aim of TESTA is to enhance the student learning experience from assessment by providing evidence to programme teams about assessment and feedback patterns and to help teams to identify ways of improving assessment design in the interests of better learning outcomes. 3. TESTA is a partnership project, led by the University of Winchester and working within four similar institutions, Bath Spa, Chichester, Winchester, and Worcester. TESTA set out to work on seven programmes in different disciplines across the four institutions. The project grew beyond the original four partners to more than 70 programmes in 17 universities in the UK and more than twenty programmes at one university in Australia. 4. The approach of TESTA has been to collect programme data, analyse and collate this into a readable case study, and engage in a conversation with the whole programme team about the findings. This discussion has welcomed lecturer clarification, contextualisation and challenge, and allowed for ideas and solutions to emerge from discussion with the whole programme team. 5. The findings from TESTA both in the partner institutions and beyond are as follows: 

The ratio of summative to formative assessment is much higher in almost all programmes. TESTA defines formative assessment as required, unmarked and eliciting feedback. This ratio of high summative to low formative occurs in spite of overwhelming evidence that formative assessment helps students to learn.



Student effort scores on the Assessment Experience Questionnaire (AEQ) are low, indicating that the assessment system’s influence on study habits does not encourage high levels of effort, distributed within and across modules.



The vast majority of students are not clear about goals and standards of their programme by the final year of their degree. Factors which hinder students’ capacity to make judgements about quality, include inconsistencies between lecturer expectations and marking practices, a seemingly ‘thin’, implicit and somewhat tacit relationship between assessment criteria and lecturer marking standards.



Students on TESTA describe feedback on assessment tasks as coming too late to be of use, partly because it often comes after the close of a module. Both the timing and timeliness of feedback impact on its usefulness. In some cases students describe

NTFS Projects End of Year Report Template

Page 4 of 38

feedback as so variable in quality, lacking in developmental focus, or so general in tone, that it lacks usefulness. 

The TESTA process has shown that many programmes’ assessment demands are built up from the module to the programme so that lecturers often see the programme through the lens of their own modules. Modular assessments are often developed in isolation and without relation to the whole programme.



The full assessment pattern is not apparent until it is mapped across the whole programme. In many instances, mapping demonstrates a lack of clear sequencing and progression through a whole programme, with students experiencing new and varied forms of assessment at odd points, and not gaining mastery of particular forms.

6. In the four partner universities, TESTA has collected post-intervention data on most programmes to try to understand the effects of programme-wide change. The data is complex given that many other changes are often taking place at the same time, and that the interventions have not had long to settle in. It is also difficult to interpret the influence of interventions across different cohorts of students. 7. The methodology of TESTA, its participatory approach to change, its programmatic focus, and its open source website have given it wider appeal in the UK Higher Education sector. The project’s expansion is reflected in these outputs: 

Members of the TESTA team have given more than 30 presentations and workshops, including 10 keynotes, to about 1500 academics mainly in the UK but also at universities in Australia and the Netherlands.



The website www.testa.ac.uk has had more than 6,500 hits from 116 countries in two years with users looking at an average of four pages in three minutes. The most popular pages are the research toolkit, videos, blogs and guidelines.



TESTA team members have helped at least 70 programmes in the UK and Australia, and at least 20 UK institutions to think more programmatically about their assessment and feedback patterns.



The project leader/lead researcher has trained 12 researchers in five institutions to undertake the TESTA process, and colleagues from nine institutions have undergone intensive workshops on undertaking the research the methodology.



TESTA was the theme for the HEA Change Academy Programme on assessment and feedback in 2011/12 in which seven universities participated and have been supported to implement the research process.



TESTA findings have been used by whole universities to inform an institutional review of assessment and feedback, as well as by whole faculties to leverage programmatic changes to assessment and feedback.

8. Overall, the achievements of TESTA demonstrate the value of evidence-led approaches to change, when empirical methods are based on educational principles, and the change NTFS Projects End of Year Report Template

Page 5 of 38

process respects the professionalism and disciplinary specialisms of academics. 9. TESTA underscores the importance of a whole systems approach to change by focusing on programmes, programme leaders, quality assurance managers, and students working in collaboration with researchers and academic developers, and with the support of senior managers. 10. The TESTA project expanded beyond its funded remit to support the conduct of research in more than four times the number of universities envisaged, and more than ten times the number of programmes planned, demonstrating its value for money, and strengthening the credibility and generalizability of its research findings. The expansion occurred in spite of the widespread perception and reality that the TESTA methodology is complex and resource intensive, involving statistical analysis of survey data; qualitative analysis of significant volumes of textual data, and access to students and potentially sensitive data.

Key Words Programme assessment, feedback, change process. Acknowledgements We are grateful to have been awarded a grant through the National Teaching Fellowship Scheme project strand initiative funded by the Higher Education Funding Council for England (HEFCE) and managed by the Higher Education Academy. I am particularly grateful to Professor Graham Gibbs, our consultant, for his generosity, wisdom, and experience, and his capacity to draw on a funny story to help guide the TESTA team and programmes through difficult terrain. Thank you to the project leaders at Bath Spa, Chichester and Worcester for their commitment and lively participation, and to the many programmes who opened their doors to us. Finally, thank you to all the students, who give voice to this project, and for whom we hope it will make a difference in the longer term.

NTFS Projects End of Year Report Template

Page 6 of 38

2.

Background

Research demonstrates the centrality of assessment for learning (Ramsden 1992; Knight 1995; Boud and Falchikov 2006). Assessment influences student perceptions and their satisfaction with higher education, reflected year in and year out in low scores on the National Student Survey, which in turn spawn university league table rankings. Assessment requirements profoundly influence the study behaviour of students (Innis 1996; Gibbs and Dunbar-Goddet 2007), while assessment outcomes affect graduate opportunities (Boud 1995). Assessment demonstrates what we value in teaching and learning, and significantly shapes the lives of our students. The design of assessment systems may contribute to a host of unintended and unwanted consequences for student learning, which may only be visible at programme level. Across the sector, HEA Subject Centre and FDTL projects, reflecting on the importance of assessment, have largely focussed at module level. Knight and Yorke (2003) describe one of the main consequences of modular degrees as the depletion of formative assessment opportunities (see also Rust 2000; Yorke 1998). The modular system has divided course credits into semester long units, in order to facilitate transferability across universities in the UK, and within countries who are signatories of the Bologna process. As a consequence, students study for degrees in discrete units, in compressed time, with the need to measure and accredit their achievements being the main drivers for a largely summative assessment diet. Recently, there has been a resurgence of research interest in programme-wide assessment reflecting anxieties that degree coherence, progression, and ‘slow learning’ are at risk in modular degrees (Bloxham and Boyd 2007; Claxton 1998; Gibbs and Dunbar-Goddet 2008; 2009; Knight and Yorke 2003; Knight 2000; Rust 2000). In the literature, there is recognition that changing assessment to improve student learning will require a whole programme approach. In responding to sector wide problems in the design of assessment, the ASKe CETL (Price and O’Donovan et al, 2008) have developed an ‘assessment manifesto’ for improving student learning through assessment, emphasising formative assessment and feedback, reducing summative assessment and raising questions about written criteria and feedback. Research has demonstrated that institutions have distinctive ‘assessment environments’ that frame programme, discipline and module-level practice (Gibbs and Dunbar-Goddet 2007). One institution may have one sixtieth of the formative-only assessment of another, nine times as much summative assessment, and half as much written feedback delivered four times as slowly (ibid 2007). The TESTA project grew out of evidence and literature to suggest that assessment regimes in UK higher education strongly favoured a diet of summative assessment over formative, and that this was detrimental to student learning. In parallel, a growing body of research was suggesting that NTFS Projects End of Year Report Template

Page 7 of 38

modular degree structures were having deleterious effects on assessment design and student learning through an emphasis on the module’s assessment rather than the coherence of the whole programme’s assessment diet. The importance of the TESTA project has been in its emphasis on gathering empirical evidence on whole programme assessment regimes, which are shared with programme teams in a context of tried and tested assessment and feedback principles (Gibbs and Simpson 2004; Nicol and McFarlane-Dick 2007). The project design has underscored the value of collecting robust and particular research data, which presented a case study of the typical student’s assessment experience across a whole degree programme. TESTA identified a need for programme teams to engage with the ‘big picture’ of what assessment and feedback regimes looked like from a student perspective, and to identify and address problematic patterns. In essence, it seemed that change required whole programmes to take a new approach. 3.

Aims and objectives

The aims and objectives of TESTA as originally envisaged in the bid are captured in the table below. The table highlights some refining of our initial approach, for example in using postgraduate students rather than undergraduate students as research assistants; and the complexity of measuring learning gains over a three year research and development project with different cohorts of students and many different and intersecting variables. The final objectives section of the table demonstrates the value for money of the TESTA NTFS project in terms of its reach. The project grew from seven programmes to more than 70 programmes in 18 different institutions, more than ten times its original scope (see details in Appendix A and B). Table 1: Aims and Objectives of TESTA Aims of TESTA 1. To improve student learning by changing assessment patterns at the programme level. 2. To develop a deeper understanding of patterns of assessment in different disciplines and universities. 3. To work with whole programme teams in developing evidence based assessment designs based on educational principles 4. To engage students in the research process, including as paid research assistants, facilitators of focus groups and participants.

5. To ‘measure’ or use proxy indicators to measure improvements in student learning.

NTFS Projects End of Year Report Template

How this aim changed No change No change No change We used PhD and Masters students rather than undergraduates because of the complexity and time-consuming nature of the methodology. We trained about a dozen researchers and research assistants from more than ten universities to use the methodology. Measuring improvements in student learning proved more complex than we anticipated for three main reasons: 1) The timeline for bedding in interventions was too short (one year conducting research, one year bedding in, one year measuring); Page 8 of 38

6. To leverage change in institutional QA arrangements. Objectives of TESTA 1. To collect data from seven programmes in the four partner institutions. 2. To disseminate the TESTA methodology to four additional institutions through the offer of free consultancy. 3. To share the methodology and findings through an open source and user-friendly website. 4.

Methodology

4.1.

Research Methodology

2) Many other variables influence student learning from assessment, for example in one partner institution, the second year also marked a shift from 15 credit semesterised modules to 20 credit year long and semester modules; 3)The cohort effect – we were ‘measuring’ effects of changes with different student cohorts. In many senses TESTA became less about measuring effects, than about changing the culture of how assessment is understood and executed, in line with evidence and educational principles. No change. How these objectives changed Expanded to serve 20 programmes in the partner universities and more than 50 programmes in a further 15 universities. Expanded to serve at least a further15 universities. No change

TESTA is based on a triangulated research methodology drawing on the conceptual foundation of Gibbs and Simpsons’ (2004) paper on the conditions of assessment, which improve student learning. The reasons we adopted this methodology over others was that it combined qualitative and quantitative methods; it was a tried and tested, published methodology; and we needed to present robust programme data to teams. Gibbs and Dunbar-Goddet (2007, 2009) had published on distinguishing university and programme-level assessment environments using this method, and we knew that it discriminated features of programme environments. 4.1.1. Specific Research Tools The TESTA methodology uses three tools to triangulate data. These include a programme audit which consists of a qualitative interview with the programme leader and a content analysis of programme documents; the use of a quantitative questionnaire to determine students learning behaviour in relation to assessment and feedback patterns in the programme; and finally focus groups with between three and eight students to provide explanation for some of the phenomena suggested by the audit and questionnaire.

NTFS Projects End of Year Report Template

Page 9 of 38

a) The Programme Audit The programme audit consisted of a discussion with the programme leader over programme documents to elaborate and quantify the ‘official’ assessment regime, as well as sampling written feedback from the administrator. The purpose of the audit was to discover what assessment and feedback a typical student might expect over the course of a three-year degree programme. The audit captured data on the following aspects of assessment on the programme: • number of summative tasks • number of formative tasks •

varieties of assessment



proportion exams to coursework

• •

amount of written feedback in word counts calculation of ‘formal’ oral feedback through tutorials, generic feedback etc



overview of criteria, learning outcomes, course docs.

b) Assessment Experience Questionnaire (AEQ) The AEQ is a survey developed by Gibbs and Simpson (2003) (see Appendix C). The AEQ has been used widely to measure the extent to which students experience various conditions of learning at module-level. The version used in this project had been revised to distinguish between programme environments (Gibbs and Dunbar Goddet 2007, 10). It consists of 28 statements clustered into nine scales linked to conditions of learning from assessment, with one overall satisfaction item. Students respond on a five point Likert scale, which ranges from strongly agree (5) to strongly disagree (1). The nine scales on the AEQ are: • Quantity and distribution of effort • •

Coverage of syllabus Quantity and quality of feedback

• •

Use of feedback Appropriateness of assessment



Clear goals and standards



Deep approach

• •

Surface approach Learning from exams

c) Focus Groups We conducted two to five focus groups per programme with 3-10 final year students in order to provide textual explanation for scores on the AEQ, and cross check the audit data. The discussion ranged from types of assessment, how assessment influences effort, what feedback is like, when it reaches them, how useful it is, perceptions of online, oral and written feedback, ideas about ‘feed forward’ and how they achieve a good ‘nose’ for quality.

NTFS Projects End of Year Report Template

Page 10 of 38

Figure 1: Diagram of TESTA research process

4.1.2. Pre and post-intervention data collection The design of the TESTA research process included pre and post intervention cycles of data collection to try to assess impact. Six out of seven programmes had undergone a full TESTA cycle of data collection and team discussion by June 2010. In most cases interventions ran from September 2010 for one year, before we started a second cycle of data collection. This entailed a repeat of the AEQ, programme audit and focus groups to try to capture data about the effects of various interventions. 4.1.3. Research Representation and Participation a) Representation The formulation of case studies drew on Richardson’s (1990) work on academic writing, and research on the pedagogic use of case studies (Kreber 2001), which suggests that “case studies should tell a good story, raise a thought-provoking issue, contain elements of conflict, promote empathy with the central characters, lack an obvious, clear-cut answer, take a position, demand a decision and be relatively concise” (Gross-Davis 1993, 162, in Kreber 2001). The formula we adopted for representing qualitative data from focus groups used the notion of headlines and supporting quotations, which proved a powerful incentive for lecturers to read the quotations. Richardson (1990) argues that in most social science research, readers skip over the quotations and read the interpretation, and we wanted to reverse this trend and encourage lecturers to listen to the student voice.

NTFS Projects End of Year Report Template

Page 11 of 38

b) Participation The single most powerful moment in the TESTA process was the meeting with programme teams over their case studies, which they had not yet seen or read. Prior to this meeting we met with the programme leader over their case study to ensure that the data did not compromise ethics or confidentiality. The programme team meeting was framed as a collegiate conversation, with a strong developmental focus. The TESTA researchers presented the case study as unfinished business rather than the ‘final word’ about a programme, because we knew that lecturers would add context to our interpretation, fill in gaps and correct misinterpretations. In every case the meeting proved to be a catalyst for thoughtful change by programme teams, and presented a rare opportunity for a whole team to catch sight of what the entire programme assessment pattern looked like and how the assessment process was experienced by students. 4.2.

Approach to Change

In TESTA we used an approach to change, which is similar to three categories in Land’s ‘Orientations to Academic Development’, in Eggins and Macdonald (2003). Our work drew on the following orientations: 

The researcher: this orientation sees evidence as being the most influential way of influencing colleagues and assumes a rational-empirical approach to change.



Interpretive hermeneutic: this approach prizes dialogue to balance different views, and draws on multiple perspectives to bring about new shared understandings of practice. Land describes this orientation as conversational.



Discipline-specific: this orientation works within the grain of disciplines, respects different disciplinary traditions and sees these as the product of a situated community of practice (Wenger 1991).

4.2.1. Specifics of the Change Process The TESTA process developed a participatory approach to sharing, discussing and acting on the data. The first step of this process was for the researcher/s to meet with the programme leader over the case study to cross check for accuracy and assure the anonymity of lecturers and students. Following this meeting, the programme leader and TESTA lead researcher assigned a two-hour slot for meeting with the whole programme team. The meeting was set up to be as collegiate as possible; in many cases it took place over lunch and allowed for live discussion over the papers which team members were led through for the first time at the meeting. The purpose of the meeting was to set the evidence before teams while opening a discussion about contextual factors, discipline-specific issues and pedagogic principles. We set the tone of the meeting by inviting team members to corroborate, challenge, and contextualise the case study using their own experience and expertise as lecturers. This allowed for a full and rich discussion. The TESTA team kept a record of the discussion, especially of priorities and suggested interventions, which was then sent back to the team. Following the meeting, discussions with the programme leader NTFS Projects End of Year Report Template

Page 12 of 38

involved setting and steering teams towards whole programme interventions, based on the evidence, and facilitating quality assurance mechanisms to enable teams to make quick changes to assessment patterns. After a year or so of the intervention being in process, the TESTA team used the same research methodology (audit, AEQ, focus groups) to evaluate the influence of changes. The following diagram illustrates the process:

Figure 2: Diagram of the change process 5.

Planning and implementing the project work

5.1.

Project Leaders

One key challenge for TESTA was how to work similarly across the four partner institutions. The project structure identified a lead person in each institution who was responsible for identifying programmes to work on, and who acted as a lynchpin for all of our activities. At Worcester and Bath Spa, the project leaders were both heads of Learning and Teaching, and at Chichester the project leader was a principal lecturer with responsibility for Learning and Teaching in one faculty, who taught on the programme which underwent TESTA. Initially we envisaged that the project leaders would have responsibility for conducting the research, but we decided to conduct the research from Winchester. Our two reasons for doing this were that: (a) our externality provided strategic leverage for project leaders to use in facilitating NTFS Projects End of Year Report Template

Page 13 of 38

working with and acting on the data; (b) consistency of approach was essential for the comparability and robustness of the data. The role of the project leaders became to facilitate access, to give validity and support to research we conducted, to prioritise TESTA findings as important for programme leaders, lecturers and senior managers in each institution, and as team members to refine and critique our processes and methodology ensuring that it was ‘fit for purpose’. 5.2 . Programme Leaders The second tier of leadership in the TESTA project was at the level of the programme leaders. In all four institutions, the project leaders identified possible programmes and had opening conversations with the programme leaders about participation. An informal criterion for programme selection was that the programme leader had a strong interest in learning and teaching; in some cases an additional criterion was that there appeared to be pressing assessment issues signalled by NSS scores. The TESTA budget provided funding for programme teams to participate, which in a few cases allowed for module buy-out, but in most cases was used to pay student research assistants and for student time to participate in focus groups. It also funded inhouse programme team lunches at which the data was discussed. 5.3.

Quality Assurance and Senior Management

We held meetings with Quality Assurance and Senior Managers to discuss fast-tracking processes in advance of the change interventions. In all four institutions, ‘fast-track’ protocols for evidenceled changes were put in place for TESTA programmes. In two institutions, Chichester and Worcester, major overhauls of degree programmes occurred within months of the findings being presented and were rolled out in the next academic year. At Winchester, findings about the influence of the non-specification of formative assessment in definitive documents on pedagogic practice, has led to its compulsory inclusion in programme documents. Bath Spa University was undergoing a major institution-wide overhaul of its academic degree structure from 15 to 20 credit modules, some year long, concurrent with the TESTA project. This led to programmes at Bath Spa engaging with pedagogic rather than structural interventions, given their context of widespread change, and institutional prerogatives. In the second year of the project, the project team held a successful Senior Manager Network. Early findings were compelling to senior managers who redoubled their support for the project. Worcester invited Graham Gibbs to give a keynote talk about TESTA at their annual Learning & Teaching day, which led to five more programmes requesting to undergo TESTA. At Chichester, two additional researchers were trained to undertake TESTA and two further programmes are undergoing the process. At Bath Spa and Winchester the first cycle of TESTA research generated additional interest. Graham Gibbs gave keynotes at Learning & Teaching events in both universities. His keynote at NTFS Projects End of Year Report Template

Page 14 of 38

Bath Spa was filmed for the TESTA website. At Winchester, five additional programmes requested to undergo TESTA, with the support of the PVC (Academic). Winchester used TESTA to inform revisions to programmes in the light of changes to the academic year structure (2011/12). Project Leaders were invited to present a one-day workshop on TESTA findings and programmatic strategies to assessment with 61 programme leaders in September 2010. The TESTA project leader trained a researcher at Bath Spa University to conduct the process on an additional five programmes. 5.4.

The Student Dimension of TESTA

There were two ways in which students became involved in TESTA. The first was through participation in the project as research assistants. Eight PhD and Masters students at three universities (Bath Spa, Birmingham, Winchester and Worcester) were mentored through the research process, enabling them to be competent at conducting focus groups and programme audits, and in one or two cases, being competent to analyse statistics from the AEQ (see Appendix D). The second and main engagement of students was as research participants in focus groups following completion of the AEQ in class. The AEQ was handed out as a paper survey in class with an explanation of the significance of the project and student voice. Small groups of students were recruited to participate in focus groups at convenient times, when the AEQ was distributed. The focus groups generated interest in assessment and feedback processes through the discussion, and some programme leaders reported in-class discussion about assessment processes after focus groups had taken place. The TESTA budget paid for student research assistants, and allocated book vouchers for student time in focus groups. Recruiting students to participate in focus groups is a common challenge in research in Higher Education. In successfully conducting research over three years with 274 students in 47 focus groups at six universities, the lessons TESTA researchers learnt about securing student participation may be summarised as follows: 

the single most important factor in ensuring student participation was that lecturers on the programme voiced open support for the research project to their students;



recruitment of students is far more likely to be successful if researchers come into a class and provide face to face information about the project, negotiating opportunities for conducting the research at convenient times;



conversely, we found that using email or online appeals for participation was singularly unsuccessful;



the idealistic pitch that students can help to improve the programme for future generations of students through their participation had some appeal to students;

NTFS Projects End of Year Report Template

Page 15 of 38



students in their final year of a course are quite attracted to participating in research when it comes with the promise of learning new techniques for gathering data, which may have application to their dissertation work;



incentives like book vouchers or even better, amazon vouchers, to pay for students time (not their ‘answers’) are an obvious attraction, as are sweeteners like nibbles and drinks.

5.5.

Managing Data Collection and Analysis

The TESTA project has generated huge volumes of data. Most of the statistics (AEQ = n. circa 1200) were entered and analysed by one researcher at Winchester under the guidance of the project co-leader with knowledge of statistics. The qualitative data (audits n= circa 19 & focus groups n = circa 47 with 274 students) have been analysed mainly by the lead researcher/project leader, using Atlas ti, qualitative data software. The lead researcher/project leader has taken the main responsibility for triangulating the data and crafting the case studies. Refinements and revisions occurred through discussion and use by members of the project team. A consistent approach has come about through assigning lead responsibility for data analysis and representation to a small core team. Simultaneously, we have sought to ensure sustainability and to support the expansion of TESTA by training up a network of researchers in various institutions, capable of undertaking TESTA (Appendix D). We have also supported the research expansion by undertaking the statistical analysis of the AEQ on at least 8 programmes at five additional universities through the Winchester TESTA team. 6. Outputs and findings 6.1.

Research Findings

The findings discussed below demonstrate assessment phenomena on 22 programmes at eight UK universities with which the TESTA team have worked closely. The TESTA project expanded beyond its funded remit to support the conduct of research in more than four times the number of universities envisaged, and more than ten times the number of programmes planned, demonstrating its value for money, and strengthening the credibility and generalizability of its research findings. The expansion occurred in spite of the widespread perception and reality that the TESTA methodology is complex and resource intensive, involving statistical analysis of survey data; qualitative analysis of more than a thousand pages of focus group transcripts (47 focus groups at an average of 25 pages) and negotiating access to students and potentially sensitive data. The first wave of expansion of TESTA from seven programmes in four universities to 22 programmes in eight universities is part of a bigger expansion (see Appendix B), but in terms of close analysis we feel confident to report findings from this sample with which we have worked more closely on data analysis. The section below reflects on TESTA’s key findings: 

The ratio of summative to formative assessment is much higher in almost all programmes. TESTA defines formative assessment as required, unmarked and eliciting

NTFS Projects End of Year Report Template

Page 16 of 38

feedback from peers, tutors or through electronic feedback. This ratio of high summative to low formative occurs in spite of overwhelming evidence that formative assessment helps students to learn. 

Student effort scores on the AEQ are low, indicating that the assessment system’s influence on study habits does not encourage high levels of effort, distributed within and across modules. The assessment pattern on most degree programmes allows students to work in fits and starts. The common pattern of two summative assessments per module with very little formative assessment is consistent with low effort levels.



In spite of well documented modules and programmes, outlining aims, learning outcomes and assessment tasks reasonably clearly, the vast majority of students on programmes are not clear about goals and standards by the final year of their degree. Factors which hinder their capacity to make judgements about quality, include inconsistencies between lecturer expectations and marking practices, and a seemingly ‘thin’ and somewhat tacit relationship between assessment criteria and lecturer marking standards. Without intentional strategies to help students internalise goals and standards, students lack confidence in their own judgement of what constitutes ‘good’.



Students on TESTA describe feedback on assessment tasks as coming too late to be of use, partly because it often comes after the close of a module. Both the timing and timeliness of feedback impact on its usefulness. Electronic administrative procedures may also work against the use of feedback, especially where marks are released electronically, and written feedback is administered on paper after the release of marks. In some cases students describe the quality of feedback as so variable in quality, lacking in developmental focus, or general in tone, that it lacks usefulness.



The TESTA process has shown that many programmes’ assessment demands are built up from the module to the programme so that lecturers often see the programme through the lens of their own modules. Modular assessments are often developed in isolation and without relation to the whole programme. The full assessment pattern is not apparent until it is mapped across the whole programme. Mapping often demonstrates a lack of clear sequencing and progression through a whole programme, with students experiencing new and varied forms of assessment at odd points, and not gaining mastery of particular forms.

6.2.

Research Outputs

NTFS Projects End of Year Report Template

Page 17 of 38



TESTA has helped more than 20 UK institutions and more than 70 degree programmes to think more programmatically about their assessment and feedback patterns, with tried and tested educational principles at hand (see Appendix G).



TESTA has developed and refined a case study format which can be found at www.testa.ac.uk/resources/case-studies/ and the lead researcher has written more than 20 case studies for use by programme teams.



Members of the TESTA team have given more than 30 presentations and workshops about TESTA, including 10 keynotes, mainly in the UK but also at universities in Australia and Netherlands (Appendix E).



Members of the team have published three peer-reviewed publications and one online publication on HEA Escalate subject centre website (Appendix F). Several publications are in the pipeline. The first is a methodological paper which provides a detailed description and analysis of the method and key findings; the second explores a thematic slice of data looking at findings on goals and standards, mainly in relation to formative processes, or their absence; the third explores findings about student effort in relation to assessment patterns; a fourth paper explores findings related to feedback; a fifth paper examines and compares TESTA data on two BA Primary programmes to elicit the ‘signature pedagogy’ elements (Shulman, 2005); and finally we plan to write a paper exploring disciplinary differences in assessment patterns, comparing sciences, humanities, creative subjects and applied professional programmes.



The project leader/lead researcher trained 12 researchers in six institutions to undertake the TESTA process (Appendix D), and colleagues from nine institutions have undergone intensive workshops on undertaking the research the methodology (Appendix E).



TESTA was the theme for the HEA Change Programme on assessment and feedback in 2011/12. This has generated a fresh community of practice between seven different institutions who are using TESTA for a variety of different purposes and contexts: as a tool for periodic review (Coventry University); on postgraduate programmes (Essex University); with flexible and craft/studio-oriented programmes (Dundee); to strengthen assessment and feedback processes across five faculties (Birmingham); to enhance feedback processes (London Metropolitan & Robert Gordon); using less resource intensive research processes (Keele).

      

TESTA findings have been used by universities (for example, East Anglia) to inform an institutional review of assessment and feedback, as well as by whole faculties

NTFS Projects End of Year Report Template

Page 18 of 38

(Arts and Social Sciences at University of New South Wales) to leverage programmatic changes to assessment & feedback. 

The open source TESTA website www.testa.ac.uk has provided the HE community with research tools, case studies, videos, blogs and other salient resources. To date, 6,500 users from 116 countries have looked at an average of four pages of the website for three minutes at a time (APPENDIX H). The project leaders field questions from users on a regular basis. The University of Winchester has undertaken to sustain and resource the development of the website until 2014.

7. Outcomes The main outcome of TESTA has been to change the way participating programme teams think about assessment and feedback processes. TESTA has challenged a view of assessment through the lens of ‘my module’ and restated the value of a collaborative programme-wide assessment design, which attends to sequencing, timing, linkages and student learning across the whole programme. Within this paradigm shift from a modular to a programmatic view of assessment, and through discussion with programme teams, the value of formative processes has been underlined, and many lecturers on programme teams have begun to wrestle with strategies to embed formative processes in ways which students take seriously. One of the most successful interventions following TESTA has been the insertion of a final year module which requires students to enter and comment on each other’s academic blog posts across the year, and incorporates face to face discussion of these blog posts in seminar groups. The success of this process has been measured through post-intervention data collection, and improved NSS scores on assessment and feedback. Other change processes have addressed variations in marking standards through team marking workshops and analysis; improving students’ use of feedback through requiring self-reflection; and managing the return times of feedback through planning hand in dates more carefully, for example. Finally, as mentioned in section 5:3, change processes have incorporated some institutional changes to quality assurance practices to facilitate ‘fast track’ processes for evidence-informed, pedagogically driven changes to assessment design. 7.1. Aims and objectives against achievements Table 2 TESTA: aims and achievements Aims of TESTA 1. To improve student learning by changing assessment patterns at the programme level.

NTFS Projects End of Year Report Template

Achievements All the original TESTA programmes have made evidence-informed changes to their degree programmes including:  lengthening modules to facilitate more joined up assessment patterns (Psychology, Worcester);  streamlining variety and making more links between theory and practice in assessment (Nursing, Worcester);  reducing summative assessment and Page 19 of 38

   

2. To develop a deeper understanding of patterns of assessment in different disciplines and universities.

3. To work with whole programme teams in developing evidence based assessment designs based on educational principles

4. To engage students in the research process, NTFS Projects End of Year Report Template

increasing formative (Media Studies, Winchester); providing more dialogic peer and tutor feed forward (American Studies Winchester); streamlining and sequencing assessment (BA Primary Chichester); developing more consistent and clear marking practices and standards (Creative Writing and History Bath Spa); developing iterative and reflective cover sheets to encourage student engagement with feedback (History Bath Spa).

The measurement of the effects of these changes on student learning is complex, and although several programmes have shown improvements on individual scores in the AEQ, the changes have been statistically significant in only one case. TESTA has deepened understanding about assessment and feedback patterns as discussed in the findings. It has also begun to discriminate between different discipline trends in assessment patterns, student working habits, and feedback practices across social science, arts, creative degrees, professional degrees and ‘hard’ science programmes. A few examples include:  more, smaller summative and formative tasks distributed across weeks in ‘hard’ science programmes sometimes leading to higher effort scores;  greater variety of assessment in arts, social science and creative arts degrees, sometimes leading to confusion among students about standards;  more peer and tutor formative feedback practices on some creative degrees, leading to greater clarity about goals and standards;  higher ratio of exams in sciences and some professionally regulated degrees, sometimes leading to better coverage of syllabus. This was achieved by TESTA project team members on 20 programmes in the partner institutions and two teams beyond the partners, one at Birmingham, and another at Queen Mary University London. Programme team discussions were at the heart of the TESTA process, becoming a powerful catalyst for changes in thinking and practice about assessment and feedback. More than 270 students participated in TESTA Page 20 of 38

including as paid research assistants, facilitators of focus groups and participants. 5. To ‘measure’ or use proxy indicators to measure improvements in student learning.

6. To leverage change in institutional QA arrangements.

Objectives of TESTA 1. To collect data from seven programmes in the four partner institutions. 2. To disseminate the TESTA methodology to four additional institutions through the offer of free consultancy. 3. To share the methodology and findings through an open source and user-friendly website. 7.2.

focus groups, some 1200 completed AEQs, and a network of more than 12 new researchers were trained to conduct TESTA, most of whom were PhD and Masters students (Appendix D). A second cycle of post-intervention data collection using the same tools was completed on five out of seven programmes. The data collected gives some indication of improvements but does not measure these as effectively as we would have liked. We are asking programme teams to give us three years of NSS scores on assessment and feedback to see if these could be used as proxy indicators, but the same issues of ‘noise’ from other big changes, and the need for longer bedding down will be present. We have had modest success at testing the concept of fast-track changes based on evidence through TESTA. Discussions with QA managers have led to slight shifts towards a stronger pedagogic focus in documents and processes to assure quality. Achieved Yes Yes, and beyond this to 14 additional universities. Yes.

Project Outcomes

NTFS Projects End of Year Report Template

Page 21 of 38

7.3.



The substantive change that TESTA has brought about has been a shift in understanding about assessment and feedback, and particularly its influence on student learning at the programme-level. TESTA has underlined the importance of taking a whole programme approach to assessment design, and illustrated the prominence of assessment for measurement over assessment for learning on modular degree programmes.



TESTA’s main research outcome has been to place in the hands of the HE academic community a credible method for viewing whole programme assessment environments; a robust set of qualitative and quantitative tools for critically examining a programme’s assessment pattern; and a participatory approach to using evidence to bring about local changes.



In the four TESTA partner institutions, and particularly at the lead institution, academics from a variety of disciplines have been encouraged to think more programmatically about assessment and feedback, and to reassess the balance between summative and formative assessment. Some programmes have adopted gateway assessments to require students to engage in formative tasks; others are using portfolio or patchwork type assessments; while others have moved towards problem-based learning formats. All of these approaches value formative processes, help students to clarify goals and standards, and most demand from students more ‘time-on-task’.

Impact on learning, teaching and research communities



TESTA has been widely disseminated to more than 30 live audiences (circa 1500 people in total) and one major online event (JISC Expert Session at Annual Conference circa 180 participants) (Appendix E), and has website hits of approximately 6,500 visitors since it was launched in September 2010. The widespread dissemination indicates that TESTA is being widely used by the global higher education community, although this does not directly show impact.



Stronger impact indicators flow from the fact that many institutions in the UK and one in Australia have asked the TESTA team to help them use the methodology. There is a network of 26 institutions (Appendix G), which are using TESTA and its principles to inform assessment design and the academic structuring of degree programmes.



The publication of TESTA findings and the research process has been relatively slow, given that we have placed more emphasis on institutions using the methodology to benefit the student and staff experience of learning and teaching. We have published an article in the SEDA journal, Educational Developments, looking at aspects of the research and change process adopted in TESTA. We have published a paper in Active Learning in Higher Education exploring the relationship between quality assurance processes and assessment design, and

NTFS Projects End of Year Report Template

Page 22 of 38

we have published a case study of a teacher training programme’s TESTA findings on the Escalate website at http://escalate.ac.uk/studentfeedback. There is one disciplinary paper on Arts and Humanities’ students’ perspectives from TESTA data which has recently been accepted for publication and is in press in Arts and Humanities in Higher Education 12(1) (Appendix F for a list of publications). 

Taken together, the impact on the wider teaching, learning and research community has been to re-emphasise programme assessment design with a focus on student learning, sequencing of tasks, enabling students to distribute their effort, raising expectations, and helping students to clarify goals and standards through various iterative assessment tasks. One of the under-emphasised aspects of TESTA’s impact has been on how using principled and pedagogically sound approaches to assessment and feedback can also increase staff efficiencies, by placing more responsibility on student participation in assessment processes, for example through cycles of peer review, students’ re-writing assessment criteria, or engaging in reflective tasks about their feedback.



The impact on learning and teaching at the partner institutions has been to reinforce formative assessment (assessment for learning) as a sine qua non of assessment and feedback. Few academics in the partner institutions would now question the value of formative assessment, and many lecturers are now wrestling with convincing ways of making formative assessment matter to students. In the lead institution, managers talk about the ‘TESTA effect’ by which they mean the influence of viewing degrees from the programme perspective; the importance of having ‘thick’ descriptions about what NSS scores mean; and the value of using educational principles related to assessment design, feedback, and deep learning. TESTA has generated conversations, discussions and conceptual shifts which are difficult to measure but evident in the redesign of many programmes.

7.4.

Beneficiaries: who, how and why?

Who benefits 4 partner universities

30 programme leaders

How do they benefit Institutional benefits of deepening relationships, and learning about how four similar universities address assessment and feedback and QA processes. Two of the partners have successfully bid for a JISC institutional change project on Assessment and Feedback using technology to address challenges raised on TESTA programmes. See www.fastech.ac.uk Engagement with their teams in

NTFS Projects End of Year Report Template

Why do they benefit Because it reinforces collegial patterns of cross-institutional working, and exposes universities to new and different approaches to assessment and feedback.

Because they are able to Page 23 of 38

an evidence-led change process, through which they are given data about their programme’s assessment and feedback patterns, and how it is influencing student learning. 350 lecturers in partner institutions and two universities where team have worked directly with teams (QMUL and Birmingham)

274 students who participate in TESTA focus groups. 10,160 students in partner institutions and two universities where team have worked directly 12 new researchers

7.5.

negotiate changes on the basis of evidence and educational principles; because the research is done by trustworthy ‘outsiders’; and because the focus is developmental. Through participating in a team Because the focus is on the approach to using evidence and whole programme and the making whole programme team, and because the data is changes on the basis of that particular and respects evidence; through discussion of disciplinary difference. evidence, experience and assessment and feedback principles in relation to learning and teaching. Through raising awareness Because it creates more about assessment and feedback dialogue and thoughtfulness in a discussion. about A&F processes among the student community. Through changes in assessment Because these changes are design to enhance learning, designed to improve the effort, clarity of goals and student learning experience standards and provide an and may produce better environment to encourage graduates. deep learning. Through learning how to Because it enhances their undertake complex and research knowledge and skills, detailed research, engaging and enables researchers to with colleagues and students in strengthen the nexus between collecting and sharing data. research and teaching.

Lessons learned



TESTA has demonstrated the value of robust research in persuading academic colleagues to engage in learning and teaching projects. The fact that the research was both qualitative and quantitative, and contained student ‘voice’ and academic perspectives helped to make it compelling for academic colleagues.



The presentation of data and findings were important elements in the successful reception of TESTA data. Programme teams found the case study format, with headlines and quotations from students really helpful in focusing on key issues.



TESTA’s emphasis on participatory, collegial conversations over the findings, with space for lecturer contextualisation, corroboration and challenge, was powerful in ensuring that findings were negotiated, owned and refined by members of programme teams. This meant

NTFS Projects End of Year Report Template

Page 24 of 38

that actions which followed from the evidence came from programme teams. 

The design of the project in the partner institutions included an intervention phase which was helpful in crystallising a plan of interventions with the prospect of a second round of data collection. However most programmes have not been as focused on discussing the post-intervention data and findings, or as keen on finding evidence of impact. The main influence of TESTA seems to have been in the first phase as a catalyst for change, with the excitement that generated, rather than reflecting on how the changes have played out. This may be the result of the compressed timeframe for evaluation, and/or scepticism as to whether any real effects can be shown among the many other variables at play.



The expansion of the TESTA project flowed mainly out of resourcing the offer of one day’s free consultancy to four institutions by Graham Gibbs at a SEDA workshop, one year into TESTA. Coupled with the open source website with its user-friendly, transparent and easily downloadable tools, the practice of sharing information in a collegial way has helped TESTA to gain a national reputation.

8. Conclusions TESTA has gained a reputation as a valuable National Teaching Fellowship project because of its focus on programme-level assessment, its robust research process and its roots in educational principles about teaching and learning. The success and growth of TESTA in the four partner institutions which started with seven programmes and grew to more than twenty programmes is matched by expansion beyond the partner institutions to a further 13 UK universities, and one Australian university. Altogether some 70 programmes have undergone the TESTA process. Does TESTA work? As a research tool, a way of thinking about assessment, and a call to more programmatic approaches in how students learn, TESTA has had a profound influence in the lead institution and the partner institutions. It has influenced the partners and many other universities and academics in their thinking about assessment - to view the whole programme, to prioritise assessment for learning, to sequence tasks carefully, to reduce summative assessment in favour of formative, and to encourage dialogue and closer working between colleagues about what standards mean. So does TESTA work? As a way of improving student learning from assessment, we do not yet have enough evidence to make bold claims. Against a backdrop of widespread institutional change, it has been difficult to measure the influence of single programme interventions on student learning. Given that on most programmes, interventions have only run for one year, it is difficult to make judgements based on data from one year of implementation.

NTFS Projects End of Year Report Template

Page 25 of 38

So is TESTA worth doing? The value of TESTA seems to lie in getting whole programmes to discuss evidence and work together at addressing assessment and feedback issues as a team, with their disciplinary knowledge, experience of students, and understanding of resource implications. The voice of students, corroborated by statistics and programme evidence has a powerful and particular effect on programme teams, especially as discussion usually raises awareness of how students learn best. TESTA evidence has shown itself to be a catalyst for action and change in the interests of improving student learning, again and again. 9.

Implications

TESTA has re-focused attention on assessment design at the programme-level as a way of encouraging coherence, collegiality and high standards of teaching and learning. In a context where many new lecturers are part-time and fractional, it seems all the more important that emphasis is placed on whole programme planning, design and team dialogue to ensure consistent approaches and standards to assessment and feedback tasks. TESTA has underscored this notion of a community of practice. TESTA’s findings have some application to debates about the value of modularity and student choice. TESTA illustrates that modular assessment systems need to be well planned and managed to maximise sequencing, connections and coherence of learning from assessment. The question of slow and developmental learning over years is compromised when modules are disconnected and students feel they can ‘tick off’ modules. 

Quality Assurance systems often speak the language of the programme in validation and review processes, but findings from TESTA have shown that many lecturers view the programme through the lens of ‘their’ modules.



TESTA has generated valuable data on programme assessment patterns in different disciplines and fields of study. Winchester holds more than 1200 AEQ and audit returns on an SPSS database from 22 programmes at eight universities, and data from 47 student focus groups. The further research potential of this data is vast, for example: o Discriminating between science, arts, humanities and professional course assessment patterns; o Sharing best practice across disciplines; o Exploring in more detail the statistical correlations between audit and AEQ scores; o Collating case studies to represent particular assessment phenomena to the sector.



Developing the research potential of TESTA and sharing it with the community might be facilitated by using the HEA’s doctoral scheme to appoint a dedicated researcher competent in statistics and qualitative analysis to pull together the threads and explore the

NTFS Projects End of Year Report Template

Page 26 of 38

data from different angles.

10.



TESTA has generated a research community which will be sustained beyond the project life. This raises questions about how to fund activities in the future and the support of these researchers. Some universities are using internal funding mechanisms to support TESTA researchers (eg. Birmingham) but others may not have access to this kind of support.



TESTA has used data from students powerfully and effectively, but a further refinement may be to use undergraduate students as research assistants and change agents, as is being done in the JISC-funded FASTECH project at Bath Spa and Winchester, which builds on TESTA’s approach, and at Exeter University, which has been a forerunner in student participation in research and change processes, for example. References

Adams, J. and McNab, N. (forthcoming) Understanding Arts and Humanities Students’ Experiences of Assessment and Feedback, Arts and Humanities in Higher Education. 12(1). Bloxham, S. and Boyd, P. (2007) Planning a programme assessment strategy in Developing Effective Assessment in Higher Education. Berkshire. Open University Press. 157–75. Boud, D. and Falchikov, N. (2006) Aligning assessment with long-term learning. Assessment & Evaluation in Higher Education, 31(4) 399-413. Boud, D. (Ed.) (1995) Enhancing Learning through Self-assessment. London. Kogan Page. Claxton, G. (1998) Hare Brain, Tortoise Mind. London. Fourth Estate. Feedback and Assessment for Students with Technology (FASTECH) (2011-14) A JISC-funded project. Website available at: www.fastech.ac.uk Accessed 2 July 2012. Gibbs, G. and Dunbar-Goddet, H. (2007) The effects of programme assessment environments on student learning. Higher Education Academy. Available at: http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/gibbs_0506.pdf Accessed 1 July 2012. Gibbs, G. and Dunbar-Goddet, H. (2009) Characterising programme-level assessment environments that support learning. Assessment and Evaluation in Higher Education. 34(4) 481–9. Gibbs, G. and Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1) 3–31. Gross-Davis, B. (1993) Tools for Teaching. San Fransisco. Jossey Bass Higher and Adult Education Series. In Kreber, C. (2001) Learning Experientially through Case Studies? A Conceptual Analysis. Teaching in Higher Education. 6(2) 217-228. Innis, K. (1996) Diary Survey: how undergraduate full-time students spend their time. Leeds. Leeds Metropolitan University. Jessop, T., McNab, N. and Gubby, L. (2012) Mind the gap: An analysis of how quality assurance procedures influence programme assessment patterns. Active Learning in Higher Education. 13(3) 143-154. Jessop T., El Hakim Y. & Gibbs G. (2011) The TESTA Project: Research Inspiring Change. Educational Developments 12(4) 12-16. Jessop, T., Lawrence, P. and Clarke, H. (2011) TESTA: Case Study of a Change Process. BA Primary, University of Winchester. Available at: http://escalate.ac.uk/studentfeedback Accessed 1 July 2012. Knight, P. (2000) The value of a programme-wide approach to assessment. Assessment and Evaluation in NTFS Projects End of Year Report Template

Page 27 of 38

Higher Education. 25(3) 237-51. Knight, P. (2001) Complexity and curriculum: A process approach to curriculum-making. Teaching in Higher Education. 6(3) 369–81. Knight, P. and Yorke, M. (2003) Assessment, Learning and Employability. Maidenhead. Open University Press. Knight, P. (Ed) Assessment for learning in Higher Education. London. Kogan Page. Land, R. (2003) Orientations to Academic development. In Eggins, H. & Macdonald, R. (Eds) The Scholarship of Academic Development. 34-46. The Society for Research into Higher Education & Open University Press. Nicol, D.J. and McFarlane-Dick, D. (2006) Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education. 31(2)199–218. Price, M., O'Donovan, B., Rust, C. & Carroll, J (2008) Assessment standards: a manifesto for change. Brookes eJournal of Learning and Teaching, 2(3). Available at: http://bjelt.brookes.ac.uk Accessed 15 July 2012. Ramsden, P. (1992) Learning to Teach in Higher Education. London. Routledge. Richardson, L. (1990) Writing strategies: reaching diverse audiences. Newbury Park, California. Sage. Rust, C. (2000) Opinion piece: A possible student-centred assessment solution to some of the current problems of modular degree programmes. Active Learning in Higher Education. 1. 126–31. Shulman, L. (2005) Signature pedagogies in the professions. Daedalus. 134(3) 52-59. TESTA (2009–12) Transforming the Experience of Students through Assessment. Higher Education Academy, National Teaching Fellowship Project. Available at: www.testa.ac.uk Accessed 5 July 2012. Yorke, M. (1998) The management of assessment in higher education. Assessment and Evaluation in Higher Education. 23(2) 101-116.

NTFS Projects End of Year Report Template

Page 28 of 38

APPENDIX A: TESTA programmes in the four partner universities Summary of TESTA Programme activity in Partner Universities Key to cycles of data collection Pre and post data collected Pre completed, post in progress Completed Expansion programmes Expansion programmes in process

1.

TESTA Partner Universities University of Winchester (9)

2.

Bath Spa University (6)

3.

University of Worcester (2)

4.

University of Chichester (3)

NTFS Projects End of Year Report Template

Programme American Studies BA Primary Creative Writing History Law Media Studies Social Work Theology and Religious Studies English Language Studies Creative Writing Education Studies Graphic Communications Geography History Media Communications Nursing Psychology Drama and Performing Arts Midwifery Biology Computing Early Childhood Education Sport Studies BA Primary Sport Science Music (26 programmes)

Page 29 of 38

APPENDIX B: TESTA EXPANSION beyond the partner universities Summary of TESTA expansion in non-partner universities Expansion post SEDA workshop HEA Change Academy expansion Website & word of mouth expansion

1.

External Universities University of Birmingham

2.

University of Kent

3.

Edgehill University

4. 5.

Queen Mary University University of New South Wales, Australia

6.

Liverpool John Moores

7. 8.

Keele University Dundee

9.

Robert Gordon

10. 11.

Coventry Essex

12. 13. 14.

London Metropolitan Roehampton Durham

NTFS Projects End of Year Report Template

Programme English Biochemistry Psychology Civil Engineering History Modern Languages Pharmacy Psychology Computer Science Web Systems Politics & International Relations 20 programmes in the Faculty of Arts and Social Sciences (FASS) – a faculty with 7,000 UG students Film Studies Sport Development Business Studies Building Surveying Psychology BA Childhood Practice BDes Textile Design Nursing Social Work Applied Social Studies Occupational Therapy Sport and Exercise Science Several programmes/TBC Sport and Exercise LLB BSc Maths Masters in Public Health Education Dance Science programmes Total UK: 30 Total UK and Australia: 50

Page 30 of 38

Assessment Experience Questionnaire (V3.3) By filling out this questionnaire I understand that I am agreeing to participate in a research study Please respond to every statement by circling 1 = strongly disagree; 2 = disagree; 3 = neutral; 4 = agree; and 5 = strongly agree to indicate the strength of your agreement or disagreement Programme of Study: …………………… ……. Biographical Data: (please tick as appropriate) Female……….

Please respond with respect to your experience so far of the programme named above, including all its assessment components

strongly agree agree neutral disagree strongly disagree

Male ……..

1 2 3 4 5

1 1 1 1 1

2 2 2 2 2

3 3 3 3 3

4 4 4 4 4

5 5 5 5 5

1 1 1 1 1 1 1 1

2 2 2 2 2 2 2 2

3 3 3 3 3 3 3 3

4 4 4 4 4 4 4 4

5 5 5 5 5 5 5 5

1 1 1 1 1

2 2 2 2 2

3 3 3 3 3

4 4 4 4 4

5 5 5 5 5

1 1 1 1 1 1

2 2 2 2 2 2

3 3 3 3 3 3

4 4 4 4 4 4

5 5 5 5 5 5

Age (17 -21…..)

(21 -30)……… (30 +……) st

Average achievement on this course: (1 ……); (2:1…….); (2:2…….) (3…….)

I used the feedback I received to go back over what I had done in my work The feedback I received prompted me to go back over material covered in the course I received hardly any feedback on my work You had to study the entire syllabus to do well in the assessment The assessment system made it possible to be quite selective about what parts of courses you studied 6 The way the assessment worked you had to put the hours in regularly every week 7 It was always easy to know the standard of work expected 8 I paid careful attention to feedback on my work and tried to understand what it was saying 9 The teachers made it clear from the start what they expected from students 10 The staff seemed more interested in testing what I had memorised than what I understood 11 It was possible to be quite strategic about which topics you could afford not to study 12 It was often hard to discover what was expected of me in this course 13 On this course it was necessary to work consistently hard to meet the assessment requirements 14 Too often the staff asked me questions just about facts 15 I didn’t understand some of the feedback on my work 16 Whatever feedback I received on my work came too late to be useful 17 The way the assessment worked on this course you had to study every topic 18 To do well on this course all you really needed was a good memory These questions are about the way you go about your learning on the course 19 When I’m reading I try to memorise important facts which may come in useful later 20 I usually set out to understand thoroughly the meaning of what I am asked to read 21 I generally put a lot of effort into trying to understand things which initially seem difficult 22 I often found myself questioning things that I heard in classes or read in books 23 I find I have to concentrate on memorising a good deal of what we have to learn 24 Often I found I had to study things without having a chance to really understand them Learning from the exam (only to be completed if there were exams on the course) 25 Doing exams brought things together for me 26 I learnt new things while preparing for the exams 27 I understood things better as a result of the exams Overall satisfaction 28 Overall I was satisfied with the quality of this course

1 2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 2 3 4 5

Comments you would like to make:

NTFS Projects End of Year Report Template

Page 31 of 38

Scales Quantity of effort (alpha=0.69) 6 The way the assessment worked you had to put the hours in regularly every week 13 On this course it was necessary to work consistently hard to meet the assessment requirements Coverage of syllabus (alpha=0.85) 4 You had to study the entire syllabus to do well in the assessment 5 The assessment system made it possible to be quite selective about what parts of courses you studied (Negative scoring) 11 It was possible to be quite strategic about which topics you could afford not to study (Negative scoring) 17 The way the assessment worked on this course you had to study every topic Quantity and quality of feedback (alpha=0.61) 3 I received hardly any feedback on my work (Negative scoring) 15 I didn’t understand some of the feedback on my work (Negative scoring) 16 Whatever feedback I received on my work came too late to be useful (Negative scoring) Use of feedback (alpha=0.70) 1 I used the feedback I received to go back over what I had done in my work 2 The feedback I received prompted me to go back over material covered in the course 8 I paid careful attention to feedback on my work and tried to understand what it was saying Appropriate assessment 10 The staff seemed more interested in testing what I had memorised than what I understood (Negative scoring) 14 Too often the staff asked me questions just about facts (Negative scoring) 18 To do well on this course all you really needed was a good memory (Negative scoring) Clear goals and standards 7 It was always easy to know the standard of work expected 9 The teachers made it clear from the start what they expected from students 12 It was often hard to discover what was expected of me in this course (Negative scoring) Surface Approach 19 When I’m reading I try to memorise important facts which may come in useful later 23 I find I have to concentrate on memorising a good deal of what we have to learn 24 Often I found I had to study things without having a chance to really understand them Deep Approach 20 I usually set out to understand thoroughly the meaning of what I am asked to read. 21 I generally put a lot of effort into trying to understand things which initially seem difficult 22 I often found myself questioning things that I heard in classes or read in books Learning from the examination (alpha=0.78) 25 Doing the exams brings things together for me 26 I learn new things while preparing for the exams 27 I understand things better as a result of the exams Satisfaction 28 Overall I am satisfied with the teaching on this course

NTFS Projects End of Year Report Template

Page 32 of 38

APPENDIX D: TESTA Researcher Network Name of Researcher Penny Lawrence Sabine Bohnacker Vanessa Harbour Laura Gubby Nicole McNab Camille Shepherd Joelle Adams Scott Denton Judy Cohen Mark Tymms Rob Fordham Sean Russell Deborah Brewis Komalirani Yenneti

Institution University of Winchester University of Winchester University of Winchester University of Winchester University of Winchester University of Winchester Bath Spa University University of New South Wales University of Kent1 University of Worcester2 University of Chichester University of Birmingham University of Birmingham University of Birmingham

Postgraduate Student (Full Time) Postgraduate Student (Part-Time)

1 2

Trained by Laura Gubby Trained by Dr Ian Scott NTF and Project Leader at Worcester.

NTFS Projects End of Year Report Template

Page 33 of 38

APPENDIX E: Dissemination of TESTA Keynotes Conference presentations Workshops Staff Development events Keynotes University of Derby, 5 July 2012 Dundee University, June 2012

10 8 7 8

Graham Gibbs presented a TESTA keynote at annual L&T day.

Southampton Solent University March 2012

Yaz El Hakim presented a TESTA keynote at the annual L&T day. Graham Gibbs gave a keynote on TESTA at the annual L&T day. Tansy Jessop presented a TESTA keynote at their annual L&T day. Yaz El Hakim presented a TESTA keynote at their annual L&T day.

Anglia Ruskin University, 23 Feb 2012

Graham Gibbs presented on TESTA.

Liverpool John Moores University, 14 June 2011

Graham Gibbs presented a keynote on TESTA at annual L&T day.

University of Utrecht, 10 Mar 2011

Graham Gibbs presented a keynote at L&T conference at Utrecht. Graham Gibbs gave a keynote on assessment and feedback using TESTA data. Graham Gibbs led a keynote session on programme-wide changes to assessment and feedback. 10 keynotes

City University, 13 June 2012 Kingston University, April 2012

University of Worcester L & T Day, June 2010 University of Winchester Blue Skies L&T Day, 10 June 2010 Conference Presentations PASS Event, University of Bradford, 24 July 2012

Yaz El Hakim gave a presentation on TESTA at the NTFS wrap up conference.

HEA Annual conference, 4 July 2012

Graham Gibbs presented on TESTA.

JISC Annual Online Expert Session, TESTA to FASTECH, 22 Nov 2011

Tansy Jessop, Yaz El Hakim & Paul Hyland presented an expert session about TESTA findings and principles, and technological potential to improve assessment processes in FASTECH. Paul Hyland presented on TESTA at an ALTC meeting of History specialists in Sydney.

Australian Learning and Teaching Council, April 2011 Higher Education Surveys for Enhancement Conference 2011, 19 May 2011 Fifth Biennial EARLI Assessment Conference: 1-3 September 2010 Higher Education Institutional Research

Gibbs & El-hakim on using AEQ to develop programme assessment. http://www.heacademy.ac.uk/events/detail/2011/academyevents/ 19_May_Surveys_For_Enhancement Tansy Jessop & Yaz El Hakim on presented a joint symposium at the EARLI-SIG Assessment Conference in Northumbria with Bradford’s PASS project. Yaz El Hakim and Tansy Jessop conducted a workshop

NTFS Projects End of Year Report Template

Page 34 of 38

Conference, Dublin, 28-29 June 2010 Assessment and Feedback Swapshop, University of Chichester 25 May 2010

exploring ‘Myths, Truths and Traditions: Barriers to Change in Assessment in HE’ through the programme audit. Tansy Jessop & Yaz El Hakim presented TESTA findings, with Vini Lander & Duncan Reavey from the BA (Primary Education and Teaching) contributing their experience as participants in the project. 8 conference presentations

Workshops HEA Change Academy, 28 and 29 Feb 2012

Liverpool John Moores: Workshop September 2011 University of New South Wales, Sydney, September 2011 Southampton Solent University, May 2011 Southampton Solent, December 2010. SEDA Workshop, Woburn House, October 2010 Workshop for Programme Leaders, University of Winchester, Sept 2010

Staff Development Events University of Birmingham, 10 July 2012 HEA Change Academy, January 2012 University of New South Wales, Sept 2011 Programme-wide assessment: doing 'more with less': 1 June 2011 Oxford Brookes University School of Business, 15 Mar 2011 Nottingham Trent University, 25 Jan 2011

Tansy Jessop and Yaz El Hakim gave interactive workshop sessions on using the TESTA methodology at the Change Academy for the seven participating universities. Tansy Jessop, Yaz El Hakim, Nicole McNab led a one day workshop on using the TESTA methodology. Tansy Jessop led a two day workshop with Faculty of Arts and Social Science colleagues and trained a TESTA project researcher. Tansy Jessop led a TESTA workshop at Faculty of Art and Design Away Day. Tansy Jessop & Yaz El Hakim led a workshop session at a faculty L&T event. Graham Gibbs led a one-day workshop with Carol Smith, Programme Leader of American Studies from Winchester. Tansy Jessop and Yaz El Hakim led a one-day workshop for colleagues on implications of TESTA findings for restructuring the academic year at Winchester. 7 workshops Tansy Jessop presented to College Leads at Birmingham and colleagues from Nottingham University. Graham Gibbs gave an overview of the principles underpinning TESTA at the Change Academy for the seven participating universities Presentation to Associate Deans at UNSW. Tansy Jessop, Yaz El Hakim and Carol Smith presented at the regional HEA assessment seminar hosted at Winchester. Graham Gibbs presented on TESTA. Graham Gibbs presentation on TESTA.

University of East Anglia, Nov 2010, Feb 2011

Graham Gibbs led several sessions and workshops.

PASS Event, Oxford Brookes University, 25 May 2010

Graham Gibbs spoke on TESTA. 8 Staff Development Events

NTFS Projects End of Year Report Template

Page 35 of 38

APPENDIX F: PUBLICATIONS Adams, J. and McNab, N. (forthcoming) Understanding Arts and Humanities Students’ Experiences of Assessment and Feedback, Arts and Humanities in Higher Education. Jessop, T., Y. El Hakim, and G. Gibbs (in submission) The whole is greater than the sum of its parts: a study of students’ learning responses to differing programme assessment patterns. Assessment and Evaluation in Higher Education. Jessop, T, McNab, N. And Gubby, L. (2012) Mind the gap: An analysis of how quality assurance procedures influence programme assessment patterns. Active Learning in Higher Education. 13(3) 143-154. Available at: http://alh.sagepub.com/content/13/2/143.full.pdf+html Jessop, T, El Hakim, Y. and Gibbs, G. (2011) The TESTA Project: Research Inspiring Change. Educational Developments 12(4), 12-16. Jessop, T, Lawrence, P. and Clarke, H. (2011) TESTA: Case Study of a Change Process. BA Primary, University of Winchester. Available at: http://escalate.ac.uk/studentfeedback

NTFS Projects End of Year Report Template

Page 36 of 38

APPENDIX G: Network of TESTA-linked Institutions by Use and Group These institutions have either used the TESTA methodology on programmes or to inform institutional review, or asked members of the team to talk to academic staff about TESTA findings and implications for assessment design. University

Use of TESTA

1 2 3 4 5 6

University of Winchester Bath Spa University University of Worcester University of Chichester Southampton Solent University University of Portsmouth

Progs/wider review Progs/wider review Progs/wider review Progs/wider review Faculty & wider review Staff Dev and meetings

7 8 9

Queen Mary University London Edge Hill University Nottingham Trent University

Programme-level Programme-level Staff Development

10

Kingston University

Institutional review

11 12 13

University of Kent University of East Anglia Liverpool John Moores University

Programme-level Institutional review Progs/wider review

14 15 16 17

Utrecht University, the Netherlands University of Roehampton University of Birmingham University of NSW, Sydney, Australia

18 19 20 21 22 23 24 25 26

University of Nottingham University of Dundee London Metropolitan University Coventry University Robert Gordon University Keele University University of Essex University of Derby Durham University

Institutional review Programme-level Progs/wider review Progs/faculty & institutional review Staff development Programme-level Programme-level Progs/wider review Programme-level Programme-level Programme-level Institutional review Staff Dev and faculty review

NTFS Projects End of Year Report Template

University Group Guild HE/UUK Million + GuildHE/UUK GuildHE/UUK GuildHE/UUK University Alliance Russell Group UUK University Alliance Million+/ Univ Alliance UUK 1994 group University Alliance UUK Russell Group Universitas 21 Russell Group UUK UUK Million + UUK UUK 1994 group Million+ Russell Group

Page 37 of 38

NTFS Projects End of Year Report Template

Page 38 of 38

Lihat lebih banyak...

Comentários

Copyright © 2017 DADOSPDF Inc.