[ Proceedings Contents ] [ Forum 1997 Abstracts ] [ WAIER Home Page ]

Monitoring standards in the arts: A testing program using an outcomes framework

Beverley J. Pascoe
Education Department of Western Australia
Since the inception of education standards in the form of Student Outcome Statements, the Education Department of Western Australia has conducted system level testing at Years 3, 7 and 10 in all learning areas apart from Languages Other than English. During 1996 the bold step was taken to test The Arts. This paper describes the testing program which was used to test The Arts using the Student Outcome Statements as a framework. It describes the processes used in the Monitoring Standards in Education project and the conceptualisation of a methodology to test The Arts, the strategies used in the testing and their relationship to the Student Outcome Statements, and the methods of marking the tests and analysing the data. A brief discussion of the difficulties encountered in sampling is also included, together with an overview of the final results of the testing.

What is monitoring standards in education?

Monitoring Standards in Education, known as MSE, is a testing program undertaken by the Education Department of Western Australia which aims to give parents, teachers and the community precise information about the standard of student performance and the quality of learning in Western Australian Government schools when each year, students in Years 3, 7 and 10 are tested in one or two of the eight learning areas. Testing was first undertaken in 1990 in English and Mathematics and since that time all learning areas apart from Languages Other than English (LOTE) have been tested. In 1996 the bold step was taken to test The Arts which, as indicated by the evidence, had not been tested on a system level in any other state of Australia or possibly anywhere else in the world. This is possibly due to the difficulties that have been encountered in reaching consensus as to how The Arts should be assessed as well as failure by Governments, in the past, to recognise The Arts as an important and compulsory learning area.

During the initial stages of development extensive research was conducted into current Arts education philosophy on assessment, and consultation with teachers and other people with expertise in The Arts was carried out together with piloting and trialing in schools to establish appropriate methodologies for testing. The innovative strategies developed for the testing were based on similar philosophies of those being adapted in 1997 for the National Assessment of Educational Progress (NAEP) testing in the Arts in the United States of America. Extensive consultation with Dr Carol Myford, who is on the US National Assessment Governing Board, formed a part of the early development stage. The tests were developed by the Evaluation Branch of the Education Department of Western Australia in collaboration with Educational Testing Centre and, importantly, with input from practising arts educators.

Many of the assessment methods developed for the MSE testing measure, for the first time, the depth and complexity of student knowledge and skills in the learning area, and innovative standardised assessment procedures have been developed which allow students to work in ways that reflect good classroom practice. Where possible, 'hands on' strategies are used to monitor student performance and open-ended assessment items provide students with the opportunity to perform to the maximum of their abilities. The MSE assessments permit reliable evidence to be gathered on outcomes that are not readily addressed through machine-marked paper and pencil tests.

MSE does not identify schools or individual students. However, at the completion of testing and reporting, school versions of MSE tests are produced to provide schools with a reference point to compare their students' performance against the state as a whole, and as a consequence to implement their own programs to address any specific areas of need. This procedure increases the value of MSE to teachers and schools and MSE materials are widely used by schools as part of the school assessment procedures for their management information systems. Schools are made aware that MSE tests, although well researched and highly regarded, are limited by the constraints of system level testing and should be regarded as a 'snapshot' only of student performance, to be used to substantiate teachers' own records and judgements.

On what framework is Monitoring Standards based?

MSE uses Student Outcome Statements (Education Department of Western Australia, 1996, draft version) the Western Australian version of the National Profiles (Curriculum Corporation, 1994), as a framework for reporting on student progress and achievement. These statements have undergone a period of trialing and upgrading since 1994. They describe significant skills, understanding and knowledge across eight levels of achievement in the sequence in which they typically develop.

In Western Australia The Arts learning area comprises the five disciplines of dance, drama, media, music and visual arts, and statements are generic across these five areas. However, each is a discrete discipline with its own techniques and conventions. The statements describe student progress and achievement in four strands; 'Communicating Arts Ideas' and 'Using arts skills, techniques, technologies and processes' both of which relate to expression of The Arts; and 'Responding, reflecting on and evaluating the Arts' and 'Understanding the role of the Arts in society' which relate to appreciation of The Arts .

MSE reports the performance of representative samples of Years 3, 7 and 10 students and provides information about the achievements of subgroups of students: girls and boys; Aboriginal and Torres Strait Islanders and students from non-English speaking backgrounds and, as 1996 was the first year that collection of MSE data in the Arts was undertaken, this data will establish benchmarks for future testing in this learning area.

What is the profile of The Arts in Western Australian schools?

The Arts learning Area focuses on aesthetic understanding and practice developed through the five art forms of dance, drama, media, music and visual arts and, although these art forms are often used in interrelated ways, each has its own techniques and conventions. Students use their senses, perceptions, feelings, values and knowledge to communicate through The Arts. They develop creative ways of expressing themselves as well as a critical appreciation of their own arts works and those of others.

As audience and critics they analyse and reflect on the use of arts elements and develop criteria for critically appreciating and making informed judgements about their own arts works and that of others. They should also recognise the role and value of arts in their lives. Students need this type of participation and evaluation of various forms to develop their aesthetic understanding and practice.

These objectives were foremost in the minds of test developers and it was agreed unanimously that they could not be assessed using the multiple cho ice, machine scorable testing devices so often used in system-level testing.

To address all four Outcome Statement strands related to expression and appreciation of The Arts, students at Years 3, 7 and 10 were required to complete two test forms, the 'Analysis' and the 'Process'.

What is the Analysis form?

The Analysis form consisted of a set of stimulus material to which students responded to the strands related to appreciation, and tasks were designed to reflect the range of achievement likely to be evident in students at Years 3, 7 and 10 in those strands. Students produced responses in relation to aesthetics, critical analysis, interpretation of meaning, mood and arts elements in relation to stimulus material presented, and marking keys reflected that a range of responses could have been made to a stimulus.

An attempt was made to use as much local content as possible, as well as variety in culture and genre, in stimulus materials presented. The stimulus material included videotaped excerpts for dance, drama and media, audiotapes for music and colour prints of paintings for visual arts.

Because of the subjective nature of The Arts, when assessing such notions as aesthetics and critical analysis it was important that students were able to express their own opinions of mood, feeling, likes and dislikes. For this reason, answers which extracted personal opinions were not assessed but were included in tests as prompts for discussion and justification of responses. This provided the opportunity to gain an insight into students' knowledge and skills.

What is the Process form?

The Process form was designed to reflect the range of ability likely to be evident in students at Years 3, 7 and 10 in the Student Outcome strands related to expression and offered a broad view of student abilities through their documentation of the steps in the learning which lead to the performance of their final products. The tasks were intended to focus on students' abilities to explore and develop ideas and use creative problem solving techniques to plan, shape and share meaning through the presentation of a final performance within a group in the case of the performing arts or an individual piece of work in the case of media or visual arts. Stimulus material and marking keys allowed for a wide range of interpretations of a single stimulus.

In the case of the performing arts of dance, drama and music where students worked in groups, they were encouraged to brainstorm ideas related to the stimulus and to write their ideas on an individual basis before joining their groups. They were then made aware of the evaluation criteria and given specific instructions as to how to combine their ideas to plan their performance as a group within the guidelines, notating their plan using conventional notations and/or words and diagrams. After being given a specific time allowance for rehearsal, groups performed as teachers videotaped their performances. Afterwards, students completed a critique of their group's performance on an individual basis. Students were evaluated on their individual planning sheets, the group plan, the group's performance, and the individual critique of their group's performance.

In the case of media and visual arts where students worked individually, they brainstormed ideas and were then made aware of the evaluation criteria and given specific instructions as to how to plan their presentation within the guidelines. They then completed a plan before commencing their piece of work and, after completion, students completed a critique of their own work. Students were evaluated on their individual planning sheets, the piece of work presented, and the critique of their work.

Stimulus material ranged from the one word stimulus, 'Fire,' for dance, to short stories, poems, videotape excerpts and paintings.

In order to standardise marking of performances for the performing arts, teachers were trained in the use of assessment materials and were required to videotape student performances. The videotapes were then marked centrally by markers with expertise in music who had been trained in marking procedures.

What item types did we use?

A combination of multiple choice, short answer and extended answer question types were included in the tests and, where possible, tasks were open-ended in order to provide students with the opportunity to demonstrate their maximum levels of ability. Some items were dichotomous, ie right or wrong, and others were polychotomous, ie answers showing partial understanding, receiving some credit. Planning procedures and presentation or performance of final productions were also a part of the assessment and, since this was an assessment of The Arts, student responses were not assessed for spelling or writing skills.

Through the use of common items and common stimulus material, both the 'Analysis' and 'Process' tasks allowed for linking of items through Years 3, 7 and 10, thus providing valuable information on student progression through the Student Outcome Statement levels. Each student was required to complete both the 'Analysis' and the 'Process' task in order to capture both strands of the Student Outcome Statements and data was analysed to report on students' overall results from the combination of both tests. Marking keys, which described the categories for each item indicating various levels of understanding, were developed.

How did we mark the tests?

The completed tasks required a marking complement of seven markers with experience and knowledge in the discipline for each of the 'Analysis' tests and twelve for each discipline for the 'Process' tests. A full day's training was conducted for each. Markers scored a set of common items, discussed differences and reached a consensus on judgement. Through this process a common understanding of the marking key was reached and modification or additions to the marking keys were effected.

At the end of the day, markers had marked a series of common tests and felt confident they had reached a unified understanding of levels and a clear understanding of interpretation of the marking key. However, they exchanged telephone numbers in order to make contact to discuss any unusual or difficult examples which may not have emerged during marker training. Markers were requested to rotate Year 3, 7 and 10 tests when marking in order not to lose track of the development of levels through the year groups.

Marking of the Process tests in the performing arts involved viewing of students' performances on video in relation to individual plans, group plan, and student appraisals. Markers worked in pairs with the whole group marking the same video-taped performance. When the marking of each student group was completed, discussion took place among all markers, with each pair giving explanations to justify levels allocated. This procedure was repeated throughout the day, rotating through Year groups 3, 7 and 10 until markers were confident they had reached an understanding of 'levelness' of performance.

Who did we test?

Traditionally, individual student samples have been drawn at random for MSE testing, however, because the 'Process' task for the performing arts was designed to be administered to groups of students, sampling of whole classes was necessary. Whole class samples of students from Years 3 and 7 were randomly selected from 40 government primary schools for each Year group.

Only those students at the Year 10 level who were currently studying the relevant discipline at the time of sampling were tested. Numbers for sampling were restricted by the limited number of students undertaking arts options at secondary schools and the limited numbers of schools offering the complete range of arts disciplines, together with the fact that a decision had been made to test each school in one of the five arts disciplines only. The following data demonstrates the limitations to the Year 10 students available for the arts sample:

Discipline Tot al enrolledTotal population% of population

Dance 19741714011.5
Drama 35191714020.5
Media 19381714011.3
Music 1374171408.0
Visual Art 58121714033.9

Figure 1: Arts enrolments for Year 10, Semester one, 1996

No. of arts subjects
offered in Year 10
No. of schools

532
425
319
220
117
058

Figure 2: Number of schools offering arts disciplines at Year 10

The final sample at Year 10 consisted of 20 classes for each of the five disciplines.

The low proportion of male enrolments to female enrolments in arts programs was cause for some concern. The breakdown of arts enrolments by gender is shown in the following table.

Females enrolled
Males enrolled
Discipline No.% of
enrolled
% of
population
No.% of
enrolled
% of
population

Dance 194498.511.3 301.50.2
Drama 251671.514.7 100328.55.9
Media 108455.96.3 85444.15.0
Music 79157.64.6 58342.43.4
Visual Art 375964.721.9 205335.312.0

Figure 3: Year 10 Arts enrolments by gender, Semester one, 1996

Assessment results for Years 3, 7 and 10 were analysed to provide specific information on the performance of girls and boys, as well as Aboriginal and Torres Strait Islander students, students from remote schools and students from non-English speaking backgrounds.

How do we analyse the data?

MSE uses a Rasch model (Rasch, 1980) of analysis, arising from the development of Item Response theory, as it allows the simultaneous scaling of item difficulties and student abilities on the same scale. The following sequence (Honeyman, 1996) describes the procedure by which the MSE testing program uses this property of Rasch models to link SOS levels, item difficulties and student abilities:
  1. A set of items is designed to operationalise a particular SOS level.

  2. Tests derived from the items are administered to a sufficiently large sample of students and subsequently marked by a group of trained markers.

  3. Data from the test are analysed using a Rasch model and estimates of item difficulty and student ability are produced. Item difficulties are then plotted on a vertical scale known as a logit scale. Student ability is not plotted at this stage.

  4. The SOS level which was originally operationalised in step one is entered and a study of item clusters is used to establish SOS level boundaries. The relationship between student ability and item difficulty is adjusted to reflect a probability of 0.7 that a student will correctly respond to an item of equal difficulty, and the ability estimate of each student is then plotted.

  5. A study is then made of test scripts of students whose abilities are close to the established level boundaries and the levels of student test performances are compared to the levels derived from the study of item clusters.

  6. Logit scales are then rescaled to Western Australian Monitoring Standards in Education scores, known as WAMSE scores, which range from 0 - 800.

The following figure demonstrates the method used by MSE to report results of the overall sample in relation to Student Outcome Statement levels.

Figure 4: MSE sample graph
Figure 4 was not available to the HTML Editors at the time of creating this file.

References

Curriculum Corporation (1994). The Arts - a curriculum profile for Australian schools. Victoria: A.E. Keating (Printing) Pty Ltd.

Education Department of Western Australia (1996). Student Outcome Statements. The Arts (draft version). Perth: Education Department of Western Australia.

Honeyman, A. (1996). Monitoring standards in education: A handbook of the operations, policies and procedures of the Education Department of Western Australia (draft version). Perth: Education Department of Western Australia.

Pascoe, B. J. (1995). The influence of primary school music programmes on student choice of music studies in lower secondary schools. Perth: Edith Cowan University (unpublished thesis).

Rasch, G. (1980). Probabilistic models for intelligence and attainment tests (expanded edition). Chicago: University of Chicago Press (original work published in 1960).

Author: Beverley J. Pascoe MEd is a teacher who is currently on secondment to the Education Department of Western Australia Monitoring Standards in Education project. She has carried out research in music education in both primary and secondary schools in Western Australia and is currently undertaking PhD research in the measurement of classroom music knowledge and skills utilising an outcomes framework.

Please cite as: Pascoe, B. J. (1997). Monitoring standards in the arts: A testing program using an outcomes framework. Proceedings Western Australian Institute for Educational Research Forum 1997. http://www.waier.org.au/forums/1997/pascoe.html


[ Proceedings Contents ] [ Forum 1997 Abstracts ] [ WAIER Home Page ]
Last revision: 1 June 2006. This URL: http://www.waier.org.au/forums/1997/pascoe.html
Previous URL 30 July 2001 to 16 May 2006: http://education.curtin.edu.au/waier/forums/1997/pascoe.html
Previous URL from 13 Aug 1999 to 30 July 2001: http://cleo.murdoch.edu.au/waier/forums/1997/pascoe.html
HTML: Roger Atkinson [rjatkinson@bigpond.com] and Clare McBeath [c.mcbeath@curtin.edu.au]