[ Proceedings Contents ] [ Forum 1998 Program ] [ WAIER Home Page ]

The effect of using laptop computers on achievement, attitude to science and classroom environment in science

Darrell Fisher and Ed Stolarchuk
Science and Mathematics Education Centre,
Curtin University of Technology
and St Hilda's School, Southport Queensland
This study was part of an evaluation of the effectiveness of laptop computers in grades 8 and 9 science classrooms, in a sample of Australian independent Schools. Effectiveness was determined in terms of the impact laptop computers have had on laptop students' attitudinal and achievement outcomes and their perceptions of science classroom environment. Students' attitudes to science were assessed using a scale from the Test of Science-Related Attitudes (TOSRA) instrument, achievement was measured using scales from the Test of Enquiry Skills (TOES) instrument, while students' perceptions of the science classroom environment were assessed using the Science Classroom Environment Survey (SCES). These quantitative instruments were administered to 433 laptop and 430 non-laptop students in 14 independent schools across four Australian states. Descriptive statistics confirmed the reliability and validity of the SCES for science laptop classroom research. Qualitative data were collected by interviewing students and teachers in two of the fourteen schools. These data confirmed and offered explanations for the quantitative findings, which indicated that those laptop science classrooms characterised by opportunities for individual students to interact with the teacher and an emphasis on the skills and processes of inquiry best promoted positive students' attitudes to science. Laptop science classrooms characterised by selective treatment of students least promoted students' cognitive achievement in science.

Introduction

The advent of relatively cheap, robust personal computers (laptops), over the past decade, has resulted in an increasing number of Australian Independent Schools prescribing laptop computers as one of the standard items of equipment students must bring with them to the science classroom. However, to date, there has been a minimal amount of reported research on the impact laptop computers have had on student attitude to science and on student cognitive achievement in science, and there are no published reports on the impact laptop computers have had on students' perceptions of science classroom environment. The few published laptop studies have focussed on the traditional areas of "computers in science classrooms" evaluation, for example, grade improvement, communication skills improvement, work presentation, content retention and so on (McMillan & Honey, 1993; Gardner, Morrison, & Jarman, 1993; Mitchell & Loader, 1993; Loader, 1993; Rowe, 1993; Shears, 1995).

Students' perceptions of science classroom environment have been favourably associated with student attitude to science and student cognitive achievement in science (Fraser, 1994; McRobbie & Fraser, 1993; Fraser, 1991; Fraser, Walberg, Welch, & Hattie, 1987; Haertel, Walberg, & Haertel, 1981). The availability of proven instruments for assessing science classroom environment, student attitude to science and student cognitive achievement, have allowed this study to proceed and thus contribute to our understanding of the effects of laptop computers.

Classroom environment

Science education researchers have led the world in the field of science classroom environment research over the past twenty-five years (Fraser, 1994; Fraser & Walberg, 1991), and numerous science classroom environment research instruments have been validated for use in educational research. One such instrument is the Individualised Classroom Environment Instrument (ICEQ) developed by Fraser (1990). The initial version of the ICEQ long form had five scales with approximately 15 items per scale. The final published version of the ICEQ (Fraser, 1990) contained 50 items evenly distributed across the five scales, of Personalisation, Participation, Independence, Investigation and Differentiation. A short form of the ICEQ was also constructed (Fraser, 1990). The short form retained all five scales of the long form, but contained only five items in each of the scales. As with the long form, each item is responded to on a five-point, Likert-type scale ranging from 1 (Almost never) to 5 (Very Often). There are some negative items for which the scoring is reversed. Two forms of the questionnaire exist, one assesses student perceptions of the actual classroom environment and the other measures the classroom environment students would prefer or consider ideal.

Validation statistics for the short form report that the Cronbach alpha reliability coefficients ranged from 0.69 to 0.85 indicating high internal scale consistency. Each scale's ability of differentiating between the perceptions of students in different classrooms was confirmed by calculating one-way ANOVAs for each scale, using class membership as the main effect. The ANOVA eta2 statistic calculated for each scale, representing the proportion of variance due to class membership, ranged from 0.21 to 0.39 (p<0.001) indicating adequate scale differentiation. The mean scale correlations ranged from 0.15 to 0.34, indicating each scale sufficiently measured a different dimension of the classroom environment. These reliability and validation figures confirmed that the short form of the student-actual version of the ICEQ is a reliable and valid instrument and that it could be used with confidence to measure students' perceptions of science classroom environment.

The short form of the ICEQ was the basis for the construction of the Science Classroom Environment Questionnaire (SCES) used in this study. The major change made to this form of the ICEQ was that the items were reworded, as necessary, to ensure each one was written in the personal form. For example, the item 'The teacher is unfriendly to students' was reworded to read 'The teacher is unfriendly to me'. This was done to elicit students' perceptions of their own experiences, rather than their perceptions of the class or group experience as a whole. This was thought to be important in classrooms where laptops are used. That individuals' perceptions, are based on their perspectives, was recognised over four decades ago when Stern, Stein, and Bloom (1956) differentiated between private beta press, unique individual perceptions, and consensual beta press, group perceptions.

Table 1: Descriptive information for SCES scales
ScaleDescriptionSample item

PersonalisationEmphasis on opportunities for individual students to interact with the teacher and on concern for the personal welfare and social growth of the individual.The teacher helps me if I'm having trouble.
ParticipationExtent to which students are encouraged to participate rather than be passive listeners.I ask the teacher questions.
IndependenceExtent to which students are allowed to make decisions and have control over their own learning and behaviour.The teacher decides which students I work with.
InvestigationEmphasis on the skills and processes of of inquiry and their use in problem solving and investigation.I explain the meaning of statements, diagrams and graphs.
DifferentiationEmphasis on the selective treatment of students on the basis of ability, learning style, interests, and rate of working.I move on to other topics if I work faster than other students.
NegotiationEmphasis on opportunities for students to explain and justify to other students their newly developing ideas; reflect self-critically on the viability of their own ideas.Other students ask me to explain my ideas.

Adapted from Fraser (1990) and Taylor, Dawson, & Fraser (1995)

Moreover, the ICEQ did not address an aspect of the science classroom environment that was of interest, this being, whether the laptop science classroom allowed for more student interaction and peer learning/teaching. Various classroom environment instruments were reviewed to identify scales that could be used to meet the above need, and a suitable scale was found in the Constructivist Learning Environment Survey (CLES) (Taylor, Fraser, & White, 1994; Taylor, Dawson, & Fraser, 1995; Taylor, Fraser, & Fisher, 1997). This scale is referred to as the 'Negotiation' scale in the SCES. The final version of the SCES contains six scales, the five ICEQ scales and the one CLES scale. Each scale consists of five items, with some items being reverse scored. Each item is responded to on a five-point, Likert-type scale ranging from 1 (Almost never) to 5 (Very Often). Table 1 contains a description and sample items of each scale.

An 'attitude to science' scale, simply referred to as the Attitude scale, was attached to the SCES. It was adopted from the 'Enjoyment of Science Lessons' scale of the Test of Science Related Attitudes (TOSRA) (Fraser, 1981) and contained five items.

Cognitive achievement was measured using three scales from the 'interpreting and processing information skills' section of the Test of Enquiry Skills (TOES) (Fraser, 1979), which was designed to test non-content specific enquiry skills of science students in grades 7 to 10. The three scales selected were 'Scales', 'Charts and Tables' and 'Graphs'.

Methodology

The aim of this study was to evaluate the effects laptop computers have had on students' perceptions of science classroom environment, on students' attitude to science and on students' cognitive achievement in science, in grades 8 and 9 in Australian Independent Schools.

A number of independent schools were contacted in all Australian states to determine if they used laptop computers in grades 8 and 9 science and if so, whether they would participate in this study. The final sample consisted of 863 students in 44 classes of grades 8 and 9 science classes in 14 independent schools, in four states. Of these students, 433 used laptops in 23 different laptop science classrooms. A non-laptop sample consisted of 430 students in 21 non-laptop science classrooms. Each student in the sample completed the Science Classroom Environment Survey (SCES), with attached Attitude scale and enquiry skills scales.

Qualitative data were collected from two of the 14 laptop schools, both having laptop and non-laptop classes. Laptop students and teachers were interviewed using pre-set questions, with some impromptu questioning to further explore the answers provided to the pre-set questions. Students and teachers were interviewed separately. In one school a group of eight students was interviewed during one single session. In the other school, two groups of four students were interviewed in two separate sessions. In both cases, this was determined by timetable constraints and student availability. A group of three teachers was interviewed in one school and a group of two teachers was interviewed in the other school. Student and teacher interviews were recorded on audio tape.

The reliability and validity of the SCES were confirmed by determining each scale's internal consistency (Cronbach alpha coefficient), discriminant validity (mean correlation of each scale with the other scales) and ability to differentiate between classrooms (ANOVA eta2 ). The associations between students' perceptions of science laptop classroom environment and students' attitudinal and cognitive achievement outcomes were investigated by analysing the data using both simple and multiple correlations. The simple correlation (r) describes the bivariate association between a selected outcome and each scale of the instrument, the SCES in this instance. The multiple correlation, as expressed by the standardised regression weight (beta ), describes the multivariate association between an outcome and particular scale, when all other scales are controlled.

The effects laptop computers had on students' perceptions of science classroom environment were investigated by calculating effect sizes, ES, for each of the SCES scales. Effect sizes were calculated using Cohen's d formula (1977) where the difference in the two group means, for each scale, is divided by the pooled standard deviation. Effect sizes of 0.2 are considered small effects, 0.5 medium effects, and 0.8 large effects.

The qualitative data were transcribed using pen and paper, and then typed and categorised in a form considered favourable for analysis and interpretation.

Results

Validation of the SCES Questionnaire

In the descriptive statistics reported in Table 2, both the individual student and the class mean were used as the units of analyses. Scale internal consistencies were confirmed by calculating Cronbach alpha coefficients for each scale. Values obtained were satisfactory ranging from 0.60 to 0.81 and from 0.73 to 0.93 using the individual student and the class mean as the unit of analysis respectively, for the six scales of the SCES questionnaire.

0.11*
Table 2: Internal consistency (Cronbach Alpha Coefficients), Discriminant Validity (Mean Correlation with other Scales), Ability to Differentiate Between Classroom (ANOVA eta2 ), Scale Means and Standard Deviations for Science Laptop Student Sample

Unit of
Analysis
Alpha
Reliability
Mean
Correlation
ANOVA
(eta2 )
Scale MeanStandard
Deviation

Personalisation Individual
Class Mean
0.81
0.93
0.34
0.45
0.34*3.26
3.27
0.85
0.50
Participation Individual
Class Mean
0.67
0.80
0.32
0.40
0.15* 3.45
3.46
0.71
0.27
Independence Individual
Class Mean
0.65
0.84
0.08
0.08
0.39* 3.58
3.57
0.77
0.49
Investigation Individual
Class Mean
0.67
0.73
0.23
0.23
2.89
2.88
0.71
0.23
Differentiation Individual
Class Mean
0.60
0.77
0.10
0.25
0.19* 1.99
1.97
0.65
0.29
Negotiation Individual
Class Mean
0.72
0.87
0.21
0.34
0.11* 3.02
3.04
0.72
0.25

*p<0.01    student n = 433    class n = 23

Scale discriminant validity was confirmed by calculating the mean correlation of each of the instrument's six scales with the remaining scales. The correlations ranged from 0.08 to 0.34 using the individual student as the unit of analysis and from 0.08 to 0.45 using the class mean as the unit of analysis, indicating satisfactory scale discriminant validity. Each scale's ability of differentiating between the perceptions of students in different classrooms was confirmed by calculating one-way ANOVAs for each scale, using class membership as the main effect. The ANOVA eta2 statistic calculated for each scale, representing the proportion of variance due to class membership, ranged from 0.11 to 0.39 (p<0.01); therefore, indicating satisfactory scale differentiation.

Scale means are reported in Table 2 as this is the first reported use of the ICEQ in science laptop classrooms. Using the individual student as the unit of analysis, scale means ranged from a high of 3.58 for the Independence scale to a low of 1.99 for the Differentiation scale, and from a high of 3.57 to a low of 1.97, for the same scales respectively, using the class mean as the unit of analysis. Fraser (1990) reported scale means, for the class mean unit of analysis, for grades 8 and 9 science students, ranging from a high of 3.60 for the Participation scale to a low of 2.20 for the Differentiation.

Associations between laptop student outcomes and science classroom environment

The statistical methods engaged to investigate the associations between laptop students' perceptions of their classroom environment and their attitudinal and cognitive achievement outcomes were described earlier. The results of the analyses are presented in Table 3.

Both multiple correlation (R ) statistics, between the set of SCES scales and each of the outcomes, are statistically significant. An examination of the simple correlation (r ) results in Table 3 indicates that of the 12 possible relationships between science classroom environment and the outcome variables of attitude and achievement, 10 are statistically significant (p<0.05). This is over 16 times that expected by chance alone. A similar examination of the multiple correlation beta weights (beta ), however, reveals that only three of the 12 possible relationships are statistically significant (p&lt;0.05). This is five times that expected by chance alone.

The multiple correlation (R ) statistic, at value 0.69 (p<0.001), suggests that the association between students' perceptions of science laptop classroom environment as measured by the SCES, and students' attitude to science is a strong one. Furthermore, the RSup>2 statistic, at 0.48 indicates that 48% of the variance in laptop students' attitude to science is explained by students' perceptions of science classroom environment.

The simple correlation (r ) data in Table 3 indicate that all the associations between students' attitudinal outcomes and the SCES scales are statistically significant. The associations for the scales of Personalisation, Participation and Investigation are quite large, while the association for the Negotiation scale is somewhat smaller. The associations for the Independence and Differentiation scales are small. All of the associations are positive, except for the Differentiation scale. These findings suggest that positive students' attitude to science are promoted in science laptop classrooms where students perceive the science classroom as being characterised by personalisation, participation, independence, investigation and negotiation. In contrast, students' attitude to science decrease in science laptop classrooms where students perceive the science classroom as being characterised by differentiation. However, as noted earlier, this is a small association.

An examination of the attitudinal outcomes standard regression weights (beta ) data, indicates that only two of the six scales retain their statistical significance. This more conservative analysis suggests that of the science laptop classroom characteristics earlier identified as promoting positive student attitudes to science, it is the classroom characteristics of personalisation and investigation that are most influential.

Table 3: Associations Between SCES Scales and Laptop Students' Attitudinal and Cognitive Achievement Outcomes in Terms of Simple Correlations (r ) and Standardised Regression Coefficients (beta )
Strength of Classroom Environment-Outcome Association
AttitudinalAchievement
Scalerbeta rbeta

Personalisation0.65** 0.52**0.22**0.12
Participation0.47**0.01 0.19**0.08
Independence0.11*0.02 0.13*0.08
Investigation0.47**0.24** 0.05-0.02
Differentiation-0.10*-0.02-0.26**-0.22**
Negotiation0.33**0.07 0.080.01

Multiple R Correlation0.69**0.33**
R2 0.480.11

*p<0.05   **p<0.001  n = 433 (laptops)

The qualitative data collected supported and offered several insights into the quantitative findings. For example, when students were asked why they thought they had a better attitude to science than did their non-laptop counterparts, their answers included comments such as "easier, different, new, a bit more fun" and "it's not such a drag and everything." Students also felt that correcting errors was easier, they could complete their work quicker (once they learned how to use their laptops), they were very much pleased with their finished product (it looked like a professional report) and they experienced reasonable success at learning on their own through trial and error, but this learning was more related to learning about the laptop than science.

The qualitative data collected from teachers included comments such as "averaging, spreadsheets, graphing, statistics - all that early stuff, it came alive" and "at the stage where there wasn't much statistical data involved, they just enjoyed writing it up on the laptops so it looked like a nice scientific report - it helped them take an interest in what was going on." Teachers also felt that they could offer students more individual help and that students appeared to be more motivated to work on their own and discover how to do various things, but again this was more to do with learning about computers than the scientific concepts being taught.

The multiple correlation (R ) statistic, 0.33 (p<0.001), indicates a significant association between science laptop classroom environment, as measured by the all the SCES scales, and student cognitive achievement. The R2 statistic indicates that 11% of the variance in students' cognitive achievement can be attributed to laptop students' perceptions of their science classroom environment.

The simple correlation (r ) data indicates that four of the six correlations between students' cognitive achievement and the SCES scales are statistically significant. Of the four scales, the Personalisation, Participation, and Independence scales are positively correlated and the Differentiation scale is negatively correlated. The correlation values are generally small, suggesting that to a small extent, students' cognitive achievement is higher in science laptop classrooms where students' perceive the classroom as being characterised by personalisation, participation and independence. Students' cognitive achievement is lower in science laptop classrooms where students' perceive the classroom as being characterised by differentiation.

Achievement outcome association standard regression weights (beta ) data in Table 3 indicate that only one scale, of the three scales earlier identified as being statistically significant, retains its statistical significance. This more conservative analysis of association suggests that student cognitive achievement is most influenced by science laptop classrooms perceived to be characterised by differentiation.

The qualitative data collected reinforced the existence of a less positive relationship between laptop use and cognitive achievement than between laptop use and attitude. Students indicated that laptops helped them with things such as making tables, spreadsheets, graphing, presentation, editing of work, projects, note taking, writing up investigations and organisation, but it really didn't make it easier to learn science. They felt laptops were only used as tool for graphing, note taking and so on; they were not used to 'teach' any of the science materials or topics. Following are some typical comments

Laptops don't really help with our knowledge of science, except for our knowledge of computers.

Perhaps if we had a science program in it we could go home and look at it and learn more from it. It would be good if you could put encyclopaedias, Encarta and stuff like that on them but they are too small for that.

Like they expect you to know everything on the computer and if you don't they you have to waste time to find out how to do it, and you miss out on class things.

Teacher comments included
Ah, I think that to start with the information they are putting in the computer, they don't actually take notice of it.

I guess what I'd say is that there are more questions about the actual computing that science which is maybe a bit of a worry. the could actually detract from the learning of science - they are so busy trying to learn how to do a table they actually don't pay attention to the information going in.

We were teaching a lot of computer skills, like graphing which they then used later on - it was an unknown situation.

Teachers also felt that they had to often spend a disproportionate amount of time with students who could not use the computers efficiently. This led to some students feeling they did not get their 'fair share' of the teacher's time and that students were not treated equally in the class.

Effects of laptops on students' perceptions of classroom environment

Differences between laptop and non-laptop students' perceptions of science classroom environment were examined by calculating effects sizes for each of the SCES scales. An examination of the direction of effect sizes in Table 4, reveals that the effects laptop computers have had on science students' perceptions of their classroom environment have all been positive, except for those factors measured by the Personalisation and Participation scales, and only at the student unit of analysis.

Table 4: Effect Sizes (ES ) for Laptop Computers on the Science Classroom Environment as measured by the SCES Scales
ScaleUnit of Analysis Effect Size - ES a

Personalisation Individual
Class Mean
-0.02
0.02
Participation Individual
Class Mean
-0.04
0.00
Independence Individual
Class Mean
0.01
0.15
Investigation Individual
Class Mean
0.10
0.20
Differentiation Individual
Class Mean
0.25
0.45
Negotiation Individual
Class Mean
0.03
0.17

a ES was calculated by subtracting the non-laptop mean from the laptop mean and dividing the difference by the pooled standard deviation, Cohen's d (1977)

An examination of the magnitude of the effect sizes indicates that for the scales of Independence, Investigation and Negotiation, at the class mean unit of analysis, the effects are small (0.15 to 0.20). The effect size for the Differentiation scale, 0.45, approaches the medium effect category at the class mean unit of analysis and is small, 0.25, for the individual student unit of analysis. All other reported effect sizes are negligible, at 0.10 or smaller.

Conclusions

This study has confirmed the SCES as a valid and reliable instrument for collecting student perceptual data on science laptop classroom environment. As the SCES was based on the ICEQ, rewritten in the personal form, the study has also presented the first reliability and validity statistics for the ICEQ when written in the personal form and when used in science laptop classrooms.

The multiple correlation (R ) statistics in Table 3 suggest that of the two student outcomes, attitude and cognitive achievement, the strength of the association between students' attitudinal outcomes and their perceptions of science laptop classroom environment is just over twice as strong as that between students' cognitive achievement outcomes and their perceptions of science laptop classroom environment. Furthermore, the R2 statistic, indicates that the percentage of variance in students' attitudinal scores, as explained by students' perceptions of science laptop classroom environment, is over four times that, for students' cognitive achievement scores and their perceptions of science laptop classroom environment.

The student and teacher interview data supported a more positive and stronger relationship between students' attitude and science classroom environment than between students' cognitive achievement and classroom environment. The use of computers resulted in students being more enthusiastic toward science, and made some of the mundane science-related tasks such as making tables and graphing 'come alive'. However, the qualitative data indicated that much was taught and learned about computers, often at the expense of science, and that the teacher was often perceived as spending an inordinate amount of class time with certain students at the expense of others.

The effect size data in Table 4 suggest that laptops have had minimal effects on students' perceptions of science classroom environment, especially at the individual student unit of analysis.

These findings are of practical significance as they have indicated an overall positive association between students' perceptions of science laptop classroom and their attitudinal and cognitive achievement outcomes. Schools considering the introduction of laptops into science classrooms would find this information valuable during their deliberations.

References

Cohen, J. (1977). Statistical Power Analysis for the Behavioral Sciences. New York: Academic Press.

Fraser, B.J. (1994). Research on Classroom and School Climate. In D.L. Gabel, (Ed.), Handbook of Research on Science Teaching and Learning. New York: Macmillan.

Fraser, B.J. (1991). Two Decades of Classroom Environment Research. In B.J. Fraser & H.J. Walberg (Eds.), Educational Environments: Evaluation, Antecedents and Consequences. Oxford: Pergamon Press.

Fraser, B.J. (1990). Individualised Classroom Environment Questionnaire: Handbook and Test Master Set. Hawthorn: The Australian Council for Educational Research Ltd, Radford House.

Fraser, B.J. (1981). TOSRA: Test of Science-Related Attitudes Handbook. Hawthorn: The Australian Council for Educational Research Limited.

Fraser, B.J. (1979). Test of Enquiry Skills Handbook. Hawthorn: The Australian Council for Education Research Limited.

Fraser, B.J., & Walberg, H.J. (Eds) (1991). Educational environments: Evaluation, Antecedents, and Consequences. Oxford: Pergamon Press.

Fraser, B.J., Walberg, H.J., Welch, W.W., & Hattie, J.A. (1987). Synthesis of educational productivity research. International Journal of Educational Research, 11(2), 145-252.

Gardner, J., Morrison, H., & Jarman, R. (1993). The impact of high access to computers on learning. Journal of Computer Assisted Learning, 9, 2-16.

Haertel, G.D., Walberg, H.J., & Haertel, E.H. (1981). Socio-psychological environments and learning: a quantitative synthesis. British Educational Research Journal, 7, 27-36.

Loader, D. (1993). Reconstructing an Australian School. The Computing Teacher, 20(7), 12, 14-15.

McMillan, K., & Honey, M. (1993). Year One of Project Pulse: Pupils Using Laptops in Science and English. A Final Report. New York: Bank Street College of Education. (ERIC Document Reproduction Service No. ED 358 822)

McRobbie, C.J., & Fraser, B.J. (1993). Associations between student outcomes and psychosocial science environment. Journal of Educational Research, 87, 78-85.

Mitchell, J., & Loader, D. (1993). Learning in a Learning Community: Methodist Ladies' College case study. Jolimont: Incorporated Association of Registered Teachers of Victoria.

Rowe, H.A.H. (1993). Learning with Personal Computers. Melbourne: The Australian Council for Educational Research.

Shears, L. (Ed) (1995). Computers and Schools. Melbourne: The Australian Council for Educational Research Ltd.

Stern, G.G., Stein, M.I., & Bloom, B.S. (1956). Methods in personality assessment. Glencoe: Free Press.

Taylor, P., Dawson, V., & Fraser, B. (1995). Classroom learning environments under transformation: A constructivist perspective. Paper presented at the annual meeting of the American Educational Research Association, San Francisco.

Taylor, P.C., Fraser, B.J., & White, L.R. (1994). A classroom environment questionnaire for science educators interested in the constructivist reform of school science. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Anaheim.

Taylor, P.C., Fraser, B.J., & Fisher, D.L. (1997). Monitoring constructivist classroom learning environments. International Journal of Educational Research, 27(4), 293-302.

Please cite as: Fisher, D. and Stolarchuk, E. (1998). The effect of using laptop computers on achievement, attitude to science and classroom environment in science. Proceedings Western Australian Institute for Educational Research Forum 1998. http://www.waier.org.au/forums/1998/fisher.html


[ Proceedings Contents ] [ Forum 1998 Program ] [ WAIER Home Page ]
Last revision: 1 June 2006. This URL: http://www.waier.org.au/forums/1998/fisher.html
Previous URL 30 July 2001 to 16 May 2006: http://education.curtin.edu.au/waier/forums/1998/fisher.html
Previous URL from 31 Aug 1999 to 30 July 2001: http://cleo.murdoch.edu.au/waier/forums/1998/fisher.html
HTML: Roger Atkinson [rjatkinson@bigpond.com] and Clare McBeath [c.mcbeath@curtin.edu.au]