An ABC News report today sparked my interest with the headline “Study finds no relationship between attendance and results at remote Indigenous schools.”
This interim finding comes out of an important five-year study, The Remote Education Systems Project that is looking at how education can best meet the needs of remote communities.
Specifically researchers have concluded that increasing attendance at remote Indigenous schools will not necessarily improve results, because, according to a senior research fellow, Sam Osborne, the analysis of the NAPLAN results from more than 200 remote schools has not identified any established relationship between attendance and outcomes in NAPLAN results. This puzzling finding is the focus of this post, but first the research project itself is worthy of comment.
The Remote Education Systems Project
I have just come across this study. It looks very exciting and promising for a number of reasons.
Firstly, it is over a number of years – its final report is not expected to be completed until the end of 2016. This is unusual in the world of Indigenous research. Most research attempts to measure the impact of a single initiative, sometimes trying to use random trial type approaches, as though the drug trial paradigm can apply to such a complex context.
Secondly, it is focussing on remote Indigenous schooling only. I have long argued that the tendency to assume all Indigenous education disadvantage challenges are the same, regardless of the context, has been most unhelpful.
Thirdly, the research is not starting out with a standard deprivation model. It takes as its starting point the importance of community perspectives and understandings, and of building on community strengths. I am really looking forward to the insights that will emerge from fresh thinking and community engagement.
And finally, the research framework is ‘going outside the box’ and questioning whether the current outcomes/targets/standards based approach is the best way to go. I have recently made the case for the irrelevance of all the NIRA targets and output measures related to schools in the remote Indigenous context, so I strongly support this.
However, in questioning whether the dominant standards/target based approach as the only way to go, I do hope that the researchers also remember to take a critical look at the data and not pass over what else it might be telling us apart from its lack of fit.
The attendance/student learning puzzle
The issue of school attendance is an excellent case in point. Systems prioritise attendance because of the assumption that children cannot benefit from school unless they attend on a regular basis. This makes intuitive sense. So if the connection between attendance and learning progress is not obvious in remote schools, what else might be going on?
Now I don’t know how this project is using the attendance data but most educators who write on this matter argue that children need to attend school for over 80 per cent of the time to benefit. Others insist that they need to attend for over 90 per cent of the time. However this is all rather irrelevant to the performance data available to most of us, because attendance data is rarely reported to the individual student level. It is only reported to the school or classroom level so comparing high attending individual students with low attending individual students is not generally possible.
If this project has managed to analyse results by individual student attendance profiles this would be an important piece of work.
However, whether or not this is what has been achieved by the study, understanding how it is usually reported and how this is used in policy terms is important. Let me explain
The MySchool and the COAG reports both use the concept of average attendance. This might be ‘good enough’ in the majority of situations where average attendance rarely falls below 90 per cent. But in remote communities, where average attendance rates of 52 per cent are common, it is not very helpful.
Average attendance basically measures the collective gap between the number of students enrolled and the cumulative number of students on a daily basis. So if there are 200 days in the school year and 30 students enrolled, 100 per cent attendance would mean that over the school year there were 200×30 or 6000 ‘student attendance events’.
However, a 52 per cent attendance rate could mean anything from, 52 per cent of students enrolled attending for 100 per cent of the time, to all students attending for 52 per cent of the time, to anything in between. In the first extreme, 52 per cent of students should be benefitting but 48per cent of children would get absolutely zero benefit. However in the second extreme scenario, none of the children would benefit.
Of course, most schools are not at either extreme but we still don’t know what an average attendance figure means in terms of its impact on student learning. When I last had access to the NT official and very detailed attendance data base (over 3 years ago), I do recall that for the larger remote community schools, the average number of students who attended school over 80 per cent of the time was only 27 per cent. Yet the average attendance rate was around 60 per cent.
If we delve further into what this might mean at the school and classroom level, some new questions emerge.
So, let us imagine a classroom in a remote Indigenous community school with a 60 per cent attendance rate where 27 per cent of the children attend over 80 per cent of the time.
Firstly, how many children would be on the roll for the average class if the official teacher-student ratio is 1-20?
In the NT, schools are allocated staff based, not on enrolment numbers, but on attendance . This impacts significantly on actual classroom size and the challenges facing remote teachers. For example, a primary school with 300 children enrolled ,but an attendance rate of 60per cent, would be allocated staff for 180 students not 300. Yet the number of students who need to be assigned to teachers and classes is 300 not 180 – they just attend irregularly. This would require making class sizes of about 33 not 20.
So on any one day, a teacher might have only 20 children in their class but about 33 children on the roll. Based on the expectation that only about 27 per cent would attend over 80 per cent of the time, this class of 33 might have about 9 children who attend on a very regular basis and the remaining 24 children would also attend, albeit on a highly irregular basis.
Can you just imagine the chaos of such a classroom and how hard it would be to focus on the small number of students who are there regularly? Add to this mix, inexperienced short term principals, a high number of novice teachers, a generally non-English speaking student body and cultural challenges, and you get an even more accurate picture.
It appears that the finding, that the rate of attendance does not necessarily lead to improved learning outcomes, may well be because of discriminatory school funding by the NT government and the impact of the high level of irregular attendance of the majority of students on the classroom learning environment.
Knowing whether this is the case matters because it affects the questions we ask and the solutions that are considered.
I do hope that this issue is investigated further as part of this exciting research project.
 This, in my view, is a very serious case of indirect discrimination. It is also highly unethical because the NT government signed a Memorandum of Understandings with the Commonwealth Government in September 2007 that included a commitment, on their part, to move from “staffing based on attendance” to “staffing based on an Agreed Student Number” (note: this would, be based on estimates of the numbers of age relevant children in the designated area, so it would expected to be, at least at the level of enrolment, but possibly higher). This work has never been done. This has enabled the NT Government to continue to underfund Australia’s most needy schools for years. One of the reasons they can get away with this is the COAG approach of only requiring output based accountability. It is also worth noting that, had the recommendations of the Gonski Report been implemented, there would have been an independent monitoring body to monitor needs based funding.
 The savvy reader may have noticed that the MySchool data on FTE student numbers and FTE teacher numbers in NT remote schools does not bear this out. In fact these ratios look very healthy. I have hesitated writing about this issue because of this problem. But I now understand how to make sense of it.
In the explanatory notes of the Productivity Commission’s Report on Government Services (ROGS) Chapter 4 on Schools, the following note is included under the definition of teacher: For the Northern Territory, Assistant Teachers in Homeland Learning Centres and community school are included as teaching staff. ( 2013 p 4.9.9). This labeling of unqualified Indigenous Education Workers as teachers is another NT sleight of hand. For instance, it allows them to create an impression that the Homeland Learning Centres have daily access to a teacher, but they do not.