Please Barry and Tony – listen to Jim (NAPLAN and MYSCHOOL)

Jim Angermeyr is the latest to add his voice to the growing number of experts who are troubled by how test scores are being used. He says that policy makes should have a healthy respect for error and use caution when interpreting results

“That caution grows as the groups get smaller, like looking at a classroom instead of a whole school. And that caution grows even more when the stakes increase because increasing the stakes can lead to all kinds of distortions…

 Where the distortion comes in is that you can only test a limited amount of the domain. Even if it’s a domain like mathematics, you can’t cover everything. And so you make assumptions about kids’ skills in that broader domain….

 Testing professionals know that you’re just sampling the domain and you don’t try to make inferences further than that. But nonprofessionals do that all the time.”

 If Angermeyr was running the world he says he would:

  •  severely reduce the accountability stakes for tests. …
  •  do away with standards…
  • put testing back as a local control issue in school districts.
  •  take the emphasis off evaluating and [compensating] teachers.
  • put the emphasis on good training for principals and curriculum specialists and teachers on how to interpret data and use it for the kind of diagnosis and assessment that it was originally intended for….

It’s politicians and some policymakers who believe tests can do more than they really can. And there’s not enough people stopping and saying wait a minute. When you can summarize a whole bunch of complicated things in a single number, that has a lot of power and it’s hard to ignore, especially when it tells a story that you want to promote. And that’s where it gets really twisted.

He concludes by saying:

“Perhaps if those designing the tests raise their voices alongside those of us who are giving the tests, and the students taking the tests, and their parents as well, we can bring about the change we need.”

Who is Jim Angermeyr?  Why does what he say about Value added measures matter?

Well Jim Angermeyr was one of the architects of the value-added assessment.  He worked with the Northwest Evaluation Association to develop tests.

What do you think Barry McGaw and Tony Mackay?  You were not the initiators of the current MySchool reporting regime but you  through ACARA are its custodians.  Don’t you think you owe it to parents, to children, to teachers and to policy makers to confirm that the arguments put to us by Professor Margaret Wu, by Jim Angermeyr and other psyshometricians have weight and should be heeded.

I respectfully ask that you help us to return assessment to its proper and central place as a classroom diagnostic tool and return testing to its proper place as a population measure of great value for big picture analysis.  We need your voice.

Source: Designer of Value-Added Tests a Skeptic About Current Test Mania – Living in Dialogue – Education Week Teacher.


NT indigenous topping the class: More rubbish education data from ABS

This article NT indigenous topping the class | The Australian notes that ‘the greatest advance in education in the past five years has been among Aborigines in the Northern Territory, with a 69.4 per cent increase in the number of indigenous students completing Year 12.”

This is yet another example of the crazy conclusions we arrive at with poorly thought through data.  Now this data comes from the Census so it must be  the MOST ACCURATE right?  After all it is based on a head-count of everyone on Australia.  But  it is misleading.  Surely the 69.4 per cent should alert us to the fact that something is not right.

If only one person completed in year 12 then just two people would be 100 per cent increase.  But if we were trying to convey the actual meaning of this pretty meagre progress wouldn’t  we say – ‘in the five year period between the census taking the figures on year 12 completion remain dismal – moving from  1 person to 2 persons. The figure we really need is either the raw numbers of Indigenous Yr 12 completers or the per centage of 17 and or 18 year  old Indigneous  yong people who had completed year 12.

And of course we should be able to calculate that the 2006 figure is very low because,  if even 60 per cent of  say 3,000 Indigenous young people completed yr 12 in 2006, a 69 per cent increase would, take it to over 100 per cent .  So we do know this increase it is based on a low starting base but how low?? We have no idea.

We also don’t know how many of the increased completers are in remote or very remote contexts.  if the figures were low enough it would be possible to get this increase with nil increases in remote and very remote australia.

And finally, we need to be cautious about assuming that retention to  yr 12 has any real meaning in terms of the extent to which Indigenous young people beneft from schooling.  There is extensive anecdotal evidence that many  of the small numbers of students in remote and very remote communities who stay on till the end of year 12 still cannot read to Year 3 NAPLAN standard.  Unfortunately, year 12 completion is a very poor proxy for “benefiting from schooling”.

But the real idiocy of this kind of reporting is that we should not have to stuff around with head count surveys every five years to get a sense of  how we are progressing  in remote/very remote Australia – trying to read meaning into data just because we have it.

In fact the NT Department of Education has excellent administrative data and as Nigel Scullion points out in this article the next “Close the Gap” Report is due in February 2013.

However  this report will rely on year 12 retention data and this does not tell us much at all, except that we urgently need to negotiate a more useful measure for the extent to which Indigenous young people benefit from schooling.

What we measure and report, and what we fail to measure: topsy turvy data

In an article in The Conversation by Kate Taylor called “Is this progress? Watering down the Millennium Development Goals[1], Taylor notes that one of the problems for reporting on progress or otherwise on the very important Millennium Development Goals is the lack of reliable data to track and assess progress.

Our assessment of progress for all of the MDG targets remains severely hampered by a lack of data. Donors have spent billions of dollars through their bilateral programs and through partnerships such as GAVI and the Global Fund to Fight AIDS, Tuberculosis and Malaria, but they have barely invested in systems to monitor their effect.”

She also quoted Dr Seth Berkly from the GAVI alliance who noted recently that:

“ … the household data that GAVI uses – essentially the same sources used to track MDG 7 – can be very problematic. For example, in some conflict-ridden countries national surveillance is so weak that you can’t tell if 86% of children are immunised as claimed by the World Health Organisation (WHO) and UNICEF or if the true figure is closer to the 36% estimated by other experts. These numbers matter because they translate into lives saved or lost.”

Targets are important. They fundamentally underpin transparency and accountability.

This is appalling, no question and it made me think about all the other information that we do not collect or report on that are important in terms of public accountability.  I can think of many examples but here are a few that matter to me:

1. Overcrowding Housing in NT remote communities

Recently Minister Maklin gave an upbeat report on the number of new houses built in remote communities in the Northern Territory[2].  On reading it most readers could be forgiven for thinking – ‘problem solved’.  But the data provided does not really tell us anything at all.

Its been a long time since new houses were built in all but a few NT remote communities so this amount of new houses has to compensate for al the non developments over the past years and the population increase before it even begins to affect the appalling overcrowding situation.  But none of this data is available.

When 30 new houses were built for the Wadeye community in 2007-2008, John Taylor from the CAEPR Institute at ANU remarked that over that year alone, there were over 200 new babies born in Wadeye and the average person per three bedroom home was over 15.  Even if 100 people died over that same period (unlikely) it is clear that the rate of building over this period barely kept pace with the population increase.  But it was hailed as amazing progress.

So to understand the impact of the new houses we would really need to know what was the average per person per home or per bedroom at the start and has it been reduced.  It would also be nice to have a target.  After-all if it had gone down from 15 per three bed home to 14.5 we should be stating that we have made a start on addressing the overcrowding problem in NT remote communities but we have a long way to go.

2.  NAPLAN data

Unlike many I support the concept of public transparency and accountability.  By this I mean that where significant funds are expended (and education is one significant area of government expenditure) the public are entitled to know that the funds expended are delivering in policy terms efficiently and effectively.

So NAPLAN data could help us – the public – understand whether we are:

  • reducing the proportion of children who are failing to reach minimum standards of literacy
  • reducing the learning gap that is based solely on the levels of disadvantage of students
  • ensuring that children in different jurisdictions or sectors or geo-locations are not being disadvantaged by these factors
  • whether a particular system level literacy intervention or major policy change is having any impact on the pattern of student learning outcomes.

Although NAPLAN is a limited multiple choice test on a narrow range of learning, it is a reasonable enough tool to use for these population level review purposes.

But this means having the data at this population level.  Now we collect this data and it is published.  It is included in the COAG Reform Councils Report of the National Education Agreement[3] and I am pretty sure that the information in this report has not been read and adsorbed by more than a handful of people.

This data tells us that we are not on track to halve the gap in Indigenous literacy and numeracy achievement although there are gains in some state and in some subjects.  It also tells us that the percentage of children who do not achieve the minimum literacy and numeracy benchmark is up to 5 to 8 times higher in the bottom quintile of SES in each nationally and the states results vary even more. However when you look behind the data one can find enormous problems.  This includes the decision to use parent education data as the proxy for disadvantage even though it is far from complete (with over 40% of NT information missing) and that student absences are so high in some states as to render the results meaningless.

But what do we do with the data – we report it at school level.  Even though no-one credible has yet been willing to state unequivocally that there is any decent level of reliability or validity in NAPLAN data reporting at this level.

This school level data doesn’t tell us anything about whether we are making progress in addressing education disadvantage or closing the Indigenous student learning gap and it cant be rolled up to do this.

So there you have it.  We collect poor data and barely draw on it at the population level, even though this is the most important level. But we spend all our energy, time and resources breaking it down to school level where errors of many kinds render it totally meaningless.

We have a way to go before we are data smart.  Perhaps the national Curriculum can train all future citizens not to be fooled by such appalling misuse of data through better data literacy.  One can only hope.

NT the next jurisdiction to be bought off on school autonomy rhetoric

According to this article $480 million plan for more school autonomy NT has now accepted $2.4 mill to ‘ implement school autonomy measures  in 20 Territory schools.

But, just like in NSW, it appears that the money to schools is what they are really after not school autonomy. This article quotes the Principal of Ludmilla Primary who will use the funds for a family liaison project.

Now I ask you – what is stopping Ludmilla from implementing thsi program now?  Is it lack of autonomy – No  it is lack of funds.

The more this program rolls out the more it is starting to look like a giant school bribery program.

The only question is – what is the Commonwealth getting from this expenditure?  Suggestions anyone?

Carol’s Indigenous students had such an “awesome time” at school that their friends starting coming to class too

In this post,  Aboriginal Engagement by teacher Carol Puskic from Geraldton Senior College in WA, Puskic talks about using a very clever but simple tool which has transformed the energy and engagement of the students in her class – a class specifically for disengaged Indigenous students who come from Halls Creek, Port Hedland, Broome and beyond and board in Geraldton.  The tool is called ClassMovies.

I first stumbled across the ClassMovies project in July 2010 and was so excited by its potential as a tool that teachers could use in so many useful ways that I wrote about it[1]  I introduced the article as follows:

Continue reading