In an article in The Conversation by Kate Taylor called “Is this progress? Watering down the Millennium Development Goals, Taylor notes that one of the problems for reporting on progress or otherwise on the very important Millennium Development Goals is the lack of reliable data to track and assess progress.
“Our assessment of progress for all of the MDG targets remains severely hampered by a lack of data. Donors have spent billions of dollars through their bilateral programs and through partnerships such as GAVI and the Global Fund to Fight AIDS, Tuberculosis and Malaria, but they have barely invested in systems to monitor their effect.”
She also quoted Dr Seth Berkly from the GAVI alliance who noted recently that:
“ … the household data that GAVI uses – essentially the same sources used to track MDG 7 – can be very problematic. For example, in some conflict-ridden countries national surveillance is so weak that you can’t tell if 86% of children are immunised as claimed by the World Health Organisation (WHO) and UNICEF or if the true figure is closer to the 36% estimated by other experts. These numbers matter because they translate into lives saved or lost.”
Targets are important. They fundamentally underpin transparency and accountability.
This is appalling, no question and it made me think about all the other information that we do not collect or report on that are important in terms of public accountability. I can think of many examples but here are a few that matter to me:
1. Overcrowding Housing in NT remote communities
Recently Minister Maklin gave an upbeat report on the number of new houses built in remote communities in the Northern Territory. On reading it most readers could be forgiven for thinking – ‘problem solved’. But the data provided does not really tell us anything at all.
Its been a long time since new houses were built in all but a few NT remote communities so this amount of new houses has to compensate for al the non developments over the past years and the population increase before it even begins to affect the appalling overcrowding situation. But none of this data is available.
When 30 new houses were built for the Wadeye community in 2007-2008, John Taylor from the CAEPR Institute at ANU remarked that over that year alone, there were over 200 new babies born in Wadeye and the average person per three bedroom home was over 15. Even if 100 people died over that same period (unlikely) it is clear that the rate of building over this period barely kept pace with the population increase. But it was hailed as amazing progress.
So to understand the impact of the new houses we would really need to know what was the average per person per home or per bedroom at the start and has it been reduced. It would also be nice to have a target. After-all if it had gone down from 15 per three bed home to 14.5 we should be stating that we have made a start on addressing the overcrowding problem in NT remote communities but we have a long way to go.
2. NAPLAN data
Unlike many I support the concept of public transparency and accountability. By this I mean that where significant funds are expended (and education is one significant area of government expenditure) the public are entitled to know that the funds expended are delivering in policy terms efficiently and effectively.
So NAPLAN data could help us – the public – understand whether we are:
- reducing the proportion of children who are failing to reach minimum standards of literacy
- reducing the learning gap that is based solely on the levels of disadvantage of students
- ensuring that children in different jurisdictions or sectors or geo-locations are not being disadvantaged by these factors
- whether a particular system level literacy intervention or major policy change is having any impact on the pattern of student learning outcomes.
Although NAPLAN is a limited multiple choice test on a narrow range of learning, it is a reasonable enough tool to use for these population level review purposes.
But this means having the data at this population level. Now we collect this data and it is published. It is included in the COAG Reform Councils Report of the National Education Agreement and I am pretty sure that the information in this report has not been read and adsorbed by more than a handful of people.
This data tells us that we are not on track to halve the gap in Indigenous literacy and numeracy achievement although there are gains in some state and in some subjects. It also tells us that the percentage of children who do not achieve the minimum literacy and numeracy benchmark is up to 5 to 8 times higher in the bottom quintile of SES in each nationally and the states results vary even more. However when you look behind the data one can find enormous problems. This includes the decision to use parent education data as the proxy for disadvantage even though it is far from complete (with over 40% of NT information missing) and that student absences are so high in some states as to render the results meaningless.
But what do we do with the data – we report it at school level. Even though no-one credible has yet been willing to state unequivocally that there is any decent level of reliability or validity in NAPLAN data reporting at this level.
This school level data doesn’t tell us anything about whether we are making progress in addressing education disadvantage or closing the Indigenous student learning gap and it cant be rolled up to do this.
So there you have it. We collect poor data and barely draw on it at the population level, even though this is the most important level. But we spend all our energy, time and resources breaking it down to school level where errors of many kinds render it totally meaningless.
We have a way to go before we are data smart. Perhaps the national Curriculum can train all future citizens not to be fooled by such appalling misuse of data through better data literacy. One can only hope.