Did Neoliberalism Kill Gender Equity in Australian Schools?

My starting point for this article is the following puzzle. We live in a world saturated with discussion about feminism across all media and many areas of focus – from gaming, to global conflict. But in our schools it is largely an absent presence.

This is in spite of the fact that, in the two decades following the publication of the landmark Australian Schools Commission Report, Girls, Schools and Society (1975),[1] Australia was one of the world leaders in bringing together the best of feminist academic scholars, education policy makers and education practitioners to develop understandings about the nature of gender inequality, how school contributes to this inequality and the best ways to address gender issues in schools.

During this period, gender equity policies were in place nationally and in all states and there was a lively debate about gender equity priorities, drawing on practice and research, about how gender inequality is constructed and maintained, the implications for the education of boys and much more. Of course it was far from perfect and there were pockets of resistance, but it was never just ignored.

It is true that there are still some programs that are being implemented in Australian schools today that are informed by feminist understandings. Some good examples of this include the safe schools program – a national program addressing homophobia, some but not all anti-bullying programs and a Victorian state school program that has a focus on sex education and gender based violence.

However, these are not across all schools and there are no consistent broader gender equity programs in place. It appears to be entirely up to individual feminist educators to find suitable material and more importantly to find space in pressured teaching programs for any learning that prepares students to understand and respond to patriarchal cultural, economic and social practices and structures that they both already experience and will come across in adulthood.

There are also no longer official systemic policies they can rely on to legitimize this work and I have heard anecdotally that even strong feminist educators often decide not to raise issues relating to gendered practices in their school or to suggest the inclusion of feminist perspectives, where they may be relevant to a particular learning topic, because they fear the consequences.

Why this happened could be seen as less relevant than what to do about it. But what if the issues are connected? What if the ideas, assumptions, practices and or forces that contributed to the demise of gender equity policy and practice continue to impact today?

In my previous article, I noted that I had always assumed that the gender equity movement died because the ascendency of the men’s rights backlash coincided with increasing evidence that, in terms of school level academic outcomes, girls were actually faring better than boys, in crude terms, and that in 1996 the Howard Government was voted in and backed the men’s rights view of the world.

Looking back I can see that this might have been the main driver in the first instance but this does not explain why there was no rebound effect.

Why has gender equity stayed off the schooling agenda for 20 years when there has been such a significant resurgence almost everywhere else?

In this article I make the case for the following propositions:

  1.  That neoliberal understandings[2] have had a profound impact on the structures and cultures of schooling and this has reduced the opportunities for the kinds of intellectual work required to bring feminist considerations to mainstream learning.
  2. That neoliberal understandings have also impacted on the kinds of feminist understandings that are most accessible to educators by privileging individualistic perspectives and practices.
  3. That the impact of this has been significant and problematic for young people who leave school and enter adulthood poorly prepared for negotiating patriarchal structures, practices and assumptions that they encounter as adults particularly around work and family.

Neoliberalism and Schooling[3]

The gender equity policies and practices that were taken up by school systems right across Australia, in the period between 1975 and 1995, took place in school settings that lacked the significant elements of neoliberal understandings of schooling.

There were no national standardised tests and even when they (the NAP, now NAPLAN) were first introduced in the mid 1990s, they did not become high stakes tests until the Labor Party, excited by Joel Klein’s vision decided to report the results a school level through the MySchool website in 2008.

The Howard and Kennett visions of improving schooling through markets and school choice had not yet begun. While many parents decided to send their child to a non-state school, and a small number of parents chose out of area public schools, the local community public school was still the default. Politicians were not yet sprouting the idea that schools across all systems would improve by competing with each other for teachers and students and that it was the responsibility of good parents to make an informed choice about their child’s schooling.

The National Curriculum Statement and Profiles, that had been completed by the early 90s, looked very different to the more proscribed syllabus outline we have today. They were firmly based on developmental understandings of children’s learning, they rejected A-E grading and the notion that children must be assessed against year level standards.

Australian participation in global testing, PISA and TIMMS, was in the early stages of negotiation. Indeed, we did not know how well our students were doing in a global context, as there were no global standards or comparative data.

Teachers were still accused of being not good enough and there were vicious debates about how to teach reading, and the merits of progressive education vis-à-vis other methods. However, progressive education as understood in the broad traditions of John Dewey and Jerome Bruner dominated, and the idea of education for full democratic participation was not hotly contested within the profession.

Schooling was seen as being about much more than test scores and preparation for work. Issues that had importance beyond the world of work, like gender justice, could be prioritised in such an era. Teachers were not pressured to teach to the test and had the time to introduce broader learning themes.

I am not suggesting for a minute that things were perfect or even necessarily better. I know from my own experience during this period that our understanding of education disadvantage was not well examined and that deficit understandings of poor, Indigenous and disabled students may well have led to lowered expectations and complacency about poor outcomes. The ideas about teacher professional standards, school improvement, teacher collaboration and continued professional development existed in the research but had not yet been comprehensively implemented across all systems. But there was an absence of the kind of pressures that have become associated with the global education reform initiatives of marketisation and high stakes testing that have been documented by a many researchers in Australia and internationally.

I will draw on just two such studies, both Australian.

The first was undertaken by The Whitlam Institute, in response to public concerns about the effects of high stakes testing. They conducted a survey of teachers and principles to ascertain their perceptions of how high stakes testing has impacted on students and classroom practice.

In relation to classroom practice, survey respondents noted the following impacts:

  • NAPLAN preparation is adding to an already crowded curriculum – over 85%;
  • NAPLAN is affecting the range of teaching strategies they use – 59%.
  • NAPLAN is impacting on the way in which school communities view curriculum areas, with subjects that are not tested reduced in importance – 75%
  • The focus of NAPLAN on literacy and numeracy has led to a timetable reduction for other subjects in their schools – over 66%.

This suggests that the pressure to prepare students for the NAPLAN test is reducing the space for the kinds of enquiry that used to occur in the days when there was a gender equity policy and readily available relevant curriculum materials.

But it goes beyond this. The NAPLAN performance pressure does not just impact on individual teachers in individual schools. Schools are now in competition with one another for the most desirable school enrolments and desirable parents. NAPLAN results are published on the MySchool website so that parents can make ‘informed choices’. On the website, schools are compared with ‘ like schools’ – that is those who have similar student demographics and parents can also compare their NAPLAN scores with schools in the nearby vicinity.

The Whitlam survey also noted the following respondent views:

  • The publication of ‘weaker than expected’ results would negatively affect parental perception of the school. – 95%
  • Poor NAPLAN results would negatively affect media reports about the school – 95%
  • Weak results would damage the school’s reputation in the community – 96%
  • Lower than expected results on NAPLAN would mean that a school would have trouble attracting and retaining students – over 90%
  • There would be a negative impact on staff morale – 90%
  • Weaker than expected’ results would lead to a negative student perception of the school – 75%.

This is consistent with the observations made by Susan Groundwater-Smith and Nicole Mockler in their book, Teacher professional learning in an age of compliance[4]. These researchers undertook extensive school based research in schools across NSW in 2009 and this book is based on this experience.

They observe that the impact of the global education reform agenda has resulted in a retreat into a standardised, audited, and backward-looking schooling culture, the rolling back of a more progressive educational philosophy, an increased acceptance of ‘common sense solutions; a reduced tolerance for ambiguity; and an increase in fear and distrust. They also make the following important observation:

Almost a century after the publication of Democracy and Education, [John Dewey 1916] we find ourselves in uncertain, ambiguous times. …On a policy level we appear to be once again retreating, from a once-within-our-grasp vision of progressive education into safer, more measurable, more quantifiable territory. More worryingly we see the very notion of democracy at the heart of Dewey’s thinking under threat…at the hands of religious, economic and educational fundamentalists and a pervasive neoliberal agenda.

They concluded that ‘the press for compliance leaves little room for a more critical position to be adopted”.

Neoliberalism and Feminism

Neoliberalism has also influenced, profoundly, the dominant ideas of feminism. Eva Cox made this same point recently:

During the past few years, I have been seriously rethinking feminism. This intensified at last year’s Sydney Writers’ Festival when I attended a session by Michael Sandel on money, followed by a panel on feminism. Sandel analysed the damage done by more than two decades of neoliberal market models; the feminist panel ignored this and just complained about continued inequities, but not why this is so (my emphasis).

Cox is suggesting that those feminists who have the most exposure in popular discourse are not interested in how the neoliberal economic, social policies, practices and ideas have impacted on women – and on particularly on differently positioned women.

She goes on to say that the tamed-down version of feminism of today misses many of the important issues for women because the scope of its focus is far too narrow. Violence might be a fundamental women’s issue but many issues of mainstream economic, political, environmental importance also demand a feminist lens.

The shift to market models meant many women’s groups focused on raising the status of women via access to power in current macho terms. More women in male-defined areas of power – in politics or on boards – was erroneously claimed to be the route to feminist change. But we failed to see they were promoted because they posed no threat to the system that allowed them into the tent to share some of the power that men controlled. There are active women’s groups with current demands for remedies to violence and exclusion, access to childcare, improvements to bad media images and solutions to female poverty and lack of representation. But these are not radical demands and are defined as “women’s issues”, not general problems for society.

I want to see more action in devising solutions rather than just protest campaigns. Feminists need to lead so that we can counter the bipartisan bad policies of the major political parties: low welfare payments, bad indigenous programs, overlong working hours, too many market-based not community-based services.

There is an urgent need to solve many “wicked” policy problems – boat people, inequality, environmental damage. These issues need much better connectivity and social cohesion, so it is irrational that women are not there to contribute perspectives broader than the limited experiences of current leadership incumbents. We need wider views than macho neoliberal economics can offer to cope with the problems caused by an ageing population, mobile workers, single-person households, social inequities and growing personal care needs.

Nancy Frazer goes further and argues that feminism today actually promotes and legitimizes neoliberalism:

 In a cruel twist of fate, I fear that the movement for women’s liberation has become entangled in a dangerous liaison with neoliberal efforts to build a free-market society. That would explain how it came to pass that feminist ideas that once formed part of a radical worldview are increasingly expressed in individualist terms. Where feminists once criticised a society that promoted careerism, they now advise women to “lean in”. A movement that once prioritised social solidarity now celebrates female entrepreneurs. A perspective that once valorised “care” and interdependence now encourages individual advancement and meritocracy.

 Sarah Jaffe, a blogger for Dissent magazine explains how feminist campaigns for equal pay and for equal access to male dominated areas of work dominated feminist activism at the expense of a focus on valuing of women’s traditional work, including unpaid caring and community work and union organizing. This was a feminism that was highly compatible with the neoliberal focus on undermining unions, seeing all activity in market terms, ignoring community and pushing down wages.

 [T]he so-called “second wave” of feminism fought for women to gain access to work outside of the home and outside of the “pink-collar” fields. Yet in doing so, some feminists wound up abandoning the fight for better conditions in what had always been considered women’s work—whether that be as teachers and nurses, or the work done in the home for little or no pay.

. ..The devaluation of work that involves care, work for which women were assumed to be innately suited, continued apace when feminism turned its back. As other jobs have disappeared, the low wages that were acceptable when women were presumed not to need a “family wage,” because they ought to be married to a man who’d do the breadwinning, became the wages that everyone has to take or leave.

Equal pay for equal work means little when the wages for all are on the way down….[ F]or a hotel housekeeper, a nurse, a janitor, the best way to improve your job isn’t to get promoted through the ranks, but to organise with your fellow workers.

 What do we want young women (and men) to learn about feminism?

I am retired from the paid workforce now but I spent many years working in the public service on social policy, some of it on women’s desks. I constantly came across strong, smart, interesting women who stood up against sexism and homophobia in the workplace but whose paid intellectual work appeared to be gender, class and race blind. They did not see it as their role, in developing policy, to consider how particular design elements would impact on differently positioned individuals and families. We need scientists, economists, education and health policy workers to do better than this.

I want a schooling system that insists that students ask questions about what they learn and that equips them to apply a feminist and/or class and/or culture lens to all issues of importance. If we are committed to a fairer more just society we need nothing less.

Feminist questions and perspectives belong in the technology, music, art, English literature, science, legal studies, history, civics and citizenship, environmental studies and health classrooms, not just in wellbeing, sex education, school dress code and bullying policies.

We also need to better prepare students for what Leslie Cannold once described as ‘the equal opportunity train wreck that is motherhood”. Liberal Feminism won’t help with negotiating the structural inequality issues enmeshed in the work relationship conundrums that arise when a baby comes on the scene, even in a supposedly equal relationship.

Stephanie Coontz, Professor of family history at Evergreen state college (USA) observes that: 

…men and women .. are stuck between a rock and a hard place when it comes to arranging their work and family lives. For more than two decades the demands and hours of work have been intensifying. Yet progress in adopting family-friendly work practices and social policies has proceeded at a glacial pace.

While the US has even worse family friendly policies than Australia, those same tensions exist for young parents here.

While research[5] suggests that most young men and women have similar expectations of work, family and careers on leaving school and remain committed to the ideal of an equal relationship and with shared care of children and equal opportunities to progress in their respective jobs/careers, the reality is that this is extremely difficult to manage and most fail. And when this idea fails, the fall-back compromises are depressingly predictable.

When family and work obligations collide, mothers remain much more likely than fathers to cut back or drop out of work. But unlike the situation in the 1960s, this is not because most people believe this is the preferable order of things. Rather, it is often a reasonable response to the fact that our political and economic institutions lag way behind our personal ideals.

Women are still paid less than men at every educational level and in every job category. They are less likely than men to hold jobs that offer flexibility or family-friendly benefits. When they become mothers, they face more scrutiny and prejudice on the job than fathers do.

So, especially when women are married to men who work long hours, it often seems to both partners that they have no choice. Female professionals are twice as likely to quit work as other married mothers when their husbands work 50 hours or more a week and more than three times more likely to quit when their husbands work 60 hours or more.

So what happens when young women – including young feminists, with high hopes for their careers – find themselves doing most of the care work and, because they are home more, most of the housework, and find themselves earning less or even being, for a period, economically dependent?

When people are forced to behave in ways that contradict their ideals, they often undergo what sociologists call a “values stretch” — watering down their original expectations and goals to accommodate the things they have to do to get by. This behaviour is especially likely if holding on to the original values would exacerbate tensions in the relationships they depend on.

When a couple backslide into more traditional roles than they originally desired. The woman resents that she is not getting the shared child-care she expected and envies her husband’s social networks outside the home. The husband feels hurt that his wife isn’t more grateful for the sacrifices he is making by working more hours so she can stay home. When you can’t change what’s bothering you, one typical response is to convince yourself that it doesn’t actually bother you. So couples often create a family myth about why they made these choices, why it has turned out for the best, and why they are still equal in their hearts even if they are not sharing the kind of life they first envisioned.

And when this happens, the frameworks and ideas most readily available to make sense of what has happened do not help, because what is a structural problem – the failure of work organisation to cater for the role of caring and the undervaluing of this role – gets framed as a personal choice.

What I have outlined above only covers the dilemmas experienced by young people who end up in hetero-normative coupledom. Others who traverse these pathways as teen mums, queer parents, divorced and single parents have an even more difficult time.


In my view, young people need exposure to the best analytical frameworks that feminism can provide, not a gender blind education that leaves them to work it out and not a feminism that binds them to the key assumptions and beliefs underpinning neoliberalism, but one that is able to look at issues from a structural perspective and from the point of view and experience of people living in very different contexts.

Schools can and should prepare our young people, men and women, for the challenge of negotiating work life balance in an unequal world. We can’t just paint a nirvana of a gender blind world where work and family options are equally open to all with no detriment.

So how do we prepare them? Well the reality is that we can’t – not explicitly. You try telling even the most highly educated person that having a baby will change their life and not all in a good way. But we can equip them with the tools of feminist analysis that go beyond a liberal feminism of personal choice. We can study issues that will be relevant to their futures as workers and possibly parents.

In the 2014 budget, Hockey announced many unpopular proposals but it is important to note that they were not only about savings. The reforms to higher education and the co-payments for GP visits were also driven by a belief, held by this Government, that all services should have market signals. If implemented, these understandings take us even further into an extreme neoliberal future where education and health are not investments for the common good, but a private good that must be purchased in a competitive market.

To respond to the problems created by neoliberal policies, people need to be able to name and understand the assumptions and beliefs that underpin such practices, and to understand their impacts.

Neoliberalism dominates our understandings today but until recently it was hegemonic – so taken for granted that it was invisible like the air we breathe. The term neoliberalism was rarely used outside of leftist circles and was viewed by many as extreme left jargon. But this is changing as its tensions, contradictions and problems are becoming more and more apparent.

We now have increasing levels of exposure to information that shows how powerful business groups, drawing on neoliberal buzz words about market forces and small government, have had an unequal impact on our democratic processes as large corporations effectively ‘buy Governments’ and used the system to amass huge wealth at the expense of most of the planet. This historic concentration of wealth to a fraction of the population while hollowing out the middle class and increasing poverty is in the popular press and even the Pope speaks out about it.

Best sellers like Naomi Klein’s This Changes Everything[6] make popular the notion that climate change is intrinsically connected to neoliberalism and this is starting to change the conversation. She makes it clear that we cannot solve the urgent issues facing this planet on which we all depend just by lobbying for better climate policies. We have to change our thinking entirely:

..[W]e will not win the battle for a stable climate by trying to beat the bean counters at their own game – arguing for instance, that it is more cost –effective to invest in emission reductions now than disaster response later. We will win by asserting that such conversations are morally monstrous, since they imply that there is an acceptable price for allowing entire countries to disappear, for leaving untold millions to die on arched land, for depriving today’s children of their right to live in a world teeming with wonders and beauties of creation”

Now I am not saying that we should indoctrinate our young people about the evils of neo-liberalism and create revolutionary activists. But we can and must expose young people to the important ideas and perspectives of our time and the significant associated debates. Tomorrow’s adults deserve nothing less.

[1] Girls, Schools and Society: Report by a study group to the Schools Commission Nov 1975. Jean Blackburn was the most high profile person who was part of the group and the foreward states that she did the final editing of the publication. This had a significant influence on education policies and practices across all schooling systems in Australia and set in train a series of gender equity policy documents spanning the next two decades.

[2]Neoliberalism, sometimes referred to as unconstrained capitalism,, is, basically, the belief that states ought to abstain from intervening in the economy, and instead leave as much as possible up to individuals participating in free and self-regulating markets. This means that as much as possible, all services should be run as user pays businesses. Individuals are also seen as being solely responsible for the consequences of the choices and decisions they freely make: instances of inequality and glaring social injustice are acceptable, because they are, in the main, the result of freely made decisions.

.[3] to read more about schools and the influence of neoliberalism go to https://educatorvoices.wordpress.com/2013/05/22/what-have-schools-got-to-do-with-neo-liberalism/

[4] Susan Groundwater-Smith and Nicole Mockler: Teacher Professional Learning in an age of Compliance: Mind the Gap, Springer 2009

[5] Stephanie Coontz, Why Gender Equality stalled New York Times, Opinion Feb 2013



Hernan Guevo and Johanna Wynn, Rethinking Youth transitions in Australia, Youth Research Centre, University of Melbourne, March 2011. This is a detailed longitudinal study of young men and women from school leaving and up to their late 30s. This report makes t clear that man and women had similar attitudes to careers , jobs and families but that when children arrive the gendered patters of work and care continue to operate along traditional lines not because couples believe this is how things aught to be, but because of the complex choices and challenges under structurally constrained circumstances. http://web.education.unimelb.edu.au/yrc/linked_documents/RR33.pdf

[6] Naomi Klein, This Changes Everything: Capitalism versus the Climate, Penguin Group 2014

The NAPLAN Parliamentary Review’s ‘do nothing’ recommendations: We can do better

Many of us waited with a degree of eagerness – even excitement – for the release of the Parliamentary Inquiry Report into NAPLAN (Effectiveness of the National Assessment Program – Literacy and Numeracy Report). But what a disappointment!

While it makes a passable fist of identifying many, but by no means all, of the significant issues associated with how our NAPLAN is currently administered and reported, it does miss some of the important details. This could be forgiven if the recommendations showed any evidence of careful thinking, vision, or courage. But in my assessment they are trivial and essentially meaningless

We know a lot about the problems with our current approach to standardised testing and reporting. This Report, with the help of over 95 submissions from a wide range of sources, manages to acknowledge many of them. The key problems include:

  • it is not valid and reliable at the school level
  • it is not diagnostic
  • the test results take 5 months to be reported
  • it is totally unsuitable for remote Indigenous students – our most disadvantaged students – because it is not multilevel, in the language that they speak or culturally accessible (Freeman)
  • now that it has become a high stakes test it is having perverse impacts on teaching and learning
  • some of our most important teaching and learning goals are not reducible to multiple choice tests
  • there is a very real danger that it will be used to assess teacher performance – a task it is not at all suited to
  • some students are being harmed by this exercise
  • a few schools are using it to weed out ‘unsuitable enrolments’
  • school comparisons exacerbate the neoliberal choice narrative that has been so destructive to fair funding, desegregated schools and classrooms and equitable education outcomes
  • there will always be a risk of league tables
  • their unequal impact on high needs school
  • they do not feed into base funding formulas. In spite of the rhetoric about equity and funding prioritization being a key driver for NAPLAN, it is not clear that any state uses the NAPLAN to inform their base funding allocations to schools[1]

However, the ‘solutions’ put forward by the report are limited to the following recommendations:

  1. develop on-line testing to improve test results turn around – something that is happening anyway
  2. take into account the needs of students with a disability and English language learners. Now this recommendation is so vague as to be meaningless
  3. have ACARA closely monitor events to ensure league tables are not developed and that the results feed into funding considerations. This is another vague do nothing recommendation and I am certain ACARA will say that they are already doing this.

This is a recommendation to do nothing – nothing that is not already being done or nothing of meaningful substance.

As an example of the paucity of its analysis I offer the following. The report writes about the lack of diagnostic power of the NAPLAN tests and then says that, even if they were diagnostic, the results come too late to be useful. The report then argues, as its first and only strong recommendation that there needs to be quicker timeframe for making the results available. Did the writer even realize that this would still not make the tests useful as a diagnostic tool?

This Report, while noting the many problems assumes that these can be addressed through minor re-emphasis and adjustments – a steady as she goes refresh. However the problems identified in the Report suggest that tiny adjustments won’t address the issues. A paradigm change is required here.

We are so accustomed now to national standardised testing based on multiple choice questions in a narrow band of subjects being ‘the way we do things’, that it seems our deliberations are simply incapable of imagining that there might be a better way.

To illustrate what I mean I would like to take you back to the 1990s in Australia – to the days when NAPLAN was first foisted on a very wary education community.

How many of us can remember the pre national testing days? Just in case I will try and refresh your memory on some key elements and also provide a little of the ‘insider debates’ before we adopted the NAPLAN tests.

1989 was the historic year when all Education Ministers signed up to a shared set of goals under the now defunct 1989 Hobart Declaration. Australia was also in the process of finalising its first ever national curriculum – a set of Profiles and Statements about what all Australian children should learn. This was an extensive process driven by an interstate committee headed by the then Director of School Education in NSW, Dr Ken Boston.

During this time, I worked in the mega agency created by John Dawkins, the Department of Employment, Education, Training and Youth Affairs, initially in the Secretariat for the Education Ministerial Council (then called the AEC) and a few years later heading up the Curriculum and Gender Equity Policy Unit.

The Education Division at that time was heavily engaged in discussion with ACER and OECD about the development of global tests –the outcomes of which are PISA and a whole swag of other tests.

This was also when standardised testing was also being talked about for Australian schools. Professor Cummings reminds us of this early period in her submission to the Parliamentary Inquiry when she says that

This was also when standardised testing was also being talked about for Australian schools. Professor Cummings reminds us of this early period in her submission to the Parliamentary Inquiry when she says that

…through the Hobart Declaration in 1989 … ‘Ministers of Education agreed to a plan to map appropriate knowledge and skills for English literacy. These literacy goals included listening, speaking, reading and writing….

National literacy goals and sub-goals were also developed in the National Literacy (and Numeracy) Plan during the 1990s, including: …comprehensive assessment of all students by teachers as early as possible in the first years of schooling…to ensure that…literacy needs of all students are adequately addressed and to intervene as early as possible to address the needs of those students identified as at risk of not making adequate progress towards the national…literacy goals….use [of] rigorous State-based assessment procedures to assess students against the Year 3 benchmark for…reading, writing and spelling for 1998 onward.

It is interesting to note that the on entry assessments of children by teachers commitment referred to by Cummings did result in some work in each state. But it never received the policy focus, funding or attention that it deserved., which I regard as a pity The rigorous assessments at Year 3 however grew in importance and momentum. But the key consequence of this commitment was the year 3 state based assessment. Professor Cummings goes on to say that in order to achieve this goal – a laudable goal – the NAP was born. State based at first with very strict provisions about not providing school based data and then eventually what we have today.

Professor Cummings may not have known that many of us working in education at the time did not believe that national multiple choice standardised tests were the best and only answer and that considerable work was undertaken to convince the then Education Minister, Kim Beazley, that there was a better way.

During this period, where National standardised literacy tests were being discussed in the media and behind closed doors at the education Ministers’ Council,

Over this same period the US based Coalition of Essential Schools was developing authentic classroom teaching and learning activities that were also powerful diagnostic assessment exercises. Its stated goal was to build a data bank of these authentic assessments activities and to benchmark student progress against these benchmarks across the US. Its long term goal was to make available to schools across the US a data-base of benchmarked (that is standardised) assessments with support materials about how to use the materials as classroom lessons and how to use the results to a) diagnose a students learning b) plan future learning experiences and c) compare their development to a US wide standard of literacy development.

As the manager of the curriculum policy area, I followed these developments with great interest, as did a number of my work colleagues inside and outside the Department. We saw the potential of these assessments to provide a much less controversial, and less damaging way of meeting the Ministers’ need to show leadership in this area.

Our initiatives resulted in DEETYA agreeing to fund a trial to develop similar diagnostic classroom friendly literacy assessment units as the first part of this process. We planned to use these to demonstrate to decision makers that there was a better solution than standardized multiple-choice tests.

As a consequence I commenced working with Geoff Masters (then at ACER as an assessment expert) and Sharon Burrows (who headed up the Australian Education Union at the time) exploring the potential power of well designed formative assessments, based on authentic classroom teaching formats, to identify those at risk of not being successful at developing literacy skills.

Unfortunately we failed to head off a decision to opt for standardised tests. We failed for a number of reasons:

  • the issue moved too quickly,
  • the OECD testing process had created a degree of enthusiasm amongst that data crunchers who had louder voices,
  • our proposal was more difficult to translate to three word slogans or easy jargon,
  • multiple choice tests were cheaper.

At the time I thought these were the most important reasons. But looking back now, I can also see that our alternative proposal never had a chance because it relied on trusting teachers. Teachers had to teach the units and assess the students’ work. What was to stop them cheating and altering the results? Solutions could have been developed, but without the ICT developments we have access to today, they would have been cumbersome.

I often wonder what would have happened if we had initiated this project earlier and been more convincing. Could we have been ‘the Finland’ of education, proving that we can monitor children’s learning progress, identify students at risk early in their school lives, prioritise funding based on need  – all without the distorting effects of NAPLAN and MySchool?

We can’t go back in time but we can advocate for a bigger, bolder approach to addressing the significant problems associated with our current NAPLAN architecture. The parliamentary report failed us here but this should not stop us.

I have written this piece because I wanted us to imagine, for a moment, that it is possible to have more courageous bold and educationally justifiable policy solutions around assessment than what we have now. The pedestrian “rearrange the deck-chairs” of this Report is just not good enough.

So here is my recommendation, and I put it out as a challenge to the many professional education bodies and teacher Education Institutions out there.

Set up a project as follows:

Identify a group of our most inspiring education leaders through a collaborative peer nomination process. Ensure the group includes young and old teachers and principals, teachers with significant experience in our most challenging schools especially our remote Indigenous schools. Provide them with a group of expert critical friends – policy experts, testing and data experts, assessment and literacy experts and ask them to:

  • Imagine there is no current assessment infrastructure
  • Devise an educationally defensible assessment architecture – taking a green fields approach

I can almost guarantee that this working group would not invent NAPLAN and MySchool or anything like it, and we would be significantly better off.

We have dug ourselves into an educationally indefensible policy hole because we have allowed politicians and the media to drive key decision. To my knowledge we have never established an expert group of educational practitioners with access to specialist expertise to develop better policy solutions in education. Why don’t we give it a try?

Any takers?

[1] I understand that NSW does use the NAPLAN results to channel some additional funds to low performing schools but these are above the line payments.

Is opting out of testing just selfish individualism?

In a recent article about American culture and the opt out society Alan Greenblatt described the growing and successful movement to encourage parents to refuse to allow their child to participate in national standardised testing as selfish individualism.  It might be driven by a parents individual interest, he argues, but it is selfish and against collective interests:

 It’s probably true that the time spent on testing isn’t going to be particularly beneficial to the kids, but it’s very beneficial to the system,” says Michael Petrilli, executive vice president of the Fordham Institute, an education think tank. “If you have enough people opt out of these tests, then you have removed some important information that could make our schools better.

I find this amusing because the whole corporate reform movement, for which testing is the centerpiece, is built on the neoliberal belief that the best solution to everything – prisons, health, education etc – is to turn everything into a market and allow competition and individual choice to drive better value.

In fact this was the prime motivation described by Kevin Rudd when he first announced the ‘school transparency agenda’ on the 21 August 2008 at the National Press Club. The speech has mysteriously disappeared but I am quite clear that Kevin Rudd said something along the following lines

“If parents are unhappy with their local school because of the information in MySchool, and decide to transfer their child to another better performing school, then that is exactly what should happen.  This is how schools will improve, through parents voting with their feet.”

Now nobody who works in a struggling school thinks this is the way schools improve. Australia has run an aggressive market choice model of school funding for nearly 2 decades now and all we have to show for it is a highly class segregated schooling system and high levels of inequality.

So let me reassure parents who are concerned about our high stakes NAPLAN testing regime.  Opting out of having your child participate in these tests is much more of a community act than deciding to send your child to an elite school.


2008 was important for Indigenous education because that was when all Australian states and the Commonwealth signed up to the National Indigenous Reform Agreement (NIRA) through the COAG Reform process. I worked for NT Department of Education at the time and this development gave me a sense of cautious optimism.

The NIRA committed all states and territories to halving the gap for their Indigenous citizens on a number of key measures by 2020.  For the school sector,the already agreed targets set out in the National Education Agreement (NEA) – improvements to student performance based in NAPLAN tests, and Year 12 retention and completion – were confirmed.

I now see that while the NIRA has given added focus and priority to a very important equity policy issue, it will not drive change for the most disadvantaged Australian citizens  – those living in remote discrete communities in the Northern Territory.

There are many reasons for this but here I want to focus on just two:

  1. The unsuitability of the targets and measures that have been set; and
  2. The decision to drive change  through an outcomes focus – a strategy that is silent on inputs and process measures.

Problem One: The suitability of the NIRA targets and measures for NT remote communities

Example One: NAPLAN Performance

According to Nicholas Biddle (2002) over 67 per cent of NT Indigenous people speak a language other than standard Australian English in their home. For children who grow up in discrete Indigenous communities in remote NT this figure is nearer to 95-100 per cent.  This doesn’t just mean that these children speak another language; it means that they don’t speak English and no one else does, so they don’t hear it spoken in the home, in the playground, in the community, at social functions, on the radio, in shops and in church.

They live in a non-English speaking world, until they arrive at school.

When the children go to school, the school has to work out how to teach a whole class of children who do not understand English. In communities like Yirrkala where children speak a living Indigenous language or languages, and there is a tradition of two-way education, children learn in their own language, Yolnu Matha, using texts that have been developed through the school for this purpose. English exposure is largely oral at this stage.

A potted history is required here.  The bilingual education program was once well-funded and well supported with trained linguists actively supporting the school in developing new resources, skilled two-way teachers and Indigenous Education Workers in classrooms with high levels of Indigenous language proficiency . Over the years funding dwindled to a trickle.  Firstly it was abolished only to be reinstated without critical funded positions and for many years it languished as an unsupported program.   Then in 2010, it was briefly NT government policy to teach only in English for most of the school day.

This was introduced in haste by the former Indigenous Minister for Education, Marion Scrymgour, who later apologised for this ‘mistaken’ decision (Rawlinson, 2012). However, Yirrkala along with a number of schools, refused to comply, and now the NT Education Department appears to passively ‘allow it’ but with no support. Even those schools that did comply, in part or completely, still faced the overwhelming challenge of teaching a whole class of children who do not speak or understand any English. Whatever adaptations made involved major adjustments to the standard approach to literacy education in Australian schools.

By third grade, many remote NT classrooms are just starting to expose students to English language texts and are still using community language reading texts. Their English language focus at this point is still English language oracy as they rightly see this is a pre-condition to being able to read English. However in Year 3, all these learners are forced to sit the NAPLAN tests – tests that are totally unsuited to their stage of English learning development, no matter how they approach the ESL challenges. The vast majority of students either, do not turn up on test days, or, get a zero score – meaning that they are unable to get even one answer right. Indeed we are crazy even expecting them to.

Now let’s compare this treatment and experience to a similarly English language challenged group.

Children who are new arrivals from non- English speaking background countries can access up to one year of an intensive English program and can be exempted from sitting for NAPLAN tests for this period.

These children have come from a foreign land but in many ways remote Indigenous children are still living in a foreign land, yet no parallel Commonwealth funded intensive English program was ever provided for them and the exemption definitions do not allow them to be excused from NAPLAN testing.

The solution to this problem is extremely easy, affordable and accessible. There are culturally sensitive, developmentally appropriate diagnostic assessments developed by the Australian Council for Educational Research (ACER) with remote Indigenous students in mind and sophisticated processes that would enable the results of these tests to be equated to mainstream NAPLAN results in ways that would make sense. This simple but urgent change would put anend to the negative impact of the tests in remote schools. Having a class that all score zero on their test gives the worst kind of useless feedback to parents, students, teachers and systems.

Example 2: Year 12 retention/completion targets

For this target, the reporting framework relies on two measures. For Year 12 completions the target group is 20 – 24 year olds and the data source the School Education and Workforce (SEW) Survey managed by the ABS. The ABS data is collected through a telephone survey, from which remote communities are excluded.   This means that for our most disadvantaged cohort we have no data and therefore no performance targets.

This should be addressed through the initiation of a remote Indigenous survey as a matter of urgency.

The other measure is Year 12 retention. There are a lot of issues with this measure.  But for remote Indigenous children the key problem with this measure is that it means absolutely nothing.

A friend of mine running a government service in remote Australia went to the local Indigenous school and promised a guaranteed job to everyone who completed Year 12.  Later she was taken aside by a teacher who explained to her that completion to Year 12 just means that a student is still attending school to Year 12 – that is they are still enrolled and that is all. While the school could point to a few Year 12 completers in the community most of them could not read enough to be safe in the workplace.

In my view, it is quite mischievous to use as a measure something that appears to measure something of value that actually means nothing at all. We should stop this practice. Its existence means that the lack of meaningful data in this area is hidden from view and never prioritised.

Problem Two: The limitations of focusing only on outcome measures

There is an assumption that outcome measures are a magic bullet and will bring about the required changes on their own. This is not the case where good governance is lacking.  Marcia Langton (2012) the Foundation Chair of Australian Indigenous Studies at the University of Melbourne argues that the high levels of funds allocated to the NT from the Commonwealth Grants Commission based on the level of disadvantage of its Indigenous citizens have been diverted for other purposes.

This has happened over a sustained period and no matter which party holds power. Other policy observers – journalists, social justice advocates, and researchers such as Nicolas Rothwell (2011), Rolf Gerritson, and Barry Hansen support this view with data.

According to Michael Dillon and Neil Westbury there were serious questions raised about the level of funding and servicing for remote Indigenous schools in 2006 and, in response to queries, the Department claimed it was developing a new teacher staffing framework that would ensure transparent and consistent needs based funding.  In 2008 when I commenced with the Department the staffing review was said to be in its final stages and about to be ticked off.  In 2011 in response to a query from myself, the department also said that a new transparent needs based staffing formula was nearly complete.  Everyone I worked with knew this would never happen – taking money away from Darwin schools to give to schools in the bush would never happen. No party wants to commit political suicide.

Rothwell argues that there are no votes in solving Indigenous disadvantage and no strategies to make transparent what is happening or to hold the Territory accountable. The mantra that there are no votes in Indigenous issues is an oft-repeated NT Public Service phrase. The vast difference between the world of ‘white Darwin’ and the world of Indigenous Darwin and Indigenous remote communities is shocking.

Mainstream Darwin residents enjoy the laid back lifestyle, visits to markets, world-class conference precinct with a wave pool, state of the art senior colleges and middle schools and the extremely elaborate Parliament House and precinct, all for a Territory of less than 220,000.

Town Camp Darwin is different. Nine-Mile Town Camp, for example, is not marked on the map – it is just a blank space. This is a place where buses don’t visit, where the main power line to Darwin runs through the middle but when I was there in 2009 there were no street lights (this may well still be the case) , where many houses are condemned and several have no ablution facilities, where there are no footpaths and the grass is higher than a primary school child. The children who do manage to go to school have to be at the bus stop with no bus shelter outside the community by around 7.20am because the only bus they can catch picks them up first and then all the other children. They are on the bus for 50 minutes to go to a school less than 15 minutes away.

Remote NT is worse. The average number of people per bedroom is around three, rubbish services are spasmodic, there are no gutters and where I have seen children swimming in open drains. There is no parity of amenity.

In 2007 I attended the opening of a new high school that would never have been built in a Darwin suburb. It was built on the only oval, taking away this amenity from the whole school. It had no footpaths or covered ways, no water faucets, a very poor library, a staff room that was too small for the number of staff, and huge mud puddles between buildings.

But before this date this community of over 2900 had no secondary school whatsoever.

My argument in a nutshell is that outcomes-based accountability measures will not put any real pressure on the NT to do the right thing by their Indigenous citizens, and real accountability is what is urgently needed here. They know that they can keep on failing because this issue is already assigned by many to the too-hard-basket.

At one of the COAG working party negotiations that I attended, where states were arguing over funding shares, one state representative remarked that there was no point giving any funds to the NT because they wouldn’t deliver the goods and that the close-the-gap target could be achieved by focusing on the Indigenous population in the eastern states alone.

Let’s not make this chilling black humour a reality.

A Tale of ACARA and the see-no-evil monkeys – Subtitle: There is no excuse for willful ignorance

The Australian Senate’s Education, Employment and Workplace Relations Committee is currently holding an Inquiry into the effectiveness of the National Assessment Program – Literacy and Numeracy (NAPLAN).

Over 70 submissions have been received of varying quality. In this article, I focus on the submission from the Australian Curriculum, Assessment and Reporting Authority (ACARA). ACARA is the custodian of NAPLAN and how it is used for school transparency and accountability purposes on the MySchool website.

One of the focus questions in the Inquiry’s Terms of Reference refers to the unintended consequences of NAPLAN’s introduction.  This is an important question given widespread but mainly anecdotal reports in Australia of: test cheating; schools using test results to screen out ‘undesirable enrolments’; the narrowing of the curriculum; NAPLAN test cramming taking up valuable learning time; Omega 3 supplements being marketed as able to help students perform better at NAPLAN time; and NAPLAN test booklets hitting the best seller lists.

Here is what ACARA’s submission to the Senate inquiry  has to say about this issue:

To date there has been no research published that can demonstrate endemic negative impacts in Australian schools due to NAPLAN.  While allegations have been made that NAPLAN has had unintended consequences on the quality of education in Australian schools there is little evidence that this is the case. 

The submission goes on to refer to two independent studies that investigated the unintended consequences of NAPLAN.

ACARA dismisses a Murdoch University research project[1] led by Dr Greg Thompson as flawed because its focus is on changes to pedagogical practices as a result of the existence and uses made of NAPLAN.  The basis of the dismissal is that if teaching practices change then that is all about teachers and nothing to do with NAPLAN.  Yet this report makes clear that teachers feel under pressure to make these changes, changes they don’t agree with, because of the pressures created by the use of NAPLAN as a school accountability measure.  In other words, in one clever turn of phrase, ACARA rules out of court any unintended consequences of NAPLAN that relate to changes to teachers’ practice.

ACARA also dismisses a survey undertaken by the Whitlam Institute because it  “suffers from technical and methodological limitations, especially in relation to representativeness of the samples used” and dismisses its conclusions, without  outlining detailing the findings. Now this survey was completed by nearly 8500 teachers throughout Australia and it was representative in every way (year level taught, sector, gender, years of experience) except for the oversampling in Queensland and Tasmania. In relation to this sampling concern the writers even reported that they weighted the responses to compensate for this sampling differential.  This report documents many unintended consequences that ACARA are now saying are not substantiated because of a spurious sampling critique. This is intellectually dishonest at best.

ACARA’s dismissal of all of the findings of these two research projects on spurious grounds while refraining from stating their findings is in stark contrast to its treatment of unsupported statements by non impartial stakeholders about enrolment selection concerns.

In response to the claims that some schools are using NAPLAN results to screen out ‘undesirable students’ ACARA states that it is aware of these claims but appears willing to take at face value comments from stakeholders who represent the very schools accused of unethical enrolment screening.

It is ACARA’s understanding that these results are generally requested as one of a number of reports, including student reports written by teachers, as a means to inform the enrolling school on the strengths and weaknesses of the student. The purpose for doing so is to ensure sufficient support for the student upon enrolment, rather than for use as a selection tool. This understanding is supported by media reporting of comments made by peak bodies on the subject (my emphasis).

ACARAs approach to this whole matter comes across as most unprofessional.  But, unfortunately for ACARA, this is not the whole story.  There is a history to this issue that began in 2008, almost as soon as the decision to set up MySchool was announced by the then PM Kevin Rudd as part of his school transparency agenda.

Three years ago this month I wrote an article[2] about the importance of evaluating the impact of the MySchool Website and the emergence, under FOI, of an agreement in September 2008 by State and Commonwealth Ministers of Education to:

…commence work on a comprehensive evaluation strategy for implementation, at the outset, that will allow early identification and management of any unintended and adverse consequences that result from the introduction of new national reporting arrangements for schools(my emphasis).

It was clear from the outset that this evaluation should have been managed by ACARA as the organisation established to manage the school transparency agenda.  In 2010, in response to my inquiry to ACARA on this Ministerial agreement the CEO of ACARA stated that it was not being implemented at that point in time because, early reactions to hot button issues are not useful and because the website did not yet include the full range of data planned.

This was a poor response for three reasons.

Firstly, well-designed evaluations are designed, not as afterthoughts, but as part of the development process. One of the vital elements of any useful evaluation is the collection of baseline data that would enable valid comparisons of any changes over time.  For example, information could be collected prior to the MYSCHOOL rollout on matters such as:

  • Has time allocated to non-test based subjects reduced over time?
  • Has teaching become more fact based?
  • Has the parent demographic for different schools changed as a result of NAPLAN data or student demographic data?
  • Are more resources allocated to remedial support for students who fail to reach benchmarks?
  • Are the impacts different for different types of schools?

Secondly, the commitment to evaluate was driven by concerns about the possibility of schools being shamed through media responses to NAPLAN results, the narrowing of curriculum and teaching, further residualisation of public schools, test cheating and possible negative effects on students and teachers. Identifying these concerns early would allow for revising the design elements of MySchool to mitigate the impacts in a timely fashion.  There is no real value in waiting years before deciding corrections are needed.

Thirdly, anyone who seriously believed that all of the data elements agreed as possibly in scope for MySchool was a complete list and able to be developed quickly, was dreaming.  Waiting for the full range of data meant, in reality, an indefinite delay. There are still data items in development today.

So now, five years on from the Ministerial directive that there was a need to actively investigate any unintended consequences, there is still no comprehensive evaluation in sight. One suspects that ACARA finds this quite convenient and hopes that its failure to act on this directive stays buried.

However, Ministers of Education still had concerns.  In the MCEECDYA Communiqué of April 2012 the following is reported:

Ministers discussed concerns over practices such as excessive test preparation and the potential narrowing of the curriculum as a result of the publication of NAPLAN data on My School.  Ministers requested that ACARA provide the Standing Council with an assessment of these matters.

On the basis of this statement, I wrote to ACARA on 27 April 2013 requesting information on action in response to this directive  – then over 12 months old.  To date I have received no reply.

So what sense can be made of this?

If one takes at face value the statements by ACARA that it knows of no information regarding the extent of unintended consequences one can only conclude that ACARA has twice not undertaken a Ministerial directive.

Here we have a Government body: aware of Ministers’ concerns about unintended negative consequences about a program it manages; aware of widespread anecdotal concerns, some of them quite serious; dismissing without any proper argument the few pieces of evidence that do exist; and. refusing to undertake any investigation into this matter despite two Ministerial directives to do so.

Willful ignorance about the potential unintended and harmful impacts of a program an agency has responsibility for whilst all the while professing a strong interest in this matter is highly irresponsible and unprofessional.

It is also quite astonishing given the Government’s commitment to the principle of transparency and the fact that ACARA was established specifically to bring that transparency and reporting to Australia’s schools.

But to then write a submission that almost boasts about the lack of information on this issue, while dismissing with poor arguments the evidence that is growing, is outrageous. It also gives new meaning to a throwaway line in its submission about the negative findings from the Whitlam Institute survey  “Further research with parents, students and other stakeholders is needed to confirm the survey results on well-being

Further research is indeed needed, and this further research should have been initiated by ACARA quite some time ago– five years to be precise.  It is convenient, for ACARA, that such research is not available. It is intellectually dishonest and misleading for ACARA to now state that it “takes seriously reports of unintended adverse consequences in schools. It actively monitors reports and research for indications of new issues or trends.”

Of course, there is another,  more alarming possibility, that this work has been undertaken but is not being made public, and that ACARA is misleading the Parliamentary inquiry and the public by denying that any such information exists.

In either case, I am forced to conclude that ACARA does not want any unintended consequences of a program for which it is responsible to be known, in spite of its ‘interest in this issue’ and is persisting in its position of willful ignorance.

In an effort to restore public confidence in its work, ACARA should commit to undertaking this research at arms length using independent researchers and reporting the findings to Parliament, without delay.  Perhaps this Inquiry could recommend this.

Evaluating MySchool – We are still waiting ACARA

Note:  Three years ago this month, I wrote about the Ministers’ of Education agreement to evaluate MySchool in order to identify and mitigate unintended consequences in June 2010.  There is still no such evaluation, nor any commitment by ACARA to do this.  This is being republished as a backgrounder to my next post

At last it is official.  Well before the launch of the MySchool website, state and federal education ministers agreed to task an expert working group to ‘commence work on a comprehensive evaluation strategy for implementation at the outset, that will allow early identification and management of any unintended and adverse consequences that result from the introduction of new national reporting arrangements for schools’, according to minutes of the ministers’ meeting held in Melbourne on 12 September 2008.

It appears, however, that the decision was never implemented and responsibility for it has transferred to the Australian Curriculum, Assessment and Reporting Authority (ACARA).

Now that the initial hype surrounding the launch of the MySchool website is over and the education community is settling down for a more considered debate on the opportunities and challenges of MySchool, I thought it might be useful to put forward some thoughts about what should be in scope for this MySchool evaluation.


My understanding of the ACARA position is that it is too early to make an evaluation because early reactions to a ‘hot button issue’ area not an accurate reflection of the longer term impacts, and because the MySchool website does not yet have the full range of information that is planned.

I have to agree that at this stage in the rollout of the full MySchool concept, we have an unbalanced set of data.  As data on parent perceptions, school level funding inputs and hopefully data on the quality of school service inputs (like the average teaching experience of the staff and their yearly uptake of professional learning) are added to MySchool, the focus of schools and systems might well shift.  I, for one, hope so.  If extra information like this doesn’t shine a bright light on the comparative level of the quality of the schooling experience for high-needs students relative to advantaged students, this will be a lost opportunity.

So yes, the effects of MySchool might change over time.  But this doesn’t mean we should wait to evaluate what is happening.  In fact, I think we have already missed the best starting point.  A good evaluation would have included a baseline assessment report, something that would have told us what sort of accountability pressures schools were already experiencing for good or ill prior to the introduction of MySchool.

For several years, education systems already had access to data from the National Assessment program – Literacy and Numeracy (NAPLAN) down to the school, class and even student level.  In most government systems, NAPLAN data was but one small component of a rich set of data used by schools to review their strategic plans, identify improvement priorities, set improvement targets and develop whole-school improvement plans.  They were already using parent and student surveys, student data on learning progress, attendance turnover, post-school destinations, school expulsion and discipline, and teacher data on absenteeism, turnover, professional learning and performance reviews.  At the same time, most education systems had access to this same information and were using it to prioritise schools in need of external reviews or other forms of intervention as part of system-level school-improvement strategies.

Contrary to popular belief, the introduction of MySchool did not commence a new regime of school accountability but it did both broaden it out and narrow it down.  It broadened it out to include parents and the community as arbiters of a school’s performance and applied this public accountability to independent as well as systemic schools. It narrowed it down by using just a tiny amount of data, and that has had the effect of privileging the NAPLAN test results.


Even before the launch of MySchool, systemic schools such as government and some Catholic schools were already feeling some degree of pressure from their systems about student performance in NAPLAN.  For example, in the Northern Territory, this had already given rise to a strong push from the education department to turn around the high level of non-participation of students in NAPLAN testing – an initiative that was highly successful.

A baseline study could have taken a reading on the extent that schools and systems were responding in educationally positive ways in the previously established accountability regimes and whether there were already unintended impacts.

Without a baseline assessment we are left with only anecdotal evidence of the effect of and reaction to MySchool, and responses vary widely.  I recall hearing a more outspoken principal in a very remote school saying something along these lines:

If I have to go to one more workshop about how to drill down into the NAPLAN data to identify areas of weakness and how to find which students need withdrawal based special support, I will scream.  All areas require focused intervention at my school and all our students require specialised withdrawal support.  We don’t need NAPLAN to tell us that.  What we need is …

I suspect that for schools where the NAPLAN outcomes were poorest, there was already a degree of being ‘over it’, even before the advent of MySchool.  The challenge for the most struggling schools is not about ‘not knowing’ that student progress is comparatively poor.  It is about knowing what can be done.  For years, struggling schools have been bombarded with a steady stream of great new ideas, magic bullets and poorly funded quick fixes that have all been tried before.  Their challenge is about getting a consistent level of the right sort of support over a sustained period of time to ‘crash through’. What has MySchool done for schools with this profile?  And more significantly, what has it does for their school communities?

On the other hand, I have heard anecdotally from teachers in the Australian Capital Territory that the launch of the site has drawn some comments to the effect that ‘we probably needed a bit of a shake up’ or ‘perhaps we have been a bit too complacent’.  Will the effect of MySchool be different depending on school demography and schools’ previously experienced accountability pressures?

There is evidence from the United States that this is likely to be the case.  Linda Darling-Hammond’s new book The Flat World and Education makes the point that the negative impacts of accountability driven by multiple-choice tests are greater when the stakes are higher but are also greater in schools serving high-needs communities.  For schools in high-needs communities, the stakes are higher than for comparatively advantaged schools precisely because their results are lower and their options fewer.

Darling-Hammond’s research suggests that this unequal impact results in more negative consequences for schools serving high-needs communities on three fronts.  The first relates to the impact on student school engagement and participation.  The more pressure on a school to perform well in national testing, the more likely it will be that subtle and not so subtle pressures flow to the highest-needs students not to participate in schooling or testing.  She documents unequal patterns with high-needs schools experiencing very marked increases in student suspensions, drop-outs, grade repeating and non-test attendance as a result of accountability measures driven largely by tests.

The second relates to the stability, quality and consistency of teaching.  Darling-Hammond notes that the higher the stakes, the more the negative impact on the stability and quality of teaching.  Her book documents decreases in the average experience level of the teachers in high-needs schools after the introduction of the US No Child Left Behind program of accountability.  Ironically, some of the systemic responses to failing schools exacerbate this as systems frequently respond by funding additional program support for at–risk students, English language learners and special-needs students.  These programs almost always engage non-teaching support staff.  The practical impact of this is that the students with the highest needs get even less access to a qualified teacher during their classroom learning time.

The third relates to the quality of the teaching and learning itself.  This is the topic that has most occupied the Australian debate on MySchool.  There has been no shortage of articles in the media predicting or reporting that MySchool will result, or has resulted, in a narrowing of the curriculum, an increase in the teaching of disconnected facts or rote learning, and classroom time being devoted to test preparation at the expense of rich learning.  In addition, there area websites surveying changes at the should level in terms of what gets taught and how it gets taught.  Less commonly acknowledged is the overseas experience that suggests the quality of teaching and learning is most negatively affected in schools serving high-needs communities.

These differentiated impacts need to be identified and addressed because, if they are not, difference in educational outcomes between high-needs schools and advantages schools are likely to increase.  The impact of parents making choices to change schools sometimes on the basis of quite spurious information will also be felt more in lower socioeconomic communities.  Increased revisualisation of government schools leads to higher concentrations of students of lower socioeconomic status.  This has a negative effect on all the students of those residualised schools.  Of course, schools that struggle the most may ironically be exempt from this kind of adverse pressure because many families in these communities do not really have the choice of sending children anywhere other than the local government school.

Are there other unintended consequences worth investigating? The potential impact of the index of Community Socio-Educational Advantage (ICSEA) values being used to set up a notion of like schools, for example, deserves further attention.

The ICSEA assigns a value to each school based on the socioeconomic characteristics of the area where students live, as well as whether a school is in a regional or remote area, and the proportion of indigenous students enrolled at the school.  The development and use of ICSEA is a complex matter that deserves a separate article.  Here I am just looking at what an evaluation should focus on around ICSEA being used as part of the MySchool website.  For all its faults, in very broad terms it tells us how advantaged a school is (a high score) or disadvantages it is (a low score).

I must admit that when the concept of reporting by school demographic type was first mooted, I got a little excited.  I took for granted that this would lead to a focus on the fact that, in Australia, a student’s chance of success at school is still fundamentally influenced by the student’s level of disadvantage and the relative disadvantage of the school she or he attends. I thought, at last, we had a chance to expose the fact that Australia has not managed to break the ‘demography as destiny’ cycle.  I took for granted that this would lead to a focus on this outrage and lead to a renewed commitment to addressing it.  I also thought that the ICSEA had the potential to become a framework for a more focused needs-based funding approach than the current state approaches to this.

I was wrong.  Since the launch of MySchool in January,[1] I have googled and googled and found a lot of reporting about the way in which the tool, in its first iteration, has lead to some very strange school groupings, but with the exception of an excellent article by Justine Ferrari in the Australian, almost nothing about the link between poverty, disadvantage and NAPLAN results.

This initially surprised me – but then I was reminded that John Hattie, in his recent book, Visible Learning, makes the point, almost in passing, that differences between student learning outcomes within schools are greater than the difference in learning outcomes between schools in developed countries.  He refers to interschool differences as trivial at best.  Maybe we in Australia assume that this is so for us.

The MySchool data tells a very different story and anybody can verify this for themselves.  It shows that the schools in the lowest ICSEA grouping (those which have ICSEA values in the low 500s), who are assessed as doing better than their like-school peers (a dark green rating), have NAPLAN scores for their Year 9 students that are below the NAPLAN scores for Year 3 students in schools serving advantages communities (those with an ICSEA values about 1,100s).  This means that schools where Year 9 students are achieving average Year 3 results are rated as good for these schools

I had assumed the ICSEA would be a tool to shine a light on the issue of systemic educational disadvantage, but have come to understand that it may well be inadvertently taken attention away from this issue. The structure of the website precludes easy comparisons (for good reasons) but also draws web users’ attention to comparisons between like schools, not to unlike schools.

If the combination of the limited searchability of the MySchool website, with the complexity of the ICSEA concept has led unintentionally to an unspoken legitimising of ‘demography as destiny’ this is a serious concern that an evaluation would pick up.   Will schools start to say, ’we are doing pretty well considering the community we serve’? Have we inadvertently naturalised different outcomes for different groups of students?

In our efforts to minimise the shaming of high need schools and their communities, we need to ensure that we have not taken away the shame that all Australians ought to feel about our failure to the vicious cycle of family poverty and the failure of the institution of education to systemically and sustainably disrupt this link.

This is not a failure that can be laid at the feet of the individual classroom teacher or school principal.  It is a systemic failure.  If we insist on putting pressure solely on individual schools to do the ‘educational heavy lifting’, our revolution will fail.  If we believe that encouraging parents to vote with their feet for the best school without first guaranteeing that every choice can be a high-quality choice, with equal opportunity to learn, we will also fail.  Ideally, the evaluation should include, in scope, not just unintended impacts but the explicitly intended effects too.

While there is no space to embark on this discussion here, I would like to end with a quote from Linda Darling-Hammond’s The Flat World and Education that sums up most eloquently the kinds of changes that education transparency and accountability frameworks ought to drive:

Creating schools that enable all children to learn requires the creation of systems that enable all educators and schools to learn.  At heart, this is a capacity-building enterprise leveraged by clear, meaningful learning goals and intelligent, reciprocal accountability systems that guarantee skilful teaching in well-designed schools for all learners.


Publication details for the original article: Margaret Clark, Evaluating MySchool, Professional Educator, Volume 9, Number 2 June 2010 pp 7 – 11

Darling-Hammond, L  (2010) The Flat World and Education: How America’s commitment to equity will determine our future.  New York: Teachers College Press.

Ferrari, J. (2010, May 1) On the Honour Roll: the nation’s top schools.  The Australian Inquirer, p5.

Hattie, J. (2009) Visible Learning: A synthesis of over 800 meta-analyses relating to achievement.  London, New York: Routledge.

Patty, A. (2010, 30 March). Evaluation of MySchool pushed aside say critics.  Sydney Morning Herald, p2.

[1] Of course the research undertaken by Richard Teese and also by Chris Bonnor and Bernie Shepherd since this date have achieved this.  We do not hear the line – the biggest differences are between classes in the same c schools in Australia anymore.  Thanks to the work of Teese, Bonnor and Shepherd this myth has been busted.

A Small Step for Government But a Giant Step for Remote Indigenous Children: NAPLAN and Indigenous Learners

The current review of NAPLAN is the second Parliamentary Inquiry into the use of NAPLAN data in Australia.  If it goes the way of the first inquiry then little change can be expected.

When the previous Inquiry was initiated, I was working for the Australian College of Educators.  We put a huge amount of effort into our submission.  It went through a member consultation process, was submitted and then sank like a stone.  Indeed even the website that hosted all the submissions appears to have disappeared. Nothing much came of it as can be expected when an issue has become politicised.

I am much less optimistic about what can be achieved this time round. My ideal outcome is unrealistic. It will not lead to a change in emphasis from testing and measuring to supporting and building capability –  no matter how much the evidence supports such a change.

However we can and should advocate to address the most egregious problems and unintended consequences associated with NAPLAN.  This is our chance to highlight them.

For this reason I was very excited to see that Submission No. 71 to the Inquiry comes from Leonard Freeman, the Principal of Yirrkala School in the remote northeast of the Northern Territory.

Yirrkala College is a K-12 very remote school quite near the mining township of Nhulumbuy on the Gove Peninsula.  It serves a discreet, remote Indigenous Community on Aboriginal land and 100% of the students that attend are Indigenous.  According to MySchool 97% of the students at the school have a ‘Language Background Other Than English (known as LBOTE).

Now, it is easy to underestimate the significance of this language background issue in Indigenous contexts.  Children who grow up in a remote Indigenous community where their mother tongue is still alive and thriving are, in every sense of the word, still residing in a non-English speaking background country.  They arrive at school with almost no experience of hearing English spoken.  They don’t hear it at home, around town, on their radio station, local stores, health centre, or at social/cultural events.

LBOTE is a broad category and very unhelpful for understanding language challenges and issues.  Children and their families can be quite fluent in English, but if they speak a language other than English in their home they are still classified as LBOTE.  Most LBOTE children who have very little or no English are recent arrivals from a non-English background country.  They might reside in suburbs where English is not the dominant home language and for the first school year attend an Intensive English language unit but English is still heard around them – in playgrounds, health centres, playgroups, libraries, radio, TV and in the school playground and classrooms.  They are, at some level,immersed in a culture where English is heard.

Children at Yirrkala can grow up hearing almost no English spoken.  When they get to school, their classes are in language for the first few years (in spite of the NT Governments poor decision to change this Yirrkala maintained this policy) – in fact right up to year 3 where teaching in English is gradually introduced.

So what does Leonard Freeman have to say about NAPLAN?

He argues that while there is a perception that NAPLAN is a fair test it is anything but.

[NAPLAN] is a testing scheme that seems as fair as it could possibly be – all students sit the same tests and the marking is conducted by an independent central body. However, this perception of fairness is a thin veil that covers a system that disadvantages students who speak English as a Second Language.

There are a number of issues wrapped up in this notion of unfairness.

Firstly, the NAPLAN exemption criteria do not give adequate consideration to the English language development of Indigenous children living in non-English speaking discreet Indigenous communities.

Most Australian educators assume that students who speak little or no English can be identified by the category “newly arrived from a non-English background country”.  In fact, when I worked in education in the NT I found that I had to constantly remind education administrators at national meetings that their proxy use of newly arrived non-English speaking migrants leaves out Indigenous children with identical or even greater challenges.

Nationally two per cent of Australian children are exempt from sitting the NAPLAN test. Students can be exempted from one or more NAPLAN tests if they have significant or complex disability, or if they are from a non-English-speaking background and arrived in Australia less than one year before the tests. 

So in fact almost all other children, who have as little English language competence as Year 3 and even year 5 remote Indigenous children from communities like Yirrkala, are exempt from NAPLAN.  No children at Yirrkala were identified as exempt from NAPLAN testing.

This leads to the ridiculous situation where remote Indigenous children with almost no exposure to English language, especially in written form,    “must take the test under the same conditions as students who speak English as their first language and have their results evaluated in terms of the ‘typical achievement’ standards of mother tongue English speakers. “

Now one of the reasons why education Institutions and administrators resort to the category ‘recently arrived migrant from a non-English speaking background country’ as a proxy for children who do not yet have a sufficient grasp of Standard Australian English is because we don’t have sensible data on this matter.  We have data on children who have a language background other than English but this tells us nothing about their level of competence with written English.

This Inquiry could secure bipartisan support to fix this matter up – this is not a politicised, hot issue.  This is about applying definitions of technically relevant matters in an inclusive and fair manner. Children in years 3 and 5 who reside in communities where standard Australian English is not spoken could be either exempted from NAPLAN until their English language learning enables them to read English to a defined level.

Secondly, NAPLAN is not a culturally fair test and this further discriminates against remote Indigenous children.

Back again to Leonard Freeman:

….NAPLAN reading tests assess students’ reading ability by judging their answers to multiple choice questions which ask often complex questions about the reading material.

He provides the following example of a multiple choice item in a Year 3 reading test

‘But I feel funny about saying I own him’. What reason does Tim give for feeling this way?

a) Elvis is really Malcolm’s dog.

b) Tim thinks dogs cannot be owned.

c) All the family helps to look after Elvis.

d) Elvis is much older than Tim in dog years. 

It is pretty obvious that there is a great deal of non-accessible cultural knowledge required to eliminate supposedly irrelevant answers for this item.

These sorts of questions do not simply assess whether the student can read the reading material and basically comprehend the story; they go well beyond that. A Year 3 student from a remote Indigenous community who is still trying to master spoken English and western cultural norms would find a question like this very difficult to answer. The assumed knowledge, about dogs ages being measured by ‘dog years’, the use of the word ‘funny’ to mean uncomfortable rather than humorous, and the concept of questioning the definition of ownership are all things that would be unfamiliar to a child growing up in a remote indigenous setting.

The NAPLAN reading test actually tests analytical skills which are coated heavily in western cultural norms. 

Another example provided by Freeman of an item around the delivery of newspapers provides further insights into cultural inaccessibility.

The story begins with the householder complaining to the newspaper boy ‘you left the paper jutting out of the back of my box’ and we also learn the owner had previously complained the paper needs to be left ‘in line with the fence’. This question was designed to test whether students could infer the meaning of new words and constructions. Yet to do so the students need to be familiar with the cultural context, in this case the students need to know that houses have a box on their fence line where mail and newspaper deliveries are left.  If the student has grown up in a remote community or refugee camp where there are no letter boxes and few houses have fences they will not be able to access the meaning of the text. 

Thirdly, The lack of fit between the NAPLAN tests and the kinds of assessments needed to effectively support teachers in these challenging context leaves teachers unsupported and undermined.

Now it would be reasonable to expect that the NT Department of Education should be fully cognizant of these circumstances and to make it their business to ensure that the unintended consequences of this unintended situation could be addressed, or at least mitigated.  Sadly, when I worked in the NT I found that this was not the case.  And scrolling through their website today I found that nothing much had changed.  There is now an acknowledgement that Indigenous children are English Language learners but what this means in terms of resourcing is minimal and what it means for teachers across remote schools appears to be completely ignored.

The absurdity of this is best illustrated through the following personal experience of what can only be described as an absurd professional development event

This event took place at a beautiful new resort in the remote mining community on Groote.  The attendees at this session were principals and a group of their nominated teachers from schools in remote Indigenous communities.

The aim of the 3 days session was to ‘teach’ the attendees – all from remote Indigenous schools  – how to drill down into their schools test results and develop, not just a strategic, but a tactical response to what they find.  It was a highly structured event. First of all the groups were given a spreadsheet showing their NAPLAN results for al year levels and for all tested areas and a detailed worksheet to work through.

I sat next to a school team that came from a large school in Eastern Arnhem, similar in key features to Yirrkala.  It was also a school that ran a bilingual program, which meant that all student in year 3 and almost all in year 5 could, not yet read in English – even at a basic level.

This school had NAPLAN results that were marginally worse than the other schools represented.  At this school, in almost every subject, in almost every year level, 80 – 100% of the students scored zero – that is they did not get one answer right – not one.  Some classes in some schools had a small minority of students who did receive a higher score – a few even approaching the relevant band for their year but they were a tiny tiny minority.

The professional development session required the teachers to group their students by what they did not know.  For example – how many students did not understand the convention of the full stop?  Put a ring around these students.  The teachers next to me sighed and ringed the whole class.  And it went on like this for three whole days.  It was idiotic and devastating.

These teachers went back to the school not just demoralized but with decontextualised lesson plans on full stops, the sound ‘CH’, prime numbers and so on.

I tell this story because it is an extreme example of just how stupid it is for people to invent prescriptive solutions that must be rolled out across all schools, with no exception.

There is no doubt that this is damaging for teachers in remote schools.  It was political exposure about the poor NAPLAN results that forced Marion Scrymgour to preemptively abolish the struggling, underfunded bilingual program – something she later came to regret – for good reason.

Leonard Freeman sees the NT Department priorities and the experiences and struggles of remote teachers as heading down a collision course:

The NT government made a commitment to having 75% of NT students meet the Year 3 NAPLAN benchmarks and teaching programs are aimed at achieving this. The amount of English instruction is being increased under the misguided belief that elevating the focus and importance of English will yield better English results. 

The inclusion of ESL students in NAPLAN testing places ESL researchers, specialist ESL teachers and classroom teachers in a conflict between the principles of effective ESL teaching and assessment practices and the requirements of governments and education departments. Instead of working together to attain the best educational outcomes for students’ researchers, policy makers, teachers and governments are locked in a fundamental disagreement between meeting the needs of ESL students and the administrative and political advantages of a uniform testing regime.

 One of the perverse consequences of this is that programs which claim to accelerate English literacy or which are aimed at native English speakers are now favoured ahead of academically sound ESL programs which demonstrate the genuine progression of ESL students.

It has also led to effective and evidence-based programs such as the Step Model bilingual program to be shut down to the detriment of Indigenous ESL students.

Now some readers this may be thinking that I am arguing for lower expectations for remote Indigenous children.  This is not my message.  These children get exposed to English in school for the first time and it is often their third and fourth language.

We exempt newly arrived LBOTE children ‘down south’ not because we expect less of them but because we recognize that their learning journey has to include an additional learning pathway.  But we do not expect less of them in the long run.

Back to Freeman again:

… an ESL approach is not a lesser approach. It is aimed at getting students who are speakers of other languages to a point where they can access mainstream education. A program may be deemed ineffective if ESL students never reach age-grade performance, but ESL programs that successfully move students along the continuum at a rate that is acceptable based on the research should be regarded as valid and ideal for ESL learners.

It is important to recognise the research which shows that it takes a minimum of 4 years, and more like 7 years, to achieve proficiency in a second language to a level that is sufficient for academic pursuits. The efficacy of ESL programs should be judged against the backdrop of this research.

So what are the small steps Governments could take in order to stop getting in the way of effective education for remote Indigenous children

  1. Stop NAPLAN Testing for remote Indigenous children until year 7.
  2. In the meantime, agree on an alternative form of testing[1] that is more appropriate for ESL students in terms of cultural content and recognition of ESL learning stages
  3. Address cultural bias in NAPLAN testing so that when remote Indigenous students are linguistically ready to sit the tests they can understand what is being asked of them
  4. Develop a national agreed English Language Learner Scale (ELLS) to replace LBOTE as a student  category so there is a far and consistent way to measure disadvantage based on English language learning needs

[1] The ACER Longitudinal Literacy  and Numeracy Study for Indigenous Students (LLANS) test has been trialled in all states and territories with both indigenous and non-indigenous students. Researchers have now aligned the LLANS test results to the NAPLAN data scores. So it would be possible for ESL students in the primary years to be given an appropriate test which can give a much clearer indication of their actual literacy and numeracy skills.