Anti ed reformers make the case for PISA

In this article from the tireless Valerie Strauss of the Washington Post, Pasi Sahlberg and Andy Hargreaves respond to the open letter signed by dozens of researchers and academics from around the world to Andreas Schleicher, director of the Program of International Student Assessment, urging him to suspend administration of PISA until a new exam can be created.

The Sahlberg and Hargreaves arguments rests on the following:

1. Ignorance is not great and could have given corporate ed reformers an even easier ride

Just think for a moment what would global education look like if PISA had never been launched? There would be, as there was in the 1990s, a number of countries that mistakenly believed their education systems are the best in the world and should set the direction for other nations. Were it not for the fact that these weaker performing countries that include the United States and England have not been successful in PISA, the worldwide pressures for more market competition between schools, less university-based training for teachers, and more standardization of the curriculum, would have had a far easier ride.

The poor performance of Sweden after the implementation of their radical market choice program of for profit free schools would never have been outed.  Likewise the high performance of public education focussed Finland and to a lesser extent Canada would not have provided a very strong counter narrative.

2. PISA has enabled the OECD to shine a bright light on equity and to argue that equity and quality are not at odds:

It has put equity high up on the reform agenda. Without the data that PISA has generated over the years, calls for enhanced equity would not be part of the education policy conversation in the countries that have suffered from inequitable education systems, including the U.S. [and Australia].

However the authors do not let PISA off the hook on the many other issues raised by the group of academics and researchers.  In particular they raise serious concerns about the recent steps to put the tests into the hands of global corporate ed reformers, Pearson.

The conclude that the a) evidence provided by PISA is overwhelming and clear on the negatives of neoliberal education policies and b) that the negatives of PISA can be addressed by dealing with its problems not “knocking the PISA tower over”:

What PISA shows to the United States is that its current course of education policies that rely on competition, standardization, testing and privatization of public education is a wrong way. Our goal should not be to take PISA down, but to get it or something like it upright again, so that by using a range of criteria, and by using them in a fair and transparent way, we can identify and learn from the true high performers who are strong on equity as well as excellence, and on human development as well as tested achievement.

What do readers think?

What would your school do with Gonski money?

Throwing more money at schools isn’t the answer yells Dr Scott Prasser. This is the man who has defended every red cent that the Government has allocated to Catholic schools – even the 50% of them who were overfunded after the SES model was introduced and their funding level was grandfathered.

The Gonski Review Panel did not address the issue of how the additional funds so sorely needed by public and needy schools in Australia because this was outside their terms of reference.

But it is an important question. Glen Fowler in an article in the Canberra Times, How Money Makes a Difference tells the story of Richardson Primary school – one of a very small number of ACT disadvantaged schools and how they managed their Low SES National Partnership funds to improve learning outcomes for their children. He is what they did

Richardson Primary started by enhancing its capacity to gather and analyse data about how their students were performing. They purchased licences from the Australian Council of Educational Research (ACER) to administer annual internal tests in literacy and numeracy at all year levels. That way, they didn’t have to wait for NAPLAN results. They had up-to-date information about where students were falling behind and needed extra support.

Drawing on hard data that indicated students were struggling with vocabulary development and reading comprehension, the school set about enhancing teacher capacity to address these issues. Every staff member attended a five-day intensive course in Dr Spencer Kagan’s high-impact collaborative learning strategy. Kagan’s approach aims to engage every student, especially those who are struggling, by structuring activities so that students feel individual and collective responsibility for their learning.

Additionally, every teacher attended a two-day seminar with educational expert, Dr Dylan Wiliam, on using his formative assessment strategies to enrich each student’s learning journey.

The school also purchased teacher and classroom resources to complement structured and supported teacher-learning teams that ensure effective school-wide implementation of these key strategies. This razor-sharp focus on improving instructional practice through collaboration and reflection has led to more confident and skilful educators, adept at engaging every learner every moment of the learning process.

The school’s final strategy was to build community partnerships. Working with the YWCA of Canberra, the school established an Intel Computer Clubhouse for 10-18-year-olds in the area. The Clubhouse is an out-of-school-hours high-tech digital studio where young people can work with industry-standard hardware and software and collaborate with mentors on passion projects.

 

This is an interesting set of initiatives for a number of reasons

Firstly, This school understands that particularly in relation to students who are not achieving agreed benchmarks in reading outcomes, NAPLAN test results come too late. Fowler doesn’t state it but I am sure the school also understands that NAPLAN does not provide information for this group of learners. It is too narrow and not diagnostic in design. It is interesting to note that after using these more diagnostic assessments it was found that the real barriers to reading developments were vocabulary and reading comprehension. These are of course quite linked but neither is well tested by NAPLAN.

Secondly, the professional development focus was cooperative learning using groups of differing ability students using a well-researched evidence based approach. Now cooperative learning has a long history in education but there is a big difference between a few teachers across a school taking this approach and a well-prepared well-trained school adopting it en mass. Its worth noting that recent research has identified a growing trend for schools to adopt streaming approaches in their classrooms – not because it is well researched but because this makes it easier to teach based on NAPLAN content as the key organiser.

Thirdly, basing classroom learning experiences around information based on formative assessment allows for learning personalisation and ensures that the time spent on learning is both accessible and challenging.

Finally, there are things about Richardson Primary school that are not mentioned in this report but that matter a lot. First of all the principal is Jason Borton, who is known to many twitter-active educators as a wise, brave and outspoken leader on key education issues. High quality leadership for low SES schools is critical and systems should be investing in strategies to ensure that. Secondly, I don’t know how but Richardson Primary have managed to have relatively small class sizes – 19 at most in all but kindergarten where the ration is 16-1. Don’t let anyone id you that the size of the class does not matter.

So there you have it, this school has not wasted a cent on extrat resources to drill down on NAPLAN, new fancy learning packages aligned with NAPLAN. In fact they appear t have completely ignored it – and righty so in my view.

Instead Richardson Primary is well placed to support all its children through high quality leadership, a whole school focus on well evidenced pedagogical strategies, intelligent and focussed use of formative and diagnostic assessments across the school and a classroom student teacher ratio that is workable. I don’t know how this school will adapt to the highly financially constrained environment they will find themselves in if the full 6 years of Gonski are not agreed to, but it wont be good and students will be negatively affected.

It would be interesting to collect accounts of what other schools are currently doing that will need to stop. I do hope someone is doing this.

The NAPLAN Parliamentary Review’s ‘do nothing’ recommendations: We can do better

Many of us waited with a degree of eagerness – even excitement – for the release of the Parliamentary Inquiry Report into NAPLAN (Effectiveness of the National Assessment Program – Literacy and Numeracy Report). But what a disappointment!

While it makes a passable fist of identifying many, but by no means all, of the significant issues associated with how our NAPLAN is currently administered and reported, it does miss some of the important details. This could be forgiven if the recommendations showed any evidence of careful thinking, vision, or courage. But in my assessment they are trivial and essentially meaningless

We know a lot about the problems with our current approach to standardised testing and reporting. This Report, with the help of over 95 submissions from a wide range of sources, manages to acknowledge many of them. The key problems include:

  • it is not valid and reliable at the school level
  • it is not diagnostic
  • the test results take 5 months to be reported
  • it is totally unsuitable for remote Indigenous students – our most disadvantaged students – because it is not multilevel, in the language that they speak or culturally accessible (Freeman)
  • now that it has become a high stakes test it is having perverse impacts on teaching and learning
  • some of our most important teaching and learning goals are not reducible to multiple choice tests
  • there is a very real danger that it will be used to assess teacher performance – a task it is not at all suited to
  • some students are being harmed by this exercise
  • a few schools are using it to weed out ‘unsuitable enrolments’
  • school comparisons exacerbate the neoliberal choice narrative that has been so destructive to fair funding, desegregated schools and classrooms and equitable education outcomes
  • there will always be a risk of league tables
  • their unequal impact on high needs school
  • they do not feed into base funding formulas. In spite of the rhetoric about equity and funding prioritization being a key driver for NAPLAN, it is not clear that any state uses the NAPLAN to inform their base funding allocations to schools[1]

However, the ‘solutions’ put forward by the report are limited to the following recommendations:

  1. develop on-line testing to improve test results turn around – something that is happening anyway
  2. take into account the needs of students with a disability and English language learners. Now this recommendation is so vague as to be meaningless
  3. have ACARA closely monitor events to ensure league tables are not developed and that the results feed into funding considerations. This is another vague do nothing recommendation and I am certain ACARA will say that they are already doing this.

This is a recommendation to do nothing – nothing that is not already being done or nothing of meaningful substance.

As an example of the paucity of its analysis I offer the following. The report writes about the lack of diagnostic power of the NAPLAN tests and then says that, even if they were diagnostic, the results come too late to be useful. The report then argues, as its first and only strong recommendation that there needs to be quicker timeframe for making the results available. Did the writer even realize that this would still not make the tests useful as a diagnostic tool?

This Report, while noting the many problems assumes that these can be addressed through minor re-emphasis and adjustments – a steady as she goes refresh. However the problems identified in the Report suggest that tiny adjustments won’t address the issues. A paradigm change is required here.

We are so accustomed now to national standardised testing based on multiple choice questions in a narrow band of subjects being ‘the way we do things’, that it seems our deliberations are simply incapable of imagining that there might be a better way.

To illustrate what I mean I would like to take you back to the 1990s in Australia – to the days when NAPLAN was first foisted on a very wary education community.

How many of us can remember the pre national testing days? Just in case I will try and refresh your memory on some key elements and also provide a little of the ‘insider debates’ before we adopted the NAPLAN tests.

1989 was the historic year when all Education Ministers signed up to a shared set of goals under the now defunct 1989 Hobart Declaration. Australia was also in the process of finalising its first ever national curriculum – a set of Profiles and Statements about what all Australian children should learn. This was an extensive process driven by an interstate committee headed by the then Director of School Education in NSW, Dr Ken Boston.

During this time, I worked in the mega agency created by John Dawkins, the Department of Employment, Education, Training and Youth Affairs, initially in the Secretariat for the Education Ministerial Council (then called the AEC) and a few years later heading up the Curriculum and Gender Equity Policy Unit.

The Education Division at that time was heavily engaged in discussion with ACER and OECD about the development of global tests –the outcomes of which are PISA and a whole swag of other tests.

This was also when standardised testing was also being talked about for Australian schools. Professor Cummings reminds us of this early period in her submission to the Parliamentary Inquiry when she says that

This was also when standardised testing was also being talked about for Australian schools. Professor Cummings reminds us of this early period in her submission to the Parliamentary Inquiry when she says that

…through the Hobart Declaration in 1989 … ‘Ministers of Education agreed to a plan to map appropriate knowledge and skills for English literacy. These literacy goals included listening, speaking, reading and writing….

National literacy goals and sub-goals were also developed in the National Literacy (and Numeracy) Plan during the 1990s, including: …comprehensive assessment of all students by teachers as early as possible in the first years of schooling…to ensure that…literacy needs of all students are adequately addressed and to intervene as early as possible to address the needs of those students identified as at risk of not making adequate progress towards the national…literacy goals….use [of] rigorous State-based assessment procedures to assess students against the Year 3 benchmark for…reading, writing and spelling for 1998 onward.

It is interesting to note that the on entry assessments of children by teachers commitment referred to by Cummings did result in some work in each state. But it never received the policy focus, funding or attention that it deserved., which I regard as a pity The rigorous assessments at Year 3 however grew in importance and momentum. But the key consequence of this commitment was the year 3 state based assessment. Professor Cummings goes on to say that in order to achieve this goal – a laudable goal – the NAP was born. State based at first with very strict provisions about not providing school based data and then eventually what we have today.

Professor Cummings may not have known that many of us working in education at the time did not believe that national multiple choice standardised tests were the best and only answer and that considerable work was undertaken to convince the then Education Minister, Kim Beazley, that there was a better way.

During this period, where National standardised literacy tests were being discussed in the media and behind closed doors at the education Ministers’ Council,

Over this same period the US based Coalition of Essential Schools was developing authentic classroom teaching and learning activities that were also powerful diagnostic assessment exercises. Its stated goal was to build a data bank of these authentic assessments activities and to benchmark student progress against these benchmarks across the US. Its long term goal was to make available to schools across the US a data-base of benchmarked (that is standardised) assessments with support materials about how to use the materials as classroom lessons and how to use the results to a) diagnose a students learning b) plan future learning experiences and c) compare their development to a US wide standard of literacy development.

As the manager of the curriculum policy area, I followed these developments with great interest, as did a number of my work colleagues inside and outside the Department. We saw the potential of these assessments to provide a much less controversial, and less damaging way of meeting the Ministers’ need to show leadership in this area.

Our initiatives resulted in DEETYA agreeing to fund a trial to develop similar diagnostic classroom friendly literacy assessment units as the first part of this process. We planned to use these to demonstrate to decision makers that there was a better solution than standardized multiple-choice tests.

As a consequence I commenced working with Geoff Masters (then at ACER as an assessment expert) and Sharon Burrows (who headed up the Australian Education Union at the time) exploring the potential power of well designed formative assessments, based on authentic classroom teaching formats, to identify those at risk of not being successful at developing literacy skills.

Unfortunately we failed to head off a decision to opt for standardised tests. We failed for a number of reasons:

  • the issue moved too quickly,
  • the OECD testing process had created a degree of enthusiasm amongst that data crunchers who had louder voices,
  • our proposal was more difficult to translate to three word slogans or easy jargon,
  • multiple choice tests were cheaper.

At the time I thought these were the most important reasons. But looking back now, I can also see that our alternative proposal never had a chance because it relied on trusting teachers. Teachers had to teach the units and assess the students’ work. What was to stop them cheating and altering the results? Solutions could have been developed, but without the ICT developments we have access to today, they would have been cumbersome.

I often wonder what would have happened if we had initiated this project earlier and been more convincing. Could we have been ‘the Finland’ of education, proving that we can monitor children’s learning progress, identify students at risk early in their school lives, prioritise funding based on need  – all without the distorting effects of NAPLAN and MySchool?

We can’t go back in time but we can advocate for a bigger, bolder approach to addressing the significant problems associated with our current NAPLAN architecture. The parliamentary report failed us here but this should not stop us.

I have written this piece because I wanted us to imagine, for a moment, that it is possible to have more courageous bold and educationally justifiable policy solutions around assessment than what we have now. The pedestrian “rearrange the deck-chairs” of this Report is just not good enough.

So here is my recommendation, and I put it out as a challenge to the many professional education bodies and teacher Education Institutions out there.

Set up a project as follows:

Identify a group of our most inspiring education leaders through a collaborative peer nomination process. Ensure the group includes young and old teachers and principals, teachers with significant experience in our most challenging schools especially our remote Indigenous schools. Provide them with a group of expert critical friends – policy experts, testing and data experts, assessment and literacy experts and ask them to:

  • Imagine there is no current assessment infrastructure
  • Devise an educationally defensible assessment architecture – taking a green fields approach

I can almost guarantee that this working group would not invent NAPLAN and MySchool or anything like it, and we would be significantly better off.

We have dug ourselves into an educationally indefensible policy hole because we have allowed politicians and the media to drive key decision. To my knowledge we have never established an expert group of educational practitioners with access to specialist expertise to develop better policy solutions in education. Why don’t we give it a try?

Any takers?

[1] I understand that NSW does use the NAPLAN results to channel some additional funds to low performing schools but these are above the line payments.

A Small Step for Government But a Giant Step for Remote Indigenous Children: NAPLAN and Indigenous Learners

The current review of NAPLAN is the second Parliamentary Inquiry into the use of NAPLAN data in Australia.  If it goes the way of the first inquiry then little change can be expected.

When the previous Inquiry was initiated, I was working for the Australian College of Educators.  We put a huge amount of effort into our submission.  It went through a member consultation process, was submitted and then sank like a stone.  Indeed even the website that hosted all the submissions appears to have disappeared. Nothing much came of it as can be expected when an issue has become politicised.

I am much less optimistic about what can be achieved this time round. My ideal outcome is unrealistic. It will not lead to a change in emphasis from testing and measuring to supporting and building capability –  no matter how much the evidence supports such a change.

However we can and should advocate to address the most egregious problems and unintended consequences associated with NAPLAN.  This is our chance to highlight them.

For this reason I was very excited to see that Submission No. 71 to the Inquiry comes from Leonard Freeman, the Principal of Yirrkala School in the remote northeast of the Northern Territory.

Yirrkala College is a K-12 very remote school quite near the mining township of Nhulumbuy on the Gove Peninsula.  It serves a discreet, remote Indigenous Community on Aboriginal land and 100% of the students that attend are Indigenous.  According to MySchool 97% of the students at the school have a ‘Language Background Other Than English (known as LBOTE).

Now, it is easy to underestimate the significance of this language background issue in Indigenous contexts.  Children who grow up in a remote Indigenous community where their mother tongue is still alive and thriving are, in every sense of the word, still residing in a non-English speaking background country.  They arrive at school with almost no experience of hearing English spoken.  They don’t hear it at home, around town, on their radio station, local stores, health centre, or at social/cultural events.

LBOTE is a broad category and very unhelpful for understanding language challenges and issues.  Children and their families can be quite fluent in English, but if they speak a language other than English in their home they are still classified as LBOTE.  Most LBOTE children who have very little or no English are recent arrivals from a non-English background country.  They might reside in suburbs where English is not the dominant home language and for the first school year attend an Intensive English language unit but English is still heard around them – in playgrounds, health centres, playgroups, libraries, radio, TV and in the school playground and classrooms.  They are, at some level,immersed in a culture where English is heard.

Children at Yirrkala can grow up hearing almost no English spoken.  When they get to school, their classes are in language for the first few years (in spite of the NT Governments poor decision to change this Yirrkala maintained this policy) – in fact right up to year 3 where teaching in English is gradually introduced.

So what does Leonard Freeman have to say about NAPLAN?

He argues that while there is a perception that NAPLAN is a fair test it is anything but.

[NAPLAN] is a testing scheme that seems as fair as it could possibly be – all students sit the same tests and the marking is conducted by an independent central body. However, this perception of fairness is a thin veil that covers a system that disadvantages students who speak English as a Second Language.

There are a number of issues wrapped up in this notion of unfairness.

Firstly, the NAPLAN exemption criteria do not give adequate consideration to the English language development of Indigenous children living in non-English speaking discreet Indigenous communities.

Most Australian educators assume that students who speak little or no English can be identified by the category “newly arrived from a non-English background country”.  In fact, when I worked in education in the NT I found that I had to constantly remind education administrators at national meetings that their proxy use of newly arrived non-English speaking migrants leaves out Indigenous children with identical or even greater challenges.

Nationally two per cent of Australian children are exempt from sitting the NAPLAN test. Students can be exempted from one or more NAPLAN tests if they have significant or complex disability, or if they are from a non-English-speaking background and arrived in Australia less than one year before the tests. 

So in fact almost all other children, who have as little English language competence as Year 3 and even year 5 remote Indigenous children from communities like Yirrkala, are exempt from NAPLAN.  No children at Yirrkala were identified as exempt from NAPLAN testing.

This leads to the ridiculous situation where remote Indigenous children with almost no exposure to English language, especially in written form,    “must take the test under the same conditions as students who speak English as their first language and have their results evaluated in terms of the ‘typical achievement’ standards of mother tongue English speakers. “

Now one of the reasons why education Institutions and administrators resort to the category ‘recently arrived migrant from a non-English speaking background country’ as a proxy for children who do not yet have a sufficient grasp of Standard Australian English is because we don’t have sensible data on this matter.  We have data on children who have a language background other than English but this tells us nothing about their level of competence with written English.

This Inquiry could secure bipartisan support to fix this matter up – this is not a politicised, hot issue.  This is about applying definitions of technically relevant matters in an inclusive and fair manner. Children in years 3 and 5 who reside in communities where standard Australian English is not spoken could be either exempted from NAPLAN until their English language learning enables them to read English to a defined level.

Secondly, NAPLAN is not a culturally fair test and this further discriminates against remote Indigenous children.

Back again to Leonard Freeman:

….NAPLAN reading tests assess students’ reading ability by judging their answers to multiple choice questions which ask often complex questions about the reading material.

He provides the following example of a multiple choice item in a Year 3 reading test

‘But I feel funny about saying I own him’. What reason does Tim give for feeling this way?

a) Elvis is really Malcolm’s dog.

b) Tim thinks dogs cannot be owned.

c) All the family helps to look after Elvis.

d) Elvis is much older than Tim in dog years. 

It is pretty obvious that there is a great deal of non-accessible cultural knowledge required to eliminate supposedly irrelevant answers for this item.

These sorts of questions do not simply assess whether the student can read the reading material and basically comprehend the story; they go well beyond that. A Year 3 student from a remote Indigenous community who is still trying to master spoken English and western cultural norms would find a question like this very difficult to answer. The assumed knowledge, about dogs ages being measured by ‘dog years’, the use of the word ‘funny’ to mean uncomfortable rather than humorous, and the concept of questioning the definition of ownership are all things that would be unfamiliar to a child growing up in a remote indigenous setting.

The NAPLAN reading test actually tests analytical skills which are coated heavily in western cultural norms. 

Another example provided by Freeman of an item around the delivery of newspapers provides further insights into cultural inaccessibility.

The story begins with the householder complaining to the newspaper boy ‘you left the paper jutting out of the back of my box’ and we also learn the owner had previously complained the paper needs to be left ‘in line with the fence’. This question was designed to test whether students could infer the meaning of new words and constructions. Yet to do so the students need to be familiar with the cultural context, in this case the students need to know that houses have a box on their fence line where mail and newspaper deliveries are left.  If the student has grown up in a remote community or refugee camp where there are no letter boxes and few houses have fences they will not be able to access the meaning of the text. 

Thirdly, The lack of fit between the NAPLAN tests and the kinds of assessments needed to effectively support teachers in these challenging context leaves teachers unsupported and undermined.

Now it would be reasonable to expect that the NT Department of Education should be fully cognizant of these circumstances and to make it their business to ensure that the unintended consequences of this unintended situation could be addressed, or at least mitigated.  Sadly, when I worked in the NT I found that this was not the case.  And scrolling through their website today I found that nothing much had changed.  There is now an acknowledgement that Indigenous children are English Language learners but what this means in terms of resourcing is minimal and what it means for teachers across remote schools appears to be completely ignored.

The absurdity of this is best illustrated through the following personal experience of what can only be described as an absurd professional development event

This event took place at a beautiful new resort in the remote mining community on Groote.  The attendees at this session were principals and a group of their nominated teachers from schools in remote Indigenous communities.

The aim of the 3 days session was to ‘teach’ the attendees – all from remote Indigenous schools  – how to drill down into their schools test results and develop, not just a strategic, but a tactical response to what they find.  It was a highly structured event. First of all the groups were given a spreadsheet showing their NAPLAN results for al year levels and for all tested areas and a detailed worksheet to work through.

I sat next to a school team that came from a large school in Eastern Arnhem, similar in key features to Yirrkala.  It was also a school that ran a bilingual program, which meant that all student in year 3 and almost all in year 5 could, not yet read in English – even at a basic level.

This school had NAPLAN results that were marginally worse than the other schools represented.  At this school, in almost every subject, in almost every year level, 80 – 100% of the students scored zero – that is they did not get one answer right – not one.  Some classes in some schools had a small minority of students who did receive a higher score – a few even approaching the relevant band for their year but they were a tiny tiny minority.

The professional development session required the teachers to group their students by what they did not know.  For example – how many students did not understand the convention of the full stop?  Put a ring around these students.  The teachers next to me sighed and ringed the whole class.  And it went on like this for three whole days.  It was idiotic and devastating.

These teachers went back to the school not just demoralized but with decontextualised lesson plans on full stops, the sound ‘CH’, prime numbers and so on.

I tell this story because it is an extreme example of just how stupid it is for people to invent prescriptive solutions that must be rolled out across all schools, with no exception.

There is no doubt that this is damaging for teachers in remote schools.  It was political exposure about the poor NAPLAN results that forced Marion Scrymgour to preemptively abolish the struggling, underfunded bilingual program – something she later came to regret – for good reason.

Leonard Freeman sees the NT Department priorities and the experiences and struggles of remote teachers as heading down a collision course:

The NT government made a commitment to having 75% of NT students meet the Year 3 NAPLAN benchmarks and teaching programs are aimed at achieving this. The amount of English instruction is being increased under the misguided belief that elevating the focus and importance of English will yield better English results. 

The inclusion of ESL students in NAPLAN testing places ESL researchers, specialist ESL teachers and classroom teachers in a conflict between the principles of effective ESL teaching and assessment practices and the requirements of governments and education departments. Instead of working together to attain the best educational outcomes for students’ researchers, policy makers, teachers and governments are locked in a fundamental disagreement between meeting the needs of ESL students and the administrative and political advantages of a uniform testing regime.

 One of the perverse consequences of this is that programs which claim to accelerate English literacy or which are aimed at native English speakers are now favoured ahead of academically sound ESL programs which demonstrate the genuine progression of ESL students.

It has also led to effective and evidence-based programs such as the Step Model bilingual program to be shut down to the detriment of Indigenous ESL students.

Now some readers this may be thinking that I am arguing for lower expectations for remote Indigenous children.  This is not my message.  These children get exposed to English in school for the first time and it is often their third and fourth language.

We exempt newly arrived LBOTE children ‘down south’ not because we expect less of them but because we recognize that their learning journey has to include an additional learning pathway.  But we do not expect less of them in the long run.

Back to Freeman again:

… an ESL approach is not a lesser approach. It is aimed at getting students who are speakers of other languages to a point where they can access mainstream education. A program may be deemed ineffective if ESL students never reach age-grade performance, but ESL programs that successfully move students along the continuum at a rate that is acceptable based on the research should be regarded as valid and ideal for ESL learners.

It is important to recognise the research which shows that it takes a minimum of 4 years, and more like 7 years, to achieve proficiency in a second language to a level that is sufficient for academic pursuits. The efficacy of ESL programs should be judged against the backdrop of this research.

So what are the small steps Governments could take in order to stop getting in the way of effective education for remote Indigenous children

  1. Stop NAPLAN Testing for remote Indigenous children until year 7.
  2. In the meantime, agree on an alternative form of testing[1] that is more appropriate for ESL students in terms of cultural content and recognition of ESL learning stages
  3. Address cultural bias in NAPLAN testing so that when remote Indigenous students are linguistically ready to sit the tests they can understand what is being asked of them
  4. Develop a national agreed English Language Learner Scale (ELLS) to replace LBOTE as a student  category so there is a far and consistent way to measure disadvantage based on English language learning needs

[1] The ACER Longitudinal Literacy  and Numeracy Study for Indigenous Students (LLANS) test has been trialled in all states and territories with both indigenous and non-indigenous students. Researchers have now aligned the LLANS test results to the NAPLAN data scores. So it would be possible for ESL students in the primary years to be given an appropriate test which can give a much clearer indication of their actual literacy and numeracy skills.

Winning the PISA Race – how hard can it be?

The Australian Government has justified expenditure on school funding reforms on the basis of our falling relative performance on the OECD international Tests  – the most well known of which is the PISA testing.

I appreciate the point that investing in equity in education will – if well invested  – also improve outcomes in terms of educational excellence. This is obvious  because the biggest “gap in performance”, in terms of the distance between performance outcomes and performance potential, is not with those students already financially, socially and intellectually indulged.  They already achieve at levels relatively aligned to their potential.

No, the biggest performance gap is among those students who start school behind their peers, have less access to early learning experiences through quality childcare and preschool programs, experience less family stability, have parent(s) who are struggling to support their children in terms of quality time, financial and home stability and exposure to quality educational experiences, go to schools with a concentration of students in similar circumstances, have less than their fair share of highly experienced teachers, experience higher levels of staff turn over, more greenhorn principals and poorer, educationally relevant, school facilities.

Its great that we have at last acknowledge that ‘ their loss’ is our loss – the loss of so many potentially creative successful citizens, employees, managers, and leaders.

However, I am not a fan of using PISA as a proxy for measuring this Return on Investment.  It is misleading. It could  distort our investment priorities and our school and classroom priorities.  Many others have written about this.  It is also a moving target as results could improve in absolute terms but still slide down the list in comparative terms. So in this sense it is also risky.

We  know that according to Campbell’s Law as soon as you make a god of a particular metric, it becomes distorted as everyone games the system.

So here is my ‘real politic’ set of options for clever gaming the system to achieve this goal painlessly, while minimising unintended consequences and with no risk.  Choose your game-plan Australia.

Suggestion No. One:  Just test the ACT.  It has the lowest proportion of low ICSEA schools, no schools with a high Indigenous population, no remote or very remote schools, many of its ESL population are foreign dignitaries, it has a high level of preschool attendance, and it is the National Capital.

Shanghai is held up as a PISA star but it is important to note that as large as this city might be it is still only a city.  It is not China and it is obviously a key centre for politics, business and industry.  Its results are unlikely to be reflective of China as a country.

Suggestion No. Two:  Establish a group of specific purpose ‘benchmark’ test schools across Australia.  This proposal is based on the logic of Charter Schools in the US.  Any student ‘above a minimum competency standard’ would be eligible to apply – after-all populating them with geniuses would be too obvious.  Selection could be based on an application and a ballot that ensures a spread of SES and other student demographic features – so they could be seen as schools that are representative of the Australian population.  They could be well funded using the Gonski parameters of course.  In applying to these schools parents would need to understand that – a) commitment is required of them and their children and b) teaching would be focused on the PISA testing areas as a priority.  For some parents this could be seen as a ‘ free private school’ – a good deal.  Students who pull the results down could be gently, informally counseled out – all off -the-record of course.

Suggestion No. Three:  Pay schools by results.  Schools could apply to be test sites and paid reward funds for high PISA performance.  How they achieved this would not be questioned and neither would their decisions about spending their rewards funds.

Suggestion No. Four:  Pay teachers by results.  Teachers could apply to be test classrooms and paid reward funds for high PISA performance.  How they achieved this would not be questioned and neither would their decisions about spending their rewards funds.

Suggestion No. Five:  Pay parents/students by results.  Parents/students could apply to be test subjects – quite outside the schooling process and paid reward funds high PISA performance.  How they achieved this would not be questioned and neither would their decisions about spending their rewards funds.

Suggestion No. Six:  Exempt all ICSEA schools below 850. I have heard an unverifiable rumour that Canada exempts its ‘reservation’ schools from the PISA testing sampling, but in Australia we over-sample for our own data collection /policy purposes.  We should cease this immediately and select our sample schools from schools that will not pull our results down.

Alternatively we could drop this PISA goal altogether and instead put our full backing into supporting teacher capacity development, building quality support tools for teacher feedback and self reflection based on classroom practice, reduce face to face teaching time in order to increase teacher planning and collaboration time, restructure teacher career pathways around the teacher standards, develop comprehensive strategies for improving the equitable distribution of highly experienced teachers across schools and implement Gonski.

We are already doing a lot to support this better pathway.  All we really need to do is change our goals and reconsider high stakes testing.  I know which way give us the best returns on investment.

Please Julia Gillard Don’t let Bill Gates Undermine the Work of AITSL

Sub Title:  We must not sacrifice teacher self-reflection and ‘safe’ learning to the god of performativity

In an article on this blog a few weeks ago I warned about the important difference between the  work that the Australian Institute of Teaching and School Leadership (AITSL) is doing to develop high quality and useful tools to support teacher initiated professional learning, development, peer mentoring and coaching  and what Bill Gates would like to do with such tools.

Bill Gates met with the PM yesterday and will be watched by millions on QandA tonight.  If he talks about  his TEDX message about the value of videos of teachers in classrooms, student feedback instruments, portfolios of teachers work, walkthroughs or other tools for ‘measuring’  or ‘ judging’ teacher performance for rewards or for compulsory performance review processes,  think about what he is actually saying.

He is saying that the best way to improve teacher quality and drive improved teacher performance is to test it/ assess it/ judge it/ weigh it.   Does this ring any bells?

Now I ask everyone to think about this sort of policy approach from the point of view of a newish teacher.  Would  you improve more in a system a) that encourages a pro-active  teacher initiated approach to professional development with high levels of peer collaboration, opportunities for self reflection and peer discussion on problems and areas for development using the latest high quality support tools,  or b) in a system that used all these same tools to measure you  – where every measurement was recorded in a performance grading process?.  Would you be enthusiastic about using video of your teaching or a student feedback survey on your semester project in order to reflect and hone your professional craft if you knew it could then be taken and used for formal performance assessment process which go into your records for all time?

Its a no brainer.  If you want to built the professional knowledge and skills of teachers then work with them, support them, give them a ‘safe place’ where development needs can be acknowledged along with high quality frameworks to support this.

There will always be a small proportion who will not rise to the challenge – who are probably in the wrong profession but lets not design a performance improvement framework around ‘weeding out the bad’.  This lowest common denominator approach sabotages the very goals of improvement.  The best way to manage this problem is to focus on school leadership.

Tony Mackay  Chair of AITSL wrote about this here, rather more tactfully and only recently

Australia is not a basket case in school reform. We have achieved something no other nation has so comprehensively managed: Australia is one of the first countries in the world to have a national set of professional standards to improve teaching in schools.

 Others have tried to develop national standards and failed. We have done it, getting the education sector – federal, state and territory governments, universities, non-government schools, employer groups and unions – to reach agreement on an end-to-end system for teacher quality.

 No other country possesses an exactly equivalent body to AITSL. Every few weeks the institute receives inquiries from overseas governments and education authorities wanting to know how Australia managed to get agreement on national standards from so many disparate groups involved in schooling. They have come from as far afield as the New York City school system, the Canadian province of British Columbia, Scotland, the Middle East and elsewhere.

 So how did AITSL achieve what has eluded our overseas colleagues? We …. learnt from [others] mistakes. …

Mandated standards will never work unless you get school systems and teachers on board to make them work. So we listened to teachers and school leaders. We set up a comprehensive national network of advisory groups, public seminars, forums and focus groups. We involved 6000 teachers and school principals in helping us shape the standards.

Undermine this at your peril.