by clicking the arrows at the side of the page, or by using the toolbar.
by clicking anywhere on the page.
by dragging the page around when zoomed in.
by clicking anywhere on the page when zoomed in.
web sites or send emails by clicking on hyperlinks.
Email this page to a friend
Search this issue
Index - jump to page or section
Archive - view past issues
Inside Teaching : October 2010
Inside Teaching | October 2010 CURRICULUM & ASSESSMENT 30 the student achievement-level report, we constructed an Excel spreadsheet listing student names down the left-hand column, then their actual marks on each of the 10 NAPLAN writing criteria, shown in Figure 1, their total marks, attributed VELS levels and also VELS-level results from the other NAPLAN tests of reading, language conventions and numeracy. Thus each student’s scores were itemised and could be examined in detail. We also added data such as VELS levels for class assignments, essays and the like, as well as administrative details such as home group and class group. We then manipulated the spreadsheet in various ways using the ‘sort’ command under ‘data’ on the menu bar to examine the spread of results. It was useful to compare different class groupings, gender differences and so on, although the real purpose was to find the things we could really focus on in the final term. More than half of the cohort scored lower than we expected. We identified the criteria on which they’d score poorly, discussed them as a team, and constructed personalised English ‘toolbox’ classes for the final term. This meant running several English classes simultaneously, focused on different aspects of writing. For example, one class looked at audience and vocabulary, another concentrated on punctuation, a third on text structure, and the fourth – students with high test scores – on extended narrative writing. Students were grouped according to their test results from five months earlier in the year. Problems, as they emerged Predictably, there were problems: • Not all students had sat the test so which toolbox class should they attend? • In the five months since the test, many students had progressed with written literacy. • The test itself didn’t accurately measure the competence and capacity of some students, so they were placed in the wrong class. • Final term is crowded and short – not the best time to engage Year 9 students in innovative learning. • We didn’t have time to fully prepare our teaching team and two of the four class groups struggled to engage with the purpose of the exercise. So, a good idea – using the data to personalise the teaching – only partly succeeded in practice. In a perfect world this approach could have been carried over to the same cohort in Year 10, where the insights gleaned from the Year 9 data could have been systematically applied to English teaching from first term onwards. Unfortunately, though, changes in personnel, team structures, leadership and so on, as so often happens in schools, led to the whole project being temporarily shelved. The feedback I’ve had from teachers in my workshops has been similarly mixed – a good idea with a useful purpose is hobbled by time constraints, established class arrangements, inflexible timetables and, most crucially, the huge gap between Figure 1. NAPLAN writing criteria Audience – the writer’s capacity to orient, engage and affect the reader (6 marks) Text structure – organising the story according to standard narrative structure (4 marks) Ideas – covering both creativity and crafting of ideas (5 marks) Character and setting – portrayal of the character, and establishment of a setting (4) Vocabulary – the range and precision of the language (5) Cohesion – controlling the text technically (4) Paragraphing (2) Sentence structure – • grammatically correct, sound and meaningful sentences (6) • use of correct punctuation (5) • accuracy and range of spelling (6)