Pages

Wednesday, December 31, 2014

What matters most?

What sort of society do we want to live in? Regardless of your particular persuasion - Christian or Atheist, Muslim or Hindu, this is a question you have either thought about, or ought to have thought about, at some time. And at Christmas time, as stories come out of children who have no Christmas, we see the national past-time of welfare bashing develop momentum ready for the post Christmas frenzy.

Like it or not, we have vulnerable people all around us. I believe the view that you can judge the quality of a society by the way it treats its most vulnerable members.

Apparently it is difficult to attribute the sentiment to any one person. For example:
"A nation's greatness is measured by how it treats its weakest members." ~ Mahatma Ghandi
"A society will be judged on the basis of how it treats its weakest members and among the most vulnerable are surely the unborn and the dying," ~Pope John Paul II
A decent provision for the poor is the true test of civilization.~Samuel Johnson, Boswell: Life of Johnson

All of these quotes I found with a simple Google search on http://askville.amazon.com/measure-civilization-treats-weakest-members-accurate-quote/AnswerViewer.do?requestId=4718239

It is with a paradoxical degree of irritation then that I read and hear some incredibly intolerant views expressed by others in society. News stories of those unable to help themselves regularly evoke responses that range from intolerance to outright belligerence with accusations that include 'idiots', 'bludgers', and other labels I'd rather not print. So how come we have vulnerable people in our society?

So, let's try to put some thinking straight. Here's one perspective.

A common view is that people at the bottom of the 'social heap' simply don't want to help themselves. That's interesting. It assumes that we are all the same. It assumes for example that we all have the same abilities, or the same personality profiles. It assumes that we have the same risk profiles, or the same 'intelligence' (whichever model you prefer to use to identify those measurable attributes that we call 'intelligence'). That's not just tricky, it's downright untrue. At the risk of lecturing, any attribute of a naturally occurring population is spread across that population in a normal distribution.



Without that we wouldn't have this:




or this:


But just as there are those at the top of the pile, so there are those at the bottom. That's why we see this:


and this:


I am not suggesting anything here other than that we don't all have the same capabilities. Some of us are born with more short twitch fibres than long in our muscles making us more suitable to one type of running over another. No amount of training is going to make a sprinter out of someone with the wrong muscle twitch fibres, for example. (See the work of Stephen J Gould in his book 'The mismeasure of man' for an alternative view about the normal distribution though).

There are also arguments abounding about 'nature vs nurture' and the impact of each on our development as adults. And of course there is the argument that birth doesn't have to define destiny, that any of us is capable of changing our position in life with the right application.

L
However, laying blame is the easiest response. You might even label it the 'lazy' response, but it doesn't help the situation in any way shape or form. 

Those in wheelchairs don't chose to be there. I'd also venture the generalisation that those living in poverty don't chose the live that way either. Don't get me wrong. I am not stupid enough to suggest that every person who lives in poverty is incapable of helping themselves. There are always those that choose not to help themselves. 

Consider the accusation that those at the bottom of our supposed social heap should get off the bums and work. Let's get them all to start their own businesses. Well, starting your own business requires a good idea, a huge work ethic, and a risk profile that accepts risk. 

How does that look? It depends on the personality type model that you use. Here's one:


There are many different models of personality type, and argument amongst psychologists on the validity of any or all of these too. The point is that we are different. Not all of us have a risk profile that lends itself to new ventures. There are those amongst us who are risk seekers. There are those amongst us who are more risk averse. It seems reasonable to assume that the willingness to accept risk will be distributed normally across the population, like any other attribute. Why have I never started up a business? I am relatively risk averse, simple!! For some of us simply changing jobs, or even accepting a job, is a 'risk' that pushes us too far.

If we talk about accepting employment from others, the economist might argue that there are no jobs for them, or that there is a mismatch between the skill sets of job seekers and the job market. We could equally argue that the drive, the 'get up and go' required to search out a job isn't equally distributed amongst us all.

However my argument is that we mustn't generalise across everyone in any of our artificial categories. As I said earlier, laying blame is the 'lazy' response, but it is not a helpful response. A more productive response might be, what can I do to help? How could I change the situation? Maybe I 'pay it forward' with a coffee for someone else in the coffee queue. Maybe I donate some long unused clothing to the City Mission. What matters in my opinion is that we generate a more caring society.

Every human being has value, every human being has talents and skills, every human being has something to offer to society. The bigger and more challenging question, the tough question, is how we get more and more of those at the bottom of our social pile to see their talents and skills? How do we encourage more and more people to be the best they can be, and make the best contribution they can to the society in which they live? And how do we care for those most vulnerable in our society?

"A nation's greatness is measured by how it treats its weakest members." ~ Mahatma Ghandi


Tuesday, December 23, 2014

One year of BYO laptops

There it is - the end of our first year with BYOD .. well BYO laptop to be precise. What do I think I might have learned?
  1. Laptops make a difference.
  2. Staff will shift their practice when they are good and ready, and not before.
  3. Laptops make a difference.
  4. Staff will shift their practice when they can see some advantage to themselves - reduced workload, more efficient work flow and maybe, just maybe, improved learning for their students.
  5. Laptops make a difference.
  6. Many people (staff and students) believe that 'elearning' is delivery using electronic media.
  7. Laptops make a difference.
  8. Many people (staff and students) believe that staff delivering content is true learning.
  9. Laptops make a difference.
  10. Laptops enable engagement in ways not possible without them. Teachers can engage with students in ways that are difficult to achieve without laptops.
  11. Laptops make a difference.
  12. Laptops enable feedback in detail and volume that is very difficult to achieve without laptops
  13. Laptops make a difference.
  14. Life would be incredibly tedious if not boring for any student if all teachers in their lives did the same things with laptops every day.
  15. Laptops make a difference.
Hmmm... is there a theme there? Of course I say all of this with no quantitative data at all to support my arguments. I did say this is what I THINK I have learned. Let's look for the data.

Wednesday, October 8, 2014

Student engagement with Twitter Take 2

In earlier posts I have detailed how I have made use of Twitter in classes by asking students to tweet responses to questions that I have posed them in and out of class (mostly in class), or of posing questions that occur to them during the more didactic episodes in the lesson. I was in a planning meeting with friend and colleague Pauline Henderson (@paulinehendog) and we were discussing some research question on the impact of Twitter in classes.

In order to do this I had to explain more fully to Pauline how I use Twitter. One of my uses was inspired by an example related to me in a Masterclass run by Alan November last year. He related the example of a Maths teacher who tweeted to her students examples of Maths in real life. As time went by they began to tweet back to her and the rest of the class examples that they had seen too.

Engagement? I think so.

This inspired me to start to tweet articles of interest in my own subject area of Economics. Often I would hold a brief class discussion on the articles, working hard to 'decode' the economists' language to make the material more accessible to my students.

The interesting behaviour (and the one I had hoped for, even mentioned to the students themselves) is that they began to tweet interesting articles that they had found. Out of 30 boys in my classes, four boys have taken the trouble to tweet articles to me and the rest of the class, in all cases more than once. I know that that's not too many, but it's four more than none.

The articles have all been relevant to the content of the time. 

Our arguments (that is, Pauline and I): 
  1. To bother to do this is a sign of engagement.
  2. To be able to do this those boys must have engaged in some pretty serious thinking. The focus of our course is on deep thinking, and on that score this was pretty deep.
In each case I made sure I took the time and trouble to then discuss the relevant articles with the whole class. I felt that this was important in order to legitimise their posting behaviour. The articles themselves were also genuinely interesting and relevant, and in most cases they were articles that I hadn't found myself.

We talked these articles through together, the posters often taking the lead in the discussions. We were all learners together.


Thursday, October 2, 2014

Improving the learning with Web2.0 tools

Improving the learning with Web2.0 tools 


The teaching profession is filled with good people, well intentioned people, people who want the best for the students that sit in front of them. These are people who would often almost literally give the shirts off their backs for their students.

Yet these are also all too often people who fill their daily working lives with practice based upon little more than mythology. Challenge it and the response is all too often that wonderful professional 'put down' - 'well of course it works, and you'd need to be pretty stupid to think otherwise.'

That observation leads to more than a touch of nervousness when I start to describe the impact of what we might call 'Web2.0' tools in the classroom. In between those bouts of Senior Management 'stuff' that fill my day, I teach two Level 2 Economics classes, and over the past 5 years I have progressively brought several new tools into the class repertoire. The focus of the work has been on developing students' ability to think critically, and to express that thinking in writing. It is no coincidence that the three external Level 2 Economics standards now count towards UE literacy (writing) in NCEA terms,  a commentary on the reasoning and writing expectations of these standards.

I now have a well established pattern of development through the year. Running in amongst the more traditional 'direct instruction' I begin by developing students' ability to reason and write with the use of online discussion forums. I moderate these carefully, giving every student detailed feedback on their posts (perhaps one of the more powerful differentiation tools?) before setting them to the process of constructing and deconstructing argument with each other.

As students develop confidence in their ability to argue, and to record those arguments in writing, I then transition them to more formal writing using GoogleDocs. I create a writing structure using the SOLO framework, ensuring that their written work closely matches the expectations of the NCEA standards and assessments. Each piece of work receives detailed feedback from me in writing.

While content tends to be developed through more than traditional direct instruction, I often use GoogleForms to elicit responses to content related questions. I gather this data under 'test' conditions. This provides a rich array of thinking, not all of which is correct. Students then take this response data (which I have shared back with them) and in groups co-construct the correct answers to a range of questions that help them to develop their understanding of subject content. Throughout all of this classes are engaged with the use of Twitter both inside and outside the class.

I am also teaching specific writing techniques (in particular the Statement, Explanation, eXample/evidence structure with which our English colleagues are enamoured).

Finally, because our external assessment system still requires students to sit examinations in which they write their answers using traditional pen and paper, I then start on the final transition to answering questions on paper.

What have I found? I can hear the voices (quite rightly) calling 'where's your data?' I have no reliable, authentic, replicable data on the outcomes.  I have some observations to make.

Consistent with international research (Goldberg et al), students in my classes right more and write better than they ever did before. Whereas 10 years ago on paper they might have written a paragraph of 5 lines, now they will write anything between 15 and 30 lines in explanation (mostly - there are ALWAYS exceptions). The reasoning is clearer (in general) and the examination results improved. The students are now more often writing responses  that I would classify as Relational and Extended Abstract as they respond to questions. A class taught without these approaches over the same period saw reasonably static results.

This year for the first time I ran an NCEA internal assessment using GoogleDocs. The results were dramatically different. The overall pass rate went from 55/75% to 94%. There is almost no evidence of collaboration (the task had to be completed independently). The answers were fuller in nature, the reasoning clearer and more accurate with most students. I have clear evidence of metacognitive thinking. As I supervised one of the 'in class' sessions of this assessment I watched over the shoulders of students as they used the comments function to pose questions of themselves on what they had written, ready for review later before they submit their work. Interestingly (perhaps??) this was not something I'd 'taught' them to do.

I see great engagement amongst students with both the class work, and the subject. The use of Twitter has been enlightening as students pose questions, offer answers to questions (both in and out of class), and tweet interesting resource material to me and the class. I run a unique class hashtag for each class.

Is any of this conclusive? NO!!! Is this any better than the mythologising of which I accused too many colleagues at the start of this post? Probably not.

It is quite possible that the results have nothing whatsoever to do with the technology. What is needed is some quality research. I'm a numbers man, I need quantitative data. I have little other than probably meaningless word and line counts (there are the NCEA achievement stats for both internals and externals), but I am not bold enough to try to suggest cause and effect yet. That's next year's Teaching as Inquiry project.

References

1. Goldberg A, Russell M, and Cook A "Meta Analysis: Writing with computers 1992-2002", Technology and Assessment Study Collaborative, Boston College.
2. Hattie J "Visible learning for teachers: Maximizing impact on learning", Routledge, 2012

Friday, September 19, 2014

First reflections - online assessment

My first online assessment has been handed in by students. I would be lying to say that I was entirely relaxed about this as it is the first online assessment of this sort that I have written and administered. I've used online testing models in the past (collections of multiple choice or True/False questions as  a source of formative assessment and student practice) but never summative written assessments that are such high stakes.

The assessment was entirely open book, and I took 2 weeks to prepare the boys for the material, in particular to make sure that they were prepared for the depth of thinking that was required.

The assessment tool was GoogleDocs.

For me it was one heck of a risk. Had I written an assessment that required sufficiently deep thinking to mean that copy and paste would not help? Had I briefed the boys sufficiently on the lack of advantage and the ethical issues with copy and paste anyway?

I haven't marked any of the work yet, so can't comment on so many things, but what I have 'sampled' is the boys' responses to the task. I worried that they would find it easy. No, they all responded without a moment's hesitation. It was hard, they said. They had to think, they said. They found that they got things muddled in their thinking, and at times had to work hard to find clarity in their thinking and therefore in their responses.

The big tests will be:

  1. The NZQA moderation - did I get the task right, and will I get the marking right?
  2. Did the boys genuinely produce their own work in response to the tasks?

So far so good, now for the week of marking.




Wednesday, September 17, 2014

Online assessment behaviour & metacognition

I am fascinated watching the online behaviour of boys during their assessment.

The assessment is being completed using the GoogleDocs platform. The task is being completed in an 'open book' environment because it is about the thinking and synthesis, not about knowledge recall (although it obviously requires that knowledge in order to be able to synthesise).

Here's an interesting behaviour: a boy writes a piece of his response, and then uses the 'Comment' function to add a comment or question to the side on what he has just written. When I check, sometimes this is a 'note to self', other times it is a question as yet unresolved in his thinking.

This is the most visible illustration of 'metacognition' that I have observed.

'The Innovator's Mindset'

I thought this was an appropriate follow up to yesterday's blog post about assessment and risk taking:


http://georgecouros.ca/blog/archives/4783


Thanks, Pauline, a nice find!!

Tuesday, September 16, 2014

GoogleDocs assessment and risktaking

This year it fell to me to set and mark our large Level 2 Economics internal assessment. Normally I've had a colleague do this while I set and mark all of the formative assessment for external standards covered by the course. It seemed like a fair split of workload with which we were both comfortable.

The Level 2 Economics course is arguably a course in logic and philosophy as much as it is about economics. The external standards all count for UE literacy credits in writing which signals very clearly the writing expectations that students must meet.

In writing this year's assessment I decided to change things significantly by assessing online using Google Docs and an assignment format rather than an in class paper based written test. I was prompted in part by observation of colleagues' evolutions of assessment format, and also by our previous Moderation report which suggested that an assignment format might produce better outcomes for the students.

So here am I sitting supervising the boys in class as they work on their assessments. I was very nervous going into the exercise - this is a brand new experience for me, and quite a risk, but risk taking is something that we need much more of in education. My perception of risk is heightened by the fact that I lose some control over the process, that I hand control of more of the process over to the students. That's hard - as teachers most of us are at the very least closet control freaks at heart. However I have put in place a strong positive culture of working and thinking, I have emphasised the need for each boy to produce his own work. We have discussed the implications of using other people's work, both acknowledged and unacknowledged. And I have technical processes at hand to help where I suspect that collusion etc may have occurred. There does come a point at which we need to set our learners free and let them fly. They need to 'show their mettle'.

The intensity, the concentration, the output, are all truly prodigious. As with all of the formative writing that the classes had completed previously, the volume of writing appears to be significantly more than we would have seen in the past on paper. The quality will be judged when I mark the work next week of course.

Wednesday, September 10, 2014

Twitter discussions

Thanks to good colleague Pauline who found this video for me on one American Professor's use of Twitter in the classroom.




Thursday, August 14, 2014

More Twitter variation ...

Today's lesson made what I thought was a now familiar use of Twitter, but the evidence of outcome was fascinating. Our current topic is the price taker model of trade, a cool variation on the old supply and demand market model. I had taught the model the old fashioned way in the previous class. Hattie calls it direct instruction, and apparently it works (who knew??? .. <engage sarcasm circuits>).

The lesson went like this. The boys were handed a small square of blank scrap paper on which they first drew the basic model. I then gave them a scenario, and required them to draw the impact of the scenario on the market, and then tweet the outcomes on three variables. The diagrams were reasonably well done (but they were after all scratch sketch diagrams), and about half the boys got the answers correct. We went over the analysis (this wasn't a 'closed question' situation, slightly deeper thinking was required).

I then set the boys a more complex exercise requiring deeper thinking, but following the same format. Now most boys got their analysis correct.

Finally I set them a third problem that included a cunningly set trap. Every boy (in both classes) got the answers correct. How do I know? They were tweeting their answers. Did any of them copy the answers of others? Possibly, that's something I couldn't control for in this setting.

Finally they had to pair and share something new that each had learned, and then tweet their learning.

Here was one tweet:





This was the consequence of an increase in productivity for a small price taking nation that improves its own productivity. This boy had completed an important piece of relational thinking (I think) within the context of the SOLO framework.

On reflection I should have pushed for an additional deeper question that looked at wider connections and flow on effects, but given the lack of 'mastery' of the model at the start of the class, I was happy with the progress that we'd made.

The use of Twitter appeared to engage the boys, and support and affirm their thinking. Of course I can't prove any of that, but the Twitter responses gave me some data on the levels of competence that we had achieved by the end of the lesson.

Monday, August 11, 2014

Twitter and that pedagogy again ..

I reprised last week's Twitter lesson today. The question was a little more challenging than last time - and the boys' economics was well and truly tested.

The immediacy of the individual feedback was effective, and the learning that occurred for them as each boy heard my feedback to others in the class seemed to be significant. I have no data to prove the efficacy of this approach, even though it sits well alongside the Hattie data about feedback. I was however able to watch them modify their answers and tweet new solutions.

What was also interesting was the evolution of willingness to take risks with possible answers. I hadn't quite registered that in my thinking before.

Friday, August 8, 2014

Twitter, student engagement, and feedback

It started out as a lesson on interest rates and exchange rates. It ended up as a lesson on interest rates and exchange rates, but the path through the lesson wasn't what I'd planned.

I posed two questions to the class, one a closed question, and the other a question demanding an explanation. It was a spur of the moment decision, as you do sometimes in teaching when your instinct says that you should try something, and I asked the boys to open up their Twitter app (either the Twitter web page, or Tweetdeck, as it turns out). They answered the closed question with a tweet, a warm up for Twitter as much as anything else, and then they had to Tweet the explanation of what they had just tweeted. The tweets were streamed live onto the whiteboard at the front of thew class (yes I still have a whiteboard at the front of the class, and yes the boys still sit at desks, in chairs - I'm so old fashioned..).

Here are a couple of tweets. I haven't shown the whole image as I wanted to make sure that I anonymised the tweets, so I had to screen shot to avoid each boy's name.




As the individual tweets appeared on the screen I gave each boy feedback on his response. The feedback was in the form of additional questions that might prompt them to edit and improve their tweets, which many did (just a touch of the old socratic questioning here, based largely on pushing boys through the SOLO thinking framework).We then repeated the exercise with an additional question.

Normally I'd have run a class discussion. Despite my practised skills in running class discussion, I would not have managed to get an individual explanation/answer from every boy. I asked them to put their hands up if they would have worked to stay under my radar in a class discussion - over half the hands went up.

What happened here? I managed to engage every boy in the class. What's more, I'd managed to give every boy individual feedback on his answer.

Hattie says:
"The aim is to provide feedback that is 'just in time', just for me', just for where I am in my learning process"
(Hattie, J 'Visible learning for teachers, maximising impact on learning', Page 122)

The feedback I'd given related to exactly where each boy's tweet suggested he was in his understanding of the issue at hand.

This felt like a 'pretty good day at the office'.

Tuesday, June 24, 2014

GoogleDocs and the development of thinking and wriitng


One of our school curriculum goals over the past three years has been the development of staff skills in the use of critical literacies (including deeper thinking and analysis, critical questioning, and writing) with the aim of improving NCEA performances. My subject area is economics. In my classes I have been working to develop tools that promote a better standard of analysis and writing from students.

My approach has encompassed three strategies:

  1. Creating a writing scaffold based upon best advice from my English teaching colleagues on how to structure a paragraph, using the S.E.X.  (Statement, Explanation, eXample) framework
  2. Applying the SOLO framework to enhance the quality of student thinking and analysis.
  3. The use of GoogleDocs as the writing tool.
  4. Providing improved feedback to students (according to Hattie, feedback has a high effect size in terms of its impact on learning ("Visible learning for teachers: Maximising impact on learning", John Hattie, P255) ).


As the three external economics standards at Level 2 now count for UE literacy, it seemed natural that good writing should be an imperative of the development of thinking in economics.

Philosophically, I believe that NCEA as an assessment framework is fundamentally about thinking (which is not to say that every subject area has got that right with every standard). I also believe that thinking cannot take place in the absence of knowledge (although I was challenged in this idea at the recent Edutech conference. Is it 'knowledge' of 'knowing' that is the new imperative?). So the development of students' ability to reason within the knowledge framework of economics seems to me to be my 'main game'.

Research suggests that we write more when we write electronically, and we write better (for and example of the research see "Meta-analysis: Writing with computers 1992-2002", Goldberg, Russell and Cook, December 2002).

Many of my colleagues and I had noted the minimalist imperative that has pervaded teenage boys' writing. In addition my own hand writing has always been at best deplorable, and so when marking their work my feedback was both minimal in volume, and at best difficult to read. I therefore felt that if I could get boys writing electronically I was likely to see better writing from them. I also believed that I was more likely to give them more effective feedback, feedback that they could tread and act upon.

My first step was to set up an electronic task structure that used the SOLO framework. I created a series of appropriate questions that evolved through the year as students built their base knowledge. These were physically structured into a table format.


These questions were set up in a master document that I created in GoogleDocs. The table allowed students to structure their answer using our paragraph structure, and also provided a dedicated space in which I could write feedback.

As the subject of economics is a 'high user' of graphical models (and the GoogleDocs draw tools are not yet as sophisticated as I would like) I provided a series of diagram/model templates that students could use. They are required to copy/paste the appropriate template into their answer, and then reference it in their writing.

The tasks are then arranged in course order, with headings that are set into a table of contents at the start of the document. The document also starts off with a simple reminder of how the SOLO framework works, and an exemplar on how it is used.



Finally I shared the document with all students in my classes using the Hapara 'Teacher Dashboard'.

The results have been very positive. Boys write more, and they write more effectively/coherently.

Here is a snapshot of some writing:



I also give more and better feedback, much more akin to 'coaching' (in the spirit of best practice with formative assessment there are no grades allocated for this work):


Conclusions:

  1. Student thinking and writing has improved.
  2. The quality and quantity of my feedback has improved.
  3. Overall NCEA grades have improved
  4. Student engagement in their writing seems to have improved (I have NO empirical evidence to support this by the way, simply that age old, but much over rated, teacher 'feeling').
  5. I have no means of determining whether the use of GoogleDocs has been the major contributor or not.
Overall however, in the absence of valid replicable research data, I would still find it hard to abandon this tool as an effective means of developing improved student thinking and writing.

Thursday, June 19, 2014

Google Add-ons .. wow!!!

Today a good colleague introduced me to a new Google Add-on - 'Texthelp - Study Skills'. I showed the tool to two classes today, and frankly they were wowed.



Here's the problem that has been looking for this solution. As a laptop school our students now mostly take notes most of the time mostly on their laptops.. you get the idea. Our concerns have included their ability to keep their work organised, and to retrieve that work and use it purposefully when revising.

So they have notes that they may have taken in GoogleDocs, or Word (mostly), and then there are those articles that we have distributed via our learning management system Moodle as pdfs.

I showed the boys how to:


  • Import a pdf into their Google Drive and convert into a Google Doc. Of course the tool can be used directly with their own notes if they have used Google Docs to take notes in the first place, no conversion required.
  • Highlight using the Add-on "Texthelp - Study Skills"



  • Collate their highlights into a summary document using the Add-on
  • Finally (and this isn't the Add-on) to write their own summary/synthesis of their highlights in order to cement their learning.
What a great way to engage students with text. Interestingly I was attending some staff PD this afternoon, and grabbed an online journal article to support the topic we were discussing. My first response was to import it into my Google Drive, covert it, and start annotating using this tool. Great.


Wednesday, June 18, 2014

Magic can be taught ... yep


'Anticipatory reading guides' and literacy development

Twenty-first century technology is making communication ever easier as it connects us around the globe. The fact that we are capable of communicating more easily does not however mean that communication itself becomes easier. Good communication has always (and most probably always will) demanded the skills to decode for meaning, and to express meaning and deep thinking.

The NZ national curriculum identifies five key competencies:
·       thinking
·       managing self
·       relating to others

“Using language symbols and text “is a nice ‘eduspeke’ way of talking about communication, and we probably most commonly associate the first two with our academic work, although all are important as the bases for those behaviours that generate success.

The KCs are intended to underpin our work in all areas of the curriculum, and this in itself makes us all teachers of literacy in some way shape or form, albeit that individual subject literacies might vary across learning areas.

In those subjects that are text rich the traditional skills of reading and writing are brought into relief and coupled with the emphasis on critical thinking conversation easily transitions to the term critical literacy (although the exponents of ‘critical literacy’ would no doubt take me to task for too lose a use of the term).

My basic ‘thesis’ when it comes to literacy concepts is that secondary teachers in New Zealand have not traditionally been trained as teachers of their subject literacies. It is in my opinion one of many fundamental flaws in our pre-service teacher education. I am no exception to this claim, and so have had to work hard to up skill myself. I’ve had to become the learner; that’s refreshing.

My first attempts at improving student literacy came from the development of more effective writing scaffolds and tools.

More recently I have shifted my focus to promoting reading for meaning. While completing a ‘Secondary literacy’ course run by UC Ed+ several years ago I became acquainted with a nice tool called the ‘Before and after’ grid that promotes reading for meaning.

The grid looks like this:

Before reading
Place an X under either Agree or Disagree, reflecting your pre-reading opinion on each statement.
After reading
Place an X under either Agree or Disagree, reflecting your reading opinion on each statement. Find the evidence from the reading to support your view after reading. Copy and paste the evidence into the right hand column

Agree
Disagree
Statement
Agree
Disagree
Supporting evidence
1.    
2.    
3.    
4.  
5.  
6.  

(Blogger formatting limitations have confounded me here, but you can find the grid on the TKI page).

The process is as follows:
1. Select the reading that you intend to use. 2. Identify a series of statements that relate to the reading. These are pasted into the ‘Statement’ column. Where appropriate (and it may not always be so) use the SOLO framework to ensure that you cover both the surface and deeper thinking that comes form the article. Don’t just seek regurgitation of fact. 3. Before allowing the students to read the material that you have selected, ask the students to read the statements, and then decide whether they agree or disagree with the statements. Check the appropriate column on the left hand side of the table. When I do this in class time (usually it is done at home) I ask them to complete this part in silence so tat their response represents what THEY currently think as individuals. 4. Now have the students read the resource material. They must then decide whether they still agree or disagree with the statements that you made. They mark their new thinking in one of the two columns on the right hand side. They must also identify the evidence in the article that supports their view. This evidence is written in to the ‘Supporting evidence’ column on the right. 5. Ask students to highlight those rows where they have changed their minds as a result of reading the resource material These rows essentially represent ‘new learning’ for the students.

Now this is NOT an original resource. I have blatantly copied it from the one supplied at the Secondary Literacy course. I have found the grid presented as an ‘Anticipatory reading guide', on TKI.

I have now added an additional element. In a box below the reading I pose a question based on the reading, and ask the students to answer that question. They are encouraged to use evidence from the article to support their answer. This tests their learning/understanding from the reading.

My early work with this tool was undertaken on paper. However it was a very simple exercise to adapt this to a Google Docs format. The tables were easily drawn into Google Docs, and it has been a simple matter to paste suitably attributed reading material into the same doc. It became a matter of doing a ‘copy and paste’ when finding the supporting evidence, but that is of little import as they must have read the evidence in order to understand whether or not it supports their decision. The Google tool gave me the opportunity to give more detailed feedback to students on the Google Doc.

I developed a document that contained a series of these tasks, with a ‘Table of Contents’ at the beginning, so students can easily go to the task I want them to complete. Topics are now introduced using the 'Before and after reading' grid, and I use the approach to help students to dig more deeply into readings on more complex topics, especially what I would call 'real world' readings written by experts in the chosen topic.

The use of the Google Docs tool means that I get better written responses where I pose the final question to test understanding. I also seem to get better reading responses as students read the material more carefully and thoughtfully. Obviously I can't claim that the use of eLearning tools has done this, but it's interesting speculation isn't it.

Wednesday, March 5, 2014

1:1 - the early days.

It has been four weeks since we embarked upon our own 'brave new world' with the implementation of our 1:1 laptop programme. We invested a lot of time in staff development and in trialling and discussion in the two to three years prior to this. We undertook extensive preparation trying to contemplate things that might go wrong, knowing that we could never anticipate everything.

What have I seen?

On the positive side I don't think we've seen major problems. We anticipated lots of potential problems and I think we've been successful in being prepared to meet those problems. Things like preventing boys leaving laptops lying around as they attended school assemblies or chapel services.

However what I don't believe we've seen so far is radical transformation of learning. There has been no revolution in the classroom.

What I think we have seen is the beginnings of a gradual transformation. Some staff have shown a willingness to look at new ways of doing things, a willingness to take a look at interesting looking apps in order to consider how they might improve learning.

I don't think that that approach is any different to anything we have ever seen before. When overhead projectors were first introduced I don't recall teachers having a clear idea of how they would use them (I do remember their early days). Instead teachers used them to do what they'd always done, and then began to see new ways of using the technology.

I suspect that when chalk boards were first introduced it took a while to see how they might be used in a way that was different to a slate.

I think that at a professional level teachers are no different to any other work group. There are those who are innovators and early adopters, there are those who will adopt practices established by those early adopters, and those who will resist change. We are however seeing more conversations amongst staff about teaching and learning.

Deep down I reckon that most teachers are fundamentally creative. They want to do cool things, they want to use cool stuff. They just need to align those things with their personal paradigms of teaching and learning. There are (as there have always been) those teachers whose paradigm is more constricted than others. Consequently their uses of laptops in classes are less imaginative than for others. Fortunately I don't see those teachers standing in the way of the others who are keen to get on with the job.