jump to navigation

Lazy writing … or pressures of deadlines? May 3, 2015

Posted by IaninSheffield in Musings, research.
Tags: ,
add a comment

flickr photo by Patty Marvel http://flickr.com/photos/pattymarvel/16304315951 shared under a Creative Commons (BY-NC-ND) license

I’m not sure when I started becoming a much more critical reader/consumer of information delivered through the Internet, or whether being ‘picky’ is just a personality trait, but recently one particular stream of information has become an itch I need to scratch.

For some while I’ve found Edudemic to be a helpful source of inspiration and information; its RSS feed has always found a regular place in my reader. From interesting and helpful ideas, emerging educational technologies and provocative articles, I’ve regularly found something to stimulate my thinking. Recently though, my spidey-sense has started to tingle when reading some of the articles. If someone writes

Although 80% of K-12 teachers do have social media accounts, such as Twitter for personal or professional use, most of them don’t integrate them into classroom lessons.

that figure makes me sit up and take notice. 80%? As much as that?! I don’t get the impression from colleagues that it’s as high as that, but maybe I’m missing something. First thing then is to check the source of that figure; the link was, as you can see, helpfully provided.

The survey from which the data arose was undertaken by the University of Phoenix and the article helpfully provided a brief overview of its survey methodology. As an ‘online survey’ wouldn’t it be fair to say that it drew from a skewed population? So rather than 80% of teachers, we’re already at 80% of teachers who would complete an online survey. What we don’t know is how the survey respondents were recruited; it could have been through social media sites and if it was, then how is the 80% figure now looking in terms of being reflective of the whole teaching profession? (The University of Phoenix article did provide contact details for anyone who wanted more details on the methodology, so we could doubtless find our answers)

Having spotted one instance of data perhaps not telling the whole story, other examples started to become apparent. In “Teachers Guide to Polling in the Classroom” we’re told that

Research has shown that students absorb new information into chunks, with 20 minutes being the limit for that information to go from short-term to long-term memory.

Which seems somewhat familiar, but rather than take it on face value, we can follow the helpful link. This take us to a brief article on “Use It, or Lose It! Retaining New Knowledge with E-Polling” within the Colorado State University website. This actually cites what sounds like an academic article (Orlando, 2010) and in the references provides a link … to another brief article on polling technologies. This casually mentions

… it’s been proven that most people can only retain about 20 minutes of content in our short-term memory before we have to reflect on it in order to move it to our long-term memory …

yet fails to tell us where the evidence for this claim can be found. So research may indeed have shown that ‘students absorb new information into chunks,’ but at least do us the courtesy of providing a specific citation for the actual source, rather than bouncing us around a couple of other articles, neither of which provide a foundation for the claim.

A final example (The Four Negative Sides of Technology) offered up plenty of threads at which to pick, including

More than a third of children under the age of two use mobile media.

Whereas what the report actually said was “38% of all children under 2 have ever used a smartphone, tablet, or similar device.” ‘Use’ as opposed to ‘have ever used’ might seem to matter little, but I’d argue there is quite a significant shift in emphasis by changing the phrasing slightly. In the same article there’s

A report from the United Kingdom revealed that kids who use computer games …

where the link doesn’t actually go to a report, but to a Telegraph article about the report, which inevitably includes journalistic license. Why not simply link directly to, and quote from the report itself?

I suppose I’m simply being pedantic, expecting authors who are trying to convey a particular message and who might be on tight deadlines, to be completely rigorous and accurate in their referencing. Or with the impending election here in the UK, perhaps my cynicism filter needs recalibrating. And yet with the ease with which headlines can be quickly bounced around the Internet these days, if people don’t take the time to verify for themselves the claims that are being made within an article and simply take a headline Tweeted out at face value, groupspeak and the echo chamber become the norm. That’s beginning to bother me.

The Interconnected Model – Part 2 March 28, 2015

Posted by IaninSheffield in CPD, research.
Tags: , ,
1 comment so far

As I mentioned in the preceding post, I wanted a way to explore the RiskIT Week programme we recently undertook in school. This is our third year of RiskIT and I felt it was time to focus in a little more closely on how it works, so wondered whether the Interconnected Model might provide a useful lens. Let’s consider how the Model might look for a particular individual then.

During the preliminary week of RiskIT, colleagues offer brief sessions sharing interesting practice where they enjoyed some measure of success. Let’s imagine Sarah attended a session where Paul was showing how he’d used Google Slides to on a collaborative group project with his Y10 class.

riskit1Sarah was sufficiently inspired to try it out in one of her Y9 lessons (1) and could then add that technique into her professional repertoire (2), becoming slightly more capable as a user of learning technology and having a new way through which to undertake collaborative work. Subsequently whilst reviewing a project she had done with the same Y9 group, she found that several of the students had transferred what they had learned in the RiskIT lesson to help them complete their project (3). This caused Sarah to reflect on the consequences of the lesson in a deeper way and helped to further embed what she had learned about collaborative work and Google Slides (4).
Of course different participants might have completely different models.

riskit2James was recently at a subject co-ordinator’s meeting where someone had demonstrated using Socrative as a lesson exit ticket system (1). Having been concerned for a while that he wanted a quicker way of scanning his classes for how much they had understood during lessons, he decided to try it with his Y11’s to establish how well the group had understood the introduction of difficult concept (2). The intention was to use the feedback from the class to prepare the follow-up lesson. Unfortunately, he hadn’t allowed sufficient time at the end of the lesson for the students to power up the laptops, log on, access his exit ticket, then log off and put the laptops away. He got very little usable information. Following a rethink (3), in preparation for a repeat with his next group, he asked if those who had them would use their smartphones (4). This time everything was completed in a few short moments (5) and he had the feedback he needed (6).

Although reflecting on the activities in this way is useful for me, it would be so much more powerful for colleagues to reflect on their own undertakings with a view to exploring what went well and what might need further attention (and how to go about that).

As I’ve started to look at our RiskIT in this way, I can see where our emphasis might need to shift for next year. Although the project closes when we share our Risks amongst each other, what we don’t do so well is to share our reflections on the outcomes. But then again, would people be able to find the time to read or hear about their colleagues’ experiences? Perhaps the most important bit of all?

In the concluding post of this series, I’ll consider some of the implications that taking a perspective using the Interconnected Model has revealed.

The Interconnected Model – Part 1 March 24, 2015

Posted by IaninSheffield in CPD, research.
Tags: ,

I recently came across an interesting paper1 on teacher professional growth in which the authors propose a model to examine and explain teacher change as a complex, interwoven learning process. Where professional development programs based on a deficit-training-mastery model have largely failed to effect teacher change, those initiatives which enable greater agency and which allow (encourage?) teachers to become active learners who reflect and act on their learning have proven more effective.

Clarke and Hollingsworth developed the Interconnected Model (IM) to describe how this might happen, locating change in any of four connected domains:

  • Personal domain (teacher knowledge, beliefs and attitudes),
  • Domain of practice (professional experimentation),
  • Domain of consequence (salient outcomes),
  • External domain (sources of information, stimulus or support).
the interconnected model of professional growth

creative commons licensed (BY-NC-SA) flickr photo by ianguest: http://flickr.com/photos/ianinsheffield/16710551680

The mediating processes of ‘reflection’ and ‘enactment’ can translate changes from one domain into another, so for example, undertaking a new practice in the classroom might cause one to reflect in such a way as to change one’s attitude to a particular approach. By using the IM as a lens through which to view different professional learning experiences, we can perhaps gain insights to help inform our approach to professional development or professional learning.

Let’s explore this further with a simple example from personal experience, but perhaps common across teaching practice:


In developing a new activity, or simply modifying an old one, we might use our current knowledge to plan then try out the activity with a class. Reflecting on how effective the activity was, we might then readjust our knowledge-base to accommodate that new learning for future use. Or if the outcomes were not quite as we might have hoped, we might draw further on our knowledge-base to readjust the activity to undertake a further iteration of the loop. Don’t we regularly do this if we have two or more classes running parallel through the same scheme of work? Whether the activity works first time or not, we often take another lap or two around the loop to accommodate the different learning needs of subsequent classes.

Try as we might, sometimes the activity just doesn’t seem to be successful, so here we might draw on the external domain, by perhaps discussing things with a colleague or searching for potential solutions on the Web. You might like to consider how we should adapt the diagram to reflect that.

In the next post, I’ll attempt to use the Interconnected Model to explore a recent initiative in school – RiskIT.

1Clarke, D., Hollingsworth, H., 2002. Elaborating a model of teacher professional growth. Teaching and teacher education 18, 947–967.

Definitely HandsOn … December 2, 2014

Posted by IaninSheffield in CPD, research.
Tags: , , , ,
hands on

creative commons licensed (BY-NC-SA) flickr photo by Building Unity 1000 Families: http://flickr.com/photos/buildingunity/303497031

This post might go some way towards explaining why (once again!) posts have lost their regularity recently. For the last five weeks I’ve been participating in the 3rd edition of the HandsOnICT MOOC and it’s rather sucked up my time. I’m not a ‘serial MOOC dropout‘ who visits to get a flavour of the content, the practice or the community; if the topic being covered will address a need for me, then I’m in and will do my utmost to see it through. And so it proved with HandsOn – Design Studio for ICT-based Learning Activities (DS4ICTL); I committed to the full five weeks … and full-on it proved!

This was no gentle stroll through a few interesting creative exercises or discursive mental conundrums. No watching a few talking heads, then answering a few auto-marked questions or writing a reflective post or two. DS4ICTL is delivered through a Moodle implementation, (supported by ILDE) consists of five modules of study, each with several activities including peer mentoring, facilitated by a group of experienced online tutors, in seven language streams and using Open Badges to credential the learning. Phew! I was attracted to learning about the design-based approach when creating online/elearning activities. There seemed to be plenty in there that might prove both fresh and useful in supporting me in my role in school. Additionally I’d be working on a project I needed to undertake as part of my work schedule. Good authentic, grounded learning then.

During the first week, the activities sought to familiarise us with the work environments, discussion and reflection areas and introduce us to our peers. Then over subsequent weeks we chose a project, explored the context within which it would be developed and brought some of the principles of design into realising our resource. Many of these principles were new to me and required some degree of persistence to become more comfortable with them. Perhaps that’s what contributed to the time it required each week to work through the activities? I’d decided I was prepared to allow five-ish hours a week, but actually it often transpired to be more. This was a MOOC; there was no compunction for me to do that, but somehow this was different. It mattered. It felt … professional. (And I mean that in several ways)

Given the amount of time it required, one would hope I gained something from the experience and of that, I have no doubt:

  • It extended my learning – I became more familiar with how to use design principles in creating learning activities; about using personas, scenarios and prototyping; heuristic evaluation; andragogy and heutagogy.
  • It extended my personal learning network – despite the large numbers in the MOOC, there were fewer in the English language stream and only a handful who were clearly out to complete in the scheduled time. Since we were often exchanging views and ideas with the same people, it allowed a greater degree of familiarity than we might usually expect in a MOOC.
  • It developed my skills – we worked in several environments for different aspects of the course, thereby gaining a breadth, if not depth, of experience in new workspaces.

I was impressed by how quickly issues were resolved, either by the tutors who were clearly committed to the course, or by peers, who were clearly switched on. As a result, I now have the framework within which to build a resource I’ve been meaning to produce for some while. It’s sufficiently developed (and hopefully robustly designed!) and ready to deploy, so that colleagues will hopefully be enjoying the benefits in the very near future.

In addition to the demanding time commitment, there were other aspects of the course I found tough:

  • Maintaining station within the course timeline. I found that when I slipped slightly behind, despite the notion that participants could work at their own pace, I floundered. This was because I felt out of place; uncomfortable commenting on the posts of those further forward and less in touch with those following behind. Furthermore, committing to supporting and learning from those at the same point in the course with you meant you had less time to devote to those further back on the timeline; those who might in fact benefit from a little extra encouragement.
  • Peer mentoring. Commenting on people’s posts in discussions is fine; I’m used to that, but providing the formal feedback using a scoring rubric was much harder. Applying the rubrics were fine, but trying to offer supportive feedback when criteria hadn’t been met, especially when you’re dealing with fellow professionals who you don’t know, isn’t easy. There’s the temptation to be more lenient than perhaps we might with our students; after all it’s only a MOOC that someone’s taking part in out of interest. It’s hardly a high-stakes environment. On one shoulder I had the hard-nut angel that was my professional integrity and on the other the sweet angel who sees no value in upsetting someone for no reason. Who won? Well you’ll have ask those whose contributions I evaluated. I’d also add here the frustration I’d sometimes feel if an assessment had asked the learner to provide links to ‘a’ and ‘b,’ but the learner only provided ‘b’ with no explanation why ‘a’ was missing. Obviously there’s no compulsion to complete everything or even anything within the MOOC, but when a peer is relying on you being clear in order to fulfil their own obligations … well, like I said, frustrating.
  • Pitching responses appropriately. Linked with feedback I also found it harder than usual knowing how to pitch responses to people’s comments. When someone participates in a course in a language which is not their first, I have nothing but admiration, though that naturally demands more thought when responding to their contributions, so as not to offend. (Good experience and useful practice though, given the increasing number of students we’re welcoming from overseas).
  • Navigating the different environments. It wasn’t that I couldn’t cope with this, so much as finding it frustrating flipping from one back to the other … especially when the navigation didn’t ease those transfers (due to technical reasons arising caused by having to have different language streams). Although I managed, I suspect a MOOC novice, or someone less confident with online learning could find it rather overwhelming or intimidating.

In summary then, DS4ICTL proved to be a valuable experience; perhaps the most useful MOOC I’ve had the pleasure of participating in. It was well designed, well organised and well supported. All credit to the designers and facilitators; it must have been a mammoth undertaking. I’d suggest either reducing the content slightly, or spreading it out over an extra week, just to reduce the weekly demand. If the demographic of potential participants is those who are reasonably well along the digital literacy continuum, then it’s probably pitched well, but it’s a little too complex for novice learners I’d argue. If there was another HandsOn MOOC on a different topic, I wouldn’t hesitate to sign up.

The badges earned through the course can be viewed here. As with all digital badges, they have metadata attached enabling a viewer to establish who the issuer was and under what circumstances. Might have been helpful if the learning outcomes for each award could also be listed and even some of the evidence? Most of the badges also transferred across to my Backpack.

“Storm” … or just blustery conditions? October 29, 2014

Posted by IaninSheffield in Musings, research.
blustery condtions

creative commons licensed (BY) flickr photo by ell brown: http://flickr.com/photos/ell-r-brown/5946492853

My sense is that ICT, and the ICT community of which we are all a part, is at a crucial time in its evolution, as is the role of ICT in the education system

I was minded of the above when reading Nick‘s recent post on Learn eNabling in which he draws attention to the rising ‘tide of opinion and commentary’ asserting that technologies in schools have failed to make a positive impact on student learning or achievement. But those people are right. The evidence that technology has a significant impact on achievement or learning is notable by its absence. Or perhaps more accurately, by its lack of consensus.

Here’s why I think that might be. As Nick also suggested

…it is often difficult to establish hard evidence of improved pupil attainment as a result of using ICT. Isolating the impact of ICT from all other factors that can affect achievement can be problematic.

Balanskat et al, 2006

So the research is actually quite hard, or in some cases, even flawed.

The question of whether or not ICT has made significant impacts on a wide variety of student learning outcomes is still in doubt because of the variety of assumptions made in many research studies and the limited reliability of some research methods.

Cox and Marshall, 2007

For example

The connection between the use of ICT and the achievement of students is only valid when the means of measurement is congruent with the means of teaching and learning. In some studies there is a mismatch between the methods used to assess the effects of ICT on student achievement and on how ICT is actually used in the classroom.

Trucano, 2005

And how many studies go this far?

In order to understand the impact of ICT on learning, a holistic approach is needed that takes into account the socio-economic context, the learning environment, and teacher training

Punie et al, 2006

and I’d also add to that institutional strategies, goals and norms; external assessment regimes linked with school and teacher accountability;  and of course ongoing political agendas.

If we take a step back for a moment, are we really saying that the increased levels of technology in schools have resulted in “no significant difference?” If so then the massive levels of investment have indeed been for naught. (Here I should point out that there is no question that substantial sums of money have been and are being spent unwisely by faculties, schools, local authorities and central government, for a whole host of reasons … but that’s another post) However I think that technologies have indeed made a difference by ‘adding value’ to the learning experience and they have done so by smoothing communications and improving the connections our students can make; they have provided easy access to vast repositories of data and information; they have provided channels through which students can ‘publish’ evidence of their learning to an authentic audience; and have given learners the tools to take control of their learning. They have made what would previously have been impossible or very difficult, achievable, manageable and (relatively) easy. I readily accept that not all students, teachers or schools are doing all of the aforementioned, but many are well on the way, so perhaps this is where the impact should be sought? If seeking a difference in students attainment isn’t a realistic endeavour, then maybe we should be looking for where learning technologies can actually make a difference?

The naysayers and righteous sceptics may indeed have a point … or perhaps they’re missing it?

Ironically, it may be that the poorly resourced, inadequately trained, poorly conceptualised and inadequately operationalised forms of ICT usage so far on offer have sapped teachers’ interests, yet it is because the ICT has not been utilised as an integral part of a transformed classroom learning experience that it has failed.

Perhaps as eNOOBs, we really do have our work cut-out? Or perhaps these are the challenges to which we need to rise?

I’ll close by providing attribution for the quotes which bookended this post – Professor David Reynolds – Building an ICT Research Network Conference2001!

Cox, M.J., Marshall, G., 2007. Effects of ICT: Do we know what we should know? Education and Information Technologies 12, 59–70.
Balanskat, A., Blamire, R., Kefala, S., 2006. The ICT Impact Report: A Review of Studies of ICT Impact on Schools in Europe. European SchoolNet.
Punie, Y., Zinnbauer, D., Cabrera, M., 2006. A review of the impact of ICT on learning. Working paper prepared for DG EAC. Seville: JRC-IPTS (Joint Research Centre–Institute for Prospective Technological Studies).
Trucano, M., 2005. Knowledge Maps: Impact of ICTs on Learning & Achievement.