The Interconnected Model – Part 3 April 1, 2015Posted by IaninSheffield in CPD.
Tags: CPD, interconnected
add a comment
In preceding posts, I introduced the Interconnected Model as a mechanism through which to explore teacher growth, then discussed how that might provide one way by which to consider our recent RiskIT project. Here I’ll briefly reflect on the ways the Interconnected Model has informed my thinking and perhaps more importantly, what it has to say for our professional development agenda.
It’s perhaps important to state from the outset that this has been no more than my mental exercise; the examples referred to in the previous post used nothing more than imaginary personas. As I mentioned, it would be far more powerful and informative if we could gather the experiences of all our RiskIT participants and analyse the growth sequences they felt they undertook. Clarke and Hollingworth observed:
The non-linear structure of the model provides recognition of the situated and personal nature, not just of teacher practice, but of teacher growth: an individual amalgam of practice, meanings, and context.
By synthesising messages in the models contributed by whole community, we might be better placed to adjust our provision to better meet the needs of our participants and better allow for varied and individual growth.
Although the two examples in the previous post are fictional, they mirror reality in one sense; neither has any arrows which indicate flow to the external domain. Although we gather and share the outcomes of our Risks, the successes and failures (yes, I dare use the word!) within our community, it goes no further than that. We don’t formally reflect on the process, as we might if we were using the Interconnected Model as an interpretative tool. This means that our individual learning experiences can’t help inform each other’s knowledge and practice in anything other than a surface way. The brief information we share is decontextualised, making it harder for colleagues to see whether someone else’s journey through a particular Risk can help guide their own development. With adequate preparation, we could doubtless include a reflective element using the IM, but even if we did share those reflections within our community, would colleagues have the time (or inclination) to learn from the experiences of others? If we are to move beyond a simple surface approach, perhaps we need to commit more deeply? But to do that, colleagues will need the time and space. As Dylan Wiliam2 exhorts:
(School leaders) …should create a culture for the continuous improvement of practice, and to keep the focus on a small number of things that are likely to improve outcomes for students. In addition, they need to create the time within the existing teachers’ contracts to do this, and to encourage the taking of sensible risks.
1Clarke, D., Hollingsworth, H., 2002. Elaborating a model of teacher professional growth. Teaching and teacher education 18, 947–967.
2Wiliam, D., 2010. Teacher quality: why it matters, and how to get more of it. Paper given at Spectator ‘Schools Revolution’ conference http://www.dylanwiliam.org/Dylan_Wiliams_website/Papers.html
The Interconnected Model – Part 2 March 28, 2015Posted by IaninSheffield in CPD, research.
Tags: CPD, professional development, riskit
1 comment so far
As I mentioned in the preceding post, I wanted a way to explore the RiskIT Week programme we recently undertook in school. This is our third year of RiskIT and I felt it was time to focus in a little more closely on how it works, so wondered whether the Interconnected Model might provide a useful lens. Let’s consider how the Model might look for a particular individual then.
During the preliminary week of RiskIT, colleagues offer brief sessions sharing interesting practice where they enjoyed some measure of success. Let’s imagine Sarah attended a session where Paul was showing how he’d used Google Slides to on a collaborative group project with his Y10 class.
Sarah was sufficiently inspired to try it out in one of her Y9 lessons (1) and could then add that technique into her professional repertoire (2), becoming slightly more capable as a user of learning technology and having a new way through which to undertake collaborative work. Subsequently whilst reviewing a project she had done with the same Y9 group, she found that several of the students had transferred what they had learned in the RiskIT lesson to help them complete their project (3). This caused Sarah to reflect on the consequences of the lesson in a deeper way and helped to further embed what she had learned about collaborative work and Google Slides (4).
Of course different participants might have completely different models.
James was recently at a subject co-ordinator’s meeting where someone had demonstrated using Socrative as a lesson exit ticket system (1). Having been concerned for a while that he wanted a quicker way of scanning his classes for how much they had understood during lessons, he decided to try it with his Y11’s to establish how well the group had understood the introduction of difficult concept (2). The intention was to use the feedback from the class to prepare the follow-up lesson. Unfortunately, he hadn’t allowed sufficient time at the end of the lesson for the students to power up the laptops, log on, access his exit ticket, then log off and put the laptops away. He got very little usable information. Following a rethink (3), in preparation for a repeat with his next group, he asked if those who had them would use their smartphones (4). This time everything was completed in a few short moments (5) and he had the feedback he needed (6).
Although reflecting on the activities in this way is useful for me, it would be so much more powerful for colleagues to reflect on their own undertakings with a view to exploring what went well and what might need further attention (and how to go about that).
As I’ve started to look at our RiskIT in this way, I can see where our emphasis might need to shift for next year. Although the project closes when we share our Risks amongst each other, what we don’t do so well is to share our reflections on the outcomes. But then again, would people be able to find the time to read or hear about their colleagues’ experiences? Perhaps the most important bit of all?
In the concluding post of this series, I’ll consider some of the implications that taking a perspective using the Interconnected Model has revealed.
Definitely HandsOn … December 2, 2014Posted by IaninSheffield in CPD, research.
Tags: academic, badges, course, CPD, MOOC
This post might go some way towards explaining why (once again!) posts have lost their regularity recently. For the last five weeks I’ve been participating in the 3rd edition of the HandsOnICT MOOC and it’s rather sucked up my time. I’m not a ‘serial MOOC dropout‘ who visits to get a flavour of the content, the practice or the community; if the topic being covered will address a need for me, then I’m in and will do my utmost to see it through. And so it proved with HandsOn – Design Studio for ICT-based Learning Activities (DS4ICTL); I committed to the full five weeks … and full-on it proved!
This was no gentle stroll through a few interesting creative exercises or discursive mental conundrums. No watching a few talking heads, then answering a few auto-marked questions or writing a reflective post or two. DS4ICTL is delivered through a Moodle implementation, (supported by ILDE) consists of five modules of study, each with several activities including peer mentoring, facilitated by a group of experienced online tutors, in seven language streams and using Open Badges to credential the learning. Phew! I was attracted to learning about the design-based approach when creating online/elearning activities. There seemed to be plenty in there that might prove both fresh and useful in supporting me in my role in school. Additionally I’d be working on a project I needed to undertake as part of my work schedule. Good authentic, grounded learning then.
During the first week, the activities sought to familiarise us with the work environments, discussion and reflection areas and introduce us to our peers. Then over subsequent weeks we chose a project, explored the context within which it would be developed and brought some of the principles of design into realising our resource. Many of these principles were new to me and required some degree of persistence to become more comfortable with them. Perhaps that’s what contributed to the time it required each week to work through the activities? I’d decided I was prepared to allow five-ish hours a week, but actually it often transpired to be more. This was a MOOC; there was no compunction for me to do that, but somehow this was different. It mattered. It felt … professional. (And I mean that in several ways)
Given the amount of time it required, one would hope I gained something from the experience and of that, I have no doubt:
- It extended my learning – I became more familiar with how to use design principles in creating learning activities; about using personas, scenarios and prototyping; heuristic evaluation; andragogy and heutagogy.
- It extended my personal learning network – despite the large numbers in the MOOC, there were fewer in the English language stream and only a handful who were clearly out to complete in the scheduled time. Since we were often exchanging views and ideas with the same people, it allowed a greater degree of familiarity than we might usually expect in a MOOC.
- It developed my skills – we worked in several environments for different aspects of the course, thereby gaining a breadth, if not depth, of experience in new workspaces.
I was impressed by how quickly issues were resolved, either by the tutors who were clearly committed to the course, or by peers, who were clearly switched on. As a result, I now have the framework within which to build a resource I’ve been meaning to produce for some while. It’s sufficiently developed (and hopefully robustly designed!) and ready to deploy, so that colleagues will hopefully be enjoying the benefits in the very near future.
In addition to the demanding time commitment, there were other aspects of the course I found tough:
- Maintaining station within the course timeline. I found that when I slipped slightly behind, despite the notion that participants could work at their own pace, I floundered. This was because I felt out of place; uncomfortable commenting on the posts of those further forward and less in touch with those following behind. Furthermore, committing to supporting and learning from those at the same point in the course with you meant you had less time to devote to those further back on the timeline; those who might in fact benefit from a little extra encouragement.
- Peer mentoring. Commenting on people’s posts in discussions is fine; I’m used to that, but providing the formal feedback using a scoring rubric was much harder. Applying the rubrics were fine, but trying to offer supportive feedback when criteria hadn’t been met, especially when you’re dealing with fellow professionals who you don’t know, isn’t easy. There’s the temptation to be more lenient than perhaps we might with our students; after all it’s only a MOOC that someone’s taking part in out of interest. It’s hardly a high-stakes environment. On one shoulder I had the hard-nut angel that was my professional integrity and on the other the sweet angel who sees no value in upsetting someone for no reason. Who won? Well you’ll have ask those whose contributions I evaluated. I’d also add here the frustration I’d sometimes feel if an assessment had asked the learner to provide links to ‘a’ and ‘b,’ but the learner only provided ‘b’ with no explanation why ‘a’ was missing. Obviously there’s no compulsion to complete everything or even anything within the MOOC, but when a peer is relying on you being clear in order to fulfil their own obligations … well, like I said, frustrating.
- Pitching responses appropriately. Linked with feedback I also found it harder than usual knowing how to pitch responses to people’s comments. When someone participates in a course in a language which is not their first, I have nothing but admiration, though that naturally demands more thought when responding to their contributions, so as not to offend. (Good experience and useful practice though, given the increasing number of students we’re welcoming from overseas).
- Navigating the different environments. It wasn’t that I couldn’t cope with this, so much as finding it frustrating flipping from one back to the other … especially when the navigation didn’t ease those transfers (due to technical reasons arising caused by having to have different language streams). Although I managed, I suspect a MOOC novice, or someone less confident with online learning could find it rather overwhelming or intimidating.
In summary then, DS4ICTL proved to be a valuable experience; perhaps the most useful MOOC I’ve had the pleasure of participating in. It was well designed, well organised and well supported. All credit to the designers and facilitators; it must have been a mammoth undertaking. I’d suggest either reducing the content slightly, or spreading it out over an extra week, just to reduce the weekly demand. If the demographic of potential participants is those who are reasonably well along the digital literacy continuum, then it’s probably pitched well, but it’s a little too complex for novice learners I’d argue. If there was another HandsOn MOOC on a different topic, I wouldn’t hesitate to sign up.
The badges earned through the course can be viewed here. As with all digital badges, they have metadata attached enabling a viewer to establish who the issuer was and under what circumstances. Might have been helpful if the learning outcomes for each award could also be listed and even some of the evidence? Most of the badges also transferred across to my Backpack.
Thinking about teacher attitudes to technology May 12, 2014Posted by IaninSheffield in CPD.
Tags: CPD, elearning, SAMR, teaching
1 comment so far
If we weren’t able to help our students appreciate their current capabilities, how they might improve and how to set about that, we’d be failing in our duties as teachers. But how do we know our own level of capabilities, at least in regard to the use of learning technologies? By what yardstick can we measure our own progress? Without that, how can we even begin to see a path forward?
In the quest to find answers to these questions, I’ve come across a whole raft of contenders:
1. SAMR – Proposed by Reuben Puentedura, you can find a helpful set of resources which delve into the topic in more detail here. The model is incredibly useful for reflecting on the role of technologies in activities developed for using with our learners. Its simple four level scale, divided into the two domains of enhancement and transformation is accessible, understandable and enables teachers to quickly consider the impact that technology might have on the learning process. However measurement against the SAMR model needs to be undertaken on an activity by activity basis; in one lesson with one group of students, you might be undertaking an activity at the Modification level, whilst during the very next lesson with a different group (or even the same one) technology might simply be used at the Substitution level. That’s absolutely fine. Technology isn’t always used to take us to new places, sometimes it simply helps make a task that little bit more manageable. Some people see the levels as a ladder and that we should aspire to climb the rungs to Transformational enlightenment. So by recording all the activities we undertake using technology, progress could be measured as the overall level moves towards Redefinition. I don’t subscribe to that. If someone understands how to use technology at the higher levels and does so within their practice at appropriate times, whilst at others uses technology at the Substitution level, then that to me is acceptable. If they’re not in a position to do that, then perhaps remedial action does need to be taken.
2. To get a better overview of how technology is being used across a teacher’s practice, across the curriculum or across a school, the Technology Integration Matrix (TIM) offers itself up. Both that used by Florida and the one in Arizona have the same underpinnings and enable cross-referencing of five characteristics of meaningful technology integration at five different levels. Support for TIMs is extensive (lesson plans and video exemplars) and they offer useful lenses through which to view your own practice or that of others. The five characteristics quite rightly focus on the activities of the students and how they have been enabled or empowered to use technology … but I feel there are consequently areas within our own practice which are to some extent neglected.
3. One powerful lens through which to view the use of technology in learning is the TPACK framework1 (Technological Pedagogical and Content Knowledge) proposed by Mishra & Koehler1. This requires teachers to consider three different components of their practice. Any particular teaching situation or activity involving the use of technology will involve expertise across the three domains and require an appreciation of the roles of the technology, the subject or content and the pedagogy which enables the learning. Each teacher with each activity will encounter a unique context. In some circumstances where their content knowledge is well-grounded, they may wish to use a new technological tool and therefore need to reconsider their pedagogy, yet in another they may be teaching something for the first time and want to explore how to make the most of a tool they’re already adept with. The more often a teacher finds themselves at the heart of the diagram where all three domains intersect, or the degree to which they can see how to quickly navigate there, the more developed their practice is becoming. Powerful though TPACK may be, it is a framework more suited to deep reflection and devising appropriate curricula and lessons which incorporate the use of technology appropriately.
There are plenty of other frameworks cited in Knezek & Arrowood’s2 “Instruments for assessing educator progress in technology integration,” which can be divided into the three areas of attitudes, skill/competency and level of proficiency. Dating back to the turn of the millenium, some aspects within some of these instruments are now slightly dated, but nevertheless could be updated.
In the past we’ve asked colleagues to report on their skill levels with technology and subsequently put in place a programme to provide support. More recently we shifted the emphasis of our self-reporting process to towards capability, rather than plain skills. Now however I’m wondering whether we need to dig a little deeper and explore some of the underlying attitudes which determine teachers’ beliefs towards eLearning and technology use.
Never one to shirk a challenge then I’ve drafted a framework which draws inspiration from SAMR, TIM and to some extent CBAM (Concerns Based Adoption Model, mentioned in Knezek & Arrowood). The matrix suggests teacher attitudes at four possible levels, across ten aspects of technology integration, the idea being that colleagues would choose statements that best reflect their attitude. This would generate a profile (a radar chart might be useful here), hopefully indicating areas in which they might be open to change. If nothing else, it should provide a starting point for discussion.
The big BUT though is whether these criteria and the statements at each level are valid. What do you think? What might you add, leave out or amend? Feel free to add your observations below, or do please add comments to the draft document.
As Christensen et al (2000)3 observed
…not every educator is best served by training aimed at some arbitrary level, and that different levels of integration may require different techniques.
Before we decide on a professional development strategy, we clearly need to know the levels.
1Mishra, P., Koehler, M., 2006. Technological pedagogical content knowledge: A framework for teacher knowledge. The Teachers College Record 108, 1017–1054.
2Knezek, G.A., Arrowood, D.R., 2000. Instruments for assessing educator progress in technology integration. Institute for the Integration of Technology into Teaching and Learning, University of North Texas Denton. [online at http://www.iittl.unt.edu/pt3II/book1.htm, last accessed 12/05/2014]
3Christensen, R., Griffin, D., Knezek, G., 2001. Measures of Teacher Stages of Technology Integration and Their Correlates with Student Achievement.
You better, you better, you BETT February 3, 2013Posted by IaninSheffield in CPD, Inspiration.
Tags: #bett2013, BETT, conference, CPD
So the BETT Show shifted lock, stock and no smoking barrels from Olympia across to the Excel Exhibition Centre. How was it for you? On balance I have to say I preferred the new venue for a bunch of reasons which can be found here – Tweets about “#thingsipreferredaboutexcel”, but as for the show itself, well it was a bit of a mixed bag. Somewhat unusually I gained more from the exhibitors I visited than from the presentations I attended.
As you become a more seasoned BETTer you develop strategies for maximising the most from your day. For me there are four aspects:
- attending some of the presentations which chime with either personal interests or link with plans we have in school
- visiting the exhibitors showcasing products which either we need or are considering back in school
- wandering around and benefitting from those serendipitous moments where you might catch a product you’d not even thought about, but which might offer new possibilities.
- catching up with friends both old and new.
There were three observations that particularly stuck in my mind as I travelled home. The first was how disappointed I felt having attended the four presentations I did. This wasn’t because they were poor, in fact quite the contrary – they were interesting, well delivered and contained useful pointers to resources and ideas. My disappointment stemmed from the fact that I didn’t actually learn anything new; these were all areas in which I currently have an interest so I’ve already made it my business to find out what the current state of knowledge is and what the issues are. So maybe next year I need to seek out themes with which I’m less familiar (makes note to self). The second thing was just a wonderfully pleasant little moment as I was walking past the ‘Learning Together – heppell.net’ stand and a young chap of about 10 stopped me and boldly asked if I’d like to see the game they’d created. With that he sat me down next to his partner working at a computer, a Year 5 girl who then took me through how she’d created a simple little controllable animation in Scratch. She’d never used it before, hadn’t been shown what to do, but just followed some of the inbuilt help, experimented a little and in an hour produced a ‘game’ with which she was justifiably delighted. She could also tell me that she thought any of my year 5 students back at school would be able to pick it up as easily and year 6’s would find it a doddle.
It was whilst I was here the third thing caught my eye; the worksurfaces here were writeable and had been written on using dry-wipe markers. Jottings, notes and ideas of people as they’d be exploring some of the exhibits. Yes you’d have to be brave in certain circumstances to treat the desks or walls with this paint, but what a great idea? Brainstorming, group work and capturing discussions could all be done on work desks or walls and be available for classmates to ponder – learning made visible?
Who’d’ve thought my biggest takeaway from a technology show would be something as low-tech as a new paint?! It just goes to show what a show can show you.