What if the World goes “Meh?” November 3, 2014Posted by IaninSheffield in Musings.
Tags: #ds106, #edutalk, feedback
Was catching up a few podcasts over the weekend and dropped on one from the DS106 series on Radio Edutalk. Until now I’d not picked up any episodes in the series, since I’ve not been involved in DS106. The episode I caught was the Good Spell Episode 16 with Mariana Funes and John Johnston who were discussing the effects of audience, or lack thereof, when you’re producing online artefacts. The hosts were talking about how it sometimes feels to post creations online for review, and then get no feedback. As Mariana put it:
Sometimes you might invest a huge amount of time on something and the World goes “Meh.”
As a blogger this is certainly an issue you have to come to terms with; if a tree falls in the forest and no-one is there to hear it yada yada. Perhaps it’s simply an aspect of our web literacy we need to develop; how to cope with criticism, praise, constructive feedback … or even nothing at all.
It prompted me to think about the work our students produce and what the effects might be on less mature learners if we don’t respond adequately to the effort they’ve invested. It’s clearly got the potential to do far more harm than simply fail to help them make progress with that particular task. Their whole outlook on learning could be affected. I wonder if we take that into account when we’re worrying about the ‘marking’ load?
After this I’ll certainly be picking up a few more DS106 episodes.
VLE … not just a distribution tool? December 2, 2011Posted by IaninSheffield in Management, Resources, Tools.
Tags: feedback, history, reflection, self-evaluation, vle
A colleague came to me recently asking if there was a way our VLE might be able to help with an initiative he wanted to undertake within his department and with their Year 8 students. They’d always solicited feedback from students, but this was usually ad hoc and with the intention of informing the course and lesson structures. Their wish was to make the feedback process more structured and more useful to the students, enabling them to monitor and reflect on their progress.
Although we have a learning platform built around a SharePoint implementation, we decided on using the Feedback tool in our Moodle VLE since the resultant data could be viewed, extracted and manipulated a little more easily. Referring back to the feedback they provided would also be slightly easier for the students.
We chose a traffic-light system (Red, Amber & Green) to give a three point scale, which they would set against different aspects of their course, what they learned, what skills they developed etc. In addition each section concluded with a free-text response into which they would add action plan points.
Once all the responses are submitted, the teacher can see an overview, allowing topic or skill areas the students felt less confident with to be seen immediately … which of course enables remedial action to be taken where necessary. S/he can also see the action points the students feel they want to address, again making the choice of an appropriate course of action so much easier and hopefully subsequently more effective.
Each individual student can see a summary of their responses and print it out for future reference if appropriate. More importantly, they will be able to refer back to their responses later in the year when they repeat the process and thereby be able to see whether their action points have had the intended effects.
Here we can see the free-text responses showing the action points students had for one of the sections.
Formulating targets for self-improvement is never easy and as we can see, some of the responses perhaps need teasing out a little more. Part of the process of moving the students forward will be in helping them develop the more reflective aspects of their approach, so their action points become increasingly SMART.
Wouldn’t it be great if all subjects required their students to undertake self-reflection like this, on a regular basis so it simply became a natural part of learning? And how about if that data was fed into a central system so a student could see their progress profile across their subject range? And if their pastoral tutors (mentors) had access to that data too so that students got timely and appropriate guidance on addressing areas needing further development … and got praised for areas in which they’re improving?
OK I know. Small steps.
Deep Impact? October 18, 2010Posted by IaninSheffield in Management, Musings, research, Teaching Idea.
Tags: feedback, impact, iPod Touch, metrics, research
At a recent meeting of ICT leaders from our partner schools, my colleagues from our iPod Touch Project gave a short presentation outlining progress in the first couple of weeks. It seemed to stimulate a fair amount of interest judging by the number of questions which followed. One colleague asked how we were going to measure the impact of the project and whether any of the apps might help in this respect; he wondered whether performance in some of the maths skill and drill type apps might be monitored for improvement. A reasonable question, but one to which I suspect we already know the answer, but it got me thinking more seriously about measuring impact and whether crude ‘test-like’ performance indicators were actually what we ought to be using as impact metrics.
A good starting point might be to say what we mean by ‘impact.’ In the major ‘ImpaCT2’ study undertaken in 2002, impact is the result of an intervention intended to achieve an underlying policy goal; in this case clearly focused on the effects on pupil attainment as measured by national tests. In a similarly wide-ranging study undertaken in Nordic countries, the focus was more on pupil learning and explored pupil performance, the teaching and learning processes and communication and co-operation. Arguably less quantifiable than ImpaCT2, the Nordic study explored the perceptions of pupils, teachers and parents. Given the depth and rigour of these (and other) studies, perhaps it is disappointing that the degree of impact is lower than we might hope, given the level of investment in ICT.
At present the evidence on attainment is somewhat inconsistent. And. The literature is very positive about some aspects of ICT use, rarely negative, but mainly incomplete or inconsistent. (Condie & Munro, 2007)
There’s two ways to look at this: either ICT isn’t solely about raising pupil attainment, or maybe we’re looking for impact in the wrong way.
The ICT Impact Report from European SchoolNet cautions us that
… measuring ICT impact on learning outcomes is only one area of potential impact. Much depends much on how ICT is used and so it is important to consider the factors that prepare the ground for improved learning and consequently lead to better learning outcomes. A second crucial area of ICT impact is therefore the underlying teaching conditions that promote ICT enhanced learning.
Within the four key objectives in the DfES Harnessing Technology Strategy from 2005 we find ‘sharing ideas,’ ‘providing motivating learning experiences,’ ‘building open and accessible systems’ and ‘providing online resources.’ Maybe these are areas rich for mining examples of impact? In fact we perhaps need to heed Trucano (2005):
It may be that more useful analyses of the impact of ICT can only emerge when the methods used to measure achievement and outcomes are more closely related to the learning activities and processes promoted by the use of ICTs
So maybe on our list of metrics ought to be:
- Motivation and engagement (though this has been covered in other studies like Passey et al, 2004)
- Presenting and representing information in different ways (perhaps involving multimedia)
- Classroom talk and pupil interaction & collaboration
- Personalisation of learning, targeted to the needs of each learner
- Facilitation and enabling of creativity
- Involvement, inclusion and engagement of parents in the learning process
- Level of access to ICT (anywhere/anywhen)
Though we need to keep in mind the cautionary note that
ICTs are used most by teachers to fit with traditional pedagogies, but the greatest impact is with teachers who are already experienced edtech integrators.
It’s perhaps a wise move if we return to the original premise on which our project came into being. It was intended to be a proof of concept; an exploratory study seeking to surface the issues with mobile devices (technical, pedagogical, behavioural, safety, security) in school so that we are better placed to plan our next steps forward.
How do you think we ought to measure impact?
Condie, R. et al., 2007. The impact of ICT in schools: a landscape review, Becta.
DCSF, Harnessing Technology: Transforming Learning and Children’s Services. Available at: http://publications.education.gov.uk/default.aspx?PageFunction=productdetails&PageMode=publications&ProductId=DFES-1296-2005 [Accessed October 17, 2010].
Harrison, C., 2002. ImpaCT2 The Impact of information and communication technologies on pupil learning and attainment. Available at: http://research.becta.org.uk/index.php?section=rh&rid=13606 [Accessed October 15, 2010].
Passey, D et al., 2004. The Motivational Effect of ICT on Pupils, DfES Research Report RR523. Available at http://www.education.gov.uk/research/data/uploadfiles/rr523new.pdf [Accessed October 18, 2010]
Pedersen, S. et al., 2006. E-Learning Nordic 2006: Impact of ICT on Education, Ramboll Management. Available at: http://www.oph.fi/download/47637_eLearning_Nordic_English.pdf [Accessed October 15, 2010].
Trucano, M., 2005. Knowledge Maps: ICTs in Education-What Do We Know about the Effective Uses of Information and Communication Technologies in Education in Developing Countries?. Online Submission, p.77.