A couple of things happened this week as I reviewed data collection methods and the role of key stakeholders…….The first thing was watching Stephen Fry in America, principally because he was in San Francisco (I soooo love that city!). Fry was talking with the boy who practically re-invented Apple with ipod, who said that one of the reasons he loved being in America was that people listened to the ideas of others……Take home message number 1 from the week. The other was when I was verbally abused in the warehouse by a woman whose kid was standing up in a supermarket trolley. I dared to share my experience about the time when one of mine stood on the side of a trolley and bowled the whole thing over….8 month old baby strapped in the baby seat as well……to say I get a little freaked when I see unsteady babies standing in trolleys now is a little understated. However, she was not in a receptive mood, and I was listening to what she said….(some of the time…the rest of the time I was trying to escape!)…..anyway, she said that she had had four kids and none of them had anything more serious than tonsillitis so she wasn’t about to spoil her record. MMMMMMM I thought, success by another man’s measure. Take home point number 2 for the week (shut your mouth in the warehouse)…..and sensitivity to others’ parameters for success.
So, how does this fit with evaluation? Well, interestingly I was (F2F) teaching about clinical assessment this week, and the subject of how we measure ‘expertness’ came up. As a result of listening carefully to the ideas presented, we are now considering another measure that is clinically based for evaluating that. If you like, a ‘showcase’ or ‘master class’ given by those who do not enjoy or fear writing, but can ‘show and tell’. There are follow-throughs here to the Waterlow Scale e-learning unit. Rather than say questionnaires which would be my questions, or focus groups which could be difficult to arrange because of shift patterns, it may be better to look at the place of the ‘showcase’ approach for determining application of the Waterlow risk assessment to practice. The demons and dragons may be time for gathering the evidence, but the quality of the evidence is likely to be enriched. Coupled with other measures like completion rates (quantitative rather than qualitative) there is likely to be the ability to triangulate as well and to describe ‘success’ of the work in others’ paradigms.
I am also mindful that advice from Bronwyn has been to keep the evaluation small…..To this end, my decisions this week have been around tools that could be used in conjunction with my original choices about guidelines which were defining process to identify the feasibility and appropriate delivery methods and can staff and students easily use learning technologies and on-line resources. The material I have located here from Sage, directs consideration of factors important in accessibility and, would also be a good process to follow to ensure that best practices in development have been met (I will attempt to post the document, or URL to the group later today). This document will help to gather perspective of everyone and to ‘listen’ to the good ideas that sometimes come very quietly!
Sam