Saturday, March 28, 2009

Apple Mac leadership and supermarket trolleys

A couple of things happened this week as I reviewed data collection methods and the role of key stakeholders…….The first thing was watching Stephen Fry in America, principally because he was in San Francisco (I soooo love that city!). Fry was talking with the boy who practically re-invented Apple with ipod, who said that one of the reasons he loved being in America was that people listened to the ideas of others……Take home message number 1 from the week. The other was when I was verbally abused in the warehouse by a woman whose kid was standing up in a supermarket trolley. I dared to share my experience about the time when one of mine stood on the side of a trolley and bowled the whole thing over….8 month old baby strapped in the baby seat as well……to say I get a little freaked when I see unsteady babies standing in trolleys now is a little understated. However, she was not in a receptive mood, and I was listening to what she said….(some of the time…the rest of the time I was trying to escape!)…..anyway, she said that she had had four kids and none of them had anything more serious than tonsillitis so she wasn’t about to spoil her record. MMMMMMM I thought, success by another man’s measure. Take home point number 2 for the week (shut your mouth in the warehouse)…..and sensitivity to others’ parameters for success.


So, how does this fit with evaluation? Well, interestingly I was (F2F) teaching about clinical assessment this week, and the subject of how we measure ‘expertness’ came up. As a result of listening carefully to the ideas presented, we are now considering another measure that is clinically based for evaluating that. If you like, a ‘showcase’ or ‘master class’ given by those who do not enjoy or fear writing, but can ‘show and tell’. There are follow-throughs here to the Waterlow Scale e-learning unit. Rather than say questionnaires which would be my questions, or focus groups which could be difficult to arrange because of shift patterns, it may be better to look at the place of the ‘showcase’ approach for determining application of the Waterlow risk assessment to practice. The demons and dragons may be time for gathering the evidence, but the quality of the evidence is likely to be enriched. Coupled with other measures like completion rates (quantitative rather than qualitative) there is likely to be the ability to triangulate as well and to describe ‘success’ of the work in others’ paradigms.


I am also mindful that advice from Bronwyn has been to keep the evaluation small…..To this end, my decisions this week have been around tools that could be used in conjunction with my original choices about guidelines which were defining process to identify the feasibility and appropriate delivery methods and can staff and students easily use learning technologies and on-line resources. The material I have located here from Sage, directs consideration of factors important in accessibility and, would also be a good process to follow to ensure that best practices in development have been met (I will attempt to post the document, or URL to the group later today). This document will help to gather perspective of everyone and to ‘listen’ to the good ideas that sometimes come very quietly!



Tuesday, March 17, 2009

Epiphanies and other things

I have been reading the paradigm material over the last couple of days and trying to get to grips with evaluation studies done in my own sphere of expertise. It has been interesting reading especially when I coupled it with material from the marketing from articulate software.

My first encounter with e-learning evaluation paradigms left me wondering if I am indeed as critically social as I think I am.......because actually in an e-learning context, I find my persuasion toward the eclectic-mixed methods-pragmatic suprise there I guess as a nurse who is dominated by the pragmatic in both work, learning and teaching style! The bit that got me was from the Reeves' (2006) paper. Here, he suggests that there is a suggestion of ignoring what makes good instruction, and that models of instructional design can be followed yet still not acheive the goals........Oh yeah been there, I thought!! Cut through a couple of days and an email from articulate marketing........Here, Kuhlman (2009) talks about taking the perspective of the learner and avoiding being stuck in quiz hell, or limited navigation and over informing (as opposed to providing opportunities for learning).......Pullen (2006) also adds to the debate arguing that health professionals learn better when they are able to apply their work to their practice using clinical tools.

So where's the universe collision?.....Well, you may have seen my post to Michelle's blog...I was interested to praise the way in which she was was being responsive to learners needs. My quick trip into pedagogy vs androgogy also reminded me that as a precursor to evaluation, I need to be clear about my intentions: am I engaging adults...and moreover, am I engaging qualified health professionals who want to improve their work by the application of clinically relevant tools and information. It is easy to slip into information providing, rather than providing opportunities from which to learn.....So when Reeves (2006), Kuhlman (2009) and Pullen (2006) make their respective points, the truth I can find there is that I need to reflect from a learners perspective what the applicability of the learning material actually is, rather than as the F2F teacher who gives great background, empirical evidence and whole lot more introduction (because we have the time to reflect, ponder, discuss etc). My e-learning context requires of me that I should consider the previous professional experience as a done deal, or at least offer a chance to revisit relavant sources IF REQUIRED...not presume that erveyone needs everything...they are adult learners and can decide that for themselves...or go back to supplementary material if needs be. The outcome of this has been to re-do my story board for the evaluation project, and to remember that I am intending to achieve adult learner engagement; with healthcare professionals who like to apply to practice quickly and easily, and at the same time attend to Reeves' (2006) point of avoiding the meeting of goals....

I'd like to include this is the first of my formative evaluations of the project because from it has come a different way of viewing presentation of the material, with key stakeholders in mind, a relevant philosophical perspective as well as the potential to actually answer any evaluation questions in a more meaningful way.
Happy to hear your thoughts!


Tuesday, March 10, 2009

E-learning Guidelines

I thought I would start this post with a little give you some context and a way of explaining why I chose my quality issues and subsequently the particular e-learning guidelines.

I work in an environment where the patients are having elective or planned surgery. Usually, the patients are really well, have uneventful procedures and go home quickly to recuperate. Whilst we have a very high standard of nursing care and our nurses almost always get things just right, there are unfortunately occaisions where a problem develops..... as was the case when a patient made a complaint to the Health and Disability Commissioner about an aspect of care. Such action results in an investigation and the outcome often indicates that a change in practice is desirable or, at least a way of ensuring that the loop is closed for the future. The problem in my scenario related to the ongoing assessment of pressure areas...In general terms, you probably feel the need to wriggle every now and then or your bottom goes numb when you are on the computer too's your body telling you to, in some circumstances the normal pathways telling you to move aren't working and it's our job to make sure we make you way or the other!!!

Feedback from the HDC lead us down the path of auditing current practice, recommending some changes, implementing these and here's the biggie....we changed the tool we used to assess pressure areas. We implemented this change mainly by F2F sessions and did really well......however, we are a 24/7 service and there are a number of staff who haven't yet managed to get to a F2F session, and therefore haven't been taught how to use the new tool. So, (and here come the quality issues.....)

1. Is the topic suitable for an e-learning context?
2. Will that learning be applied in a practice environment in order to improve patient pressure area assessments?

We are also at the very beginning of our organisation's journey through the development of e-learning as an additional strategy for offering teaching and learning, so mindful of that, I am looking for a topic rather than a course that will act as the demonstration site for 'showing' how e-learning can be useful for us.......

How the guidelines may help:

3.1 Students/Learning Design SD7
Is there a defined process to follow that identifies the feasibility of and appropriate delivery modes for the course
2.3 Managers/Other support MO1
Can staff and students easily use the learning technologies and online resources

The short answer is 'no'.
So in this case, I would have to say that these guidelines remind me that I should go to the drawing board and look at all of the delivery options available here because whilst I might be a little gung-ho to get going on a project, this could well not be the right one. If I look at the characteristics of my remaining population for example, they are night staff and I would be asking them to use a new technology to learn a new assessment tool. Am I being realistic? There would be little or no helpdesk support over the night time. Is this fair or appropriate....sensible? However, rather than rule the project out at this stage, I also know that night staff have been great early adopters of our on-line library system. The way we rolled this out was by training the trainers....the duty managers learned how and supervised the others overnight.....this could also work here......

Furthermore, I am reminded of previous papers we have accomplished......and the fact that Salmon's (2002) model of on-line learning could be used to address the implementation of the e-version which ensures that the ability to use the technologies is addressed. Certainly, the topic would lend itself well to interactivity (believe me!) and this is supported by Mason and Rennie (2006) who suggest that interactivity has the potential to motivate and engage....the reason I suspect why the on-line library was so well received........

These guidlines are going to be very helpful in actually articulating the process that should/could be well as indicating the kinds of questions I would like to ask during formative evaluation stages.


Friday, March 6, 2009

Importance of quality in e-learning

I have been doing a little surfing this week, trying to find perspectives on quality in e-learning....and there is certainly a lot of discussion out there. But even after trawling and reviewing, I think I am in more of a position to tell you what it is not, rather than to tell you what it is and thereby being able to articulate a position on its importance.....

Let me give you a comparison...

try this:

You should find 9 characteristics of quality e-learning from the Concord Consortium who, have been researching teaching and experiencing e-learning for 15 years and are certainly in a position then to suggest elements of quality in this context. I like their thoughts. Some of these characteristics I identify with fom the perspective of an e-learner and would agree that yes, they do make the experience facilitation, explicit schedules, high quality materials....and from my (brief) e-learning teaching experience I would also agree with other aspects like ongoing assessment, limited enrolment etc..

Compare then, with the application, articulate (available at and there is a 30 day free trial) who say that you can create e-learning in 10 minutes. Well it took me 5 hours (please don't laugh!) and it's still not right, but I do like to think I aspired to creating a quality product for subsequent evaluation....... I even found myself a willing volunteer for that part when I get there (my office mate)....anyway....

I just really got the feeling that the goals of these two were different, but strangely somehow similar. Articulate do aspire to quality in e-learning in many ways......they want your product to look good (and for US$1800 for the software, it ought to!!) and for individuals to engage...they want your company to be able to deliver on time, in time education. They offer a means to have ongoing assessment, explcit schedules and high quality (looking) materials.....and they will even give you some webspace to start your own (VERY basic) LMS. So what's wrong?

We are careering through an e-learning revolution where people are making a great deal of money selling ( heck I really want to say exploiting) on-line learning. Using software like articulate you can set yourself up in a heartbeat. Take nursing for example......Nursing Council requires that all nurses can demonstrate 60 hours of professional development every three years.It doesn't seem to matter much how you achieve that on-line material is fair game. If i'd had a spare few grand I could have been in business by now potentially earning well off the back of professional knowledge and a bit of IT why wouldn't you? ......back in a minute, just off to the bank....

Nah, only kidding!!!

I have come across the term rapid e-learning and I can only think that being able to throw up on-line material is what this means ( I am happy to be dis-abused of this notion however!) But what's missing is the pedagogy to start and run with and the evaluation that you have actually done what Gagne begged of us all those years ago, 'a change in behaviour' and that you do it as Ausubel would entreat a meaningful way. So for me, quality in e-learning is about the strength of your systems and processes...yes the Dunkin and Biddle process-product, but taking account of all of those really important things that contribute to the whole meaningful learning experience and ultimately, positively impacting where it counts for you. This is the difference and, I guess the point that the Concord Consortium are making. Quality e-learning is a thoughtful, proactive process, leading to learners who reap the benefits developmentally and intellectually. This way they will keep coming back not desert for the next cool thing that superficially meets a need. I hope......


Wednesday, March 4, 2009

why evaluation is important to me

Let me say at the outset that it is....but also let me say (because i have to get this on the outside.....), evaluation is the beggar of reactive education. I've read the opinion that 'we' the teachers don't really pay attention to evaluation....but I think we do, in terms of wishing we could do more......!

So that said. Evaluation is important to me, because I want to know that the teaching and learning encounters in which the nurses engage, directly impact on the quality of care being delivered to the patients. I want to know that the knowledge they have is incorporated into the decisions that they make and is useful in helping them get to the source of the problem quicker or helps them to prevent or head off potential problems at the pass.....

I read with interest the link to the posting on project evaluation toolkit (thank-you), and was really reassured by the stattement on page 4 that the evaluators perspective influences the questions asked and strategies used. In a meeting this week for L&D, I was forced to talk evaluation in numbers to the CFO because that's what he understands....but it is alien to me to try to talk ROI in numbers. Somehow, it feels 'soft' (in a qualitative sense) to say, "well we saved lives today you know"........So evaluation is also important to me because I want to learn how I can focus on the right questions rather than the scatter gun approach and look at everything (because it IS ALL important you know!) I am interested to look at different approaches to evaluation because it may be a strategic win to evaluate in someone elses frame of reference...numbers rather than 'care'..........because at the end of the day, evaluation is a tool for me to not only to check whether I am getting the right things right, but I also think that my learning here could help me to be able to talk to those other perspectives in ways that mean something to them.....if that assists the ongoing development of appropriate patient' s all good!