Hi all - just a quickie to say "IM STILL HERE"! T
hanks Bronwyn for your feedback. I hope i haven't missed anything -the only comment i could find was one on evaluation models??? So have gone hunting and am currently looking at Stakes Responsiveness Evalulation Model. I'm blown away by the presentations people have posted and had really wanted to explore making something with audio - visuals. BUT....Due to the set back in having to change my plan, my current overload of work in the "work - life" balance (like what life?) and the way that this month is disappearing - I've decided my presentation is going to be an exec summary. Will post this sometime this week.
Was interested to read the posting on Bronwyn's site re-deadlines and agreed with the comments which seem to sum up to - deadlines have their place in some circumstances and in others are not appropriate.
Speaking of deadlines - I'm aiming for the 18th July as my deadline to complete this (h0pe this is ok with you Bronwyn):]
All the best, Rika :]
Sunday, June 15, 2008
Tuesday, May 27, 2008
Week 10 & 11
Hi everyone, Thanks for all the words of encouragement! I did end up having to change my plan and here it is : Version 2.
http://docs.google.com/Doc?docid=ddc7p856_2dm8jnzcm&hl=en
Thanks for the advice on the presentation Gordon - yours looks great. Am still having microphone difficulties so mine might have to be sound free! Either that or you maybe you could all lip read it?Rika :]
Friday, May 16, 2008
Plan and Presentation Woes! Week 10
Hello you lovely people!
First up - thanks HEAPZ Gordon for the advice on how to publish my plan! Really appreciated.
Second up - have a few problems to sort out. It looks like my plan may not be able to happen. It was dependent on the willingness of the institution to participate and they may not be ready to do this. Will find out for sure early next week. I have been frantically trying to come up with another cunning plan! which i believe i have! All a bit frustrating.
Also - I have been having major problems trying to get a newly purchased webcam to work so i can do my presentation. My computer doesn't seem to like it and i can't get the microphone to work. I've just bought a new laptop and am now trying to set this up as it has an inbuilt webcam. All very time consuming when i have stacks of work and wee child to hang out with! Am still smiling though!
Was stoked to see a few other presentations up and running - good on you guys! You are legends!
All the best, Rika :]
First up - thanks HEAPZ Gordon for the advice on how to publish my plan! Really appreciated.
Second up - have a few problems to sort out. It looks like my plan may not be able to happen. It was dependent on the willingness of the institution to participate and they may not be ready to do this. Will find out for sure early next week. I have been frantically trying to come up with another cunning plan! which i believe i have! All a bit frustrating.
Also - I have been having major problems trying to get a newly purchased webcam to work so i can do my presentation. My computer doesn't seem to like it and i can't get the microphone to work. I've just bought a new laptop and am now trying to set this up as it has an inbuilt webcam. All very time consuming when i have stacks of work and wee child to hang out with! Am still smiling though!
Was stoked to see a few other presentations up and running - good on you guys! You are legends!
All the best, Rika :]
Monday, May 5, 2008
Week 9 Evaluation plan
I sent a copy of my evaluation plan to Bronwyn before the holidays and recieved some feedback. Basically that i should focus on one of my objectives and delete the other! Which i have done. So here it is again for some more feedback. http://docs.google.com/Doc?id=ddc7p856_0g9j38ddr I may still have to adjust the timeline a bit as i have to work in with the TEI that i am working with - and am meeting with them on Thursday :] All the best, Rika :]
Thursday, April 17, 2008
Week 8
Well - its been a pretty mad couple of weeks and i'm feeling a bit behind. Am offline for a week (biking around Gt Barrier Island with some mad gel friends of mine) so am hoping a late presntation from me will be ok as i've run right out of time this week!
My evaluation plan, however, is coming together quite nicely. Just needs to be negotiated and tidied up and added to here and there.
Essentially is a front end needs analysis - looking at
a. is there a need to redevelop the existing marine programme into an electronically delivered programme
b. if so - how would the design and delivery of this best meet the needs of key stakeholders and target audience
I'm using the first two steps of the model that i talked about last week (good suggestion thanks Bronwyn!)
OK- it's getting late and its been a very long day. Talk to you all in a couple of weeks!
All the best, R :}
My evaluation plan, however, is coming together quite nicely. Just needs to be negotiated and tidied up and added to here and there.
Essentially is a front end needs analysis - looking at
a. is there a need to redevelop the existing marine programme into an electronically delivered programme
b. if so - how would the design and delivery of this best meet the needs of key stakeholders and target audience
I'm using the first two steps of the model that i talked about last week (good suggestion thanks Bronwyn!)
OK- it's getting late and its been a very long day. Talk to you all in a couple of weeks!
All the best, R :}
Wednesday, April 2, 2008
Week 5&6 Evaluation Methods
Ok - here goes!
1. What type of evaluation i think will best suit my project?
A few years ago i did a job for a TEI (tertiary education institute) creating a marine studies programme for school (year 13) students that would then feed into a science degree. At the time we discussed putting the programme on line. I am interested in doing a front end needs analysis to evaluate the need for this and how it might best be designed. The front end or needs analysis is the most appropriate type evaluation as the elearning programme has not been developed yet!
2. Articles providing theory and models to assist with my front end needs analysis
I went hunting for articles that would be helpful in this project. I have to say i was amazed at how few articles I found (after A LOT of searching) that actually address evaluations at the front end. Most seem to look at evaluating existing programmes. It was interesting that many of relevant articles that i did find were focussed or written by/for business organisations. Which left me pondering the paradigms of the business and education worlds - where businesses are more likely to ascertain needs prior to development than the tertiary sector!?
Anyway - here is a brief review of two articles that i found that were useful in the context of my proposed project:
1. Spiros Ap. Borotis, and Angeliki Poulymenakou. (2004) E-Learning readiness components: key issues to consider before adopting e-Learning interventions. In eLearn 2004 Conference Proceedings, pp. 1622-1629, November 2004.
Borotis and Poulymenakou (2004) reviewed other studies and then defined seven components of e-learning readiness. These are:
1. Business Readiness: referring to the link of organisational business priorities and characteristics, to e-Learning efforts.
2. Technology Readiness: primarily the technical infrastructure.
3. Content Readiness: concerning e-Learning content material.
4. Training Process Readiness: or the ability of organisations to organise, analyse, design, develop, implement and evaluate a concrete training program
5. Culture Readiness: or an organisations perceptions and cultural parameters concerning e-Learning adoption and use.
6. Human Resources Readiness: the availability and set-up of the human-support system including the receptivity and the prerequisites of humans to learn successfully in the new environment, and facilitate itsoperation.
7. Financial Readiness: or in other words the budget allocation. Although e-Learning helps to decrease costs in training, it requires a significant investment to initialize and maintained.
Each of these components is then explored in more depth. In terms of paradigms... ummm.... I guess i can see strands of the Analytic Empirical Positive Quantitative Paradigm - as the models is pretty straight forward and looks at individual elements in isolation from their environment and in this respect reflects the rational scientific approach.
Overall I found the article helpful in developing a front end evaluation model – even though it has been written from the perspective of business rather than education institution e-learning development. Increasingly TEIs are taking a business case / business model approach to new developments which makes much of the findings of the paper relevant to the tertiary sector. I can see a place for at least an element of all seven steps in the evaluation that i am designing for my project.
2. Cook, D.A and D M Dupras (2004) A Practical Guide to Developing Effective Web based Learning, JGIM Volume 19, June 2004.
This article aims to assist in the development of effective educational websites. The authors offer a ten step model for the e effective development of web based learning.
The Ten Steps to Effective Web-based Learning are:
1. Perform a needs analysis and specify goals and objectives
2. Determine your technical resources and needs
3. Evaluate preexisting software and use it if it fully meets
your needs
4. Secure commitment from all participants and identify and
address potential barriers to implementation
5. Develop content in close coordination with website design
6. Encourage active learning—self-assessment, reflection,
self-directed learning, problem-based learning,
learner interaction, and feedback
7. Facilitate and plan to encourage use by the learner
8. Evaluate—both learners and course
9. Pilot the website before full implementation
10. Plan to monitor online communication and maintain the
site by resolving technical problems, periodically
Whilst there are elements of the other steps that i can adapt to use in my evaluation plan: the preparatory steps (pre-development) are most relevant to what I am looking at doing and include:
Step 1. Perform a needs analysis and specify goals and objectives
This includes: a needs analysis, including problem identification, assessment of learners’ needs, and assessment of the teaching environment. The authors discuss the importance of determining learners’ perceived educational needs and preferences as well as evaluating resources and barriers in the teaching environment. The paper suggests the use of the needs analysis to “develop goals and objectives that address the gap between current and ideal performance, taking into account available resources and learner
perceptions” (Cook and Dupras, 2004: 699).
Step 2. Determine your technical resources and needs
Cook and Dupras (2004) suggest that e effective course design requires an understanding of both the subject and the instructional medium and as such suggest a multidisciplinary approach with at least one team member having in-depth understanding of Internet operations is desirable. They also point out the importance of determining the technical resources and needs of learners.
Step 3. Evaluate commercial software and use it if it fully meets your needs
This step involves a critical evaluation of software choice, purchase or development. Cook and Dupras (2004) suggest that most importantly, we need to evaluate how well it promotes active learning and provides for evaluation
Step 4. Secure commitment from all participants and identify and address potential barriers to implementation
The authors state that whilst “technical needs are important in an online course… the human element is critical. Secure acceptance and commitment
from all involved—administrators and faculty in addition to learners. In our experience, potential barriers include resistance to online learning, inadequate computer skills, insufficient time, or perception that the curriculum is a low priority. Identifying barriers early allows you to address them in a timely manner—before implementation begins” (Cook and Dupras, 2004:700)
In terms of paradigm - once again i guess the rational scientific approach is evident in the sense that individual elements are looked at in isolation from the whole. The emphasis of the authors on determining what learners see as their needs provides an element of Constructivist Hermeneutic Interpretivisit Qualitative Paradigm - where the evaluator is expected to immerse themselves in the world of the learner.
Overall – I found this article refreshing in its simplicity and easy to follow practical suggestions. It provides a good overview of one approach to development that includes evaluation in at least two stages of the process.
1. What type of evaluation i think will best suit my project?
A few years ago i did a job for a TEI (tertiary education institute) creating a marine studies programme for school (year 13) students that would then feed into a science degree. At the time we discussed putting the programme on line. I am interested in doing a front end needs analysis to evaluate the need for this and how it might best be designed. The front end or needs analysis is the most appropriate type evaluation as the elearning programme has not been developed yet!
2. Articles providing theory and models to assist with my front end needs analysis
I went hunting for articles that would be helpful in this project. I have to say i was amazed at how few articles I found (after A LOT of searching) that actually address evaluations at the front end. Most seem to look at evaluating existing programmes. It was interesting that many of relevant articles that i did find were focussed or written by/for business organisations. Which left me pondering the paradigms of the business and education worlds - where businesses are more likely to ascertain needs prior to development than the tertiary sector!?
Anyway - here is a brief review of two articles that i found that were useful in the context of my proposed project:
1. Spiros Ap. Borotis, and Angeliki Poulymenakou. (2004) E-Learning readiness components: key issues to consider before adopting e-Learning interventions. In eLearn 2004 Conference Proceedings, pp. 1622-1629, November 2004.
Borotis and Poulymenakou (2004) reviewed other studies and then defined seven components of e-learning readiness. These are:
1. Business Readiness: referring to the link of organisational business priorities and characteristics, to e-Learning efforts.
2. Technology Readiness: primarily the technical infrastructure.
3. Content Readiness: concerning e-Learning content material.
4. Training Process Readiness: or the ability of organisations to organise, analyse, design, develop, implement and evaluate a concrete training program
5. Culture Readiness: or an organisations perceptions and cultural parameters concerning e-Learning adoption and use.
6. Human Resources Readiness: the availability and set-up of the human-support system including the receptivity and the prerequisites of humans to learn successfully in the new environment, and facilitate itsoperation.
7. Financial Readiness: or in other words the budget allocation. Although e-Learning helps to decrease costs in training, it requires a significant investment to initialize and maintained.
Each of these components is then explored in more depth. In terms of paradigms... ummm.... I guess i can see strands of the Analytic Empirical Positive Quantitative Paradigm - as the models is pretty straight forward and looks at individual elements in isolation from their environment and in this respect reflects the rational scientific approach.
Overall I found the article helpful in developing a front end evaluation model – even though it has been written from the perspective of business rather than education institution e-learning development. Increasingly TEIs are taking a business case / business model approach to new developments which makes much of the findings of the paper relevant to the tertiary sector. I can see a place for at least an element of all seven steps in the evaluation that i am designing for my project.
2. Cook, D.A and D M Dupras (2004) A Practical Guide to Developing Effective Web based Learning, JGIM Volume 19, June 2004.
This article aims to assist in the development of effective educational websites. The authors offer a ten step model for the e effective development of web based learning.
The Ten Steps to Effective Web-based Learning are:
1. Perform a needs analysis and specify goals and objectives
2. Determine your technical resources and needs
3. Evaluate preexisting software and use it if it fully meets
your needs
4. Secure commitment from all participants and identify and
address potential barriers to implementation
5. Develop content in close coordination with website design
6. Encourage active learning—self-assessment, reflection,
self-directed learning, problem-based learning,
learner interaction, and feedback
7. Facilitate and plan to encourage use by the learner
8. Evaluate—both learners and course
9. Pilot the website before full implementation
10. Plan to monitor online communication and maintain the
site by resolving technical problems, periodically
Whilst there are elements of the other steps that i can adapt to use in my evaluation plan: the preparatory steps (pre-development) are most relevant to what I am looking at doing and include:
Step 1. Perform a needs analysis and specify goals and objectives
This includes: a needs analysis, including problem identification, assessment of learners’ needs, and assessment of the teaching environment. The authors discuss the importance of determining learners’ perceived educational needs and preferences as well as evaluating resources and barriers in the teaching environment. The paper suggests the use of the needs analysis to “develop goals and objectives that address the gap between current and ideal performance, taking into account available resources and learner
perceptions” (Cook and Dupras, 2004: 699).
Step 2. Determine your technical resources and needs
Cook and Dupras (2004) suggest that e effective course design requires an understanding of both the subject and the instructional medium and as such suggest a multidisciplinary approach with at least one team member having in-depth understanding of Internet operations is desirable. They also point out the importance of determining the technical resources and needs of learners.
Step 3. Evaluate commercial software and use it if it fully meets your needs
This step involves a critical evaluation of software choice, purchase or development. Cook and Dupras (2004) suggest that most importantly, we need to evaluate how well it promotes active learning and provides for evaluation
Step 4. Secure commitment from all participants and identify and address potential barriers to implementation
The authors state that whilst “technical needs are important in an online course… the human element is critical. Secure acceptance and commitment
from all involved—administrators and faculty in addition to learners. In our experience, potential barriers include resistance to online learning, inadequate computer skills, insufficient time, or perception that the curriculum is a low priority. Identifying barriers early allows you to address them in a timely manner—before implementation begins” (Cook and Dupras, 2004:700)
In terms of paradigm - once again i guess the rational scientific approach is evident in the sense that individual elements are looked at in isolation from the whole. The emphasis of the authors on determining what learners see as their needs provides an element of Constructivist Hermeneutic Interpretivisit Qualitative Paradigm - where the evaluator is expected to immerse themselves in the world of the learner.
Overall – I found this article refreshing in its simplicity and easy to follow practical suggestions. It provides a good overview of one approach to development that includes evaluation in at least two stages of the process.
Wednesday, March 26, 2008
Week 4 - Models
Ummm.... Here's a summary of the different paradigms and how i see them relating to my practice and evaluation methods in general.
1. Analytic Empirical Positive Quantitative Paradigm (could they come up with a longer title???) reflectst the traditional rational scientific type approach where we use experiment and experience to explain things. This is a fairly objective (rather than subjective) view of the world! Evaluation methods that fall under this paradigm would be fairly rational - scientific and classical experimental in their approach. A lot of science type education has traditionally been consistent with this thought paradigm. In my practice I would employ evaluation methods that are consistent with this paradigm if i wanted to test a specific element of a programme using an experimental type approach. Whilst it can be handy - from a big picture how you view the world perspective this seems a fairly dry and unimaginative way to view your world! Not a very empowering way to view the world in the sense that it seems we are at the mercy of external stimulii and environments.
2. Constructivist Hermeneutic Interpretivisit Qualitative Paradigm (another succinct title!) is slightly more empowering type view of the world we humans actually construct their reality! More subjective than (1). Interestingly the researcher is more likely to be immersed in the subjects world rather than detached from the subject as in (1). Evaluating from this perspective i would immerse myself in teh world of the subject and try and understand how they had learnt and how the course had affected their construction and interpretation of reality! In my practice I think i use this informally a lot of the time when i try to get inside the head and the world of a client or institution or target group of students.
3. Critical Theory Neomarxist post modern praxis paradigm (ummmm) seems to me to add a political / social justice type bent on (2) - being concerned with who will benefit! And doing lots of deconstruction to determine who will benefit. This paradigm does make you consider underlying and sometimes hidden assumptions that might exist. Could become a bit depressing and 'conspirator' theory like if you viewed your world too much from this perspective i suspect! Ummm - have used this approach when studying and reflecting on things but can't say i have employed it in evaluating any of my own education creations. Have used it when looking critically at education resources from a Treaty / Bicultural perspective though.
4. Eclectic mixed methods pragmatic paradigm (another great title) is concerned with practical problems and has a fairly realistic view of the world as complex! Evaluating things from this perspective can get fairly complex as unlike (1) there will be multiple variables acting all at once and how do you isolate them? or do you even try? I have used this perspective to evaluate environmental education or education for sustainability programmes - i would be looking to see that they reflected thought consistent with this paradigm. In environmental education or education for sustainability we tend to view the world as interconnected and complex. I guess creating an evaluation model from this perspective would be complicated but at least it would be reasonably realistic!!!!
Ummmm - well that's it for this week! R :]
1. Analytic Empirical Positive Quantitative Paradigm (could they come up with a longer title???) reflectst the traditional rational scientific type approach where we use experiment and experience to explain things. This is a fairly objective (rather than subjective) view of the world! Evaluation methods that fall under this paradigm would be fairly rational - scientific and classical experimental in their approach. A lot of science type education has traditionally been consistent with this thought paradigm. In my practice I would employ evaluation methods that are consistent with this paradigm if i wanted to test a specific element of a programme using an experimental type approach. Whilst it can be handy - from a big picture how you view the world perspective this seems a fairly dry and unimaginative way to view your world! Not a very empowering way to view the world in the sense that it seems we are at the mercy of external stimulii and environments.
2. Constructivist Hermeneutic Interpretivisit Qualitative Paradigm (another succinct title!) is slightly more empowering type view of the world we humans actually construct their reality! More subjective than (1). Interestingly the researcher is more likely to be immersed in the subjects world rather than detached from the subject as in (1). Evaluating from this perspective i would immerse myself in teh world of the subject and try and understand how they had learnt and how the course had affected their construction and interpretation of reality! In my practice I think i use this informally a lot of the time when i try to get inside the head and the world of a client or institution or target group of students.
3. Critical Theory Neomarxist post modern praxis paradigm (ummmm) seems to me to add a political / social justice type bent on (2) - being concerned with who will benefit! And doing lots of deconstruction to determine who will benefit. This paradigm does make you consider underlying and sometimes hidden assumptions that might exist. Could become a bit depressing and 'conspirator' theory like if you viewed your world too much from this perspective i suspect! Ummm - have used this approach when studying and reflecting on things but can't say i have employed it in evaluating any of my own education creations. Have used it when looking critically at education resources from a Treaty / Bicultural perspective though.
4. Eclectic mixed methods pragmatic paradigm (another great title) is concerned with practical problems and has a fairly realistic view of the world as complex! Evaluating things from this perspective can get fairly complex as unlike (1) there will be multiple variables acting all at once and how do you isolate them? or do you even try? I have used this perspective to evaluate environmental education or education for sustainability programmes - i would be looking to see that they reflected thought consistent with this paradigm. In environmental education or education for sustainability we tend to view the world as interconnected and complex. I guess creating an evaluation model from this perspective would be complicated but at least it would be reasonably realistic!!!!
Ummmm - well that's it for this week! R :]
Subscribe to:
Posts (Atom)