Jabel Erica's E-Portfolio
  • Home
  • About Me
    • Resume
    • Contact
    • Teacher Funder
  • Rates
  • Instructional Design Works (Core)
    • EDID6501
      • Assignment One
      • EDID6501 Reflection Core
    • EDID6502
      • Improving Second Grade Mathematics in Roseau Primary School
      • Reflection EDID6502
    • EDID6503
      • Concept Map
      • Roger Schank Report
      • ADHERE MODEL
      • Concept Maps PPT
      • Group Report on ADHERE
      • Pecha Kucha - ADHERE
      • EDID6503 Reflection
    • EDID6504
      • First Phase Social Education Programme
      • Second Phase - Social Education Programme
      • Third Phase - Social Education Programme
      • Evaluation Report
      • Reflection EDID6504
    • EDID6505
      • Lessons
      • Mini Project - Rendering Teaching PPT
      • Group Project - Stahlians
      • Reflection EDID6505
    • EDID6506
      • Small Data
      • Piktochart Draft
      • Reflection EDID6506
    • EDID6507
      • Needs Assessment
      • Mini Case Assignment
      • Reflection EDID6507
    • EDID6508
      • Assignment 1 - Using HTML Code
      • GIMP Exercises
      • Individual Storyboard
      • Reflection EDID6508 - CORE
    • EDID6509
      • Individual Stint Tasks
      • Plan It! Wedding Dress and Makeup
      • Problem Solving Environment
      • EDID6509 Reflection Core
    • EDID6510
      • Final OTA
      • Top Considerations for Designing LMS Content
      • EDID6510 Reflection Core
    • EDID6511
      • Project Development
      • Team Project
    • EDID6512
      • Prospectus
      • Design Project
      • Transcripts/Videos
      • Timeline of Progress
      • Reflection EDID6512
  • Programme of Study
  • Writing Samples
    • Dreame Creative Works
  • Artworks
    • Digital Painting
  • Blog Main Page
    • Untitled
  • Story Book Illustration Style(s)
  • Home
  • About Me
    • Resume
    • Contact
    • Teacher Funder
  • Rates
  • Instructional Design Works (Core)
    • EDID6501
      • Assignment One
      • EDID6501 Reflection Core
    • EDID6502
      • Improving Second Grade Mathematics in Roseau Primary School
      • Reflection EDID6502
    • EDID6503
      • Concept Map
      • Roger Schank Report
      • ADHERE MODEL
      • Concept Maps PPT
      • Group Report on ADHERE
      • Pecha Kucha - ADHERE
      • EDID6503 Reflection
    • EDID6504
      • First Phase Social Education Programme
      • Second Phase - Social Education Programme
      • Third Phase - Social Education Programme
      • Evaluation Report
      • Reflection EDID6504
    • EDID6505
      • Lessons
      • Mini Project - Rendering Teaching PPT
      • Group Project - Stahlians
      • Reflection EDID6505
    • EDID6506
      • Small Data
      • Piktochart Draft
      • Reflection EDID6506
    • EDID6507
      • Needs Assessment
      • Mini Case Assignment
      • Reflection EDID6507
    • EDID6508
      • Assignment 1 - Using HTML Code
      • GIMP Exercises
      • Individual Storyboard
      • Reflection EDID6508 - CORE
    • EDID6509
      • Individual Stint Tasks
      • Plan It! Wedding Dress and Makeup
      • Problem Solving Environment
      • EDID6509 Reflection Core
    • EDID6510
      • Final OTA
      • Top Considerations for Designing LMS Content
      • EDID6510 Reflection Core
    • EDID6511
      • Project Development
      • Team Project
    • EDID6512
      • Prospectus
      • Design Project
      • Transcripts/Videos
      • Timeline of Progress
      • Reflection EDID6512
  • Programme of Study
  • Writing Samples
    • Dreame Creative Works
  • Artworks
    • Digital Painting
  • Blog Main Page
    • Untitled
  • Story Book Illustration Style(s)
Search

EDID6504 - Evaluation Report

Evaluation Report for Career Advancement Programme(CAP) Online
 
A Paper in Partial Fulfillment of the Requirements of
EDID 6504 – Programme Evaluation and Course Assessment Methods
Semester I, 2016-2017
 
TEAM EYEQUAD
Makeisha C. Bahadur - 00706847
Jabel E. Bercasio - 315104634
Kamar K. Maxwell - 03015654
Marvin Thompson - 309101649
 
 
 
University: University of the West Indies Open Campus
Course Coordinator: Dr. Camille Dickson-Deane
Project Submitted on: November 28th 2016
 
 

CONTENTS
Executive Summary                                                                                                  3
Introduction  
Purpose of evaluation                                                                                                  4
Background  
            Objectives of the CAP Online                                                                        4         
            Intended Outcomes of CAP Online                                                                4
Log Frame                                                                                                                 6
Evaluation Methodology and approach
            Evaluation Design                                                                                           8
            Outcome Evaluation Questions                                                                      10
            Selection of Evaluation Team                                                                         10
            Methods of Data Collection                                                                            12
            Data Analysis                                                                                                  22
Reporting Format                                                                                                     28
Ethical Issues                                                                                                             30
References                                                                                                                  32
Appendix 1-sample test                                                                                            33
Group Members Contribution                                                                                37
 
 
Executive Summary
Career Advancement Programme (CAP) Online is designed to eliminate the face to face delivery of CAP courses, which are pursued by sixth formers, by offering them online.
The continuation, sustainability and replicability of CAP Online was determined by the outcome of a mixed evaluation. This mixed method approach involves the use of a control group, pre and post testing and randomised trials. 
The evaluation sought to answer the following questions:
●Were the participants able to complete Entrepreneurship online?
●What are the differences in tests scores between control group and the participants?
●Were the participants more willing to register into other online courses after their experience?
●What are the changes to the number of sessions in the teachers’ timetable after implementation?
Surveys, meetings, interviews, differences in test scores between participants and control groups and LCMS generated data were the instruments used to collect data. These instruments facilitated the analysis of both quantitative and qualitative data that provided answers to the evaluation questions.
The analysis of qualitative data obtained from survey, meetings and interviews were collated and the main themes identified and summarised. The quantitative data were cleaned, coded and represented in charts. This quantitative data was generated from submissions, collaboration in the LCMS as well as various survey questions.
All the findings were compiled and disseminated to stakeholders through using reporting formats.
Introduction
Purpose of evaluation
1.         Document accomplishments of the programme.
2.         Ascertain feedback on the quality of activities and how they affected the results.
3.         Determine the effectiveness of CAP online to deliver Entrepreneurship.
4.         Determine & explain unintended results.
 
Background
 
 
CAP Online will offer courses such as entrepreneurship via the use of learning and content management system (LCMS) coupled with a few face to face seminars. The LCMS will include weekly activities and tasks, progress tracker, social media component to collaborate with other students in the country.
The students will be able to use LCMS to complete assignments in the comfort of their homes on any device – phone or tablet. The less fortunate students can utilize the school’s computer lab as they see fit.
Objectives of CAP Online
1.   Increase collaboration among sixth form students in CAP Programme Island wide.
2.   Deliver CAP courses such as Entrepreneurship via an online or blended medium.
3.   Ensure that all students successfully complete the ministry CAP programme.
4.   To reduce the numbers of sessions teachers are expected to teach.
5.   To create a programme that is sustainable and one that can be replicated.
 

 
Intended Outcomes
1.Collaboration among students that lead to meaningful discourse and creation of new ideas.
2.Course content is delivered effectively through the online medium.
3.Ministry of Education will use the online method for years to come.
4.Students will appreciate and enjoy the new mode of delivery
 
 
 
 
 
 
 
 
 
 
 
 
 


LOGICAL FRAMEWORK / LOG FRAME
PROJECT TITLE: CAP delivered through online medium
Objectives
Objectively Verifiable Indicators
Source of Verification
Important Assumptions
GOAL
To introduce online learning into the Jamaican Secondary School System.
 
 
 
PURPOSE
The CAP course, Entrepreneurship, is completed by sixth former students through an online medium by the end of the 2017-2018 school year.
At the end of the 2017/18 school year the CAP Entrepreneurship online course within Jamaican schools will be increased by 10%.
 
 
90% of participants will successfully complete the weekly tasks given in the online CAP Entrepreneurship
 
100% of participants would collaborate with fellow participants of a different school
Ministry of Education CAP Statistics
 
 
 
 
 
Survey
 
Weekly task submission data pulled from the LCMS
 
Interviews with participants
 
Ministry of education CAP statistics
 
CAP Entrepreneurship through online medium will be encouraged.
 
 
 
 
Participants will have access to internet and computers within schools.
 
 

 
 
OUTPUTS
Brochures on CAP online
 
 
 
 
Engagement & confidence in the CAP programme
 
Well trained in school support staff.
 
Fully operational LCMS
 
 
ACTIVITIES


Develop and disseminate brochures about CAP online
 
Selection of participants in the piloting
 
Planning and conducting sessions on the CAP online Entrepreneurship course with the participants
 
Selection of liaison teacher
 
Planning and Holding training sessions for the liaison teacher covering navigation of the LCMS, assessment and monitoring of students.
 
 
 
 
200 brochures produced and distributed to schools in 5 parishes by June 2017
 
 
Number of visits to LCMS as well as completion of tasks
 
 
Decrease in the number of calls to the help desk
 
 
 
 
INPUTS
 
Budget, training space, project evaluators, additional computers
 
 
 
 
 
 
 
 
 
Interview with school liaison teacher
 
Project Documents
 
Submission Data from LCMS
 
Evaluation of the IT and liaison teacher
 
Ministry of Education monthly CAP reports
 
 
 
 
Hire expert graphic artist to develop brochures
 
 
 
Schools along with the MOE will embrace and prioritize online learning
 
Choose the most appropriate LCMS based on research & budget
 
 

 
 
 
 

 

Evaluation Methodology and Approach
 
Evaluation design
 The outcome evaluation for CAP Online is based on a mixed method approach. It makes use of the following designs: randomized controlled trial, comparison group and pre-post comparison.
The pre-post comparison design is ideal for the CAP Online programme because it quantifies the knowledge of the group prior to and post the participation in the programme. Hort (2010) states that this design provides an opportunity for a test or other assessment to be administered at the start of the programme and at the end to see if the programme’s objectives/outcomes were achieved.  
The pretest-posttest design is simpler than other experimental designs, but it values accountability and the data collected can be used to make improvements to the activities in the programme.
Pre-post studies will allow the knowledge of CAP Online participants to be compared in an identical context over the duration of the course, using the data that is collected initially to establish a baseline of the learners’ knowledge.  It is ideally suited for assessing the outcome of varied instructional interventions.  These comparisons will be used to assess changes in knowledge, skills and attitudes as a direct result of the CAP programme.
According to the World Health Organization (2000), this is a common evaluative tool in training and education and it does not require expertise to be able to analyze the data.  
After implementing, any programme stakeholder wants to know if the investment was worth it and if the programme did what it intended. The pre-posttest evaluation design is particularly
suited to answer this question as measuring the value added by the programme is another of the strengths this design.
Pretesting data when compared against posttest provides for the delineation of participants who improved the most or the least. Pre-testing also alerts instructors of the topic, which can be exempted from the course as participants may have had a sufficient understanding and mastery of the subject matter.
Randomized comparison group
According to New Philanthropy Centre (n.d.) randomize group design is the best design. They posit that this design is the most powerful approach to creating comparison groups. They explain that at its core it is simply selecting a group for the programme and determining who will benefit from the programme and who will not randomly. Randomized comparison groups eliminate the possibility of external factors negating the results. This results in variances between the groups being a consequence of the intervention.
Additionally, New Philanthropy Centre (n.d.) suggests that randomization can be embedded in the monitoring activities of the programme, initiating the monitoring at different intervals for the groups.   If any alteration is to be made to the programme, a randomly selected group could be the beneficiaries of this and all else remain unchanged. This design can be used to draw a comparison between several things. The World Health Organization (2010) cites that one of the fundamental advantages of this design is that it facilitates the most alternate explanations following a programme.

 
Outcome Evaluation Questions
1. Were the participants able to complete Entrepreneurship online?
2. Did the participants improve their computer skills after the programme?
3. What are the differences in tests scores between control group and the participants?
4. Were the participants more willing to register into other online courses after their experience?
5. What are the changes to the number of sessions in the teachers’ timetable after implementation?
6. Do gender and location factor in course completion?
7. Did the participants engage in active collaboration via the online tools?
 
Criteria for Selection of an Evaluation Team
Members of the Evaluation Team:
1.Evaluation Manager
2.Evaluators
Choosing the Evaluators
On the ethical side of things, the members of the evaluation team must not be related or in close, familial relations with any of the stakeholders.
  • They should be subject matter experts, or very well-versed in the field that is being evaluated. For example, someone who is evaluating an online university should not only be part of the academe but must have experience in studying or investigating academic formats or the evolution of education.
  • They should be able to see the bigger picture, and see the programme not just against the local environment, but also against the international scheme of things.
  • They should be familiar with how similar programmes are handled or managed.
  • They must have skills in information technology, as there are many means of evaluation and data analysis that can be quickly performed when using software.
  • They must have specific skills in evaluation, such as “design, data collection,  data analysis and report preparation.”
 
Choosing the Evaluation Manager
  • The evaluation manager must possess all the evaluators’ characteristics, but must also be able to lead a team.
  • He must be able to identify any skill gaps among his team, and he must be able to provide a remedy for such gaps.
  • He must have people skills as he is the one who must meet with the stakeholders regularly.
 
 

DATA COLLECTION MATRIX TABLE
 
 
Data Collection Matrix for: CAP Online                General Evaluation Approach: Goal Oriented
 
 
Evaluation Questions
Information
needed
Data Sources
Data Collection Methods
Data Collectors
Data Analysis Method
Time and Location of Data Collection

Were the participants able to complete Entrepreneurship online?
Weekly submission information
Records from LCMS
Record analysis
Ministry of Education
Liaison Teachers
Quantitative
End of course: April 2017

What are the differences in tests scores between control group and the participants?
Pretest and posttest scores of the control group and participants
Teachers’ gradebooks
Collect pretest and posttest results
Evaluators
 
Teachers
Quantitative
Pretest: September 2016
Post-test:
April 2017

Did the participants improve their computer skills after the programme?
Weekly submission Information
Participants
 
Records from LCMS
Record analysis
 
Interview
(meeting)
Evaluators
Quantitative and Qualitative
Pretest: September 2016
Post-test:
April 2017

Are the participants willing to register for other online courses after their experience?
 
Registration per term
Registration records
 
Survey
 
 
Meeting
Record analysis
 
Survey Responses
 
Meeting responses
Ministry of Education
 
Evaluators
Quantitative
End of term: April 2017

 
 
 
What are the changes to the number of sessions in the teachers’ timetable after implementation?
 
 
 
 
Allotment of Hours of Work
 
 
 
Interview
 
Time tables
 
LCMS data
 
 
 
Record analysis
 
Interview
 
Meeting
 
 
 
Teachers
 
 
 
Quantitative and Qualitative
 
 
 
Pre-Term: September 2016
Post-term:
April 2017

Do gender and location factor in course completion?
Gender and schools of participants
Registration records
 
Survey
Record analysis
 
Survey Responses
Ministry of Education
 
Liaison Teacher
Quantitative
End of Term: April 2017

 
Did the participants engage in active collaboration via the online tools?
 
Collaborative tools usage data
 
LCMS
 
Survey
 
Meeting
 
Record Analysis
 
Survey response
 
 
 
Liaison Teacher
 
Ministry of Education
 
Evaluator
 
Quantitative
 
End of Term: April 2017

 
                                                           

 
Selection of Sample Size
The participants were selected using purposive sampling. Both the participants and the control group will be:
●equal number of males and females
● attend high schools in different parishes in Jamaica
●have obtained a minimum passing grade of 2 in CSEC IT
●between the ages of 16 and 19
●sixth form students
However, the control group is participating in CAP Online course of Entrepreneurship through a face to face format whilst the participants are engaged in an online programme
 
DATA COLLECTION INSTRUMENTS
 
Tool 1 - Pretest & Posttest Sample
See Appendix 1- Sample entrepreneurship test paper
Tool 2 - Survey
A.Personal Information
Gender            M          F
Age     16        17        18        19
Main area of study
Sciences          Business          Arts     Linguistics
Name of School          X                     Y
 
B. Online weekly activities
1. Were the instructions clear?  
●Never
●Few times
●Sometimes
●Most of the times
●All the times
2. The course material helped me understand the course better.
● Never
●Few times
●Sometimes
●Most of the times
●All the times
3.Were the activities interesting?
●Never
●Few times
●Sometimes
●Most of the times
●All the times
4.Were you able to complete the activities in the allotted time?
●Never
●Few times
●Sometimes
●Most of the times
●All the times
5.Was the workload manageable?
●Never
●Few times
●Sometimes
●Most of the times
●All the times
6.How many hours [weekly] do you spend on the activity?
 
a. Less than 5             b. 5-10 hours               c. greater than 10 hours
 
 
7.Which weekly activity did you complete? Place a tick
 
Week 1
Week 5

Week 2
Week 6

Week 3
Week 7

Week 4
Week 8

 
8. State reasons you were unable to complete the weekly activities
●N/A all completed
●Too difficult
●Didn’t understand
●Workload unmanageable
●No internet access
●Other ____________________________________
 
C. LCMS Learning and content Management system
1.What do you think is the biggest advantage of CAP online ?
●Convenience
●Collaboration with peers from other schools
●Meeting individual learning needs
●Other _________________________
2. What do you think is the biggest disadvantage of CAP Online?
●Feedback not instant
●Technological problems
●Poor time management skills lead to procrastination
●Other____________________
3. Will you be inclined to pursue other courses online?
●Never
●Maybe
●Definitely
4. How many times did you log into LMS weekly?
●Never
●1-5 times
●6-10 times
●10-20 times
●More than 20 times
5.Rank the features LCMS in order of most important
●Email
●Discussion boards
●Live chat
●Calendar
●Blog
6. Describe your use of the online collaborative tools
All the time     Most of the time          Some of the time         Few times        Never
●Email                          
●Live Chat
●Blog
●Discussion Board
 
7. How difficult was it to navigate the LCMS?
●Very easy
●Easy
●Manageable
●Difficult
●Very difficult
Some of the LCMS survey questions adapted from
http://oie.gsu.edu/files/2014/04/ISAT-Supporting-Document-10-Learning-Management-System-Student-Survey.pdf

 
Tool 3 -LCMS Submission Data Log
LCMS will generate reminders that will be sent to students with outstanding work.
Ministry officials will retrieve the data monthly.
 
Student ID
Number of logins
Minimum expected forum post
Number of forum posts
Number of Assignments
Number of Assignments submitted

 
 
 
 
 
 

 
 
Student ID
% posts completed
% of assignments completed

 
 
 

 
 
 

 
Tool 4 -Meeting
The evaluators conducted a semi structured discussion.
Sample Questions - How do you feel about CAP online? Will you be willing to complete another course Online? What were some of the challenges? What were some of the advantages? How do you think doing online course impact your computer skills? Did you use any of the online collaborative tools? If so which?
The evaluators asked probing questions to engage the participants in a discussion.
Tool 4 : Teacher Interview
The liaison teachers had an informal meeting with the evaluators. The evaluators asked the following questions: How many sessions of entrepreneurship were you teaching last year? How many hours [weekly]you spend retrieving data from LCMS? How many hours [weekly]would you spend marking papers, teaching, planning lessons, sourcing resources? How do you feel about CAP Online?
The teacher underwent two interviews with different evaluators but the content of the questions remained.
 
DATA ANALYSIS
Qualitative Analysis
For this outcome evaluation, both the qualitative and quantitative data was be analyzed in various forms. The qualitative data collected was used to investigate knowledge and understanding of the effectiveness of the CAP programme, in achieving its objectives; understand students’ experience and relationships and social interactions and contextual factors. This data provided a background on the impact of the programme on the participating individuals.
The qualitative methods used to collect data were:
-       Interviews
-       Meetings
-       Analysis of records
Triangulation will be used with these three sources of data to verify and substantiate assessments by cross-checking results. The combination of findings from any three sources provided evidence of a pattern.
Inductive analysis was used as data are collected and analysed, themes, issues and trends will become evident (Patton, 2002). The process of gathering qualitative data and analyzing it seeks to reduce the volume of data and the large quantities of data sources. As this type of data is gathered descriptive information will be used to provide an explanation or interpretation as a response to the evaluation questions. This data was then  manually analysed and coded into categories.
Morra Imas & Rist (2009) outlined a method of analysing qualitative data (p. 384).
-Set up a table with three columns: topics, quotations and findings
-Read all transcripts and notes at one time
-Organise your notes by using different coloured markers or highlighters
-Carefully reread all data from the first evaluation
-Enter all ideas and concepts related to expectations for the evaluation question
-Tally the times that those ideas and concepts (even including feelings and opinions) are mentioned from all the prepared notes
-Find the quotations that are best associated with each topic in the appropriately-named column
-Reflect on what you have so far and write an initial summary of existing points
-Write your initial conclusions under the Findings column
-Categorise your findings into simple types
Analysis of Quantitative Data
Quantitative data focuses on the responses to specific and well-wrought questions that examine the “cause and effect” or “correlation” relationships between two events. To test the causality or link between the CAP programme and children’s performance in the subject, Entrepreneurship; the quantitative data seeks to maintain a level of control of the different variables that may influence the relationship between events and recruit respondents randomly.
In this case, a quasi-experimental design was used on a control group with pre and post testing conducted. The use of surveys and questionnaires provided the evaluators with numerical data that can be statistically examined and yield a result that gives us information about the larger population.
According to Lind, Marchal and Mason (2001), “statistics is the science of collecting, organising, presenting, analysing, and interpreting data to assist in making more effective decisions”. Newcomer and Conger (2010) revealed that the levels of measurement are key factors in choosing among statistical methods. Statistical operations were used to process the quantitative data collected using scores from surveys, questionnaires, interviews, group meetings, LMS data and Ministry records.
Quantitative data were analysed using three levels of measurement:
1.   Nominal data identifies a category, such as when numbers are coded to gender. E.g. 1= Male and 2= Female.
2.   Ordinal data can be placed on a ranked scale E.g. frequency of logins, posts and submitted assignments
3.   Interval data indicates a numerical arrangement where the distance between each value is the same Eg: Very difficult to Very easy/Never to Always                                                
Descriptive statistics was also be used to determine:                                    
• measures of central tendency: way of describing a group of data to indicate the central point                                                                                                                             
• measures of dispersion: way of describing a group of data to indicate how spread out the data are                                                                                                                   
Statistical Analysis of Quantitative data
Measures of central tendency was assessed to determine variance from the mean scores between the control group and the students completing the CAP online subject.  The type of data collected, determined in part, the type of statistical analysis that will be used. The scores between the control group and the participating group are expected to be somewhat close, it is best to use the mean for the interval/ratio data. There is an expectation for the scores to vary, indicating more frequent logins by online students in the programme.
Another measure that was used for the interval data collected was to examine the variance of the scores from the mean, using a standard deviation measure. Standard deviation takes in consideration the “spread of scores on either side of the mean.” If more scores vary from the mean, then you get a bigger standard deviation. On the other hand, most of the data will find itself in the middle of a normal distribution with fewer data at extreme ends of the mean.
Not all data always have normal distributions. Some distributions have flatter curves, others have steeper curves or curves that rise at one end or the other, therefore the standard deviation will measure how closely the data cluster around the mean.
When inspecting the differences between the control group’s scores and the participants’ scores, the evaluators would have to check the value of standard deviation for both. This way, the t-test will determine whether one group has produced statistically higher scores than the other. This is a good means of analysing two groups and their means. Evaluators are then zeroing in on the mean scores of each of the two groups (control group and participants’ group).
      Linking Qualitative and Quantitative data
 For this study, data analysis methods will be limited to those identified here. These are the ones that are deemed appropriate for the study’s small sample size. When the data is collected and the results are presented, the findings of the qualitative analysis and the quantitative analysis are linked to show meaning among the relationships that are found. Quantitative data are analyzed using descriptive and inferential statistics summarizing the data and describing the central value the mean. The inferential statistics will enable evaluators to make estimates about our population, while the standard deviation will provide information about the dispersion of responses. Finally, the statistical analysis, using the t-test, will provide details on the comparison between the scores of both groups and what this means for the study.
La Polt (1997) suggested that the methodology used in evaluation of outcomes had the statistical power to justify that the programme was the sole cause of the outcomes. Factors to consider included sample size, random selection and assignment to groups and fair selection of respondents. These factors are considered critical to establishing statistical power. It was also suggested that evaluators should also seek to find other possible contributors to rule them out. We have used both qualitative and quantitative methods for the analysis of the data in this case and therefore since short surveys are administered that are relatively easy to answer, a simpler approach to analysis is adopted. 
 

 
REPORT FORMAT
 
The evaluators are ethically held responsible to report the evaluation results to all the stakeholders. The stakeholders are the Ministry of Education, teachers, principals, students and the Jamaican public. The reporting format for each stakeholder will vary.
 
1.The Ministry of Education
The Ministry of Education will receive an official document as well as an oral presentation.
●The Official Document
This document will be emailed and sent via personal courier to the officials at the Ministry of Education prior to the oral presentation.
The official document will have the following headings:
Title of the Report - Evaluation of CAP online
Introduction - includes purpose and objectives of the evaluation as well the scope of the evaluation.
Background- overview of the programme and challenges
Evaluation Methodology and Approach- evaluation outcome questions, methods of data collection and analysis, selection of sample.
Findings- major findings listed first then minor details.
Recommendations
Conclusion
Appendix- data collection instruments, samples of data collected
 
●Oral Presentation
The oral presentation aided by a PowerPoint will last approximately 20 minutes. The slides will focus on an overview of the programme, data collection methods, major findings and recommendations.
 
2. The Jamaican Public
The official document will be available for download on  the ministry of education’s website.
 
3. The Teachers & School Principals
An email will be sent to the liaison teachers and their respective principals with the major findings and recommendations. The teachers will be directed to the official document on the Ministry of Education’s website.
 
4. The Students
The students will be given a less formal document such as brochure or newsletter. This informal document will highlight students work in the programme, programme overview, major findings and recommendations.
 
                 
 
 
 

 
ETHICAL ISSUES
 
Control Groups
There are two major ethical concerns regarding the use of control or comparison groups. One concern is if creating the group deprives participants from the benefits of the CAP online programme and the second deals with the question of whether the participants’ consent should be obtained, especially since some of the students would be 16 years old.
 
Consent (informed consent) 
Informed consent is required where persons are participants in a study or programme.  Wilder research (2009) list 2 forms of consent namely passive and active. Passive consent presupposes that children can participate in the program/evaluation barring dissent of the parent or guardian.  Active consent on the other hand requires a parent’s say so prior to any interaction with the minor. The CAP programme will employ active consent as a matter of principle.
 
Comparison groups
New Philanthropy Company (n.d) suggest that the participants must be told about the purpose and design of the evaluation. Participants should be told in what is involved and care should be taken to avoid using deceptive words to entice participation.
New Philanthropy Company (n.d) proposes that a waiting list design offers a way to overcome ethical considerations surrounding determining your initial group by selecting and refusing persons. Although not everyone will benefit from the selection and participation in the intervention at the same time persons will be selected randomly over a period. This is possible when the intervention is delivered in phases.
 
 
 
 
 

 
References
Anney, N.V. &Mosha, A.M (2015). Journal of Education Practice vol.6 no. 13. Retrieved from http://files.eric.ed.gov/fulltext/EJ1080502.pdf
 
Boston University School of Public Health (n.d). Retrieved from    http://sphweb.bumc.bu.edu/otlt/MPH-Modules/EP/EP713_Bias/EP713_Bias_print.html
 
LaPolt, E.K. (1997). Ethical dilemmas in program evaluation and research design. Retrieved from http://www.socialreasearchmethods.net/tutorial/Lapolt/lizhtm.htm
 
Marsh, J. (1978). The Goal -Oriented Approach to evaluation: Critique and Case Study from drug abuse treatment. Journal of Education and Planning, 1, p 41-49.
 
Morra Imas, L. G., & Rist, R. C. (2009). The road to results: Designing and conducting effective development evaluations. Washington, D.C.: The World Bank
Munoz, L (2012). Bias in Programme Evaluation. Retrieved from https://lynnmunoz.wordpress.com/2012/02/26/bias-in-program-evaluation
 
New Philanthropy Company (n.d). Using Comparison Groups to Understand Impact.Retrieved from http://www.clinks.org/sites/default/files/UsingControlGroupApproachesToIdentifyImpact.pdf
 
World Health Organization (2000). Evaluation of Psychoactive Substance Use Disorder
Treatment. Retrieved from http://apps.who.int/iris/bitstream/10665/66584/8/WHO_MSD_MSB_00.2h.pdf
 
Wilder Research (2009) obtaining informed consent: Retrieved from http://www.ruthmottfoundation.org/wp-content/uploads/2015/08/informedconsent-8-09.pdf
 
 
 
 

APPENDIX 1- PRE-& POSTTEST


 
 
 

 
 
 

Retrieved from http://ministry-education.govmu.org/English/educationsector/seceducation/Documents/Specimen%20Paper%20Entrepreneurship%20Education%202016.pdf
 
TABLE OF TEAM MEMBERS’ CONTRIBUTION TO THE PROJECT
 
Member
Contributions

Bahadur, Makeisha
Analysis of data, research, proofreading and revisions

Bercasio, Jabel Erica
Evaluation questions, Evaluation team selection, formatting, proofreading, re-writing and revisions

Maxwell, Kamar
Project direction, research, Executive summary, Background objectives, Data collection, Report format, proofreading and structure

Thompson, Marvin
Evaluation design, Ethical issues

 
 
 
 
 
 
 
 
 
 
Powered by Create your own unique website with customizable templates.
  • Home
  • About Me
    • Resume
    • Contact
    • Teacher Funder
  • Rates
  • Instructional Design Works (Core)
    • EDID6501
      • Assignment One
      • EDID6501 Reflection Core
    • EDID6502
      • Improving Second Grade Mathematics in Roseau Primary School
      • Reflection EDID6502
    • EDID6503
      • Concept Map
      • Roger Schank Report
      • ADHERE MODEL
      • Concept Maps PPT
      • Group Report on ADHERE
      • Pecha Kucha - ADHERE
      • EDID6503 Reflection
    • EDID6504
      • First Phase Social Education Programme
      • Second Phase - Social Education Programme
      • Third Phase - Social Education Programme
      • Evaluation Report
      • Reflection EDID6504
    • EDID6505
      • Lessons
      • Mini Project - Rendering Teaching PPT
      • Group Project - Stahlians
      • Reflection EDID6505
    • EDID6506
      • Small Data
      • Piktochart Draft
      • Reflection EDID6506
    • EDID6507
      • Needs Assessment
      • Mini Case Assignment
      • Reflection EDID6507
    • EDID6508
      • Assignment 1 - Using HTML Code
      • GIMP Exercises
      • Individual Storyboard
      • Reflection EDID6508 - CORE
    • EDID6509
      • Individual Stint Tasks
      • Plan It! Wedding Dress and Makeup
      • Problem Solving Environment
      • EDID6509 Reflection Core
    • EDID6510
      • Final OTA
      • Top Considerations for Designing LMS Content
      • EDID6510 Reflection Core
    • EDID6511
      • Project Development
      • Team Project
    • EDID6512
      • Prospectus
      • Design Project
      • Transcripts/Videos
      • Timeline of Progress
      • Reflection EDID6512
  • Programme of Study
  • Writing Samples
    • Dreame Creative Works
  • Artworks
    • Digital Painting
  • Blog Main Page
    • Untitled
  • Story Book Illustration Style(s)