Tikiwiki助手感谢您安装Tikiwiki!
To begin configuring Tiki, please 登录 as the Admin. To learn more, visit: http://www.tikiwiki.org. For help, visit http://doc.tikiwiki.org. |
Course Catalog | Chapter Directory Module D > Data sources
Data Sources
How can you measure that?Consider a cluster of questions when deciding how to measure an indicator:
Data sourcesData sources are tools, documents, and locations for information that can be designed to show what happened in your target audience.
Anecdotes:
Surveys or feedback forms:
Behavior observation or assessment:
Participant projects:
Status observation or assessment:
Coach
You’ll learn a lot by doing your logical Planning Model. Remember that you not only have the instructional modules but also the examples (available in the Cases section) to give you ideas and show you how to apply them. When you submit your logical Planning Model to the Wiki discussion page, you will get feedback and suggestions. Remember too your evaluation should be logical and honest, but will not be held to the standards of publishable research. For in-depth guidance on different forms of evaluation, see the references for Module D by clicking the Resources tab at the top of the screen.
For each outcome type, can you rate the usefulness of each type of data source? (for each type of data source, provide good/fair/weak for single-choice selection) Skills/Knowledge/Behavior (short-term to medium-term):
Data: What makes sense to use?Your common sense and knowledge of the situation will suggest what data to use. The kinds of data sources explained give the most common types. Remember that Outcomes Based Planning and Evaluation programs do not require formal research; although some programs may wish to get a consultant involved in evaluation. (References to further information can be found by clicking the Resources tab at the top of the screen.) No kind of data source is better than another. The data source chosen should depend on what is being evaluated. To indicate that children are developing a habit of reading: “# and % of Springfield students in the summer library reading program who spend at least an hour per day for independent reading for fun. Anecdotes To indicate that West Dakota residents use public library databases as a preferred source of information: # and % of WD residents who say that they are likely or very likely to use the public library databases as one of their first 3 sources of health information. Anecdotes To indicate that Student Bird Watchers learn bird-identification skills: “# and % of Student Bird Watchers who correctly identify five birds common to the area on a field trip. Anecdotes To indicate 4th-8th grade teacher-participants who demonstrate ability to teach biodiversity with inquiry-based methods: “:# and % of teachers who implement completed science curriculum unit in the classroom.†Anecdotes
Dig Deeper
Matrix of common needs analysis and evaluation data sources
Purpose Requirements Advantages Disadvantages Questionnaires and Surveys To collect standardized data from a large number of participants. Construction of survey. Paper, scannable forms, CBT embedded surveys, e-mail, intranet, or Internet can be used. If not proctored, participants cannot ask for clarification or instructions. Knowledge Assessments To assess participants' knowledge acquired through training or in the workplace or other environment. Construction of assessment test. Paper, scannable forms, CBT embedded tests, e-mail, intranet, or Internet can be used. If a pretest is used, participants may score higher on the posttest due to familiarity. Performance Assessments To assess participants' application of skills acquired through training or in the workplace or other environment. Construction of assessment (checklist, rating form). Paper, scannable forms, and interactive multimedia embedded tests can be used. Potential rater bias. Structured Observation To watch an activity and record what is seen. Construction of checklist or rating form. Objective of interest (for example, learner, designer, instructor, work sample) can be observed by a senior trainer, subject matter expert, designer/developer, evaluation specialist, supervisor, or manager. Potential observer bias. Focus Groups To explore a topic in-depth with a small number of participants. Development of session questions. Depth of inquiry possible. Potential group bias. Telephone Interviews To collect standardized reporting data over the telephone. Creation of interview transcript and recording form. Probing of incomplete answers is possible. Potential interview bias. Based on National Leadership Grant tutorial by the Institute of Museum and Library Services found at http://www.imls.gov/Project_Planning/index.asp
Choosing data sources and the people they’re applied toIf your indicators are specific, usually they will suggest the data source to measure it as well as the people in the target audience it should be applied to. What people or program data will the indicator be applied to? Consider the following issues. Examine the visuals below to explore issues related to each program. For background on programs, click Cases. Some or all of the target audience? West Dakota Library Rx Some or all of the participants? Springfield Library Summer Reading Program People other than the participants? Springfield Library Summer Reading Program Remember: Collecting data costs time and money. Collect only enough information to figure out if your program is successful, so be specific and concrete. Consider the difference in costs between collecting information about “children†and “children who need after-school tutoring.†Or do you mean “children who participate in at least five tutoring sessions? If you expect 100 children to meet this criterion, should you say instead “a random sample of children who participate in at least five tutoring sessions� Library example: Riverton Memoirs program indicatorsFor the following outcomes and indicators for the Memoirs program, try choosing data sources and how they should be applied. Outcome 1: Participants show improvement in their writing. Indicator : # and % of participants who revise five pieces, commenting on what they tried to improve in each revision. Applied To : All workshop participants Data Source : Writers’ portfolios of participants work Indicator : # and % of participants whose revised pieces (two before-and-after versions) are judged better than the originals in a blind (no writer or dates given) overall grading by a creative writing specialist Applied To : All workshop participants Data Source : Expert evaluation of participants’ work Indicator : # and % of participants whose revised pieces are judged better than the originals (for two sets) by a creative writing specialist when judged by the writer’s goals in the revision. Applied To : All workshop participants Data Source : Expert evaluation of participants’ work Outcome 2: Participants demonstrate they feel themselves to be part of a community of writers. Indicator : # and % of participants who can name three ways they feel a part of the community of writers. Applied To : Program participants Data Source : exit survey Indicator : # and % of participants who act as part of a community of writers after the program (produce writing, continue library group or join another, attend readings, read memoirs, read regularly about authors’ concerns) Applied To :Program participants Data Source : Phone interview with checklist of behaviors
|
登录 |