您的当前位置:首页正文

LogicModel

来源:一二三四网
for Program Planning and EvaluationPaul F. McCawleyAssociate DirectorUniversity of Idaho ExtensionThe Logic Model

Planning ProcessCIS 1097

What is the Logic Model?The Logic Model process is a tool that has been used formore than 20 years by program managers and evaluatorsto describe the effectiveness of their programs. The modeldescribes logical linkages among program resources, activ-ities, outputs, audiences, and short-, intermediate-, andlong-term outcomes related to a specific problem or situ-ation. Once a program has been described in terms of thelogic model, critical measures of performance can be iden-tified.1Logic models are narrative or graphical depictions ofprocesses in real life that communicate the underlyingassumptions upon which an activity is expected to lead toa specific result. Logic models illustrate a sequence ofcause-and-effect relationships—a systems approach tocommunicate the path toward a desired result.2A common concern of impact measurement is that of lim-ited control over complex outcomes. Establishing desiredlong-term outcomes, such as improved financial securityor reduced teen-age violence, is tenuous because of thelimited influence we may have over the target audience,and complex, uncontrolled environmental variables. Logicmodels address this issue because they describe the con-cepts that need to be considered when we seek such out-comes. Logic models link the problem (situation) to theintervention (our inputs and outputs), and the impact(outcome). Further, the model helps to identify partner-ships critical to enhancing our performance.The logic model was characterized initially by programevaluators as a tool for identifying performance measures.Since that time, the tool has been adapted to programplanning, as well. The application of the logic model as aplanning tool allows precise communication about thepurposes of a project, the components of a project, andthe sequence of activities and accomplishments. Further, aproject originally designed with assessment in mind ismuch more likely to yield beneficial data, should evalua-tion be desired.In the past, our strategy to justify a particular programoften has been to explain what we are doing from the per-spective of an insider, beginning with why we invest allo-cated resources. Our traditional justification includes thefollowing sequence:INPUTSOUTPUTSShort-What we Do!•workshops•publications•field days•equipmentdemonstrationsOUTCOMESMedium-Change in:Long-Change in situation:•environment•socialconditions•economicconditions•politicalconditionsSITUATIONWhat we Invest!•time•money•partners•equipment•facilitiesWho we Reach!•customers•participantsChange in:•knowledge•skills•attitude•motivation•awareness•behaviors•practices•policies•proceduresExternal Influences, Environmental, Related Programs1) We invest this time/money so that we can generatethis activity/product. 2) The activity/product is needed so people will learnhow to do this. 3) People need to learn that so they can apply theirknowledge to this practice. 4) When that practice is applied, the effect will be tochange this condition;5) When that condition changes, we will no longer bein this situation.The logic model process has been used successfully fol-lowing the above sequence. However, according to Millaret al,2logic models that begin with the inputs and workthrough to the desired outcomes may reflect a naturaltendency to limit one’s thinking to existing activities, pro-grams, and research questions. Starting with the inputstends to foster a defense of the status quo rather than cre-ate a forum for new ideas or concepts. To help us think“outside the box,” Millar suggests that the planningsequence be inverted, thereby focusing on the outcomesto be achieved. In such a reversed process, we ask our-selves “what needs to be done?” rather than “what isbeing done?” Following the advice of the authors, wemight begin building our logic model by asking questionsin the following sequence.

1) What is the current situation that we intend toimpact?

2) What will it look like when we achieve the desiredsituation or outcome?

3) What behaviors need to change for that outcome tobe achieved?

4) What knowledge or skills do people need before thebehavior will change?

5) What activities need to be performed to cause thenecessary learning? 6) What resources will be required to achieve thedesired outcome?One more point before we begin planning a program usingthe logic model: It is recognized that we are using a lin-ear model to simulate a multi-dimensional process. Often,learning is sequential and teaching must reflect that, butthe model becomes too complicated if we try to communi-cate that reality (figure 2). Similarly, the output from oneeffort becomes the input for the next effort, as building acoalition may be required before the “group” can sponsora needed workshop. Keep in mind that the logic model isa simple communication device. We should avoid compli-cations by choosing to identify a single category to entereach item (i.e., inputs, outputs or outcomes). Details oforder and timing then need to be addressed within theframework of the model, just as with other action planningprocesses.

Planning Elements

Using the logic model as a planning tool is most valuablewhen we focus on what it is that we want to communicateto others. Figure 3 illustrates the building blocks ofaccountability that we can incorporate into our programplans (adapted from Ladewig, 1998). According to HowardLadewig, there are certain characteristics of programs thatinspire others to value and support what we do. Bydescribing the characteristics of our programs that com-municate relevance, quality, and impact, we foster buy-infrom our stakeholders and audience. By including thesecharacteristics within the various elements of the logic

Figure 2. Over-complicated, multi-dimensional planning model.INPUTSOUTPUTS42 p. curriculumOUTCOMES8 participants had increased knowledge of proper 6 participants fermentationinstalled timing techinquesequipment10 participants installed timing equipment60% of participantsincreased product yield by 15%SITUATIONResearch base, 4-weeks time, editor & print $42 page curriculum,classroom, teaching partners2 participants neglected new equipment, 12 needed retraining3-day workshop for 20 participants1-day follow-up workshop for 12 participants in follow-up had increased knowledge of techniques2

Figure 3. Structure of Acountablility.Buy-Inmodel, we communicate to others why our programs areimportant to them. The elements of accountability are fur-ther described in the context of the logic model, below.InputsInputs include those things that we invest in a program orthat we bring to bear on a program, such as knowledge,skills, or expertise. Describing the inputs needed for a pro-gram provides an opportunity to communicate the qualityof the program. Inputs that communicate to others thatthe program is of high quality include:• human resources, such as time invested byfaculty, staff, volunteers, partners, and localpeople;• fiscal resources, including appropriated funds,special grants, donations, and user fees;• other inputs required to support the program,such as facilities and equipment;•knowledge base for the program, includingteaching materials, curriculum, researchresults, certification or learning standards etc.•involvement of collaborators - local, state,national agencies and organizations involvedin planning, delivery, and evaluation.Projects involving credible partners, built on knowledgegained from research and delivered via tested and provencurricula, are readily communicated as quality programs.Assessing the effectiveness of a program also is made eas-ier when planned inputs are adequately described. By com-paring actual investments with planned investments, eval-uation can be used to improve future programs, justifybudgets, and establish priorities.SituationThe situation statement provides an opportunity to com-municate the relevance of the project. Characteristics thatillustrate the relevance to others include:• A statement of the problem, (What are thecauses? What are the social, economic, and/orenvironmental symptoms of the problem?What are the likely consequences if nothing isdone to resolve the problem? What are theactual or projected costs?);• A description of who is affected by the prob-lem (Where do they live, work, and shop? Howare they important to the community? Whodepends on them–families, employees, organ-izations?);• Who else is interested in the problem? Who arethe stakeholders? What other projects addressthis problem?The situation statement establishes a baseline for compari-son at the close of a program. A description of the problemand its symptoms provides a way to determine whetherchange has occurred. Describing who is affected by the prob-lem allows assessment of who has benefited. Identifyingother stakeholders and programs builds a platform to meas-ure our overall contribution, including increased awarenessand activity, or reduced concern and cost.OutputsOutputs are those things that we do (providing products,goods, and services to program customers) and the peoplewe reach (informed consumers, knowledgeable decision3

makers). Describing our outputs allows us to establishlinkages between the problem (situation) and the impactof the program (intended outcomes). Outputs that helplink what we do with program impact include:•publications such as articles, bulletins, factsheets, CISs, handbooks, web pages;•decision aids such as software, worksheets,models;•teaching events such as workshops, field days,tours, short courses;•discovery and application activities, such asresearch plots, demonstration plots, and prod-uct trials.The people we reach also are outputs of the program andneed to be the center of our model. They constitute abridge between the problem and the impact. Informationabout the people who participated and what they weretaught can include:•their characteristics or behaviors;•the proportion or number of people in the tar-get group that were reached;•learner objectives for program participants;•number of sessions or activities attended byparticipants;•level of satisfaction participants express forthe program.•policies adopted by businesses, governments,or organizations;•technologies employed by end users;•management strategies implemented by indi-viduals or groups.Long-term outcomes follow intermediate-term outcomeswhen changed behaviors result in changed conditions, suchas:• improved economic conditions–increasedincome or financial stability;•improved social conditions–reduced violence orimproved cooperation;•improved environmental conditions–improvedair quality or reduced runoff;•improved political conditions–improved partic-ipation or opportunity.External InfluencesInstitutional, community, and public policies may haveeither supporting or antagonistic effects on many of ourprograms. At the institutional level, schools may influencehealthy eating habits in ways that are beyond our controlbut that may lead to social change.5Classes in health edu-cation may introduce children to the food pyramid and tothe concept of proportional intake, while the cafeteria mayserve pizza on Wednesdays and steak fingers on Thursdays.The community also can influence eating habits throughavailability of fast-food restaurants or produce markets.Even public policies that provide support (food bank, foodstamps) to acquire some items but not others might impacthealthy eating habits.Documenting the social, physical, political, and institution-al environments that can influence outcomes helps toimprove the program planning process by answering the fol-lowing:•Who are important partners/collaborators forthe program?•Which part(s) of the issue can this project real-istically influence?•What evaluation measures will accuratelyreflect project outcomes?•What other needs must be met in order toaddress this issue?OutcomesProgram outcomes can be short-term, intermediate-term,or long-term. Outcomes answer the question “What hap-pened as a result of the program?” and are useful to com-municate the impacts of our investment. Short-term outcomes of educational programs may includechanges in:•awareness–customers recognize the problemor issue;•knowledge–customers understand the causesand potential solutions;•skills–customers possess the skills needed toresolve the situation;•motivation–customers have the desire toeffect change;•attitude–customers believe their actions canmake a difference.Intermediate-term outcomes include changes that followthe short-term outcomes, such as changes in:•practices used by participants;•behaviors exhibited by people or organizations;Evaluation PlanningDevelopment of an evaluation plan to assess the programcan be superimposed, using the logic model format. Theevaluation plan should include alternatives to assess theprocesses used in planning the program. Process indicatorsshould be designed to provide a measurable response toquestions such as:4

•Were specific inputs made as planned, in termsof the amount of input, timing, and quality ofinput?•Were specific activities conducted as planned,in terms of content, timing, location, format,quality?•Was the desired level of participation achieved,in terms of numbers and characteristics of par-ticipants?•Did customers express the degree of customersatisfaction expected? The evaluation plan also should identify indicators appropri-ate to the desired outcomes, including short-, medium-andlong-term outcomes. Outcome indicators also should bemeasurable, and should be designed to answer questionssuch as:•Did participants demonstrate the desired levelof knowledge increase, enhanced awareness, ormotivation?•Were improved management practices adopted,behaviors modified, or policies altered to theextent expected for the program?•To what extent were social, economic, political,or environmental conditions affected by theprogram?ConclusionDeveloping appropriate and measurable indicators duringthe planning phase is the key to a sound evaluation. Earlyidentification of indicators allows the programmanager/team to learn what baseline data already may beavailable to help evaluate the project, or to design a processto collect baseline data before the program is initiated. Thelogic model is useful for identifying elements of the programthat are most likely to yield useful evaluation data, and toidentify an appropriate sequence for collecting data andmeasuring progress. In most cases, however, more work ona project will be required before indicators are finalized.Outcome indicators to measure learning should be based onspecific learner objectives that are described as part of thecurriculum. Indicators to measure behavioral change shouldspecify which behaviors are targeted by the program.Conditional indicators may require a significant invest-ment of time to link medium-term outcomes to expectedlong-term outcomes through the application of a targetedstudy or relevant research base.Figure 4. Insertion of evaluation plan into the logic model.INPUTSOUTPUTSShort-What we Do!•workshops•publications•field days•equipmentdemonstrationsChange in:OUTCOMESMedium-Change in:Long-Change in situation:•environment•socialconditions•economicconditions•politicalconditionsSITUATIONWhat we Invest!•time•money•partners•equipment•facilitiesWho we Reach!•knowledge•skills•attitude•motivation•awareness•behaviors•practices•policies•proceduresEvaluation Study: Measurement of process indicators — measurement of outcome indicatorsMcLaughlin, J.A. and G.B. Jordan. 1999. Logic models: a tool for telling your program’s performance story. Evaluation and Planning 22:65-72.Millar, A., R.S. Simeone, and J.T. Carnevale. 2001. Logic models: a systems tool for performance management. Evaluation and Program Planning 24:73-81.3Adapted from Taylor-Powell, E. 1999. Providing leadership for program evaluation. University of Wisconsin Extension, Madison.4Ladewig, Howard. 1998-1999. Personal communication during sessions on “building a framework for accountability” with ECOP Program Leadership Committee (Tannersville, PA, 1998)and the Association of Extension Directors/ECOP (New Orleans, LA, 2000). Dr. Ladewig was a professor at Texas A&M University at the time of communication; he now is at the Universityof Florida.5Glanz, K. and B.K. Rimer. 1995. Theory at a glance: a guide for health promotion practice. NIH pub. 95-3896. National Institutes of Health-National Cancer Institute. Bethesda, MD.12Issued in furtherance of cooperative extension work in agriculture and home economics, Acts of May 8 and June 30, 1914, in cooperation with the U.S.Department of Agriculture, A. Larry Branen, Acting Director of Cooperative Extension, University of Idaho, Moscow, Idaho 83844. The University ofIdaho provides equal opportunity in education and employment on the basis of race, color, religion, national origin, age, gender, disability, or status

as a Vietnam-era veteran, as required by state and federal laws.

400 10-01©University of Idaho

因篇幅问题不能全部显示,请点此查看更多更全内容

Top