The UNDP is an international agency that has developed indicators of development A comparison between the baseline information collected in our situation and needs assessment and analyses done 2 and 5 years later.
To determine what the effects of the program are: Assess skills development by program participants Compare changes in behavior over time Decide where to allocate new resources Demonstrate that accountability requirements are fulfilled Use information from multiple evaluations to predict the likely effects of similar programs To affect participants: Reinforce messages of the program Stimulate dialogue and raise awareness about community issues Broaden consensus among partners about program goals Teach evaluation skills to staff and other stakeholders Gather success stories Support organizational change and improvement Questions The evaluation needs to answer specific questions.
Drafting questions encourages stakeholders to reveal what they believe the evaluation should answer. That is, what questions are more important to stakeholders?
The process of developing evaluation questions further refines the focus of the evaluation. Methods The methods available for an evaluation are drawn from behavioral science and logical framework example non-profit business plan research and development.
Three types of methods are commonly recognized. They are experimental, quasi-experimental, and observational or case study designs. Experimental designs use random assignment to compare the effect of an intervention between otherwise equivalent groups for example, comparing a randomly assigned group of students who took part in an after-school reading program with those who didn't.
Quasi-experimental methods make comparisons between groups that aren't equal e. Observational or case study methods use comparisons within a group to describe and explain what happens e.
No design is necessarily better than another. Evaluation methods should be selected because they provide the appropriate information to answer stakeholders' questions, not because they are familiar, easy, or popular.
The choice of methods has implications for what will count as evidence, how that evidence will be gathered, and what kind of claims can be made.
Because each method option has its own biases and limitations, evaluations that mix methods are generally more robust. Over the course of an evaluation, methods may need to be revised or modified.
Circumstances that make a particular approach useful can change. For example, the intended use of the evaluation could shift from discovering how to improve the program to helping decide about whether the program should continue or not.
Thus, methods may need to be adapted or redesigned to keep the evaluation on track. Agreements Agreements summarize the evaluation procedures and clarify everyone's roles and responsibilities.
An agreement describes how the evaluation activities will be implemented. Elements of an agreement include statements about the intended purpose, users, uses, and methods, as well as a summary of the deliverables, those responsible, a timeline, and budget.
The formality of the agreement depends upon the relationships that exist between those involved.
For example, it may take the form of a legal contract, a detailed protocol, or a simple memorandum of understanding. Regardless of its formality, creating an explicit agreement provides an opportunity to verify the mutual understanding needed for a successful evaluation.
It also provides a basis for modifying procedures if that turns out to be necessary. As you can see, focusing the evaluation design may involve many activities. For instance, both supporters and skeptics of the program could be consulted to ensure that the proposed evaluation questions are politically viable.
A menu of potential evaluation uses appropriate for the program's stage of development could be circulated among stakeholders to determine which is most compelling. Interviews could be held with specific intended users to better understand their information needs and timeline for action.
Resource requirements could be reduced when users are willing to employ more timely but less precise evaluation methods. Gather Credible Evidence Credible evidence is the raw material of a good evaluation.
The information learned should be seen by stakeholders as believable, trustworthy, and relevant to answer their questions.
This requires thinking broadly about what counts as "evidence. For some questions, a stakeholder's standard for credibility could demand having the results of a randomized experiment. For another question, a set of well-done, systematic observations such as interactions between an outreach worker and community residents, will have high credibility.
The difference depends on what kind of information the stakeholders want and the situation in which it is gathered. In some situations, it may be necessary to consult evaluation specialists.
This may be especially true if concern for data quality is especially high. In other circumstances, local people may offer the deepest insights. Regardless of their expertise, however, those involved in an evaluation should strive to collect information that will convey a credible, well-rounded picture of the program and its efforts.Case Study - Income Generating Activities Example of Georgia (Abkhazia/Samegrelo) Tbilisi, Georgia Action against Hunger is a non-governmental, non-political, non-religious, non-profit organisation that was created in in France.
Logical Framework Approach is often confused with the Logical Framework Matrix. LFA is a project preparation methodology, whereas the LogFrame is a document with a special structure, produced at the end of the LFA process.
Non-profit/public sector; You are here. Research» Technology transfer and commercialization» Business opportunities for industry» Logical Superposition Coding in Coded Multicast for Wireless Data.
Background. Superposition coded (SPC) modulation is a physical-layer technique design framework of logical superposition coded (L-SPC. A Trojan Horse? International Development Agencies Embrace Business Practices and Mental Models Mark McPeak Business practices come to the non-profit sector – aspects of our programme in this linear, logical framework.
For example, the Field Director in Tuluá, Monique van’t Hek, spent countless hours working with a group of around. Learn non profit chapter 3 with free interactive flashcards. Choose from different sets of non profit chapter 3 flashcards on Quizlet.
NSF Administration & Management Strategic Plan. Introduction. NSF's leadership in advancing the frontiers of science and engineering research and education is complemented by its commitment to excellence in administration and management (A&M).