Modules | Module A: overview | Module B: plan | Module C: build | Module D: evaluate | Module E: report |
Other Resources | Orientation | Logic Model | Cases | Glossary | Credits | Enhanced version |


Text-Only Version: Shaping Outcomes: Module E: report

[Start Screen E-1 of 19/Module E>Introduction (1)]

Introduction

You should now know how to plan a program (Module B: Plan), build a plan for services and activities (Module C: Build), and plan measurable outcomes to show success (Module D: Evaluate).

In this module, you will learn helpful aspects of reporting on your program by answering the following questions:

[End Screen E-1 of 19]

 

[Start Screen E-2 of 19/Logic model (2)]

Planning with the end in mind

Logic models not only help you plan programs for desired outcomes, they help you write reports. By organizing your inputs, activities, services to achieve outcomes, you’ve got a ready-made structure to communicate your program to others.

With OBPE, you’re preparing to write the final report from the very beginning. Let’s see how the textual Logic Model you’ve been filling in for your program will help.

What kinds of records will you need to keep?

Inputs

Activities & Services

Outcomes

[End Screen E-2 of 19]

 

[Start Screen E-3 of 19/Module E>Gathering data (3)]

Gathering data for report

Where do you get the data for reporting Inputs, Services/Activities?

You may already have procedures in place that will show how you spent Money, Staff and Time.

[Graphic of man holding money]
Money: Have you created separate accounts for tracking program expenses? A simple spreadsheet may be all you need to record program-specific spending.

[Graphic of man smiling]
Staff: How do you normally track the work of staff and volunteers? Can you separate out the work for this program?

[Graphic of woman with clipboard]
Time: Have you created a timeline for the program? Can it be turned into a checklist for work accomplished? Is there a reporting structure (Weekly? Monthly meetings? Quarterly reports?) from team leaders (as a record of what’s been done).

[End Screen E-3 of 19]

 

[Start Screen E-4 of 19/Module E >Gathering data (4)]

Gathering data to report on participants

Data about participants can provide useful information about your program's target audience, but be careful to focus on characteristics of the participnats that are relevant to your outcomes.

The kinds of characteristics often recorded are:

[Graphic of candy people in a row]

Age, gender, income, education, referred by or how did you hear about us?, number in household, living situation, marital status, address, health, employment.

Let’s see how to choose participant characteristics that provide data relevant to your program.

[End Screen E-4 of 19]

 

[Start Screen E-4_M of 19/Module E >Gathering data >Museum example (4-M)]

Museum example: characteristics of participants

The MoNA program will provide a 5-day, 3 credit summer institute on NW art history; visual art concepts, visual thinking strategies, critical thinking, an in-depth study of 2 exhibitions, 3 docent-guided Museum visits for the teacher’s classes with support from staff. 

Which of the following characteristics of the program’s participants (members of the target audience) do you think are important to track for?

[Clicking a graphic of a teacher and student with the statement "Age and gender of teachers" presents the text: No, these characteristics don’t matter to the project outcomes.]

[Clicking a graphic of a family with the statement "Income level of student’s parents" presents the text: No, there’s no focus on economic level in the program’s purpose.]

[Clicking a graphic of a student with the statement "Number of students attending MoNA in school visits" presents the text: Yes, school visits to MoNA are part of services and should be documented by number of students attending.]

[Clicking a graphic of a docent with the statement "Educational level achieved of docents" presents the text: No, the docents are project service providers, not participants (part of the target audience).]

[Clicking a graphic of a child with the statement "Educational level of students" presents the text: Yes, the children should be in elementary school, according to the purpose statement.]

[End Screen E-4_M of 19]

 

[Start Screen E-4_L of 19/Module E >Participant characteristics>Library example (4-L)]

Library example: characteristics of participants

The Riverton (Kentucky) Memoirs program will serve participants to improve their writing and demonstrate they feel themselves to be a part of a community of writers.

Which of the following characteristics of the program’s participants (members of the target audience) do you think are important to track for reporting?

[Clicking a graphic of a family with the statement "Marital status and number of children of participants" presents the text: No, this data is not relevant to the writing behavior and achievement of participants.]

[Clicking a graphic of a man with the statement "Age of participants" presents the text: Yes. The program was designed for adults and was scheduled so that all adults could attend, whether they were working or retired.]

[Clicking a graphic of an older couple with the statement "Residence" presents the text: Yes, since funding is provided by Kentucky but Riverton borders on Indiana,  the number and percent of non-Kentuckian taking the course is relevant.]

[Clicking a graphic of a man with computer with the statement "Number of books sold by each participant" presents the text: No, publication costs are included in grant funds. Sales may be interesting but how many each participant has contributed to selling is not relevant.]

[Clicking a graphic of a woman with the statement "Number and % of participants who attend most of the meetings" presents the text: Yes, this data shows whether participants find the group interesting and helpful.]

[End Screen E-4_L of 19]

 

[Start Screen E-5 of 19/Module E>Gathering data (5)]

Gathering data to report on outcomes

Where do you get the data to report on outcomes? While you may already have procedures to document Money, Staff, and Time, data to measure outcomes comes from your evaluation plan in your Logic Model.

Examine the following information about gathering data for reporting outcomes.

[Graphic of woman writing]
Small Program
Does a memoir-writing program change the participants so they feel and act like writers? [Rollover picture of woman to present the following text: Past workshop questionnaire indicates 63% of the memoir writers attended and 22% read at community readings representing an 83% increase in participation.]

[Graphic of children working on project]
Medium Program
Can a museum exhibit on maps improve visitor’s knowledge of geography and understanding of its importance? [Rollover picture of children to present the following text: Exit survey results indicate that 72% of children in grades 3-5 were able to identify at least three features of maps and list three main points from the exhibit on map making.]

[Start COACH text]

Coach

A reminder about the differences between outputs and outcomes:

Outputs are measures of volume: products created or delivered, people served, activities and services carried out.  Think of outputs as the “things” piece of evaluation: products, deliverables, counts.  Outputs are almost always numbers:  the number of Interlibrary Loans, the number of attendees, the number of publications, the number of grants made, or the number of times a workshop was presented. 

Outcomes are the change in the “people” or the “so what” piece – what changed in the participants because of the outputs.

Outputs

Outcomes

[End COACH text]

[End Screen E-5 of 19]

 

[Start Screen E-6 of 19/Module E>Gathering data (6)]

Assessing Outcomes

You know what success will look like because you’ve already planned:

Indicators

[End Screen E-6 of 19]

 

[Start Screen E-7 of 19/Module E>Gathering data (7)]

Gathering data for improvement throughout

In module D, we focused on evaluating outcomes—the focus of Shaping Outcomes. Evaluation can also provide useful information as you develop your program.

Evaluations at different phases of program development are called front-end, formative and summative. They can evaluate the process and the quality of products. This evaluation data is vital for interim and final reports.

[Graphic of man seated on edge of desk with blackboard behind him. Text reads: Will it Work? (front end) Pilot testing materials, such as the feedback form used after author visits to the Riverton group or lesson plans for elementary school teachers in the MoNA program.]

[Graphic of two men and one woman working at a laptop computer. Text reads: Is it Working? (formative) Mid-course corrections: compare self-assesments of revised work by the Riverton group to independent review. Identify additional participants, if participant numbers are too low. Modify how you assess outcomes with more cost-effectiveness measures.]

[Graphic of young woman smiling. Text reads: Has it Worked? (summative) Should this program be continued, expanded, replicated?]

[Start COACH text]

Coach

Why haven’t we mentioned process and product evaluation before? Because Shaping Outcomes focuses on evaluation of outcomes. Being able to report on changes in your audience is only one part of evaluation, although an important one!  The resources at the end of modules D and E will provide further information, and colleagues in your professional associations and publications can guide you in finding answers related to your own program.

[End COACH text]

[End Screen E-7 of 19]

 

[Start Screen E-8 of 19/Module E>Reporting the program (8)]

Reporting the program

Programs usually last for a specific time period. They require a Final Report at the end of the program. Whether a formal report required by funders or a report circulated within your organization or to stakeholders. Programs that last a year or more usually require Interim Reports as well. Where needs assessment or prototyping is important, you may create preliminary or front end reports.

Reports are structured to answer three basic questions:

Wanted to do? Purpose statement giving need, target audience, activities/services and outcomes

We did what? Report on activities, services, outputs, participant characteristics

So what? Report on outcomes, assessing whether targets were met. Is it worth continuing, expanding, replicating by others?

Let’s look at the most complex report—the final report. See how a Logic Model, like the one you’ve been preparing, helps organize and write the final report.

[Start DIG DEEPER text]

Dig Deeper

For large programs or for readers who may wish to replicate your work, you may want to include some of the additional information suggested below. However, notice that all the items can still be fit into the three main sections of the report: Wanted to do what? We did what? So what?

[End DIG DEEPER text]

[End Screen E-8 of 19]

[Start Screen E-8_M of 19/Module E> Reporting the program>Museum example (8-M)]

Museum example: MoNA Link final report

The MoNA program’s final report to the grant funder gets most of its text from their Logic Model. If you want to see the complete Logic Model, click on Cases and choose MoNA Link.

Section 1: What we wanted to do
Section 2: What we did
Section 3: So what?

Final Report on Link Program of the Museum of Northwest Art
To the Institute for Museum and Library Services

Section 1: What we wanted to do
The Link program of the Museum of Northwest Art (MoNA) has provided intensive training and other art museum services for Skagit County elementary school teachers in order to enhance teaching with art and related concepts such as visual thinking. Immediate goals included strengthened teacher skills for leading discussions about art, using museum visits and online resources to enhance teaching, and integrating art into classroom activities, for the ultimate purpose of helping students develop critical thinking skills. <from the “description” section of the MoNA Logic Model>

Needs met:

The program grows out of perceived need for training, especially in light of new state-mandated standards and learning goals.  Elementary-school teachers often feel inadequately prepared to teach hands-on art-making, do not know how to effectively talk with students about art, and are uncomfortable visiting a museum with or without their students. Many teachers will need training to effectively meet the anticipated formal assessment in the arts mandated by the Washington State Essential Academic Learning Requirements in 2008.

Additionally, most teachers have had little pre-service training in teaching critical thinking, one of Washington State’s Four Learning Goals, infused in all curricular areas of the state’s learning standards. In the Information Age when more than enough knowledge and data is easily accessible to everyone, education’s task is no longer to provide students with facts, but to help students locate and analyze information, make reasoned judgments, think creatively, communicate clearly and solve problems. These are the skills employers seek. <from the “needs” section of the Logic Model>

Section 2: What we did
Program Activities <What we did>:

This pilot program involved volunteer elementary school teachers in Skagit County <from “target audience” in Logic Model> who completed the MoNA Link program by participating in:

5-day, 3 credit summer institute on Northwest art history; visual art concepts, Visual Thinking Strategies, critical thinking

2 training days: in-depth study of 2 exhibitions; meeting the artist; responding to the exhibition using art, writing, reflection

monthly meetings with other teachers and Museum Art Educator to develop curriculum linking exhibitions and classroom teaching

3 docent-guided Museum visits for the teacher’s class with pre- and post-visit lessons taught in the classroom by the Museum Art Educator culminating student demonstration of learning presented to other students, family, community in Museum or posted on Museum website if desired <from “solutions” in Logic Model>

Accomplishments: 

In two summer institutes, we enrolled 25 students for each summer institute from elementary teachers in Skagit County with the course advertised through the Skagit County Board of Education, with 21 completing the course in the first year and 23 in the second.  After each institute, teachers participated monthly in an educators’ website to generate and share curriculum units, with two teacher-training days provided in connection with two different exhibits (one early in the year, one later).  Each teacher’s visit with a school group was preceded and followed by a visit from the Art Educator, with docents leading the groups on their actual bus trips to MoNA.

Output: Teachers completing summer inst.
Y1: 21
Y2: 23
Total: 44
Goal: 40

Output: School groups visiting in first year
Y1: 29
Y2: 30
Total: 59
Goal: 40

Output: School groups make 3 visits in 2 years
Y1: 10
Y2: 9*
Total: 19*
Goal: 20

Output: Curriculum units
Y1: 61
Y2: 20 to date
Total: 81 to date*
Goal: 80

* report issued before completion of the second year for teachers taking the second summer institute.

Section 3: So what?

Indicators of Success: 

Our goal was to strengthen

teacher skills for leading discussions about art,
teachers’ use of  museum visits and online resources to enhance teaching,
teachers’ ability to integrate art into classroom activities, for the ultimate purpose of helping students develop critical thinking skills.<from “description” in the Logic Model>

To reach this goal, we wanted to change the knowledge, skills and behavior of the teachers so that they would see the Museum as an effective resource and would in fact use a curriculum unit linking an exhibition to their classroom teaching. <from “outcomes” in the Logic Model>

Our indicators of success were as follows:

Outcome: Teachers will see the Museum as an effective resources for instructional support

Indicator: Number and percent of teachers who rate the Museum as an effective resource for instructional support
Data Source: 4-pt. Likert scale before summer institute and at end of training year, where 1 = not effective and 4 = very effective
Results, with target of 90%: Before summer institute, 25 teachers out of 44 (57%) rated the Museum at 3 or 4; after training year, 42 teachers (95%) rated it at 3 or 4. 

Outcome: Teachers see the Museum as an effective resource for instructional support.

Indicator: Number and percent of teachers who develop at least one curriculum unit linking an exhibition to their classroom teaching during their training year. AND
Data Source: Teacher portfolios (at end of 2nd and 3rd year of MoNA Link)
Results, with a target of 100%: 42 teachers out of 44 (95%)
Indicator: Number and percent of teachers who teach at least two lessons in their classroom based on a Museum exhibition during their training year.
Data Source: Teacher reports, observations and documentation by Museum art educator (throughout 2nd and 3rd year of MoNA Link)
Results, with a target of 100%: 42 teachers out of 44= 95%

Conclusion:

The MoNA Link program has shown a high rate of success in leading and supporting elementary school teachers in linking their curriculum units to MoNA exhibits, helping them to develop and share curriculum units, providing support for them and their classes, and changing attitudes from feelings of intimidation to assurance of confidence and comfort. The student programs, described in Appendix C, show a range of ideas (including designing modern totem poles) as well as interpreting the stories of existing Northwest art objects. Although this report is being written before we’ve been able to finish tabulating results for teachers attending the second summer institute, we have results that suggest we have exceeded or closely approached our ambitious goals. 

The Skagit County Board of Education and the Board of the Museum of Northwest Art have reviewed the outcomes and expressed their satisfaction with the success of the program.  By posting our program materials on the web at http://www.mona.org/link we have tried to make it possible for other sites to follow our model.  We thank the Institute for Museum and Library Services for their support in this program.

[End Screen E-8_M of 19]

 

[Start Screen E-8_L of 19/Module E> Reporting the program>Library example (8-L)]

Library example: Riverton Memoirs final report

The Riverton program’s final report to the grant funder gets most of its text from their Logic Model. If you want to see the complete Logic Model, click on Cases and choose MoNA Link.

Section 1: What we wanted to do
Section 2: What we did
Section 3: So what?

Final Report on the “Filling in the Dashes” Program
Riverton Public Library
Kentucky LSTA grant

To the Kentucky Department of Libraries and Archives

Section 1: What we wanted to do
With the aid of a $7650 LSTA grant from the Kentucky Department of Libraries and Archives (KDLA), the River County Library offered a creative writing program focusing on autobiographical pieces. County participants attended bi-monthly meetings, guided by a facilitator, on Wednesdays from 6:30-8:30 p.m. for a year. Meetings included visits from three published Kentucky authors of biography and memoirs as well as sharing and critiquing participants’ autobiographical writing. A small book containing the best samples of each group member's writings was published at the end of the program, and group members presented selections from their work at a community meeting.  Participants improved their writing and demonstrated they felt themselves to be part of a community of writers.

Section 2: What we did

Needs met: 

The program grew out of requests by library patrons and confirmed by response to locally published articles requesting feedback from potential participants.  Library patrons wanted a writing program for adults, in Riverton, Kentucky, a town of 3800 on the Ohio River, across from Indiana. Identified needs included: an organized "group" meeting regularly to produce some genre of writing, feedback from others as to how to grow as a writer, and, most importantly, the organizational help of a facilitator.

Program Activities:

In response to advertising for the program by Library Director Gerry Bard, twenty-one adults started out in the program. Rod Blackmur, an instructor at a Louisville Community College, facilitated the group. The group completed the following activities:

20 two-hour meetings (Wed., 6:30-8:30 p.m.) were held (with only one meeting in December, January, July and August).  Eighteen of the original twenty-one participants continued with the group regularly.

Group members read from a list of five published Kentucky authors who had published memoirs or biographies.

Three Kentucky authors visited, discussing writing techniques and style.

Most meetings included critiques of autobiographical writings by group members with discussion led by the facilitator.  Some meetings included in-meeting writing.
Each continuing participant had at least two completed pieces written, critiqued, revised and published in a small book at the end of the program.

Group members gave a reading from their works at the end of the year for the Wednesdays @ One Group that meets at the Library.

Director Gerry Bard publicized the program at county post offices, with handouts to the Rotary Club, Lions Club, and Women’s Club meetings. She worked with the Extension Agent to publicize the program with the Homemakers Groups. Registration was also publicized in church bulletins throughout the county.

The Library’s cataloger, Naomi Strang, ordered and processed the multiple copies of books included in meetings. Circulation Manager Tara Baker kept records to track and assess participation and outcomes.  Mr. Blackmur handled editing duties involved in publishing the book, Kentucky Lives,by Wasteland Press of Louisville, Kentucky. 

Director Bard arranged for visits by Kentucky authors Ed McCahan, Randy Black, and Bobbie Ann Johnson. Throughout the course of the year, visiting authors were photographed as they spoke to the group. Bard wrote two articles about the progress of the group for the local paper. In addition to the final presentation to the Wednesdays @ One Group at the library, Bard arranged for readings at a local coffee house (Coffee Klatch).

After eight months, each participant chose an evaluation portfolio with two pieces, providing the original version, their description of what they tried to improve, and the revised essay.

All eighteen participants were published in the book Kentucky Lives, with from three to five pieces by each author.  All participants read either at Wednesdays @ One or at the Coffee Klatch. 

Section 3: So what?

Indicators of Success <Outcomes>:

The goals of the program were that participants would improve their writing and demonstrate they felt they were part of a community of writers.

Improved writing: An analysis of the work of the eighteen writers was carried out by an independent expert, creative writing instructor Marcia Reed in two ways. She graded the essays for 18 participants without knowing the names of the writers or which of the pair of essays (2 originals, 2 revisions) was written first, using her overall impression. She found that 12 writers (67%) improved their score on the revision. She then read the packets, evaluating them for improvement by the writer’s intent, with 17 writers (94%) showing improvement.

Community of writers: In an exit survey, all of the writers (100%) could name three ways they felt part of a community of writers, including such examples as critiquing their peers, revising, being published (in Kentucky Lives), discussing writing techniques with published authors, and presenting their work publicly at a library book club and at a local coffee shop. In addition, a phone survey three months after the close of the group showed that 15 (83%) acted in ways (predetermined by a checklist) that showed self-identification as a writer: discussing their published work with others, attending readings of other writers’ work, signing up for another writing activity, producing more writing, reading additional memoirs.

Participants indicated the importance of talking with published Kentucky authors in their sense of belonging to a community of authors.  They discussed techniques of writing and style with published writers in the context of their own experience as memoir writers.  All the participants met with at least two of the authors, and 16 of 18 (89%) met with all visiting writers. 

Analysis:

Everyone in Riverton is proud of the work done by program participants and pleased with the results. The participants showed they knew how to produce autobiographical writing, by drafting, revising and publishing it as well as presenting it to the community. A survey of participants showed that 100% of them rated as important or very important “shaping stories of lived experience” (using a 5-point Likert scale). Reading Kentucky Lives supports this view. Participants range in age from 25 to 83, covering subjects such as experiences in the Great Depression, service in World War II, getting the family’s first television set, attending the Kentucky Derby in 1975, the first day of attending an integrated high school, and the terrorist attacks of September 11, 2001. The reception of the pieces by friends and families left no doubt of the writing’s authenticity and power to affect.  One middle-aged child of a participant said he had never known before of his father’s service in Europe during World War II.

The participants’ sense of improved writing has been confirmed overwhelmingly by an independent writing judge. Our Riverton writers feel pride and achievement; they have discovered a new avenue for communication with family and friends. The community has been enriched by the shared, crafted experience of its citizens. Writing is no longer a spectator sport in Riverton. The Library is proud to be not only the source of books, but also the cradle of a book.

We are grateful for the support of an LSTA grant that has made this pilot program possible. We expect to be able to sustain and continue the program with the Library Director Identifying appropriate local volunteers.

[End Screen E-8_L of 19]

 

[Start Screen E-9 of 19/Module E> Reporting the program (9)]

Front-end reports: Will it work?

Final reports are summative. They judge by the outcomes: has this program worked?

At the other extreme are front-end reports.  They deal with threshold problems. That is, they ask the questions, “Will it work?” and “Is it needed?” before you commit lots of time and money to a particular product or program element.

For example,
Does the software address issues the target audience wants?
Which brochure is most effective in attracting participants?
Will the software work on public machines as well as private or work machines? With modem access as well as faster DSL lines?

Such reports are usually short and can keep all members of the program team informed.

[End screen E-9 of 19]

 

[Start Screen E-10 of 19/Module E> Reporting the program (10)]

Interim reports: Is the program working?

A formative evaluation occurs in the middle of a program and allows you to ask the question: Is this working?

Use your interim report with your program team to adjust resource allocation or modify what isn’t working. Use your interim report to negotiate with partners and funders about any important changes that will increase the likelihood of success with the program.

Enough participants? Why not? What change would help?

Quality of the product? Is it good enough: software, instructional materials, themed activities in a summer library program?

Are you meeting your outcome targets?With multi-year programs, you may want to evaluate the effectiveness of the outcomes expected: Are outcomes clearly written? Are outcomes sufficient to describe what you hope will happen? Are data collection methods cost efficient?

[End Screen E-10 of 19]

 

[Start Screen E-11 of 19/Module E> Reporting the program (11)]

Consider your stakeholders

For all reports—front end, interim or final, always consider what your most important stakeholders will want to know.

What will your stakeholders want to know?

[Clicking the graphic of a business meeting labeled "Organization Board" presents the following text: How does this further our mission? Is our reputation enhanced? How can we get the word out?]

[Clicking the graphic of a women labeled "Funder" presents the following text: Did you accomplish what you said you’d do? Did you spend the money as promised? Is this a replicable model?

[Clicking the graphic of a doctor labeled "Partners" presents the following text: What did we accomplish? Does it make sense to continue working together? Is this a good partner for different programs in the future?]

[Clicking the graphic of a couple labeled "Target Audience" presents the following text:  How do the results measure up with my experience? Can I be identified? Does the program seem successful an would I participate in another?]

[Begin DIG DEEPER text]

Dig Deeper

Your target audience may never see the formal final report, but consider sharing the results broadly in formats such as annual reports and member newsletters. It is a great way to celebrate your success, and recruit future participants. Having full reports available upon request is a good way to demonstrate public accountability and transparency.

[End DIG DEEPER]

[End E-11 of 19]

 

[Start Screen E-12 of 19/Module E>Improvement (12)]

Program improvement through reporting

Reports improve communication about the program for many audiences:

For each of the activities listed below, what audience(s) are affected?

[Clicking the words "Grant announcement: You announce the award of a competitive grant by sending a press release in a letter of congratulations to your board, members, friends, and all in the community who support your institution." presents the following text:Funders are glad you’re spreading the word about their award. Team members, your institution and partners are glad for the publicity. Stakeholders think well of you and are alerted to watch for future services. To see an actual example, click on the Cases tab and choose The Children’s Museum press release (under Forms and Examples).]

[Clicking the words "Monthly report: You send a weekly report about creation of a software prototype to all program team members, even those not on the software team." presents the following text: Probably only the program team will read this report; it keeps people in different departments and with different functions aware of overall progress. To see an actual example of using monthly reports for outsiders as well as team members, click on the Cases tab and choose the blog for Bridging the Gap for Hispanic Newcomers (under Forms and Examples).]

[Clicking the words "Year One report: You send a Year One report to the granting institution.Probably only the program team will read this report; it keeps people in different departments The granting agency probably requires this. But it’s useful to share the report or an executive summary with your institution, your partners and the program team. To see an actual example of an interim report and compare it to the grant proposal and final report, click on the Cases tab and choose the reports for Teaching Colorado’s Heritage with Digital Resources (under Forms and Examples).]

[Clicking the words "Feature story: You write a feature story about the program for your newsletter, annual report, or your Web site. Be sure to add a link or web address to any granting agency for users to learn more about the award.Probably only the program team will read this report; it keeps people in different departments Any one of your audiences may see this, but the main purpose is to attract attention among stakeholders in the community, especially possible participants. To see an actual example, click on the Cases tab and choose the article for the Colorado Digitization Project (under Forms and Examples).

[Clicking the words "Expanded service: You distribute a press release to news media about expanded service offered as a result of your program.Probably only the program team will read this report; it keeps people in different departments Your main audience will be stakeholders in the community, but your announcement might be a part of a report to any of your audiences. To see an actual example, click on the Cases tab and choose the Walker Museum’s Art on Call press release (under Forms and Examples).]

[End Screen E-12 of 19]

 

[Start Screen E-13 of 19/Module E>Improvements (13)]

Press Releases

A special kind of report that can be useful is a press release. And when you highlight outcomes, you are telling the story in terms your community cares about—how you improve their lives. You can get publicity and attract supporters throughout your program by preparing press releases:

To celebrate a grant starting your program.
To announce the rollout of services
To report the success of your program.

Write the press release: Indicate who, what, when, where and why. Mine your Logic Model for content.

Distribute your release: Develop relationships with local publications and media that carry stories about programs in the community. Clip or save good examples for a file you keep. Develop a distribution list, including newspapers, television stations, news and wire services, and fax your press release. Follow up with a phone call to confirm receipt and to “pitch” the story. Include a picture if possible.

Print Sources

For Immediate Release
Contact: (insert your institutions, contact name, telephone number and email address)
IMLS contact: Eileen Maxwell, 202/606-8339 or emaxwekk@imls.gov

(insert your institution’s name) Awarded Prestigious IMLS Grant
(insert your institution’s city) Thanks to a grant from the federal Institute of Museum and Library Services (IMLS), (describe how your institution will use the grant money for the benefit of your community.).
(insert a quote from your institution’s director)
(insert the quote from the IMLS director provided for you in the Fast Facts on IMLS Award Programs)
(use the “Program Statement” and “Vital Statistics” provided for you in the Fast Facts on IMLS Award Programs)

The Institute of Museum and Library Services is an independent Federal grant-making agency dedicated to creating and sustaining a nation of learners by helping libraries and museums serve their communities.
(Example courtesy of the Institute for Museum and Library Services)]

Public Service Announcements

Radio Public Service Announcement
Script for a Public Service Announcement (PSA) (10-30 seconds). Courtesy of the Institute for Museum and Library Services.
PUBLIC SERVICE ANNOUNCEMENT
USE THROUGH NOVEMBER 31, 2006
:30 [Indicated that announcement runs for thirty seconds]

ONE OF THE MOST IMPORTANT ACTIVITIES TO SHARE WITH YOUR BABY IS READING. THAT’S WHY THE BROWN COUNTY PUBLIC LIBRARY IS OFFERING THE “BORN TO READ” PROGRAM FOR TEEN PARENTS. TUTORS PROVIDE COACHING, A VIDEO OFFERS TIPS, AND LIBRARIANS DISTRIBUTE BOOKS AND OTHER MATERIALS. THE PROGRAM IS MADE POSSIBLE THROUGH A GRANT FROM THE FEDERAL INSTITUTE OF MUSEUM AND LIBRARY SERVICES. FOR MORE INFORMATION OR TO BE A VOLUNTEER CALL 505/555-1234.]

[End Screen E-13 of 19]

 

[Start Screen E-14 of 19/Improvements (14)]

Credibility through reporting outcomes

Another reason to report outcomes is that when you include outcomes, your report gains credibility. Below you can read the original report and an enhanced version of the same report. The additions are highlighted in italics.

Notice what kinds of things have been added.

[Graphic of a woman reading a report]

Original Report

Washington Public Library is pleased to celebrate the end of our summer “Read Along with Me” program. The RAM program brought together children and their parents for weekly book discussions during July and August.

We all know how special the parent-child bond is. Children love doing things with their parents. Reading is important too! It is fundamental to how well children do in school.

WPL is happy to have brought this opportunity to the parents and children of our community.

[Graphic of a man and woman discussing a report]

Enhanced Report

Washington Public Library is pleased to celebrate the end of our summer “Read Along with Me” program. The RAM program brought together over 20 third and fourth graders, and 30 fifth and sixth graders, along with their parents, for weekly book discussions during July and August.
           
We all know how special the parent-child bond is. Children love doing things with their parents. Reading is important too! It is fundamental to how well children do in school.

At the end of August, most of our parents reported that their children had read even more than the four books that each group discussed over the summer. Last year, when we contacted parents in the fall after the family had participated in the program, they said their children’s teachers had noticed improved reading skills, and for a few children, their grades had gone up. The children themselves, too, said they had fun both reading, and spending time with their fathers or mothers.
           
WPL is happy to have brought this opportunity to the parents and children of our community.

[Start DIG DEEPER text]

Dig Deeper

The table below shows the kinds of information (listed from most important to least) that most decision-makers outside the library and museum communities want. It lists the four most common categories of messages about libraries or museums with some ways to collect supporting evidence.

Notice that outcomes assessment is the most common way to support the most sought after information.

[Table with two columns: message and Information Strategies for Understanding Museum and Library Performance

Message: What Good We Do/Why We Matter

Information Strategies for Understanding Museum and Library Performance: Outcomes measurement

Message: How Much We Cost/What We’re Worth

Information Strategies for Understanding Museum and Library Performance: Return on investment, cost: benefit calculations

Message: How Well We Do It

Information Strategies for Understanding Museum and Library Performance: Customer satisfaction, quality benchmarks, rankings

Message: How Much We Do

Information Strategies for Understanding Museum and Library Performance: Inputs and outputs: statistics, gate counts, Web use logs, and other measures of quantity and productivity

[End DIG DEEPER text]

[End Screen E-14 of 19]

 

[Start Screen E-15 of 19/Module E>Improvements (15)]

What is success?

Both the Museum and Library final reports show the program generally meeting or exceeding the targets set for success. But what if your program doesn’t? Can the report still be a success?

A good report includes an honest accounting of what happened plus an analysis of why you succeeded, as well as where you fell short. It allows others to learn from the experience.

Most important, a good report allows YOU to learn from the experience. Question your lack of success to learn from it. 

[Start COACH text]

Coach

Resist the impulse to hide problems in your report. Most readers will see the truth and will then distrust you because you camouflaged the problem.

[End COACH text]

[End Screen E-15 of 19]

[Start Screen E-16 of 19/Module E>Improvements (16)]

Mid-course corrections

Consider the following comments in a final report. As you develop your Logic Model and plan your reports, consider how early reports could help you make mid-course corrections.

What didn’t happen? Fixable? We didn’t reach our goal for participants. How could we have fixed this? Bigger groups to protect against dropout?

What did happen? Avoidable? A bus strike reduced access for distant families. Could we have avoided this? No, but …could we have suspended the program for a week? Added on an additional week as a repeat?

How close did you get? We realized we hadn’t planned for attrition, so we increased class size, and improved our numbers, but still missed our target by 2.

Did you achieve a valuable Y? Didn’t achieve goal but the manual we developed will be a valuable starting place for the future.

What will you do differently in the future? In retrospect, how would you have adjusted your budget, your staffing, your timing? Add cookies to events for young children. Be sure storyteller comes from the neighborhood. Start the institute between 7 and 14 days after the end of the semester. Adjust targets and future program based on this experience.

[End Screen E-16 of 19]

 

[Start Screen E-17 of 19/Module E>Check understanding]

Check your understanding

Your supervisor has asked you to write the final report, being sure to include the five items of information listed below. Which part of the report should each item go in: what we wanted to do? What we did? So what?

Item 1: Notice of funding by a local foundation

[Clicking on “wanted to do” to present following text: This could go in “What we wanted to do” as the “punch line” and transition to What we did OR as the happy news that begins What We Did: “With the help of a generous grant from…”

[Clicking on “we did” to present following text: This could go in “What we wanted to do” as the “punch line” and transition to What we did OR as the happy news that begins What We Did: “With the help of a generous grant from…” But it doesn’t belong in “So what?”]

[Clicking on “so what” to present following text: No, this could go in “What we wanted to do” as the “punch line” and transition to What we did OR as the happy news that begins What We Did: “With the help of a generous grant from…” But it doesn’t belong in “So what?”]

Item 2: # of participants attending programs offered

[Clicking on “wanted to do” to present following text: No, this reports on an output- clearly part of “What we did.”]

[Clicking on “we did” to present following text: Yes, this reports on an output- clearly part of “What we did.”]

[Clicking on “so what” to present following text: No, this reports on an output- clearly part of “What we did.”]

Item 3: The mission of the institution

[Clicking on “wanted to do” to present following text: You can use the institutional mission in explaining “What we wanted to do” OR in showing the importance of your achievement in “So what?” (IF you have achieved success!)]

[Clicking on “we did  to present following text: No, you can use the institutional mission in explaining “What we wanted to do” OR in showing the importance of your achievement in “So what?” (IF you have achieved success!). But your mission isn’t part of “What we did.”]

[Clicking on “so what” to present following text: You can use the institutional mission in explaining “What we wanted to do” OR in showing the importance of your achievement in “So what?” (IF you have achieved success!).]

Item 4: Exit survey of participants’ attitudes

[Clicking on “wanted to do” to present following text: No, as an indicator of an outcome, this is clearly part of the “So what?” story.]

[Clicking on “we did” to present following text: No, as an indicator of an outcome, this is clearly part of the “So what?” story.]

[Clicking on “so what” to present following text: Yes, as an indicator of an outcome, the results of the survey are clearly part of the “So what?” story.]

Item 5: Staff hours and responsibility

[Clicking on “wanted to do” to present following text: No, staff hours and responsibilities are part of “What we did” but you may want to put this information in an appendix!]

[Clicking on “we did” to present following text: Input is part of “What we did” but you may want to put this information in an appendix!]

[Clicking on “so what” to present following text: No, staff hours and responsibilities are part of “What we did” but you may want to put this information in an appendix!]

[End Screen E-17 of 19]

 

[Start Screen E-18 of 19/Module E>Apply understanding (17)]

Apply your understanding

Follow your instructor’s directions for any assignments.

You have reached the end of the last module.

[End Screen E-18 of 19]

 

[Start Screen E-19 of 19/Module E>Resources (19)]

Resources

Readings

Aten, Luan. “How to Write a Press Release” http://www.lunareclipse.net/pressrelease.htm  Accessed 8 May 2006.
Basics of writing and formatting a press release.
 
Frechtling, J., Sharp, L. & Westat, Inc. (1997, August). User-friendly handbook for mixed method evaluations, [website of Directorate for Education and Human Resources & Division of Research, Evaluation and Communication]. Retrieved August 10, 2005, from http://www.ehr.nsf.gov/EHR/REC/pubs/NSF97-153/pdf/mm_eval.pdf
This thorough 131-page handbook for the National Science Foundation, provides excellent guidelines for a variety of institutional program reports. The document’s nine chapters are divided into Introduction to Mixed Method Evaluations, Overview of Qualitative Methods and Analytic Techniques, Designing and Reporting Mixed Method Evaluations, and Supplemental Materials (that is, an annotated bibliography and glossary).

Pearson Education, Inc. (2004). Analyzing quantitative data. Retrieved on August 11, 2005, from http://wps.prenhall.com/chet_airasian_edresearch_7/0%2C6488%2C382261-%2c00.html
Pearson Education, Inc. publishing as Pearson Prentice Hall established this electronic resource for educational research. The URL for this module is designed to build statistical analysis skills, consisting of an overview of how descriptive statistics are used to describe demographic characteristics of a sample, performance on tests, data analysis, and research questions.

United States Government Accounting Office. (1992, May). Quantitative data analysis: an introduction. Retrieved August 11, 2005, from http://www.gao.gov/policy/10_1_11.pdf
The United States Government Accounting Office produced this inclusive document on how to report to their Program Evaluation and Methodology Division. The first chapter is an introduction to guiding principles, quantitative questions, attributes, analysis, variables and measurement for report writing.

United Way of America. (2005). Achieving and measuring community outcomes: challenges, issues, some approaches. Retrieved August 11, 2005, from http://national.unitedway.org/outcomes/library/cmtyres.cfm
United Way shares its experience of the implementation of Outcomes Based Evaluation in over 280 agencies. The report outlines selecting appropriate outcomes to achieve, developing a strategy for achieving the intended outcomes, creating and implementing an action plan, identifying indicators of success, measuring the selected indicators, and linking program outcomes to community outcomes in the reporting process.
 
Xpress Press. “How to Write and Format a Press Release for E-mail Distribution.” http://www.xpresspress.com/PRnotes.html  Accessed 8 May 2006.
Instructions, sample, when to send a press release.

[End Screen E-19 of 19]

[End of Module E]


Modules | Module A: overview | Module B: plan | Module C: build | Module D: evaluate | Module E: report |
Other Resources | Orientation | Logic Model | Cases | Glossary | Credits | Enhanced version |