Plain answers to the questions readers ask after the article. The
structured versions of these answers also appear in this page's
schema, so the same content shows up in search-result rich snippets
and AI Overview citations.
01
What is a program report?
A program report is the structured artifact a program team
produces about a single program: who participated, what
activities ran, what outcomes those activities produced, what
participants said about the experience, and what the team
learned that changes the next cycle. Every other report
a nonprofit publishes (grant report, donor report, board report,
annual report) is a filtered view of one or more program
reports.
02
What is the difference between a program report and an impact report?
A program report is scoped to one program: one cohort, one site,
one funded activity. An impact report is organization-wide,
covering all programs across an annual cycle. The impact report
is built by aggregating multiple program reports. If
your impact report numbers do not match your program report
numbers, the architecture underneath is broken.
03
What are the five sections of a program report?
Section one is the headline outcome: one number plus the
population it applies to. Section two is who showed up:
demographic breakdown captured at intake. Section three is what
changed: pre-post movement on the outcomes the program theory
predicted. Section four is what participants said: themed
open-ended responses with citations. Section five is what was
learned and what is next: methodology and the forward-looking
note.
04
How long should a program report be?
Long enough that a sophisticated reader can verify the claims
and short enough that a busy program officer reads to the end.
In practice that is six to twelve pages of an interactive report
or a corresponding live URL. Most teams over-produce:
the report grows because each audience requested an addition.
The fix is one report, multiple filtered views, not one report
with every section every audience ever asked for.
05
How long does it take to produce a program report?
Days, not weeks, when the data architecture is right. The
traditional path takes four to six weeks because every cycle
reconstructs the dataset from scratch. With persistent
participant IDs assigned at intake, qualitative coding running
on collection, and demographics tagged as structured fields,
the report is ready when the program closes. The first
cycle takes a day or two of configuration; subsequent cycles
take minutes.
06
What is a program report template?
A program report template is the reusable five-section structure
described in this article. The template stays stable across
program types because the questions every audience asks are
stable: who, how many, what changed, what did they say, what did
we learn. Sector-specific metrics fit inside the
template; the template does not change to accommodate sector.
A workforce program and an environmental program use the same
five sections with different numbers.
07
Who reads a program report?
The program team reads it first, to learn what worked. The board
reads a filtered view to decide whether to continue the program.
The funding foundation reads a filtered view scoped to the
activities they funded. Major donors read a filtered view scoped
to their gift area. The public reads a summarized view in the
annual report. Five audiences, one source, no parallel
authoring cycles.
08
Can a program report be a live URL instead of a PDF?
Yes, and live URLs outperform PDFs on every dimension that
matters: the funder revisits across the year rather than reading
once and filing it; the data refreshes as the program continues;
the qualitative section drills back to the source response; the
audit trail is visible. PDFs still have a place for
board books and printed donor packets, but the canonical
artifact is the live URL the report is generated from.
09
What metrics belong on a program report?
The outcomes the program theory predicts will move, plus the
inputs needed to interpret them: sample size, response rate,
demographic disaggregation, and the methodology used to match
baseline to follow-up. Vanity metrics like attendance counts
belong in the appendix or not at all. The rule across
every sector: a few outcome metrics with baselines and
disaggregation, plus participant voice that explains the
numbers.
10
How does a program report support the grant report or impact report?
It supplies the data. A grant report is one program report
filtered to the activities the grant funded and rendered into
the funder's required template. An impact report is multiple
program reports aggregated across the organization. A donor
report is one program report filtered to the gift area.
The program report is the source; downstream views are
queries against it. Build the program report well and
the rest become exports rather than authoring projects.