play icon for videos

Impact Report Template: Free Examples for Every Sector

Download free impact report templates with section structure, real examples, and AI-powered reporting — built for nonprofits, CSR, and foundations.

US
Pioneering the best AI-native application & portfolio intelligence platform
Updated
May 1, 2026
360 feedback training evaluation
Use Case
Impact report template

A template gives you the headers. A working report fills them with evidence that holds. This page does both.

Eight sections, the question each one answers, length guidance per section, and a working example you can copy section by section. Adapt for nonprofit annual reports, donor updates, quarterly board pre-reads, or one-page summaries. The methodology behind the template lives on the impact reporting page; this one is for writing the document.

On this page
The eight-section anatomy
Definitions and distinctions
Six writing principles
Section format and length
A worked example, annotated
Three template variants

Live samples · 4 reports · no login

Impact report examples your funders will actually read.

Four real Sopact reports, four different donor audiences. Each opens in a browser without a login. Adapt any one to your annual report, your foundation grantee submission, your scholarship donor packet, or your corporate sponsor brief.

Each report came out of program data in minutes, not assembled in six weeks from three disconnected exports. The architecture underneath, not the styling, is what makes them defensible to a sophisticated donor.

Open any one. No login. Real program data, anonymized.
The anatomy

Eight sections, one question per section

Most working impact reports follow the same eight sections in the same order, scaled to the audience. Each section answers one question. The job of the report is to answer the eight questions in plain language with evidence that holds together. The job of the template is to keep you from forgetting any of them.

01

Cover and headline finding

Did the program work? One sentence the reader absorbs in three seconds, plus the most representative number.

1 page
02

The strategic question

What was the program trying to change? The outcome the program was designed to produce, named in the words the team uses internally.

Half a page
03

Methodology in brief

How was the data collected? Who counts as a participant, when measurement happens, what the comparison is. Plain language, not academic.

Half to 1 page
04

Outcomes

What changed? Three to five outcome metrics with baseline and follow-up, charted next to the prose that interprets them.

2 to 3 pages
05

Stories

What did the change look like for one person? Two or three participant stories tied to the outcome numbers, not decorative.

1 to 2 pages
06

Honesty section

What underperformed and why? The section that builds trust. The reader knows nothing works perfectly. Naming what did not work signals the rest is honest too.

Half a page
07

Comparison and context

Compared to what? Prior period, prior cohort, regional benchmark, or program target. The number alone does not mean anything without the comparison.

Half a page
08

Forward look

What changes next cycle? What the team learned from this report and what will be done differently. Closes the loop between reporting and program design.

Half a page
Working report

8 to 12 pages total, plus a one-page summary at the front. Larger organizations sometimes run 20 to 30 pages with appendices. Length tracks audience attention more than program complexity. A short report that answers one strategic question outperforms a long report that lists every activity.

A note on section nine. Acknowledgments and governance can sit at the end as a ninth section if the audience is funders, regulators, or a board that expects them. Most public-facing reports leave acknowledgments to a footer.

Definitions

The terms a first-time reader runs into

Five questions, five plain-language answers. The point of starting here is so the rest of the page reads cleanly without you having to keep three definitions in working memory.

What is an impact report template?

An impact report template is a reusable structure for the document a program produces to show what changed because of its work. A working template names eight things: the eight sections, the question each section answers, the length each section should run, and the format choices that make the report readable. A weak template gives only headers and leaves the rest of the writing to the team.

The template you adopt is less important than whether you actually fill in each section against a real strategic question. Most teams that struggle with impact reports do not need a better template. They need to answer the eight questions in plain language before they format anything.

What is an impact report?

An impact report is the document a program produces to answer a single strategic question with evidence. It pairs outcome numbers with participant stories that describe the same population. The defining feature is that the headline finding on the cover, the outcome chart on page four, and the story on page six all describe the same people.

The companion page on impact reporting covers the methodology behind the document. This page covers the document itself.

Impact report meaning, in plain terms

In plain terms, an impact report shows what changed because of an organization's work, with evidence. The phrase usually appears when someone is encountering the term for the first time and wants the difference between an impact report and other reports they already know about, like an annual report or a financial statement. The defining feature is the focus on change for the people the program serves, not on the activities the program ran.

What is impact report format?

Impact report format refers to two things at once. The file format the report ships in, which is usually a PDF, sometimes a web page, sometimes a slide deck, and often a one-page summary alongside one of the others. And the visual format the content takes, meaning text-heavy versus chart-heavy, single-column versus magazine layout, photo-rich versus minimalist.

Most programs publish in two or three formats from the same content. A short PDF for download, a web version for sharing on social and embedding in funder portals, and a one-page summary for board meetings or donor mailings.

What is the purpose of creating an impact report?

An impact report serves three audiences at once. It accounts to funders, regulators, or LPs for resources spent. It informs the program team about what is working and what is not. It builds external trust by showing the work to a wider audience in a form they can understand.

A report that serves only the first purpose tends to read as a compliance artifact. A report that serves all three becomes a planning instrument the team uses to decide what to do next. The template choices on this page are aimed at the third version.

Often confused with

Related templates and what they do differently

Impact report vs annual report

Different organizing question. An annual report is organized around the organization itself. An impact report is organized around the change the organization is trying to create. Many groups publish both, with the impact report leading on outcomes and the annual report carrying audited financials.

Impact report vs impact assessment

Different audience. An impact assessment is the internal analysis asking how much change the program caused versus what would have happened anyway. An impact report is the published document built from the findings. Reports do not carry counterfactuals; assessments do.

Annual impact report vs donor impact report

Different scope. The annual report covers all programs together, year over year, public-facing. The donor report is one or two pages tied to a specific donor's gift, sent quarterly or after a campaign. Donor reports are derivatives of the annual report, not separate work.

Impact report vs closeout report

Different purpose. A closeout report is the contractual document a grantee sends a funder when a grant ends. It is private and structured to the funder's terms. An impact report is public and strategic. The same outcome data feeds both.

Six writing principles

What makes the template work when you fill it in

A template is structure, nothing more. The principles below are what separate a polished blank from a report a funder reads past page two. Each one targets a specific failure mode this page has watched programs fall into across nonprofit, CSR, and impact-fund reporting.

01 · COVER

Lead with the finding, not the logo

The cover answers the reader's first question, in one sentence.

A reader picking up the report wants to know whether the program worked. The cover should answer that in one sentence with one number, before any branding, photo, or table of contents. The most important page in the report is the first one.


Why it matters. A cover that states the finding survives skim-reading. A cover that states the organization name does not.

02 · STRUCTURE

One question per section

Each of the eight sections answers exactly one question, no more.

Section drift is the most common template failure. The methodology section starts to argue for the program. The outcomes section starts to apologize for the data. Naming the question each section answers and writing only to that question keeps the report short and the chain readable.


Why it matters. Eight sections that each answer one question is twelve pages. Eight sections that each answer three is forty.

03 · BINDING

Pair every chart with a story

The story comes from one of the people inside the chart.

A 71% completion rate next to a quote from someone in the 71% binds the quantitative to the qualitative. A 71% completion rate next to a generic testimonial does not. Most programs already collect both and then present them as separate exhibits.


Why it matters. Reviewers trust paired evidence. Decorative quotes near unrelated charts read as marketing.

04 · METHOD

Methodology on page two, not in an appendix

Half a page in plain language, near the front.

Skeptical readers check the method before they trust the numbers. If the method is buried at the back, the skeptical reader has already disengaged. A short methodology note placed early signals confidence and pre-empts the questions that would otherwise crowd a Q-and-A.


Why it matters. Plain-language method on page two earns the rest of the report the benefit of the doubt.

05 · HONESTY

Include the section about what underperformed

Half a page that names what did not work and why.

Readers know nothing works perfectly. A report that names what underperformed signals that the rest of the report is honest too. The honesty section is the smallest section by length and the largest by trust impact. Most templates drop it. The ones that keep it consistently outperform.


Why it matters. A report without an honesty section reads like marketing, no matter how careful the rest of the writing is.

06 · CLOSE

End on a forward look, not a celebration

Half a page on what changes next cycle.

Reports that end on a celebratory note close the file. Reports that end on what the team is changing for the next cycle invite the reader back. The forward look turns the report from an artifact into a conversation. It is also where the program team writes for itself, not for the funder.


Why it matters. The forward look closes the loop between reporting and program design. Without it, reports do not feed planning.

Format choices

Six choices that decide whether the report works

Six format-and-presentation decisions every report writer makes, often without naming them. The first column names the choice. The middle two show what most teams do versus what works. The last names the consequence.

The choice
Broken way
Working way
What this decides

Length

How many pages the body of the report runs.

Broken way Broken

Sprawling 40-to-60 pages because the team treated the report as a chance to list every program activity from the year.

Working way Working

8 to 12 pages, plus a one-page summary at the front. Each section answers one question and stops. Activities live in an appendix or a separate annual report.

What this decides

Whether anyone reads past page two. Long reports get filed. Short reports get read and shared.

Cover

What appears on the front page.

Broken way Broken

Logo, tagline, and a stock photo of smiling participants. Reader has no idea what the report says without opening it.

Working way Working

Headline finding stated in one sentence, with the most representative number. Organization name and reporting period in small type.

What this decides

Whether a busy reader knows the answer in three seconds. The cover is the most-read page in the document.

Methodology placement

Where the data-collection note sits.

Broken way Broken

Buried in an appendix at page 38, in academic language a non-specialist cannot parse. The skeptical reader disengages before reaching it.

Working way Working

Half a page in plain language on page two. Names who counted, when measurement happened, and what the comparison is.

What this decides

Whether a skeptical reader trusts the rest of the report. Method placement is a signal, not a logistics choice.

Stories

How quotes and case stories appear.

Broken way Broken

Decorative pull-quotes scattered through the report, sourced from people the program team picked because the quotes were flattering. Disconnected from the numbers nearby.

Working way Working

One story per outcome, sourced from someone whose number is in the chart on the same page. Story attribution names the cohort or program stage, not the individual.

What this decides

Whether the qualitative reinforces the quantitative or contradicts it. Mismatched stories signal cherry-picking.

Format variants

How many versions of the report ship.

Broken way Broken

PDF only. Board members get the same 25-page document as the foundation program officer and the donor base. Each audience gets the wrong amount.

Working way Working

Three formats from the same content. Full PDF for download. One-page summary for board pre-reads and donor mailings. Web version with anchor links for sharing and embedding.

What this decides

Whether each audience gets the version they will actually read. The same content, three packaging choices.

Tone

What voice the report uses end to end.

Broken way Broken

Celebratory throughout. Every chart is a win. No section names what underperformed. The report reads like fundraising copy and is treated that way.

Working way Working

Honest about what worked and what underperformed. The honesty section is half a page and earns the rest of the report the benefit of the doubt.

What this decides

Whether the report builds trust or feels like marketing. Tone is not a writing style choice. It is a credibility choice.

Compounding effect

The first choice controls all the others. A team that commits to 8 to 12 pages writes a different cover, a tighter methodology note, and a more honest tone, because there is no room to hide. A team that drifts to 40 pages spends most of the cycle filling pages and ends with a document no audience reads cover to cover.

A worked example

Three sections, side by side

An anonymized workforce training program, mid-cohort cycle, eight weeks before the foundation reporting deadline. The team has the data. They are deciding how to write three of the eight sections. Each block below shows the weak draft and the strong rewrite, with a note explaining the move.

Setup

We had 247 enrolled, 184 completed, 155 placed within 90 days, and a $14,200 average wage gain at the twelve-month follow-up. Our soft-skills module was rated lowest in the post-survey by a wide margin. The first draft of the report celebrated everything. Our program officer asked us to be honest about what underperformed. So we rewrote three sections.

Workforce training program lead, eight weeks before report deadline

Section 01

The cover

Weak draft

Workforce Bridge 2025 Annual Impact Report

Empowering futures, transforming communities

What it does. Logo, organization name, and a tagline. The reader has no idea whether the program worked without opening the document.

Strong rewrite

$14,200Average wage gain twelve months after placement, 2025 cohort

Workforce Bridge · 2025 Annual Impact Report

What changed. The cover now answers the reader's first question in three seconds. The organization name moves to small type. The tagline is gone.

Section 04

Outcomes paired with stories

Weak draft

Outcomes

155 graduates placed within 90 days. Average wage gain of $14,200.

"This program changed my life." Sarah J.

What it does. Two numbers and a generic quote from someone whose connection to the numbers is unclear. Reader cannot tell if Sarah was placed, completed, or dropped out.

Strong rewrite

Placement and wage gain

155 of 184 program completers placed within 90 days, 84%. Average wage at placement was $19,400 against a regional baseline of $11,800 for similar entry-level roles, a $14,200 gap that held at twelve-month follow-up.

"Six months in, I'm running a service crew. Before the program I was waiting tables." Cohort 14 graduate, twelve-month follow-up.

What changed. The number now has a comparison (regional baseline) and a durability check (twelve months). The quote attributes to the cohort the number describes, not a name. The story and the chart point to the same population.

Section 06

The honesty section

Weak draft

Lessons learned

We continue to learn from our cohort and refine our curriculum based on participant feedback. We are excited about the journey ahead.

What it does. Avoids naming anything that underperformed. Reads like marketing copy. The reader assumes the team is hiding something, even when they are not.

Strong rewrite

What underperformed

Our soft-skills module rated lowest in the cohort 14 post-survey, 2.8 out of 5 against an average of 4.1 across the other modules. Participant comments named the role-play exercises as feeling staged. The module is being rebuilt for cohort 15 with input from two employer partners, and the redesigned module will be measured the same way.

What changed. The section names what underperformed, gives the number, attributes the issue, and explains what is being done about it. Half a page. The trust gain across the rest of the report is large.

What this required upstream

The rewrite was a 90-minute job, not a six-week one.

All three rewrites needed the same thing: data already bound to the participants who produced it. The strong outcomes section needed a quote from someone in the placement cohort, attributable to the right cohort. The strong honesty section needed module-level survey scores tied back to cohort 14 specifically.

Programs whose intake, follow-up, and qualitative coding share an identity layer can do this rewrite in a working session. Programs running a survey tool, a CRM, and three spreadsheets in parallel cannot, because the chain from quote to participant to outcome was broken months earlier and would need to be rebuilt from scratch first.

Three template variants

The same anatomy, three lengths

Most teams produce two or three of these from the same underlying data each year. The eight-section anatomy is the same; the length per section and the audience differ. Pick the variant that matches the audience the report is for, and produce it as a derivative of the others where you can.

01 · One page

One-page impact report

For event recaps, donor mailings, board pre-reads, and quick funder updates.

The shape. Headline finding at the top. Three or four outcome metrics. One participant story. A two-sentence forward look. The methodology note collapses to a single sentence ("Outcomes measured at intake and at the end of the program for the 2025 cohort"). The honesty section becomes one line that names the largest underperforming area. The comparison sits inside the outcomes themselves rather than as a separate section.

What breaks. Teams treat the one-pager as a separate writing project rather than a derivative of the longer report, and end up with two documents that contradict each other on minor numbers. The one-pager is also frequently missing the honesty line, because at one page the temptation to cut it is strong.

What works. Write the longer report first. Cut to one page by collapsing each section to its tightest form, not by removing sections. The one-pager that survives this process always retains the honesty line, even if it is one sentence, because removing it changes the credibility of the rest.

A specific shape

A workforce program quarterly update. Top of page: "84% placement, $14,200 wage gain at 12 months." Three stat blocks. One quote. Two sentences on the soft-skills module redesign. Two sentences on cohort 15. PDF and emailed HTML version, same content.

02 · Annual

Annual impact report

The full eight-section structure, scaled to a year of program work and a public audience.

The shape. 8 to 12 pages for most organizations, 20 to 30 for larger ones with multiple programs and an appendix. All eight sections present at full length. Multi-year comparison in the comparison section if the program has run more than one cycle. Acknowledgments to funders, partners, and board sit at the end. Photos of program participants, with consent and attribution, appear next to the stories not on the cover.

What breaks. Annual reports drift to 30 or 40 pages because the team treats the report as the place to record everything that happened in the year. The honesty section gets cut for length. Photos and pull-quotes accumulate without binding to the outcome data. The result is a polished document that no audience reads cover to cover.

What works. Cap length at 12 pages for most organizations. Move activity-by-activity detail to a separate operations report or a website appendix. Keep the impact report focused on the strategic question and the eight sections that answer it. Treat the cover and the honesty section as the two places length cannot be cut.

A specific shape

A foundation grantee, 2025 annual report. 11 pages plus a one-page summary at the front. Four programs covered, each with one outcome chart and one paired story. Half-page honesty section naming two underperforming programs. Forward look on cohort 15 redesign. PDF, web version with anchor links, and the one-page summary as a derivative.

03 · Donor and quarterly

Donor impact report or quarterly update

One or two pages tied to a specific donor's gift or a single quarter of activity.

The shape. Headline finding, scoped to what this donor funded or what this quarter measured. One outcome metric. One story from a participant whose path was funded by this donor or who was active in this quarter. A short forward look pointing to the next reporting moment. Donor-specific reports go out after a campaign or as part of the stewardship cycle. Quarterly reports go out on a fixed cadence for board and major-donor pre-reads.

What breaks. Donor reports written from scratch each cycle because the team has not built the reporting pipeline to extract donor-scoped views from the same data the annual report uses. The donor report ends up generic ("here is what we did in 2025") rather than specific ("here is what your gift funded"). Quarterly reports drift in cadence and quality because no one owns the production pipeline.

What works. Donor and quarterly reports should be filtered views of the same data layer the annual report uses. The structural fix is upstream: tag participants with the funding source they were enrolled under, so a donor-specific outcome cut is a query rather than a writing project. Quarterly cadence holds when the report is a derivative of the same data, produced in two hours rather than two weeks.

A specific shape

A major donor stewardship report after a $250,000 gift. Two pages. Headline scoped to the 35 participants this gift funded. One outcome chart, one story, one forward-look paragraph. Personalized cover with the donor name and gift amount. Generated quarterly from the same source data the annual report uses.

Tools to write it in

The tool you write the report in matters less than the data behind it

The most common question after picking a template is which tool to write the report in. The honest answer: most tools work. The one that matters is the one feeding you the numbers. Short tour of the landscape below.

Document Microsoft Word Google Docs Notion Design Canva Figma Adobe InDesign Web Webflow WordPress Data layer Sopact Sense

Picking the document tool

Word and Google Docs work for plain text-and-chart reports. Most foundation grantees produce strong reports this way. Canva, Figma, and Adobe InDesign work for design-heavy reports that need a magazine feel. Use these when the team has design capacity and the audience expects a printed artifact. Webflow, WordPress, and Notion work for interactive web versions, which let readers jump to the section they care about and let the report stay live as data updates.

The choice rarely affects whether the report works. A plain Word document with a clear cover, eight named sections, and an honesty paragraph outperforms a glossy Canva report that buries the methodology in an appendix. Pick the tool the team can actually produce in.

The data layer is the part that matters

What makes the rewrite in section 8 above a 90-minute job rather than a six-week one is whether participant outcomes, follow-up data, and qualitative quotes share an identity layer. With that layer in place, the writer queries "show me a quote from someone in the cohort that hit 84% placement" and gets the answer. Without it, the writer asks the program team for the quote, who searches through three folders, who finds something that may or may not be from the right person.

Sopact Sense is the system that holds the chain end to end. Survey collection, intake records, follow-up, and qualitative coding share an identity layer so any number on the cover can be traced back to the specific participants who produced it. The document layer above sits on top of whatever data system the team is already using.

A practical test for any impact report template tool. Open a draft, point to a number on the cover, and ask: can I trace this back to the survey question, the participant cohort, and the qualitative quote in under three clicks? If not, the chain is broken before the document layer ever opens.

Frequently asked

Impact report template questions, answered directly

Fifteen questions covering format, length, audience variants, and the most common confusions about what an impact report template should contain.

01

What is an impact report template?

An impact report template is a reusable structure for the document a program produces to show what changed because of its work. A working template names eight sections, the question each section answers, the length each section should run, and the format choices that make the report readable. A weak template gives only headers and leaves the rest of the writing to the team.

02

What sections go in an impact report?

Eight sections cover most working impact reports. Cover and headline finding. The strategic question the program is trying to answer. A short methodology note. Outcome metrics. Participant stories tied to those metrics. An honesty section about what underperformed. Comparison with the prior period or a benchmark. Forward look at the next reporting period. Acknowledgments and governance can sit at the end as a ninth section if the audience needs them.

03

How long should an impact report be?

Most working impact reports are eight to twelve pages, plus a one-page summary at the front. One-page reports work for quarterly updates and donor mailings. Annual reports for larger organizations sometimes run twenty to thirty pages, but the report length tracks the audience attention span more than the program complexity. A short report that answers one strategic question well outperforms a long report that lists every activity.

04

What is an impact report format?

Impact report format refers to both the file format the report ships in (PDF, web page, slide deck, one-page summary) and the visual format the content takes (text-heavy versus chart-heavy, single-column versus magazine layout). Most programs publish in two or three formats from the same content. A short PDF for download, a web version for sharing, and a one-page summary for board meetings or donor mailings.

05

What is the difference between a one-page impact report and an annual impact report?

A one-page impact report carries the headline finding, three or four outcome metrics, one participant story, and a forward look. It works for quarterly updates, event recaps, donor mailings, and board pre-reads. An annual impact report carries all of that plus a methodology note, a fuller honesty section, comparison with the prior period, and acknowledgments. Most teams produce both from the same underlying data, with the one-pager built as a summary of the annual rather than a separate report.

06

How do you design an impact report so people read it?

Lead with the headline finding on the cover, not with a logo and a tagline. Pair every chart with a story from a participant whose number is in the chart. Keep the methodology note in plain language and put it on page two, not in an appendix. Include an honesty section about what underperformed. End on a forward look. Length, font, and layout matter less than the question to evidence chain. A plain-formatted twelve-page report that holds the chain outperforms a glossy magazine that does not.

07

What is an impact report layout that works?

A working impact report layout opens with a cover that states the finding, follows with a one-paragraph executive summary, then walks the reader through the eight sections in order. Single-column body text reads faster than multi-column for reports under twenty pages. Charts sit next to the prose that interprets them, not on a separate page. White space between sections matters more than decorative graphics. Photos, when used, show the people the program serves rather than the staff or the office.

08

What goes on the cover of an impact report?

The cover should state the headline finding in one sentence the reader can absorb in three seconds. The organization name, the reporting period, and a single representative number belong on the cover. The cover does not need a stock photo. It does need to answer the question a busy reader carries to it: did the program work, and what changed.

09

Can I write an impact report in Word, Google Docs, or Canva?

Yes. Word and Google Docs work for plain text-and-chart reports. Canva, Figma, and InDesign work for design-heavy reports that need a magazine feel. Webflow or Notion work for interactive web versions. The choice depends on whether the team has design capacity and whether the audience expects a printed artifact. The document tool rarely affects whether the report works. The data pipeline that feeds the document does.

10

What is a nonprofit impact report template?

A nonprofit impact report template is the same eight-section structure adapted to nonprofit context. Cover with headline finding. Mission and the strategic question for the year. Methodology note covering reach, demographics, and outcome measurement. Outcomes against goals set at the start of the year. Stories from program participants. What underperformed. Comparison with the prior year. Forward look. Acknowledgments to funders and partners. Most foundation grantees produce a version of this annually.

11

What is a donor impact report template?

A donor impact report template is a one-page or two-page summary written for a specific donor or donor segment, showing what their contribution funded and the outcome it produced. Headline finding tied to the donor's gift. One participant story. One chart. A short forward look. Donor reports go out quarterly or after a campaign and feed the annual stewardship cycle. They are not the same as a closeout report to a foundation, which is contractual and formal.

12

What is an annual impact report template?

An annual impact report template is the full eight-section structure scaled to a year of program work. Most run eight to twelve pages, with longer organizations running twenty to thirty pages with appendices. The annual report covers all programs together, comparing year over year, naming what was funded, what was achieved, what was attempted and underperformed, and what the next year intends to do differently. The annual report often produces a one-page summary as a derivative.

13

What is a quarterly impact report?

A quarterly impact report is a short update covering one quarter of program activity, typically one to three pages. It carries the headline finding for the quarter, an updated outcome metric or two, a single participant story, and a forward look at the next quarter. Quarterly reports work as donor and board pre-reads and as the building blocks the annual report later combines into a fuller account.

14

How do you write an impact report from a template?

Start by answering the question each section poses, in plain language, before formatting anything. Write the headline finding first because every other section refers back to it. Then draft the methodology note, the outcomes, the stories, and the honesty section. Do the cover and executive summary last, after the body is written, so the cover reflects what the report actually says. Format the document last, not first. Most teams that struggle with the report struggle because they formatted before they wrote.

15

What is the difference between an impact report template and an impact assessment template?

An impact report template is for the published document an organization produces to communicate outcomes. An impact assessment template is for the internal analysis that asks how much change a program caused versus what would have happened without it. Assessment templates carry counterfactuals, statistical methods, and confidence intervals. Report templates carry the findings from those assessments, written for a non-technical audience. Many teams confuse the two and end up either writing a report that reads like an academic paper or an assessment that reads like marketing.

Keep reading

Where this template fits in the wider work

The two highlighted pages below are the direct companions to this template. The first explains the methodology behind the eight sections; the second shows examples of the template in production. The four below them carry the upstream and downstream pieces of the reporting pipeline.

In closing

A working template lives or dies on what fills it

The eight sections, the length guidance, and the format choices on this page are the visible half. The harder half is whether the data behind each section can be traced back to the participants who produced it. See the examples for what this looks like in production. Read the methodology page for the principles behind the template.

Working on a report on a deadline and the data is not where it should be? Book a 60-minute working session to map the chain from your survey instrument to the cover number.