The Feedback Gap
Sample Chapter from
A Practical Guide to the Data That Actually Drives School Improvement.
A note on this excerpt
This is a lightly adapted standalone version of Chapter Two of The Feedback Gap. It stands on its own — no prior reading required. Chapters are short and practical, designed to fit the limited time principals have each day. The full book walks through survey design, analysis, reporting, and building a sustainable feedback system. You can find it at EvaluationToolkits.com.
Chapter Two
Start With a Decision, Not a Survey
The most common reason surveys fail has nothing to do with the questions.
Here’s a pattern that plays out in schools and programs everywhere. Someone decides it’s time to survey. They open a survey platform, start writing questions, and end up with a 25-item instrument covering everything from professional development to parking. The survey goes out. Forty percent of staff complete it. The results come back, and there are so many responses to so many questions that no one knows what to do with them. The report gets filed. Nothing changes.
The problem wasn’t the intention. It wasn’t even the questions. The problem was the sequence.
A survey is only as useful as the decision it informs. Before you write a single question, you need to know what you’re going to do with the answer.
Identify the Decision First
This sounds obvious. It almost never happens. In practice, surveys tend to be driven by one of three things: a requirement from a supervisor or accreditor, a vague sense that “we should check in,” or anxiety about something the leader has been hearing informally. All three are legitimate starting points — but none of them is a decision.
A decision sounds like this:
“We’re trying to determine whether to expand our tutoring program to eighth grade. We want to know if sixth and seventh grade families find the current program valuable enough to justify the investment.”
“We’ve been implementing a new reading block structure for one semester. Before we commit to it for next year, we want to know whether teachers feel it’s working and what they need to make it more effective.”
“We’ve heard from a few parents that communication about homework expectations is unclear. We want to know how widespread that perception is and what kind of support families would find most helpful.”
In each case, the decision is named before the survey is designed. That single discipline — deciding before designing — is what separates surveys that produce action from surveys that produce binders.
IN PRACTICE
A principal wanted to evaluate professional development but wasn’t sure what to ask. After narrowing the decision to a single question — “Are teachers applying what they’re learning in their classrooms?” — the survey shrank from 22 items to five. Response rates doubled, and the findings led directly to a restructured PD schedule.
TRY THIS
Write it out right now. Name one decision you are actively facing in your school — a program choice, a scheduling question, a staffing concern, a communication gap. Then write the single survey question that would give you the most useful feedback on that decision. Finally: what would you do differently if 80 percent of respondents agreed? What if only 40 percent did? If you can answer those questions, you have a survey worth running.
Match Questions to Audience
Once you know the decision, you can identify the right audience. Not all surveys belong with all groups. Staff surveys work when you need insight about working conditions, program implementation, or professional culture. Family surveys work when you need to understand trust, communication, or program satisfaction. Student surveys — particularly for middle and high school students — can provide essential data on classroom climate, instructional clarity, and sense of belonging.
Resist the temptation to survey everyone about everything at once. A well-targeted survey sent to the right audience produces data you can act on. A comprehensive survey sent to everyone produces data that explains everything and informs nothing.
Keep Scope Narrow
Narrow scope is the single most important design principle in practical survey work. One focused survey on one decision outperforms a comprehensive survey covering every dimension of school life — every time. This is not a compromise. It’s an understanding that the value of survey data lies not in its breadth but in its usefulness.
A survey with five focused questions can be completed in three minutes. Response rates go up. The data is cleaner. Analysis is faster. The connection between what you learned and what you should do next is clear.
A Simple Planning Framework
Before you open any survey tool, write down answers to three questions:
-
What decision am I trying to make?
-
Who has the most relevant perspective on this decision?
-
What will I actually do differently depending on what I learn?
The third question is the most important — and the one most often skipped. If you can’t name two or three plausible actions that might follow from different results, your survey purpose isn’t clear enough yet. Keep refining until you can.
Notice also that your starting point is often outcome data. Flat scores in a particular grade, rising referrals in a specific cohort, declining family event attendance — these signals tell you where a perception survey is worth running. Outcome data points you toward the right question. Perception data helps you surface the experience underneath it.
When Not to Survey
Not every question should be put to a survey. Feedback isn’t always the right tool. If a decision is already made and won’t change regardless of results, don’t survey — it wastes people’s time and erodes trust when they see that their input had no effect. If you’re dealing with a sensitive personnel or legal matter, a survey is the wrong vehicle. If what you actually need is a conversation — a focus group, a team meeting, a one-on-one — a survey can produce data that feels definitive but misses the nuance entirely.
Knowing when not to survey is as important as knowing how to design one. The goal is not to survey more. The goal is to survey purposefully.
The Ethics of Asking
Three things are worth naming before you survey any group:
Survey fatigue is real. In many schools, staff and families receive survey requests from the district, the state, accreditation bodies, program vendors, and the school itself — sometimes within the same month. When you add to that load, the claim is only legitimate if you intend to act on what you learn and tell people what you did. A school that surveys and goes silent has forfeited the right to ask again.
Power dynamics shape honesty. Even when you guarantee anonymity, people you survey may not believe you — and in some contexts, their skepticism is reasonable. Staff who have seen colleagues face consequences for honest feedback will protect themselves regardless of what your survey header says. If the conditions in your school don’t genuinely support candor, improving those conditions is more important than running another survey.
Your data can travel. Survey results collected for internal improvement can be requested by district leadership, referenced in evaluations, or quoted out of context. Before you survey, think about what would happen if your results — including low scores — traveled beyond the audience you designed them for.
REFLECT
What decision will your next survey inform? Write it out in one or two sentences. Then answer: what will you do differently depending on what you learn?
Be The First to Know When the Book Is Published
The full version of book the book is in the works and covers how to design surveys that produce usable data, how to interpret and report findings honestly, how to communicate results to staff, families, and leadership, and how to build a feedback system that sustains itself over time.