Innovative Survey Design for Social Impact
Surveys are a powerful tool for gathering valuable data, but their effectiveness depends on the quality of the survey questions. Well-designed surveys can provide insightful and actionable information, while poor design can introduce bias, confusion, and low response rates. In this article, we will explore best practices for creating practical survey questions and aligning them with a theory of change.
We will use the example of a nonprofit organization conducting
live coding boot camps to empower young girls and lift them out
of poverty. The theory of change focuses on providing education
and job opportunities to mitigate the risk of human trafficking.
You can review the detailed impact data strategy here (Sign-up required. You can create one based on your organization).
Survey Design Best Practices
In today's data-driven age, surveys have become integral to any organization's decision-making process. Surveys are an effective way to gather feedback from stakeholders, understand their needs and expectations, and identify areas for improvement. However, designing effective surveys can be challenging, especially if you want to get deep stakeholder insight and collect meaningful feedback. In this article, we will provide you with the most effective best practices that you can follow to create a dynamic approach to understanding survey data analytics and share it with your internal stakeholders. By implementing these best practices, you can ensure that your surveys align with your organization's goals and objectives and provide valuable insights to drive positive change. So, let's dive in and explore the world of survey design!
Alignment of Output and Outcome with Survey Questions:

By aligning survey questions with the theory of change, organizations can ensure that the data collected is relevant to their goals and objectives. This approach helps organizations measure their programs' effectiveness and provides valuable insights into areas for improvement. Additionally, aligning survey questions with the theory of change ensures that the data collected is actionable, as it provides a clear understanding of the impact of the organization's initiatives on its target beneficiaries. Overall, aligning survey questions with the theory of change is crucial for impact-driven organizations to make informed decisions and achieve their desired outcomes.
Example: "On a scale of 1-10, please rate your confidence in
technical concepts after attending the coding boot camp."
Understand the Different Types of Questions:
When designing a survey, it is crucial to understand the difference between open-ended and closed-ended questions. Open-ended questions allow respondents to provide detailed and personalized comments, while closed-ended questions offer a fixed set of options. A balanced mix of these question types ensures a comprehensive understanding of the collected data, providing qualitative and quantitative insights.
Open-ended questions are particularly useful for capturing nuanced feedback and identifying unexpected insights. They allow participants to express their thoughts and feelings in their own words, providing valuable context and depth to the data collected. However, open-ended questions can be time-consuming and challenging to analyze, so it is important to use them sparingly.
Closed-ended questions, on the other hand, offer a straightforward and efficient way to collect quantitative data. They are easy to analyze and can provide numerical insights useful for making data-driven decisions. Closed-ended questions are particularly useful for assessing the effectiveness of specific initiatives or measuring the impact of different aspects of a program.
Balancing open-ended and closed-ended questions in a survey ensures a mix of qualitative and quantitative data that comprehensively understand the initiative's impact. Organizations can gather meaningful insights that inform their decision-making and drive positive change by carefully selecting the right question types.
An example of an open-ended question that could be used to measure the long-term impact of a coding boot camp for young girls is: "How has attending the coding boot camp impacted your career aspirations and opportunities?" This question allows participants to provide a detailed and personalized response, providing valuable qualitative insights into the initiative's impact. By analyzing these responses, the organization can better understand the initiative's long-term outcomes and make informed decisions to improve its effectiveness. It is important to note that open-ended questions should be used sparingly to avoid overwhelming participants with too many inquiries, which could lead to data fatigue and poor-quality responses.
Example: Closed-ended question - "Did the coding bootcamp improve
your interviewing skills? Yes/No."
Design Neutral Questions:
Example: "How would you rate the quality of the coding boot camp
materials?"
Provide Balanced Answer Choices:
Example: "On a scale of 1-5, please rate the helpfulness of
the coding bootcamp materials:
1 - Very helpful,
2 - Somewhat helpful,
3 - Neutral,
4 - Not very helpful,
5 - Not helpful at all."
Avoid Double-Barreled Questions:
Double-barreled questions are the nemesis of accurate data collection. They can be confusing and misleading for survey respondents, leading to inaccurate results that can negatively impact the organization's decision-making. Instead of asking questions combining two or more inquiries, splitting them into separate, focused ones is important. By doing so, organizations can gain a more accurate understanding of respondents' opinions and experiences. This approach also avoids overwhelming survey participants with too many inquiries, which can lead to data fatigue and poor-quality responses. Ultimately, by asking focused and precise questions, organizations can collect reliable data that can be used to make informed decisions and improve the effectiveness of their programs.
Example: Instead of asking, "How satisfied are you with the coding
boot camp and the instructor?"
instead, ask two separate questions:
"How satisfied are you with the coding bootcamp?" and
"How satisfied are you with the instructor?"
Use Clear and Concise Answer Options:
Example: "How would you rate the overall organization of the coding
boot camp? Excellent, Good, Fair, Poor."
Use Skip Logic or Branching:
Skip logic or branching is a powerful tool that allows survey designers to create a personalized experience for respondents. Skip logic keeps the survey focused and improves completion rates by directing participants to relevant questions based on previous answers. This approach ensures that respondents only see questions relevant to their experiences, which can increase their engagement and willingness to participate. Moreover, skip logic can tailor the survey experience for different groups of participants, allowing for more nuanced data analysis. For example, if a survey is conducted for different age groups, skip logic can ask different questions based on the respondent's age. This approach ensures that the data collected is relevant and tailored to the organization's needs. Skip logic is a powerful feature that can improve the quality of data collected in surveys and enhance the overall survey experience for participants.
Example: If a participant answers "No" to attending the coding
boot camp, they can be skipped to a different set of questions
related to their reasons for not attending.
Consider Question Design and Reporting Techniques for Different Question Types:
When designing a survey, it is important to consider the design and reporting techniques for different question types. For open-ended questions, it is essential to set character limits for text/comment fields to ensure that respondents provide concise and relevant feedback. Clear instructions should also be provided for matrix questions to avoid confusion and ensure respondents understand the question format. Additionally, visualizations can represent different data types, such as bar charts for rating scales and word clouds for open-ended responses. By tailoring the design and reporting techniques for different question types, organizations can effectively analyze the data collected and gain valuable insights that inform their decision-making and drive positive change.Example: Use a bar chart to represent the rating of different aspects of
the coding boot camp, such as instructor effectiveness, course materials,
and overall satisfaction.
Structured Surveys for In-depth Data Analysis:
Example: Collect demographic information such as age, educational
background, and geographic location to analyze the impact of
the coding boot camp on specific demographics or regions.
Balancing Qualitative and Quantitative Questions:
Example: Ask closed-ended questions to gather quantitative data
on participant satisfaction levels, and include open-ended questions
to collect qualitative feedback on specific aspects of the
coding boot camp that were particularly beneficial or areas
for improvement.
Survey Questions Aligned with Impact Management Project (IMP) Dimensions:
The Impact Management Project (IMP) offers a framework for measuring and managing impact, and aligning survey questions with its five dimensions can provide a more comprehensive analysis of an initiative's impact and risk factors. The "What" dimension focuses on the outcomes and changes an initiative aims to achieve, while the "Who" dimension looks at the beneficiaries of the initiative. The "How Much" dimension measures the scale and intensity of the initiative's impact, while the "Contribution" dimension assesses its contribution towards achieving its intended outcomes. Finally, the "Risk" dimension considers the potential negative effects or uncertainties associated with the initiative. By incorporating these dimensions into survey questions, organizations can gain a more complete understanding of their initiative's impact and identify areas for improvement. This approach can also help organizations manage risks and ensure that their initiatives positively impact their intended beneficiaries.Example: Ask questions related to the "Who" dimension to understand
the demographic profile of participants and assess whether
the coding boot camp is reaching the target population effectively.
Mitigating Data Fatigue for Quality Responses:
Example: Limit the length of the survey and only include questions
that are essential for evaluating the impact of the coding
boot camp.
Avoid duplicating questions or asking for the same information in
multiple ways.
Survey Design for Impact-Driven Organizations:
Example: Frame survey questions to assess how the coding boot camp
is contributing to lifting young girls out of poverty and providing
them with improved life opportunities, as outlined in the
theory of change.
Survey Design Example
Now that we have explored these best practices in survey design, it's time to implement them with a comprehensive example. Using these strategies, organizations can effectively design surveys that generate high-quality data and deep insights into their stakeholders. Moreover, by adopting an impact-driven approach to survey design, organizations can use the data collected to drive positive change and scale their programs or products. While every organization's needs are unique, following these best practices will provide a solid foundation for impactful survey design. By incorporating skip logic, balancing qualitative and quantitative questions, and aligning survey questions with the IMP dimensions, organizations can comprehensively understand their initiative's impact and risk factors. By mitigating data fatigue and adopting a survey design thinking approach, organizations can create a positive survey experience that encourages meaningful responses and provides actionable data for decision-making. By following these best practices, organizations can design surveys tailored to their specific goals and deliver deep insights into their stakeholders, enabling them to drive positive change and scale their impact.
Survey: Impact Evaluation of Coding Bootcamp for Young Girls
[Best Practices: 1, 2, 3, 4, 5, 6, 7]
Section 1: Participant Information
What is your age? [Closed-ended question]
Options: a) 15 b) 16 c) 17
What is your highest level of education completed? [Closed-ended question]
Options:
a) Elementary school
b) Middle school
c) High school
Which region are you from? [Closed-ended question]
Options:
a) North
b) South
c) East
d) West
Section 2: Coding Bootcamp Experience
Did you attend the coding boot camp for young girls? [Closed-ended question]
Options:
a) Yes
b) No
On a scale of 1-10, please rate your confidence in technical concepts after attending the coding boot camp. [Closed-ended question aligned with Outcome measurement]
Options:
a) 1 - Very low
b) 2
c) 3
d) 4
e) 5
f) 6
g) 7
h) 8
i) 9
j) 10 - Very high
How would you rate the quality of the coding boot camp materials?
[Neutral closed-ended question]
Options:
a) Excellent
b) Good
c) Fair
d) Poor
Did the coding boot camp improve your interviewing skills? [Closed-ended question] Options:
a) Yes
b) No
How satisfied are you with the overall organization of the coding boot camp? [Neutral closed-ended question]
Options:
a) Very satisfied
b) Satisfied
c) Neutral
d) Dissatisfied
e) Very dissatisfied
Section 3: Impact Evaluation
Has the coding boot camp provided you with opportunities for internships or job placements? [Closed-ended question aligned with Outcome measurement]
Options:
a) Yes
b) No
On a scale of 1-5, please rate the impact of the coding boot camp on your career prospects. [Closed-ended question aligned with Outcome measurement]
Options:
a) 1 - Very low impact
b) 2
c) 3
d) 4
e) 5 - Very high impact
How has the coding boot camp contributed to your personal growth and development? [Open-ended question] [Best Practice: 2 - Limited number of open-ended questions]
How would you rate the support provided by the coding boot camp instructors? [Neutral closed-ended question]
Options:
a) Excellent
b) Good
c) Fair
d) Poor
Section 4: Feedback and Suggestions
What aspects of the coding boot camp did you find most valuable? [Open-ended question] [Best Practice: 2 - Limited number of open-ended questions]
Do you have any suggestions for improving the coding boot camp? [Open-ended question] [Best Practice: 2 - Limited number of open-ended questions]Thank you for participating in this survey. Your feedback is invaluable in helping us evaluate and improve our coding boot camp for young girls.
Conclusion:
Designing effective surveys requires careful consideration of question types, neutrality, answer choices, skip logic, and reporting techniques. By aligning survey questions with a theory of change and best practices, organizations can collect valuable data that provides insights into the impact of their initiatives. Following these best practices will enhance survey response rates, minimize bias, and ensure that the survey data collected is reliable, actionable, and aligned with the desired outcomes of the organization's programs.
