Top

Have a question? Search the Knowledge Base.

Creating Custom Questions: Strategies for Success (Snapshot 2 - Terms 3 and 4) Print or save as PDF

In Snapshot 2 (Terms 3 and 4) you have the option to add up to two Open-Ended Questions (OEQs) and two Multiple-Choice Questions (MCQs).

Custom questions allow you to tailor your survey implementation to your local needs in addition to providing a powerful tool to further the interpretation of your overall survey results. To maximise the impact of these questions, we recommend the following 5-step approach.

  1. Identify area of inquiry
  2. Determine purpose of information
  3. Ensure participants can respond from experience
  4. Choose an appropriate response format
  5. Use words that lead to actionable results and avoid common pitfalls

Case Study: How often are you hungry at school because there is not enough food at home for breakfast or lunch?

  1. Identify an area of inquiry
    It is important to decide on an area that will assist with school improvement planning or provide further insight into the setting of an ongoing research initiative.
  2. Determine the purpose of the additional information being sought
    Be clear on why you want to gain insight into that particular area and how the information will help with the school planning process. Custom questions can be used to:

    a. Collect information or feedback pertaining to a specific focus area or initiative

    Eg. You may have a working group on student voice and would like to find out the most effective way to engage all students in the decision-making process.

    b. Collect further information or follow up on the results of a previous survey

    Eg. Last year, you may have noticed a decline in the results of a particular measure (Participation in Extra Curricular Activities) and would like to collect more detailed information to inform strategic planning (how the school can make participation in clubs more accessible to students).

     

  3. Ensure that the content allows participants to respond through their experience
    To maximise the reliability of information collected through custom questions, participants must be able to speak from their personal experience as opposed to guessing about the experiences of others. Consider using an illustrative scenario to help prompt participants to reflect on a situation that could easily be applicable to themselves.

    Eg. “If your friend told you they were having trouble with their homework, what would you tell them to do?”


  4. Choose an appropriate response format
    Consider the purpose for collecting data and how you plan to analyse the responses. It is important to consider how responses will be reported and whether this format provides actionable feedback. Some questions are better suited as multiple choice, some as multiple answers, and some as open-ended.

    Note: Open-Ended Questions are only available at the school level.
    • Open-Ended Questions (OEQs) are best utilised as a means of gathering a wide range of responses, collecting a variety of suggestions, and encouraging further dialogue. They can be used to collect qualitative data, general feedback, follow up on quantitative findings, generate ideas, and find evidence of successes or areas in which the school could be doing better.

      Eg. “Please list three subjects you would like to see offered.”
    • Multiple-Choice Questions (MCQs) are used to collect numerical data that can be used to demonstrate facts and uncover patterns related to prevalent trends in thought, opinion, attitudes, behaviours, and other specific variables. MCQs are reported as the total percentage of respondents that selected each response option, and can be used to drill down in the Interactive Charts. You can review the article  How to Use Interactive Charts(this link opens in a new tab) to learn more about this feature.

      Eg. “Which of the following subjects would you like to see added?”
    • Multiple-Answer Questions (MAQs) also provide the opportunity to collect measurable quantities of data that can be used to demonstrate facts and discover patterns in thought, opinion, attitudes, behaviours, and other contextually relevant variables. Unlike MCQs, MAQs are reported as the total number of respondents that selected each response option. Since each respondent has the option to select multiple answers, MAQs cannot be used as drill-downs in the Interactive Charts.

      Eg. “Please select any of the following subjects you would like to see added.”
Did you know? For more examples of each type of custom question, see our sample custom questions for the Student(this link opens in a new tab)Parent(this link opens in a new tab), and Teacher(this link opens in a new tab) Surveys.

 

  1. Use words that will yield actionable results and avoid common pitfalls
    Pay careful attention to the wording of questions and be specific. Avoid questions that are:

    • Too lengthy or too short
    • Double-barreled/too complicated
    • Redundant (they are already covered in survey content)
    • Inviting the identification of students or staff
    • Likely to cause divisions in your school community
    • Difficult to act upon
    • Spelled incorrectly or grammatically incorrect

 

CASE STUDY
See how one school created actionable results from a contextually relevant custom question

1. Identified area of inquiry:
Teachers reported that students were frequently distracted and complained of hunger throughout the day. To discover the reason why students complained of hunger, the region decided to include a custom question on food availability in the Tell Them From Me survey.


2. Determined the purpose of the additional information being sought:
The region suspected that a lack of availability of food in the home may be leading to students’ hunger. If the results confirmed this, data collected would be used in a grant proposal to start breakfast and lunch programs throughout the region.


3. Ensured that the content allowed participants to respond from their experience:
Students were asked to reflect on their home-life experience as it related to the availability of food.


4. Chose an appropriate response format:
They chose a Multiple-Choice Question to collect quantifiable data that could be applied as a drill-down in the Interactive Charts to identify which groups of students were most vulnerable.

5. Used words that would yield actionable results an avoid common pitfalls:
“How often are you hungry at school because there is not enough food at home for breakfast or lunch?” Response options: Never / Once or twice a month / Once or twice a week / Every day.

 

After review, the question was clearly written and allowed students to speak from their experience. The results were used in a grant proposal to start breakfast and lunch programs throughout the region.