Sustainability Attitudes Scale

Outcome

Citation

Zwickle, A., & Jones, K. (2018). Knowledge and Attitudes—Assessing Latent Constructs (pp. 435–451). https://doi.org/10.1007/978-3-319-67122-2_25

Background 

This tool was developed by Keith Jones and Brian Campbell at Central College in Iowa in collaboration with Adam Zwickle and Joseph Hamm at Michigan State University. It has been calibrated for undergraduate studies. It was not initially designed for program evaluators, but can be used in this context. 

Format 

This survey consists of 11 statements to which people respond to on a six-point , with 1 being “strongly disagree” and 6 being “strongly agree”.

Audience 

Adult

When and how to use the tool

If you are designing a program that addresses , it could be helpful to determine what your audience initially thinks about the topic. The tool could be then used for a one-time measure of sustainability attitudes in that case. If you are evaluating a long term effort to explore sustainability, perhaps through a action project, this tool could form a component of your to measure changes in sustainability attitudes. As attitudes may not change quickly, it would not be appropriate as a pre/post tool for a short program. 

How to analyze

We recommend entering survey responses into a spreadsheet using a program such as Microsoft Excel. Create a spreadsheet with 11 columns for the 11 statements and a row for each individual. Using a 1–6 point scale, enter the equivalent value (1 for “strongly disagree” to 6 for “strongly agree.”) Assign each survey a , and enter each individual’s responses (ranging from 1 to 6) across the corresponding row. Enter a dot if the response was skipped.

Create an average score for each individual by adding all of their responses and dividing by the number of questions answered. Do not include skipped questions for which you entered a dot. The average will be between 1 and 6. Scores of 1–3 indicate low attitudes of sustainability and scores of 4-6 indicate more positive attitudes about sustainability. 

When administering the pre-experience survey and post-experience , you can conduct higher-level statistics on your data to understand if participants had significant changes in the outcome areas after their participation in the program. 

What to do next 

Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next: 

  • If a baseline measurement suggests that your audience has low sustainability attitudes, you might want to design a program that can help create more positive attitudes of sustainability. 
  • You could compare populations to determine if members have a different outlook on sustainability than the general , or if one geographic area of your community is different from another. This could also provide justification for program development, marketing, or funding proposals.
  • If you used this survey to measure changes in sustainability attitudes, do you see a change in scores between the and ? Keep in mind that attitudes are slow to change, and you may not see a change, particularly if your program is short in duration. 
  • Invite program staff or other partners to look over the data. Together you might also consider:
    • What do these results tell us about our programming? Why do we think we got these results?
    • What did we think we would see with respect to sustainability attitudes? And did the data support our goals?
    • If our results did not support our goals, can we brainstorm areas within the programming or delivery to influence sustainability attitudes? What changes should be made to programming, or how should new programs be designed?
    • What stakeholders should we reach out to for collaboratively discussing program design?
    • Who or what organizations can we share our learning with?

How to see if this tool would work with your program 

To assess whether the tool is appropriate for your audience, please review the carefully and the tool with a small group that represents your population. To pilot test, ask a small group of willing participants who are part of your target audience to talk to you as they complete the tool. What are they thinking when they read each item? What experiences come to mind when they respond? As long as this is what you expect and you will gain relevant information from your evaluation, you are on the right track! If the answers are different for each person, and they should be more similar given their experiences, you may need to look at other tools. 

Tool Tips 

  • Survey is suggested to be used in its entirety.
  • Survey will take about 5 to 7 minutes to complete. Be sure to allow ample program time for participants to complete the survey.
  • This survey can be used as a paper copy or can be formatted to be completed online by using Google Forms or Survey Monkey.