Environmental Education Teacher Efficacy Belief Instrument (EETEBI)
Outcome
Audience
Method
Citation
Moseley, C., Utley, J., Angle, J., & Mwavita, M. (2016). Development of the Environmental Education Teaching Instrument. School Science and Mathematics, 116(7), 389–398. https://doi.org/10.1111/ssm.12189
Background
This tool was developed by Christine Moseley at The University of Texas at San Antonio, Juliana Utley, Julie Angle, and Mwarumba Mwavita at Oklahoma State University. The survey was developed through a series of steps including literature review; review by experts in the field of content, pedagogy, and language; focus group review and discussion; and administration of the survey with a larger sample of preservice teachers in the Southwest and Midwest.
Format
This survey consists of 20 statements to which people respond to on a six-point : (1) strongly disagree, (2) disagree, (3) somewhat disagree, (4) somewhat agree, (5) agree, and (6) strongly agree.
Audience
Pre-service and in-service teachers
When and how to use the tool
This survey can be used as a pre-teaching assessment of preservice and in-service teachers. Give this survey to teachers before a program begins and have the results inform program planning. Additionally, this tool could be used to measure changes in teacher EE teaching by also administering a posttest after a training or program. Be sure to consider if your program targets this outcome and if it is adequate length to see changes. Lastly, it can be used for a one-time measure of teacher EE efficacy.
How to analyze
We recommend entering survey responses into a spreadsheet using a program such as Microsoft Excel. Create a spreadsheet with 20 columns for the 20 statements and a row for each participant. To ensure confidentiality, assign a matching to each participant and their survey, and enter survey responses across the corresponding row. Using a 1–6 point scale, enter the number that matches each response. Enter a dot if the response was skipped.
This survey consists of two subscales, which means two different sets of questions measure two slightly different constructs. One subscale measures personal environmental education teaching efficacy (PEETE), meaning it measures teachers’ “perceptions and beliefs toward their to teach environmental education” (p. 394). The other subscale measures environmental education outcome expectancy (EEOE), meaning it measures teachers’ perceptions and beliefs toward their ability to “influence student understanding about the ” (p. 394). These two constructs are to be reviewed separately as many studies have shown them to be uncorrelated, or unrelated.
To analyze this data, you will have to find the average score for each individual for each subscale. To find the PEETE average score, add up all the responses from those questions and divide by the number of questions answered, do not include skipped questions for which you entered a dot. The average will be between 1 and 6. To find the EEOE average score, add up all the responses from those questions and divide by the number of questions answered, do not include skipped questions for which you entered a dot. The average will also be between 1 and 6. The PEETE subscale consists of thirteen (2, 3, 5, 6, 8, 10, 14, 15, 16, 17, 18, 19, and 20), while the EEOE subscale consists of seven items (1, 4, 7, 9, 11, 12, and 13). Mean scores are calculated on the two constructs with scores ranging from 13 to 78 on the PEETE scale and from 7 to 42 on the EEOE scale.
Scores of 1–2 indicate a lower efficacy; scores of 3-4 indicate a medium level of efficacy; and scores of 5-6 indicate a higher level of efficacy.
When administering the pre-experience and post-experience surveys, you can conduct higher-level statistics on your data to understand if participants had significant changes in the outcome areas after their participation in the program. Demographic data such as gender, , age, and number of years teaching can also be evaluated to compare scores across surveys.
What to do next
Once you’ve administered your survey and analyzed the data, consider the following suggestions about what to do next:
- If a baseline measurement suggests that your audience has low efficacy scores, you might want to design a program that can help increase EE efficacy. For instance, If you have educators with complimentary high efficacy results, you could have them work together to share how they feel confident teaching EE, successfully engage students, and effectively teach EE topics. Your program may encourage them to create partners and work together to meet their EE teaching goals.
- If you used this tool to measure changes in teacher efficacy, do you see a change in scores between the and ? Keep in mind that you may not see a change, particularly if your program is short in duration or is not designed to influence teacher’s EE efficacy.
- Invite program staff or other partners to look over the data. Consider questions together, like:
- What do these results tell us about our programming? Why do we think we got these results?
- What did we think we would see with respect to teacher EE efficacy? And did these data support our goals?
- If our results did not support our goals, can we brainstorm on areas within the programming or delivery to influence teacher EE efficacy? What changes should be made to programming, or how should new programs be designed?
- Who in our should we reach out to for collaboratively discussing program design?
- With whom or what organizations can we share our learning with?
How to see if this tool would work with your program
To assess whether the tool is appropriate for your educators, please review the items carefully and the tool with a small group that represents your . To pilot test, ask a small group of willing participants who are part of your target audience to talk to you as they complete the tool. What are they thinking when they read each item? What experiences come to mind when they respond? As long as this is what you expect and you will gain relevant information from your , you are on the right track! If the answers are different for each person, and they should be more similar given their experiences, you may need to look at other tools.
Tool tips
- We recommend using this survey as is, in its entirety.