Statistics Teaching Inventory

The Statistics Teaching Inventory (STI) is an instrument designed to assess the instructional practices and beliefs of instructors of introductory statistics courses. Initially designed as part of the NSF-funded project Evaluation and Assessment of Teaching and Learning about Statistics (e-ATLAS; NSF DUE-1044812 & 1043141), the current version of the STI (v.2) consists of approximately 87 items formatted into seven sections:

  1. Pedagogy;
  2. Curricular emphasis;
  3. Technology;
  4. Assessment;
  5. Beliefs;
  6. Course characteristics; and
  7. Instructor/institutional characteristics.

Currently there are four different forms of the STI for varying instructional settings:

  • Face-to-face course (no lab/recitation session led by a teaching assistant);
  • Face-to-face course (with lab/recitation session led by a teaching assistant);
  • Completely online course; and
  • Hybrid course (mixture of face-to-face and online).

Development Process

The STI (v.1), which initially included 102 items, went through a rigorous development and evaluation process, including an online pilot administration with 101 voluntary USCOTS participants during the late spring and early summer of 2009. Cognitive interview data from 16 of the pilot respondents was also also collected and analyzed as part of the validation process. For more detail regarding the development of the STI instrument, as well as, analyses of the pilot administration data, see:

Based on the pilot and interview data collected, the STI was revised to include 87 items; STI (v.2). It was at this time that items corresponding to the same content were grouped together into seven sections, namely: (1) Pedagogy; (2) Curricular emphasis; (3) Technology; (4) Assessment; (5) Beliefs; (6) Course characteristics; and (7) instructor/institution characteristics.

In 2019, the STI (v. 3) was again adapted to focus only on the teaching practices of statistics instructors—all beliefs items were dropped from the instrument. Additional items related to computation and modern data practices were also included on the instrument, as well as items tapping recommendations in the 2016 GAISE document.

STI (v.2)

Initially, the STI was developed for instructors teaching in a face-to-face format. Given the popularity of online and hybrid courses, limiting the administration of the STI to face-to-face instructors was a severe limitation if the instrument was to be used in the type of large, national study proposed in the e-ATLAS project. In addition, items on the STI were written for instructors of courses which did not have a recitation section being led by a teaching assistant. This also limited the potential sample. To overcome these limitations, four different forms of the STI were developed for varying instructional settings:

  • Face-to-face course (no lab/recitation session led by a teaching assistant);
  • Face-to-face course (with lab/recitation session led by a teaching assistant);
  • Completely online course; and
  • Hybrid course (mixture of face-to-face and online).

The first form developed was for instructors of face-to-face courses with no lab/recitation sessions (Form 1). The form was sent to an expert reviewer, a statistics education researcher with much experience in assessment research, and was revised various times based on feedback. This form was then adapted for courses that included lecture/recitation sessions (Form 2). The adaptations for this form primarily included small changes to item stems. For example, the items that asked about “the instructor” were changed to “the instructor or TA”. Four additional items were added to the Pedagogy section of this form that asked about the time spent on certain teaching methods during the recitation section. Lastly, the item responses for one item from the face-to-face form were expanded to allow for the potential of a broader TA role in these courses.

To create the forms for completely online courses (Form 3) and hybrid courses (Form 4), the initial face-to-face form was again modified. Most of the adaptations for these forms, again, came from changes to the item stems. The largest modifications were made to an item asking about the frequency with which course content was presented primarily via lecture. For Form 3, this item was changed from “lecture” to “audio or video lectures” to better reflect practices in the online environment. An additional item was also included to Form 3 that asked whether the course content was presented primarily via readings. Form 4 kept the same item as the face-to-face form, and also included the two additional items from Form 3.

Number of Items Included in Each Section of the STI (v.2) for the Four Forms.
Section Form 1 Form 2 Form 3 Form 4
Pedagogy 10 14 11 12
Curricular emphasis 10 10 10 10
Technology 10 10 10 10
Assessment 7 7 7 7
Beliefs 17 17 17 17
Course characteristics 12 13 12 12
Instructor/Institutional Characteristics 21 21 21 21
Total 87 92 88 89

In the summer of 2012, each of the four forms was administered during an online pilot study. A total of nine instructors (at three different institutions) piloted the instrument. Three instructors took the face-to-face (with no lab/recitation) form, one took the face-to-face with a lab/recitation form, three took the completely online form, and two instructors took the hybrid course form. Each of these respondents also provided detailed comments and feedback on specific questions and on the instrument as a whole. Based on this feedback, revisions were made to each form. These revisions were to shorten the instrument and add clarification to some of the items.

After these revisions were made, a statistics education expert, who had also taken part in the piloting process, reviewed the four forms. The feedback from this review led to a few more minor revisions of each form. All four forms were finalized during August of 2012 and formatted online using the survey platform Qualtrics. An initial item related to format of the course was also added. After the instructor indicated course format, the online instrument prompted the instructor to the appropriate form.

Data from the administration of STI (v.2) is presented in the Executive Report:

STI (v.3)

The third version of the instrument (STI v.3) went back to a single form given to all instructors, regardless of course modality. Items asking about the course modality were included to capture differences.

Using the STI

The four STI (v.2) forms are available as both .DOCX and PDF files from our github site.

The STI (v. 3) form is also available as a PDF files from our github site.

Or, click here to download all the files directly.

Licensing

Creative Commons License

The STI instrument is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License, which means that you are free to use, adapt, and share those adaptions so long as it is not for commerical work, and you give attribution.

Attribution

To cite the STI use:

Graduate Student Research Day 2020


Click here to download a PDF copy of the poster!



Abstract

The Statistics Teaching Inventory (STI) is designed to gain insight into pedagogical practices and content in introductory statistics courses. It has been updated to reflect current trends and recommendations in statistics education. Recent guidelines in statistics education argue for teaching more computing and data structure skills. To determine the extent these goals are put into practice, the STI was updated. The purpose of this poster is to display results of the modified survey to capture the current state of introductory statistics courses and to provide initial results after administration of the updated survey in Fall 2019.

To update the survey, items from the previous version of the STI were modified or eliminated; new items were added in response to recent statistics education and survey research. After revisions to the previous version of the STI, think-aloud interviews were conducted with experts in statistics education and data science. Qualitative analysis of the interview data provided insight into the clarity of items and how adequately the survey aligned with recommended practices. This data informed additional modifications to the survey and provided validity evidence for future research with the STI.

Results of the data collected in Fall 2019 indicated that most courses are not giving students experience with data structures or computing. Though students are being exposed to real data, the data often contains fewer than three variables and uses only a small number of cases. The students are not typically exposed to methods of getting data into useable form. The use of syntax driven software is only common in four-year colleges and universities. Courses that do focus on code, do not emphasize essential skills such as to creating, modifying, reading, and debugging code. The results indicate that, despite increasing recommendations for their inclusion, computing skills are not yet widely taught.



References

American Statistical Association. (2014). Curriculum guidelines for undergraduate programs in statistical science. Author. https://www.amstat.org/asa/files/pdfs/EDU-guidelines2014-11-15.pdf

American Statistical Association. (2016). Guidelines for assessment and instruction in statistics education: College report. Author. http://www.amstat.org/education/gaise/

DeLiema, D., Dahn, M. Flood, V. J., Asuncion, A., Abrahamson, D., Enyedy, N., & Steen, F. F. (2020). Debugging as a context for collaborative reflection on problem-solving processes. In E. Manolo (Ed.), Deeper learning, communicative competence, and critical thinking: Innovative, research-based strategies for development in 21st century classrooms (pp. 209–228). Routledge.

Horton, N. (2015). Challenges and opportunities for statistics and statistical education: Looking back, looking forward. The American Statistician, 69(2), 138–145. https://doi.org/10.1080/00031305.2015.1032435

National Academies of Sciences, Engineering, and Medicine. (2018). Data science for undergraduates: Opportunities and options. The National Academies Press. https://doi.org/10.17226/25104

Nolan, D., & Temple Lang, D. (2010). Computing in the statistics curricula. The American Statistician, 64(2), 97–107. https://doi.org/10.1198/tast.2010.09132

Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25(1), 127–147. https://doi.org/10.1007/s10956-015-9581-5

Zieffler, A., Park, J., Garfield, J., Delmas, R., & Bjornsdottir, A. (2012). The Statistics Teaching Inventory: A survey on statistics teachers’ classroom practices and beliefs. Journal of Statistics Education, 20(1). https://doi.org/10.1080/10691898.2012.11889632

USCOTS 2023


Click here to download a PDF copy of the poster!



Abstract

The poster reports on some of the major findings from the data collected in 2019 using the Statistics Teaching Inventory (STI v. 3). This instrument measures the teaching practices of instructors of introductory statistics in a variety of institutions and departments. Participants were recruited using email invitations sent to members of five statistics education mailing lists: Isostat, CAUSE, ASA Section on Statistics and Data Science Education, AMATYC, and the MAA Section on Statistics Education. A total of 228 usable responses were obtained including: 54 respondents (23.7%) from two-year colleges, 87 from four-year colleges (38.2%), and 87 from universities (38.2%); where “University” was defined as an institution that grants advanced degrees. The findings allow us to gain insight into the extent to which current pedagogical practices align with GAISE recommendations. They also help tp inform the broader statistics education community about current pedagogical, assessment, and curricular trends.