top of page

Analyzing Your Online Teaching Experiences


I enjoyed several conversations with faculty this week on the topic of how we can analyze our online teaching experiences this term and use this to help inform our next courses, whether online or face-to-face (F2F). So, I would like to share several instruments for your consideration to assist in gathering, analyzing and making sense of the student assessments and instructional methods that we offered this term. Ideally, through this process, we can identify several strategies that we can use to enhance our F2F courses in the fall; and/or integrate more intentionally if we are teaching in an online, hybrid and/or hyflex mode.

First, I want to remind us of the two types of measurements that we can collect, Direct and Indirect. Direct Measurements can be artifacts, work examples (syllabus, lessons, teaching material, etc.), portfolios (combination of items, which could include instructional videos), direct observations from colleagues and/or the CTL. Indirect Measurements can include surveys (difficult to create, potential bias - recall that standard evaluations of teaching have been shown not to correlate with effective teaching (Uttl, White, & Gonzalez, 2016; Setari, Lee, & Bradley, 2016; Linse, 2016; Stark & Freishtat, 2014; Braga, Paccagnella & Pellizzari, 2014; Mitchell & Martin, 2018)) and focus groups. Ideally we should find a way to include both types, or triangulate the data to produce valid and reliable conclusions.

The following are a few ways in which you could collect data to analyze for your online teaching (as always, I am happy to discuss further and/or assist in the collection or analysis):

  1. Direct Measure: Syllabus Comparison Chart which could include the three major components of Course Design, i.e., Learning Outcomes (LOs), Assessment, and Methods (Wiggins & McTighe, 2011). Ideally, your LOs should remain the same as they were aligned with the course catalog. Analyze the teaching approaches, resources, tools and interactions that you believe were supportive to student learning. Your assessment and teaching methods may have been modified for online teaching, of which these modifications might be able to enhance subsequent courses.

  • Creating a sub-chart comparing concepts to teaching methods might showcase which methods could be effective when teaching these concepts again, either online or F2F.

  • Listing all of the online tools that you used this term and either ranking and/or writing qualitative comments on their relative effectiveness could help you analyze which you may use in the future and in what capacity (examples include synchronous discussions; shared lecture presentations (Google Slide, PPT, etc.); virtual Office hours; online writing assignments; project-based collaboration; video files of course content using screencasting tools; student-created video or audio products; asynchronous lecture videos; between class (asynchronous) discussions; shared lecture notes; audio files of course content; online exams and quizzes, etc.).

  1. Direct Measure: Collaboration Tools can identify how often and in what ways you used collaboration tools with students; and student to student. Analyzing this data could help you identify communication venues that were effective which you could use in your fall courses (Davis, 1993).

  2. Direct Measure: Analytical Rubrics that you used to measure and evaluate student work, especially if you had used the same or similar rubric in the same course taught previously F2F (McKeachie, 2005).

  3. Direct Measure: Instructor Reflection and Documentation can be used to analyze instructional approaches that were effective (or perhaps less than effective), especially when documentation was collected during the term (Benton, 2012).

  4. Indirect Measures: Pre/Post Self-Efficacy survey results for some of you who completed this, analyzing your perceptions of teaching online abilities before the term and afterwards could showcase key areas to either enhance or minimize (Pintrich & DeGroot, 1990).

  5. Indirect Measure: Student Feedback through the course via emails, questions during virtual office hours, or chats during a synchronous videoconference session. This could include mid-term student perception feedback surveys collected through your CMS and/or your CTL. This could also include any surveys that you asked your students to complete throughout the course on how they were feeling about the course; the format or overall course experience. These data can be used to support and guide assessment and teaching methods (Hargis, 2014; Hargis, 2000; Iwamoto & Hargis, 2017).

References

Davis, B. (1993). Tools for Teaching. San Francisco: Jossey-Bass Publishers.

Benton, B. K. (2012). The iPad as an instructional tool: An examination of teacher implementation experiences. Unpublished doctoral dissertation, University of Arkansas, Fayetteville, AR.

Hargis, J. (2014). Can students learn science using the Internet? ISTE Journal of Research on Computing in Education, 33(4), 475-487.

Hargis, J. (2000). The Self-regulated learner advantage: Learning science on the Internet. Electronic Journal of Science Education, 4(4).

Iwamoto, D., & Hargis, J. (2017). Self-Regulated learning as a critical attribute for successful teaching and learning. International Journal of the Scholarship of Teaching and Learning, 11(2).

McKeachie, W. (2005). McKeachie’s Teaching Tips: Strategies, Research, and Theory for College and University Teachers. 12th ed. Boston: Houghton Mifflin.

Pintrich, P. R., & DeGroot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82, 33-40

Wiggins, G. & McTighe, J. (2011). Understanding by Design guide. Alexandria, VA: ASCD.

Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page