Updated: Jul 28, 2021
As a graduate student in chemistry, I have done my share of data crunching. Usually, though, that data reflected instrumental measurements; there was an explanation for the results. Even if I didn't know how to interpret them, there was a reference book somewhere for me to study and learn.
Education data is so different. We're not studying object behavior; we're studying human behavior. That's an altogether different kind of science! For me, with a heavy background in data collection and analysis, I'm easily overwhelmed by all of the moving parts and factors that contribute to the data. In the end, I only trust the overwhelmingly consistent trends.
When I made the switch to student centered learning to incorporate a Lab in Every Lesson, I focused on tracking participation because one of my biggest motivations was an increase overall engagement in my online, distance learning classroom.
For each lesson where new content or skills were presented, participation was tracked and scored from 0-3 for each student. Originally, the 3-point scale was designed to reflect student activity at the beginning (Review & Preview participation), middle (Learning Experience participation) and end of class (Data-Dependent Analysis and/or Skill Practice). Students earned one point for completing each student centered learning activity; their earned score for the period would reflect their degree of engagement each day. Asynchronous students (those who did not attend live session but may have watched a recording of the live session) completed and submitted the coordinating digital interactive notebook to similarly earn up to 3 points per lesson. The synchronous and asynchronous scores earned in this category did not reflect an evaluation of their performance or mastery, rather they measured demonstrated effort.
When the participation subtotal score was plotted with the recorded quarter grade for each student, the data supports what educators know to be true; when students engage in the learning process, they can more easily achieve mastery. For students with high final Quarter 1 scores (> 80%), their participation scores nearly match their earned final score. It was during this quarter we observe the highest engagement of the year; nearly half of the class list earns high (>75%) participation scores. Even students earning average (60%-75%) final quarter scores, demonstrated high classwork participation scores. There was only one student who was able to earn a passing final quarter score without earning a passing participation score.
Similar Quarter 2 data reveals less consistency in participation. Many failing students earned high participation scores. I can attest that these students weren't just demonstrating effort during class time; they showed mastery of concepts and skills in their classwork. However, they just never completed required, standardized, common assessments in the LMS. In our cyber school, students have the flexibility of completing outside of live class meetings. Unfortunately, many of those that choose not to complete assessment during live class meetings often end up never completing them at all. More generally, I observed high participation scores relative to the earned final scores. This was expected and is explained by the high-level, relational concepts and skills introduced during the second quarter. Here, every student with an earned final score greater than the participation score is either asynchronous or routinely present-not-participating (PNP).
The disparity observed between participation classwork and earned final scores is even greater in Quarter 3. With the exception of five students, participation scores remain high. In fact, average participation scores were observed to increase with progression of the school year! Yet, these earned final progress scores are among the lowest of the year. Again, a closer look at individual gradebooks reveals that this is not in any way a reflection of learning but, rather, a reflection of students' neglect to submit required, standardized, common assessments in the LMS, despite multiple reminders to do so within the time provided (one week from the date assigned).
The fine dotted lines in each of the following three graphs demonstrate the participation among various groups within the class list over three quarters. Within the "Present & Active" group of students, the average participation was between 70% and 80%, increasing slightly from Quarter 1 through Quarter 3. Participation was comparable for the "PNP" (present-not-participating) students during Quarter 1, but I observed a dramatic decline in Quarter 2 followed by virtually no change in Quarter 3. The asynchronous group of students who earned participation credit by completing classwork independently and submitting through the LMS showed the least consistency. For Quarter 1, participation was nearly equal to the "Present & Active" group of students. It increased in Quarter 2 and decreased significantly in Quarter 3.
In distance learning models, engagement can be the single most challenging obstacle to overcome. As talking heads on a computer screen, we don't stand a chance in the competition for student attention in their home environments. In most other virtual classrooms, teachers work hard with other communication-based strategies like cold-calling for answers or requesting volunteers. In those classrooms, participation might begin high but undoubtedly wanes over the course of the school year.
This data demonstrates how incorporating a Lab in Every Lesson captures and retains student attention throughout each instructional period and, perhaps more importantly, throughout the school year.