REVISED-
Students targeted in closing the gap activities included African American students who had received 2 or more Office Behavior Referrals (ODR’s) in the 2015-2016 school year. African American students comprise 21% of total student enrollment, yet represented nearly 60% of total ODR’s as reflected by school-wide SWIS behavior data. This data was used to establish program goal #1 and the closing the gap goal was established as a subset of the program goal. It was chosen to eliminate the overrepresentation of African American students who receive ODR’s.
The closing the gap goal is not written in SMART format, it was written as a total percentage reduction based on the average number of ODR’s students received. The average ODR’s from the 15-16 school year for this group was 3.55 referrals, and the goal was to reduce that number to 2.84, a 20% reduction. In reflection, I will change this to a SMART format to simplify and strengthen the goal, more appropriately tie to the program goal, and read easier for stakeholders.
All 9 participating students from grades 2-5 were in the “tier 3 category” in school-wide SWIS behavior data reports. Many factors led this group to be a small set of students. Several other students in this “tier 3 category” were special education students who already receive social work support and, simply put, some teachers write more referrals than others. While likely additional students could have benefitted, the data was not available to inform me of who they were and their behavior struggles.
The primary intervention used in the closing the gap activities was the “Check In, Check Out” (CICO) program (see link attachment). A student goal sheet is attached for reference. It was initially intended to have students also participate in the same small group services. However, upon reviewing individual student data, they were put in differentiated small group topics to best meet their needs. Students participated in one of the following small groups: impulse control, anger management, or relational aggression. Approximately bi-monthly, students participated in individual counseling for additional skills practice and received in-classroom support for skill implementation. These interventions were chosen because they combined an intense, evidence-based, behavior monitoring intervention with several layers of skill building services. A specific example was a 5th grade girl who had individual CICO goals of controlling her body and anger, also participated in an impulse control group. In individual sessions, we worked further on impulse control strategies. For classroom support, I provided coaching and feedback on skill implementation during partner math work time, a subject she historically struggled.
While students participated in differentiated small groups, they all completed the same student assessment. The student assessment was intentionally written to reflect the targeted Mindsets and Behaviors. For example, the first and third student assessment questions are focused towards student beliefs around controlling their behavior, developed through participation in CICO. The second question asks about strategies to control their behavior, which is intentionally open ended to include skills acquired through differentiated small group services. In reflection, as an area for growth, I want to analyze the balance between differentiated student services that best meet individual needs and the potential consequences of fragmenting the intervention. I question sustainability and room for growth if a chosen gap in the future proves to have a larger group of students. Further, I question if the student assessment, with such a large scope of interventions, sufficiently captured student perception growth. I also, embarrassingly, see in reflection that I missed one of the targeted mindsets and behaviors, B- SS 3, about creating a positive relationship with adults. While the counselor-student relationship is cornerstone in the CICO intervention, I missed the opportunity to capture that through student perception data and will need to be added.
As outlined in the data table, implemented activities and interventions resulted in improved individual student outcomes, mindsets, and behaviors. Originally, I tied ODR outcomes to social emotional learning targets from student report cards. In reflection, this was incorrect and I will change to academic learning targets. While it is a numerical data source, which is nice for quickly reference student progress, it is tricky because it is teacher perception of student progress, which is subjective. I believe this change will strengthen closing the gap services and the outcomes shared with stakeholders as it is predicted that increased student classroom seat time that accompanies a reduction in ODR’s will lead to better academic outcomes.