We began the 2016-17 school year disappointed with our standardized test results for the state, the Ohio AIR tests from the previous school year. While there were many issues to be addressed in the state report card, the School Counseling Department wanted to take on the closing-the-gap issue for students that are identified with low socioeconomic status. According to the state, we fell far below the benchmark of 74.2% passing for ELA and 68.5% passing rate for math for students identified with low socioeconomic status in which we were at 41.4% and 49.7%, respectively, for the 2015-16 school year.
The school counselors identified the students with low socioeconomic status that failed one or more of the previous year's Ohio AIR tests or were close to failing one. Counselors met individually with all students identified and explained their test scores and made a goal for this year. Students that were identified as needing more intervention were placed in the group to meet twice weekly for 8 weeks leading up to the AIR tests. Counselors collaborated with content area teachers to use best practices in creating core curriculum lessons for all students. School counselors attended a professional development offering by INFOhio for online test taking skills to incorporate these research-informed techniques into our group meetings. Main ideas that we found most useful in our groups were incorporated in the core curriculum lesson, as well.
We were happily surprised to see what a big difference students did this year in testing. In our small groups, 85% of students improved one or more of their test scores by ten points, which is significant with these standardized tests scoring range. We even showed huge improvements in our report card from the state, improving with 25% more students with low socioeconomic status passing ELA (total of 66.7% passing) and 13.7% more in math (63.4% passing). While this still fell below the state’s benchmark for the year at 77.1% and 77.2% passing rate respectively, we feel that this was an immense improvement towards reaching that goal.
Our perception data for the core curriculum lesson on AIR testing had only a small increase in content knowledge gained, but the pretests showed that the students had a high understanding of the questions before the lesson began. We feel that we could ask more specific questions in the pretest and posttest to more accurately assess content knowledge gained from the lesson, as the content has been taught many times over the years. We do feel it is always important each year to go over this before testing. For our small groups, we feel that meeting as frequently as we did with students was critical to develop skills and a relationship with each student to set them up for success. In the posttests, many students reported that knowing someone was watching how they individually did on the standardized tests motivated them to try their best and have stamina throughout the lengthy tests.
The data results show that many students improved their test scores through the use of identified ASCA Mindsets and Behaviors like Learning Strategy 7 “Identify long- and short-term academic, career and social/emotional goals.” We plan to continue using these standards to guide our core curriculum in groups.
Next year, we are changing how we are passing out the test scores to students as a whole school. In the past, we have students pick scores up during our schedule pick up day and they are not discussed. Due to huge improvement with students in small group, next year, we are distributing scores to their subject area teachers so that they can meet individually with students to explain their test scores and set goals with them, as we learned from the data that the ASCA Learning Strategy of goal-setting was effective to help students succeed on the AIR standardized tests. As a school counseling department, we are then creating an additional core curriculum lesson for all students to explain test scores and how to accomplish your goals. For the small groups next year, we saw a bigger improvement in language arts scores than in the math test scores and reflected that we did spend more time on the language arts’ extended response preparation than math. So, we plan to spend more time preparing students for the math test, taking the time away from the computer test-taking skills lessons, as students were much better at these skills prior to our small groups than we anticipated.