Logan County school superintendents are questioning the results of the 2014-15 report cards, stating that changes in testing and delays in the report card release invalidate the data.
The report cards, released Feb. 25, showed generally lower grades in districts statewide. Some districts did have improvements in specific areas.
Ohio Department of Education officials cautioned that, generally, grades will be lowered on this report card because of higher standards and new tests. Interim Superintendent of Public Instruction Dr. Lonny J. Rivera expressed this himself in a press release.
“We’ve long expected that grades might decline as we began to raise the bar for our students and schools,” he said.
School report cards were delayed for 2014-15 due to new tests that year. The data from those tests, created by the Partnership for Assessment of Readiness for College and Careers (PARCC), was originally supposed to come out in December 2015. But it got delayed further by late delivery of assessment results from PARCC, the education department said in a press release. Traditionally, report card data is released in August in the year the tests were taken.
The state decided to not use PARCC tests for the current school year due to pushback from a variety of sources.
Delayed data, questions of accuracy
Benjamin Logan Local Schools Superintendent David Harmon said questions surrounding accuracy and effectiveness of the tests stemmed from how this first year of PARCC tests was handled. He said the state could not set passing scores on the tests until it had data from all the tests.
“Essentially, they couldn’t tell staff, students and families what a passing score would be on several of the assessments…until they had all of the completed tests with which to compile results,” he said.
Since the tests were new, and the materials were new, there is no way to compare the 2014-15 year report card to prior years, Harmon said. And this will continue to be a problem for the next set of report cards, because the state removed the PARCC tests after one year of use.
“Our staff, our students, our families, and our school communities will have had three different assessment systems in three years’ time, and this does not include the continued expansion of additional End of Course Assessments before we get ‘fully’ settled on this new testing format,” Harmon said. “It is truly unfair to judge teacher effectiveness from year to year until we have a few years of consistency in both the academic standards and in the assessment systems that helps to judge teachers and schools’ ability to teach students to those standards.”
West Liberty-Salem Local Schools Superintendent Kraig Hissong said district staff are concerned on some of the tested areas and the number of students considered proficient.
“We have questions about value added and the accuracy of this data, and if it should have been included at all on the report card, especially when they did not feel it was solid enough to use for teacher evaluations as it has been in the past,” he said.
Hissong refers to a “safe harbor” provision put in place by Ohio legislators to protect teachers and administrators from negative impacts from the 2014-15 testing data. Ohio’s teacher evaluation system places a high value on student testing data when determining teacher effectiveness, which can affect decisions on compensation.
Bellefontaine City Schools Superintendent Brad Hall agreed this report card may not be reliable or helpful in determining school success: “The data released by the Ohio Department of Education is one snapshot in time, using an assessment that has never been used before, and will not be repeated in the future. To compare it to previously released district report cards would not provide an accurate analysis of our growth.”
Hall was another who questioned the accuracy of the data, and said the district has been working with the education department to correct errors.
Indian Lake Local Schools Superintendent Patrick O’Donnell said he and his staff have released its own version of the 2014-15 state report card for the district due to the new tests and uncertain data.
“Due to PARCC’s technical inefficiencies, our students experienced dozens of disruptive irregularities while testing,” he said.
O’Donnell said since two-thirds of the district’s evaluations are based on results from the tests, it poses an issue. So the district used data collected from the Ohio Graduation tests, the Ohio Achievement Assessment, and the American Institutes for Research (AIR) tests – which were used last year for other subjects, and will be used this year to replace PARCC tests – and created a separate report card.
That can be viewed on the district’s web site, ils-k12.org.
Districts in the region have instituted their own informal assessments in their classrooms to keep track of student progress. Many have been doing this for years.
Acknowledging strengths, addressing weaknesses
Hissong said he is pleased overall with student achievement.
“We were warned last year about the increased rigor of these tests, yet we were very happy with the number of indicators we met and the margin of passage for students in most of the tested areas,” he said. “Our PI (performance index) dropped this year due to less students achieving an advanced score, but it seems to be in-line with the performance of most districts we were similar with in the past.”
Hissong said he was proud of student achievement scores in general, which “exceeded our expectations.”
Hall said he was pleased with the work of his staff and teachers: “I am proud of the professionalism and tenacity with which the Bellefontaine staff approaches our responsibility to provide the quality education that is present in the district.”
Harmon said he is proud that the number of indicators met for the district appears to be improving, and the rate of growth for its gifted and lowest 20 percent in achievement students are improving. Also, the four-year graduation rates for all area schools, not just Benjamin Logan, are “A” grades.
Hissong said the district has been working on improving instruction for gifted students, but that likely would not show up until the 2015-16 report cards.
Harmon said his areas of concern from the report card stem from the annual measurable objectives, with overall results compared to subgroups such as “students with disabilities” and “students who are economically disadvantaged.” The district has been and continues to work on these areas to improve student growth in those areas.
Overall, superintendents expressed a desire to see stability in testing.
“Until we have stability in the assessment systems and reporting systems, it is going to be very hard to make conclusive statements about a whole lot of what is being reported on the state report cards,” Harmon said. “Don’t get me wrong, it is an important indicator. But it isn’t by any stretch of the imagination the only indicator of student successes.”
Casey S. Elliott may be reached at 937-652-1331 ext. 1772 or on Twitter @UDCElliott.