Sunday, October 27, 2013

Brilliantly Stupid Rankings that have Brilliantly Stupid Effects




 Accountability dictates that you are held responsible for your shortcomings, or your successes. In doing so, those schools that achieve great marks are ranked higher and thus receive a higher government income, and those that are on the lower end of the spectrum are given a bad reputation and a shitty income to boot. Fantastic, maybe that will improve their scores. Right? Or am I missing something here? John Therney on an article about, “Why High-School Rankings Are Meaningless-and Harmful” refers to a number of standardized ranking systems such as the, “challenge index” and then suggests how in this ranking index, “a given school can rise on the list by increasing the number of its students who take ‘advanced’ classes”, which is absurd if you have a school with everyone taking such ‘advanced’ classes yet nobody achieving anything above a 30. In this sense, we see how some ranking systems are brilliantly stupid in attempting to rank higher educational institutions, but then why do we continue to pursue such folly as seen with the myschool website and other similar sites from across the globe. The answer is simple, so that governments can reward quality, and so that parents can choose quality, but the funny thing is, “’What the hell is quality?’”(Bogue & Hall, 2003 pg. 1). Well apparently quality can be derived from, “school rankings that are based solely on observed outcomes such as average test scores or the college-bound rates of students”, in which cases, “schools located in districts with low SES” are penalized (Toutkoushian & Curtis 2005, pg 260). Therney in response to the question of quality states that:

Quality is a very subjective matter, especially in something as intangible as education. And using a simple measure to rank thousands of schools certainly cannot capture the relative quality of schools or indicate which are better than others.

Which begs the question, can’t a school located in an area of low SES have a teaching quality greater than its competitors? Indeed it can, however as a result of its location the school gets less than average results and is punished for it, but whilst the intellectual may pertain that there may be many reasons that could cause such scores, the parent may simply look upon these scores as a derivative to dictate whether the school is good or bad. Thusly, these standardized tests are not taking into account the countless other factors that may cause bad results, and one of the most primary factors causing this is a low SES. These factors are not properly represented in certain ranking systems such as the, “challenge index” that Therney explored in his article. Consequently resulting in an accountability educational system that rewards those schools fortunate enough to have a student body from a higher SES and demeaning those that do not. Thereby the ideas behind making results public to differentiate the ‘good and bad’ schools, is actually making public an analytical perspective on how an individual’s SES can have detrimental, or extremely beneficial impacts on his/her academic life. 



References:

Bogue, E. G., & Hall, K. B. (2003). Quality and accountability in higher education: Improving policy, enhancing    performance. Greenwood Publishing Group.

Toutkoushian, R. K., & Curtis, T. (2005). Effects of socioeconomic factors on public high school outcomes and  rankings. The Journal of Educational Research, 98(5), 259-271.



 

 

No comments:

Post a Comment