PISA’s Weakness: Why Teacher Experience matters

Journal title EDUCATIONAL REFLECTIVE PRACTICES
Author/s Vasco d’Agnese
Publishing Year 2017 Issue 2017/1
Language English Pages 16 P. 80-95 File size 210 KB
DOI 10.3280/ERP2017-001006
DOI is like a bar code for intellectual property: to have more infomation click here

Below, you can see the article first page

If you want to buy this article in PDF format, you can do it, following the instructions to buy download credits

Article preview

FrancoAngeli is member of Publishers International Linking Association, Inc (PILA), a not-for-profit association which run the CrossRef service enabling links to and from online scholarly content.

The aim of this study is to deepen the understanding of the ways in which PISA (the OECD’s Programme for International Student Assessment) impacts learning and education by analysing teacher feedback. Central to this study is the belief that teachers’ knowledge and experience are legitimate resources for assessing, correcting, and updating a) educational policies and b) curriculum and educational practices. The research involved 39 secondary school teachers who taught 15-year-olds in three urban schools. Each of the teachers had been in service for between 8 and 35 years. The research was conducted in southern Italy in three high schools chosen from the PISA sample. The research design combines semi-structured interviews, group interviews, and group discussions conducted with teachers involved in PISA, either as references or as teachers of classrooms involved in the PISA survey. Based on the data, PISA raises five problems: a) a learning problem: students learn to respond to a pre-specified problem rather than to develop their own inquiries, and they are not stimulated to develop meta-awareness about their own learning processes; b) a relational problem: the relationship between teachers and students is established externally and framed by the test; c) an educational problem: the role of education in helping students face the unforeseen is denied by PISA; d) an assessment problem: one tool is not sufficient to undertake a complex task such as student assessment; and e) a problem concerning the role of schools, teachers and democracy.

  1. Alexander, R. (2011). Evidence, rhetoric and collateral damage: the problematic pursuit of ‘world class’ standards. Cambridge Journal of Education, 41(3), Special Issue: “Mirrors for Research in Classrooms”, pp. 265-286.
  2. Au, W. (2011). Teaching under the new Taylorism: high‐stakes testing and the standardization of the 21st century curriculum. Journal of Curriculum Studies, 43(1), 25-45.
  3. Bank, V. (2012). On OECD policies and the pitfalls in economy-driven education: The case of Germany. Journal of Curriculum Studies, 44(2), pp. 193-210.
  4. Biesta, G.J.J. (2012). Giving teaching back to education. Phenomenology and Practice 6(2), pp. 35-49.
  5. Bonderup Dohn, N. (2007). Knowledge and Skills for PISA—Assessing the Assessment. Journal of Philosophy of Education, 41(1), 2007.
  6. Comber B. & Nixon H. (2009). Teachers’ work and pedagogy in an era of accountability. Discourse: Studies in the Cultural Politics of Education, 30(3), Special Issue: “Symposium 1: Global/National Pressures On Schooling Systems. The Andrew Bell Lecture Series” and “Symposium 2: Re-designing Pedagogy”, pp. 333-345.
  7. Gorur R. (2011). ANT on the PISA Trail: Following the statistical pursuit of certainty”, Educational Philosophy and Theory, 43, no. s1, 76-93.
  8. Grek, S. (2009). Governing by numbers: the PISA ‘effect’ in Europe. Journal of Education Policy, 24(1), pp. 23-37.
  9. Guskey T.R. (2002). Professional Development and Teacher Change. Teachers and Teaching: theory and practice, 8(3), pp. 381-391.
  10. Kliebard, H. M. (2004). The struggle for the American curriculum, 1893–1958. New York, NY: Routledge Falmer.
  11. Mansell, W. (2007). Education by numbers: The tyranny of testing. London: Politico.
  12. Meyer, H.D. & Benavot, A. (Eds.) (2013). PISA, power and policy: the emergence of global education governance. Oxford studies in comparative education, Oxford: Symposium books.
  13. Nichols, S.L. & Berliner, D.C. (2007). Collateral damage: How high-stakes testing corrupts America’s schools. Cambridge, MA: Harvard Education Press.
  14. OECD, (2014a). About PISA. --http://www.oecd.org/pisa/aboutpisa/. [Retrieved 29 March, 2014]
  15. OECD, (2014b). Home). --http://www.oecd.org/pisa/home/. [Retrieved 29 March, 2014]
  16. OECD, (2014c). PISA 2012 Results in Focus. What 15-year-olds know and what they can do with what they know. --http://www.oecd.org/pisa/keyfindings/pisa-2012-results-overview.pdf, [Retrieved 29 March, 2014].
  17. Pereyra M.A., Kotthoff H.G. & Cowen R. (Eds.), (2011). PISA Under Examination. Changing Knowledge, Changing Texts and Changing Schools. Rotterdam, Boston, Taipei: Sense Publishers.
  18. Pons, X. (2011). What Do We Really Learn from PISA? The Sociology of its Reception in Three European Countries (2001–2008). European Journal of Education, Special Issue: “On becoming a teacher: a lifelong process”, 46(4), pp. 540–548.
  19. Rutkowski, D., Rutkowski, L. & Langfeldt, G. (2012). Reading economics, thinking education: The relevance—and irrelevance—of economic theory for curriculum research. Journal of Curriculum Studies, 44(2), pp. 165-192.
  20. Schmeichel, M. (2012). Good Teaching? An examination of culturally relevant pedagogy as an equity practice. Journal of Curriculum Studies, 44(2), pp. 211-231.
  21. Udson, B. (2002). Holding complexity and searching for meaning: Teaching as reflective practice. Journal of Curriculum Studies, 34(1), pp. 43-57.

Vasco d’Agnese, PISA’s Weakness: Why Teacher Experience matters in "EDUCATIONAL REFLECTIVE PRACTICES" 1/2017, pp 80-95, DOI: 10.3280/ERP2017-001006