Come valutare un intervento di contrasto alla povertà educativa con il metodo sperimentale? Alcune lezioni dalla valutazione di WILL - Educare al Futuro

Journal title RIV Rassegna Italiana di Valutazione
Author/s Davide Azzolini, Loris Vergolini
Publishing Year 2022 Issue 2021/80-81 Language Italian
Pages 24 P. 58-81 File size 332 KB
DOI 10.3280/RIV2021-080004
DOI is like a bar code for intellectual property: to have more infomation click here

Below, you can see the article first page

If you want to buy this article in PDF format, you can do it, following the instructions to buy download credits

Article preview

FrancoAngeli is member of Publishers International Linking Association, Inc (PILA), a not-for-profit association which run the CrossRef service enabling links to and from online scholarly content.

  1. Abbiati G., Argentin G., Azzolini D., Ballarino G. and Vergolini, L. (2022). A review of Italian field experiments in education. Swiss Journal of Sociology, in corso di stampa.
  2. Berk R.A. (2005). Randomized experiments as the bronze standard. Journal of Experimental Criminology, 1(4): 417-433.
  3. Beverly S.G., Elliott W. and Sherraden M. (2013). Child Development Accounts and college success: Accounts, assets, expectations, and achievements (CSD Perspective No. 13-27). St. Louis, MO: Washington University, Center for Social Development. DOI: 10.7936/K7805247
  4. Bloom H. (2006). The core analytics of randomized experiments for social research. New York, NY: MDRC.
  5. Checchi D. (2018). “Confessioni di un ricercatore non pentito.” Presentazione al convegno INPS-SIEP I dati amministrativi per le analisi socio-economiche e la valutazione delle politiche pubbliche, Roma, 23 maggio.
  6. De Blasio G., Nicita A. and Pammolli F. (a cura di) (2021). Evidence-based Policy! Ovvero perché politiche pubbliche basate sull’evidenza empirica rendono migliore l’Italia. Bologna: Il Mulino.
  7. Deaton A., Cartwright N. (2018). Understanding and misunderstanding randomized controlled trials. Social Science & Medicine, 210: 2-21.
  8. Derrick-Mills T. (2020). Using Implementation Science to Systematically Identify Opportunities for Learning and Improvement. Washington, DC: The Urban Institute.
  9. Edovald T., Firpo T. (2016). Running Randomised Controlled Trials in Innovation, Entrepreneurship and Growth: An Introductory Guide. London: Innovation Growth Lab. -- Testo disponibile al sito: https://www.innovationgrowthlab.org/guide-randomised-controlled-trials, consultato il 13/08/2021.
  10. Elliott W., Lewis M. (2018). Making education work for the poor: The potential of children’s savings accounts. Oxford, UK: Oxford University Press.
  11. Elliott W., Sherraden M. (2013). Assets and educational achievement: Theory and evidence. Economics of Education Review, 33: 1–7.
  12. Gerber A.S., Green D.P. (2012). Field experiments: Design, analysis, and interpretation. London: W. W. Norton & Company.
  13. Hatry H.P. (2020). Evaluation Guide for Public Service Program Managers. Washington, DC: The Urban Institute.
  14. J-PAL (2021). Research Resources. -- Testo disponibile al sito: https://www.povertyactionlab.org/research-resources?view=toc, consultato il 13/08/2021.
  15. Lortie-Forgues H., Inglis M. (2019). Rigorous Large-Scale Educational RCTs Are Often Uninformative: Should We Be Concerned? Educational Researcher, 48(3): 158–166. DOI: 10.3102/0013189X1983285
  16. Martini A., Azzolini D., Romano B. and Vergolini, L. (2021), Increasing College Going by Incentivizing Savings: Evidence from a Randomized Controlled Trial in Italy. Journal of Policy Analysis and Management, 40: 814-840.
  17. Martini A., Sisti M. (2009). Valutare il successo delle politiche pubbliche. Bologna: Il Mulino.
  18. McLaughlin J.A., Jordan G.B. (2015). Using logic models. In: Newcomer K.E, Harry H.P. e Wholey J.S., a cura di, Handbook of practical program evaluation. Hoboken, NJ: John Wiley & Sons.
  19. Montgomery, P., Grant, S., Mayo-Wilson, E. et al. (2018) Reporting randomised trials of social and psychological interventions: the CONSORT-SPI 2018 Extension. Trials 19, 407. https://doi.org/10.1186/s13063-018-2733-1
  20. Newcomer K.E., Hatry H.P. and Wholey J.S., a cura di (2015). Handbook of practical program evaluation. John Wiley & Sons.
  21. Openpolis e Con I bambini (2021) Scelte compromesse, Gli adolescenti in Italia, tra diritto alla scelta e impatto della povertà educativa. OSSERVATORIO POVERTÀ EDUCATIVA #CONIBAMBINI. -- Testo disponibile al sito: https://www.conibambini.org/wp-content/uploads/2021/02/Scelte_compromesse.pdf, consultato il 13/08/2021.
  22. Petrosino A., Turpin‐Petrosino C., Hollis‐Peel M.E. and Lavenberg J.G. (2013). Scared Straight and other juvenile awareness programs for preventing juvenile delinquency: A systematic review. Campbell Systematic Reviews, 9(1): 1-55.
  23. Poister T.H. (2015). Performance Measurement: Monitoring Program Outcomes. In: Newcomer K.E, Harry H.P. e Wholey J.S., a cura di, Handbook of practical program evaluation. Hoboken, NJ: John Wiley & Sons.
  24. Rossi P.H., Lipsey M.W. and Freeman H.E. (2004). Evaluation: A systematic approach. London: Sage Publications.
  25. Schulz K.F., Altman, D.G. and Moher D., (2011). CONSORT 2010 explanation and elaboration: updated guidelines for reporting parallel group randomised trials. International journal of surgery, 9(8): 672-677.
  26. Werner A. (2003). A Guide to Implementation Research. Washington DC: The Urban Institute Press.
  27. What Works Clearinghouse. (2014). Procedures and Standards Handbook (Version 3.0). Washington, DC: Institute of Education Sciences.

Davide Azzolini, Loris Vergolini, Come valutare un intervento di contrasto alla povertà educativa con il metodo sperimentale? Alcune lezioni dalla valutazione di WILL - Educare al Futuro in "RIV Rassegna Italiana di Valutazione" 80-81/2021, pp 58-81, DOI: 10.3280/RIV2021-080004