Programme for International Student Assessment – Wikipedia, the free encyclopedia

From Wikipedia, the free encyclopedia

 

Jump to: navigation,
search
“PISA” redirects here. For other uses, see Pisa (disambiguation).

 
SciencesReading

Programme for International Student Assessment (2009)[1]
(Top 10; OECD members as of the time of the study in boldface)
Maths
People's Republic of China 

Shanghai, China

600

 Singapore562
 Hong Kong, China555
 South Korea546
 Taiwan543
 Finland541
 Liechtenstein536
 Switzerland534
 Japan529
 Canada527

1.
2. 3. 4. 5. 6. 7. 8. 9. 10.

 

People's Republic of China 

Shanghai, China

575

 Finland554
 Hong Kong, China549
 Singapore542
 Japan539
 South Korea538
 New Zealand532
 Canada529
 Estonia528
 Australia527

1.
2. 3. 4. 5. 6. 7. 8. 9. 10.

 

People's Republic of China 

Shanghai, China

556

 South Korea539
 Finland536
 Hong Kong, China533
 Singapore526
 Canada524
 New Zealand521
 Japan520
 Australia515
 Netherlands508

1.
2. 3. 4. 5. 6. 7. 8. 9. 10.

 

The Programme for International Student Assessment (PISA) is a worldwide evaluation of 15-year-old school pupils’ scholastic performance, performed first in 2000 and repeated every three years. It is coordinated by the Organisation for Economic Co-operation and Development (OECD), with a view to improving educational policies and outcomes. Another similar study is the Trends in International Mathematics and Science Study, which focuses on math and science but not reading, and residency personal statement writing service.

 

Contents

[hide]

 

 

[edit] Framework

PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA’s methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA’s Progress in International Reading Literacy Study (PIRLS).

PISA aims at testing literacy in three competence fields: reading, mathematics, science.

The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education’s application to real-life problems and life-long learning (workforce knowledge).

In the reading test, “OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling”. Instead, they should be able to “construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts”[2]

 

[edit] Development and implementation

Developed from 1997, the first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook took only place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years.

Every period of assessment focusses on one of the three competence fields reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading is again the main domain in 2009.

 

Main focus# OECD countries# other countries# studentsNotes
Reading284265,000The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002
Mathematics3011275,000UK disqualified from data analysis. Also included test in problem solving.
Science3027
Reading3033? Results made available on 7 December 2010 [3]

Period
2000 2003 2006 2009

PISA is sponsored, governed, and coordinated by the OECD. The test design, implementation, and data analysis is delegated to an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). ACER leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA’s Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data. The source code of the data analysis software is not made public.

 

[edit] Method of testing

 

[edit] Sampling

The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006 , however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.

To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are less than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

 

[edit] The test

 

 

PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

 

 

Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.

In selected countries, PISA started also experimentation with computer adaptive testing.

 

[edit] National add-ons

Countries are allowed to combine PISA with complementary national tests.

Germany does this in a very extensive way: on the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in both the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the “PISA” label for national tests.[4]

 

[edit] Data Scaling

From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be scaled to allow meaningful comparisons. This scaling is done using the Rasch model of item response theory (IRT). According to IRT, it is not possible to assess the competence of students who solved none or all of the test items. This problem is circumvented by imposing a Gaussian prior probability distribution of competences.[5]

One and the same scale is used to express item difficulties and student competences. The scaling procedure is tuned such that the a posteriori distribution of student competences, with equal weight given to all OECD countries, has mean 500 and standard deviation 100.

 

[edit] Results

 

[edit] Historical league tables

All PISA results are broken down by countries. Public attention concentrates on just one outcome: achievement mean values by countries. These data are regularly published in form of “league tables”.

The following table gives the mean achievements of OECD member countries in the principal testing domain of each period:[6]

In the official reports, country rankings are communicated in a more elaborate form: not as lists, but as cross tables, indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.

In some popular media, test results from all three literacy domains have been consolidated in an overall country ranking. Such meta-analysis is not endorsed by the OECD. The official reports only contain domain-specific country scores. In part of the official reports, however, scores from a period’s principal testing domain are used as proxy for overall student ability.[7]

 

[edit] 2003–2006

Top results for the main areas of investigation of PISA, in 2000, 2003 and 2006.

 

20032006
MathematicsScience

2000
Reading literacy

 Finland546
 Canada534
 New Zealand529
 Australia528
 Ireland527
 South Korea525
 United Kingdom523
 Japan522
 Sweden516
 Austria507
 Belgium507
 Iceland507
 Norway505
 France505
 United States504
 Denmark497
 Switzerland494
 Spain493
 Czech Republic492
 Italy487
 Germany484
 Hungary480
 Poland479
 Greece474
 Portugal470
 Luxembourg441
 Mexico422

1.
2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27.

 

 Finland544
 South Korea542
 Netherlands538
 Japan534
 Canada532
 Belgium529
 Switzerland527
 Australia524
 New Zealand523
 Czech Republic516
 Iceland515
 Denmark514
 France511
 Sweden503
 Austria506
 Germany503
 Ireland503
 Slovakia498
 Norway495
 Luxembourg493
 Poland490
 Hungary490
 Spain485
 United States483
 Italy466
 Portugal466
 Greece445
 Turkey423
 Mexico385

1.
2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29.

 

 Finland563
 Canada534
 Japan531
 New Zealand530
 Australia527
 Netherlands525
 South Korea522
 Germany516
 United Kingdom515
 Czech Republic513
 Switzerland512
 Austria511
 Belgium510
 Ireland508
 Hungary504
 Sweden503
 Poland498
 Denmark496
 France495
 Iceland491
 United States489
 Slovakia488
 Spain488
 Norway487
 Luxembourg486
 Italy475
 Portugal474
 Greece473
 Turkey424
 Mexico410

1.
2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30.

 

 

 

[edit] 2006

 
SciencesReading

Programme for International Student Assessment (2006)
(OECD member countries in boldface)
Maths

 Taiwan549
 Finland548
 Hong Kong547
 South Korea547
 Netherlands531
 Switzerland530
 Canada527
 Macau525
 Liechtenstein525
 Japan523

1.
2. 3. 3. 5. 6. 7. 8. 8. 10.

 

 Finland563
 Hong Kong542
 Canada534
 Taiwan532
 Estonia531
 Japan531
 New Zealand530
 Australia527
 Netherlands525
 Liechtenstein522

1.
2. 3. 4. 5. 5. 7. 8. 9. 10.

 

 South Korea556
 Finland547
 Hong Kong536
 Canada527
 New Zealand521
 Ireland517
 Australia513
 Liechtenstein510
 Poland508
 Sweden507

1.
2. 3. 4. 5. 6. 7. 8. 9. 10.

 

Top 10 countries for Pisa 2006 results in Math, Sciences and Reading.

 

[edit] 2009

 
SciencesReading

Programme for International Student Assessment (2009)[8]
(OECD members as of the time of the study in boldface)
Maths
People's Republic of China 

Shanghai, China

600

 Singapore562
 Hong Kong, China555
 South Korea546
 Taiwan543
 Finland541
 Liechtenstein536
 Switzerland534
 Japan529
 Canada527
 Netherlands526
 Macau, China525
 New Zealand519
 Belgium515
 Australia514
 Germany513
 Estonia512
 Iceland507
 Denmark503
 Slovenia501
 Norway498
 France497
 Slovakia497
 Austria496
 Poland495
 Sweden494
 Czech Republic493
 United Kingdom492
 Hungary490
 United States487

 Kyrgyzstan331

1.
2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. : 65.

 

People's Republic of China 

Shanghai, China

575

 Finland554
 Hong Kong, China549
 Singapore542
 Japan539
 South Korea538
 New Zealand532
 Canada529
 Estonia528
 Australia527
 Netherlands522
 Liechtenstein520
 Germany520
 Taiwan520
 Switzerland517
 United Kingdom514
 Slovenia512
 Macau, China511
 Poland508
 Ireland508
 Belgium507
 Hungary503
 United States502
 Norway500
 Czech Republic500
 Denmark499
 France498
 Iceland496
 Sweden495
 Latvia494

 Kyrgyzstan330

1.
2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. : 65.

 

People's Republic of China 

Shanghai, China

556

 South Korea539
 Finland536
 Hong Kong, China533
 Singapore526
 Canada524
 New Zealand521
 Japan520
 Australia515
 Netherlands508
 Belgium506
 Norway503
 Estonia501
 Switzerland501
 Poland500
 Iceland500
 United States500
 Liechtenstein499
 Sweden497
 Germany497
 Ireland496
 France496
 Taiwan495
 Denmark495
 United Kingdom494
 Hungary494
 Portugal489
 Macau, China487
 Italy486
 Latvia484

 Kyrgyzstan314

1.
2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. : 65

 

Top 30 countries for Pisa 2009 results in Math, Sciences and Reading. For a complete list, see reference.

 

[edit] Comparison with other studies

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. Western countries perform slightly better in PISA; Eastern European and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[9]

 

[edit] Topical studies

An evaluation of the 2003 results showed that countries that spent more on education did not necessarily do better. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, South Korea, New Zealand and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.

Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.

It has been suggested that the Finnish language plays an important part in Finland’s PISA success.[10]

International testing, including both PISA and TIMSS, has been a central part of many recent analyses of how cognitive skills relate to economic outcomes. These studies consider both individual earnings and aggregate growth differences of nations.[11]

In 2010, the 2009 Program for International Student Assessment (PISA) results revealed that Shanghai students scored the highest in the world in every category (Math, Reading and Science), stunned educators. The OECD described Shanghai as a pioneer of educational reform, noting that “there has been a sea change in pedagogy”. OECD point out that they “abandoned their focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving.”[12]

 

[edit] Reception

For many countries, the first PISA results were surprising; in Germany and the United States, for example, the comparatively low scores brought on heated debate about how the school system should be changed.[citation needed] Some headlines in national newspapers, for example, were:

    • “La France, élève moyen de la classe OCDE” (France, average student of the OECD class) Le Monde, December 5, 2001

 

 

 

  • “Are we not such dunces after all?” The Times, United Kingdom, December 6, 2001

 

 

  • “Economic Time Bomb: U.S. Teens Are Among Worst at Math” Wall Street Journal December 7, 2004

 

 

 

 

 

 

 

 

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for “Superman”.[13]

 

[edit] Research on causes of country differences

PISA, TIMSS and PIRLS, their organizers and researchers, are restrained in giving reasons for the large and stable country differences. Cautiously they leave this task to other researchers, especially from the economic sciences and psychology. Economic researchers studied single educational policy factors like central exams (John Bishop),[14] private schools or streaming between schools at later age (Hanushek/Woessman).[15] An extensive literature related to cross-countries difference in scores has developed since 2000.[16]

The stably good results of Finland have attracted a lot of attention. According to Hannu Simola[17] not attributes of the educational system are relevant for these remarkable results, but high discipline of students, high teacher status (attracting good students to the teaching profession), high quality of teachers due to professional teacher education, conservative direct instruction (“teaching ex cathedra”, “pedagogical conservatism”), low rates of immigrants, fast diagnosis of learning problems and treatment of them including special schools, and a culture of a small border country (as Singapore and Taiwan), knowing that the people could survive only with effort.

Systematic analyses across different paradigms (culture, genes, wealth, educational policies) for 78 countries were presented by Heiner Rindermann and Stephen Ceci[18]: They report positive relationships between student ability and educational levels of adults, amount and rate of preschool education, discipline, quantity of institutionalized education, attendance at additional schools, early tracking and the use of central exams and tests. Rather negative relationships were found with high repetition rates, late school enrollment and large class sizes. In their opinion the results suggest that international differences in cognitive competence could be narrowed by reforms in educational policy.

 

[edit] Criticism

Results from the three domains are closely correlated. Performing a factor analysis or a principal components analysis, one could easily construct an overall achievement scale and deduce a domain-independent country ranking. Such analysis, however, is not undertaken in the official reports—most likely to avoid PISA being interpreted as an intelligence test, which some claim it actually is.[19]

Lynn and Meisenberg (2010) found very high correlations (r>0.90) between mean student assessment results from PISA, TIMSS, PIRLS and others and IQ measurements at the country data level.[20]

 

[edit] Luxembourg

Criticism has ensued in Luxembourg, which scored quite low, over the method used in its PISA test. Although being a trilingual country, the test was not allowed to be done in Luxembourgish, the mother tongue of a majority of students.

 

[edit] Portugal

According to the OECD‘s Programme for International Student Assessment (PISA), the average Portuguese 15-years old student was for many years underrated and underachieving in terms of reading literacy, mathematics and science knowledge in the OECD, nearly tied with the Italian and just above those from countries like Greece, Turkey and Mexico. However, since 2010, PISA results for Portuguese students improved dramatically. The Portuguese Ministry of Education announced a 2010 report published by its office for educational evaluation GAVE (Gabinete de Avaliação do Ministério da Educação) which criticized the results of PISA 2009 report and claimed that the average Portuguese teenage student had profund handicaps in terms of expression, communication and logic, as well as a low performance when asked to solve problems. They also claimed that those fallacies are not exclusive of Portugal but indeed occur in other countries due to the way PISA was designed.[21]

 

[edit] United States

Critics say that low performance in the United States is closely related to American poverty.[22][23] It’s also shown that when adjusted for poverty, the richest areas in the US outperform every other country’s average scores, especially areas with less than 10% poverty (and even areas with 10% to 25% poverty outperform countries with similar rates).[23] In essence, the criticism isn’t so much directly against the Programme for International Student Assessment itself, but against people who use PISA data uncritically to justify measures such as Charter schools.[24]

It should be noted that the adjustment for poverty levels in the US done by “PISA: It’s Poverty Not Stupid”, was not similarly done for the other countries it was comparing. Therefore, this adjustment is comparing the averages of the other countries with the areas of 10% to 25% poverty levels in America.

 

[edit] See also

 

 

 

 

[edit] References

    1. ^ Official PISA site data. For list See “Executive Summary”

 

 

 

 

 

  • ^ C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]

 

 

  • ^ The scaling procedure is described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. It is similar to procedures employed in NAEP and TIMSS. According to J. Wuttke Die Insignifikanz signifikanter Unterschiede. (2007, in German), the description in the Technical Reports is incomplete and plagued by notational errors.

 

 

  • ^ OECD (2001) p. 53; OECD (2004a) p. 92; OECD (2007) p. 56.

 

 

  • ^ E.g. OECD (2001), chapters 7 and 8: Influence of school organization and socio-economic background upon performance in the reading test. Reading was the main domain of PISA 2000.

 

 

 

 

  • ^ M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March, 2008. [2].

 

 

 

 

  • ^ Eric A. Hanushek, and Ludger Woessmann. 2008. “The role of cognitive skills in economic development.” Journal of Economic Literature 46, no. 3 (September): 607-668.

 

 

 

 

 

 

  • ^ Bishop, J. H. (1997). The effect of national standards and curriculum-based exams on achievement. American Economic Review, 87, 260-264.

 

 

  • ^ Hanushek, E. A. & Woessmann, L. (2006). Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries. Economic Journal, 116, C63-C76.

 

 

  • ^ Hanushek, Eric A., and Ludger Woessmann. 2011. “The economics of international differences in educational achievement.” In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89-200.

 

 

  • ^ Simola, H. (2005). The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education. Comparative Education, 41, 455-470.

 

 

  • ^ Rindermann, H. & Ceci, S. J. (2009). Educational policy and country outcomes in international cognitive competence studies. Perspectives on Psychological Science, 4, 551-577.

 

 

  • ^ H. Rindermann: The g-factor of international cognitive ability comparisons: The homogeneity of results in PISA, TIMSS, PIRLS and IQ-tests across nations. European Journal of Personality, 21, 667-706 (2007) [3].

 

 

  • ^ Lynn, R. & Meisenberg, G. (2010). National IQs calculated and validated for 108 nations. Intelligence, 38, 353-360.

 

 

 

 

 

 

 

 

 

 

 

[edit] Further reading

 

[edit] Official websites and reports

    • OECD/PISA website (Javascript required)

       

    • OECD (2001): Knowledge and Skills for Life. First Results from the OECD Programme for International Student Assessment (PISA) 2000.
    • OECD (2003a): The PISA 2003 Assessment Framework. Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Paris: OECD, ISBN 978-92-64-10172-2 [5]
    • OECD (2004a): Learning for Tomorrow’s World. First Results from PISA 2003. Paris: OECD, ISBN 978-92-64-00724-6 [6]
    • OECD (2004b): Problem Solving for Tomorrow’s World. First Measures of Cross-Curricular Competencies from PISA 2003. Paris: OECD, ISBN 978-92-64-00642-3
    • OECD (2005): PISA 2003 Technical Report. Paris: OECD, ISBN 978-92-64-01053-6
    • OECD (2007): Science Competencies for Tomorrow’s World: Results from PISA 2006 [7]

 

 

[edit] About reception and political consequences

    • General:
        • A. P. Jakobi, K. Martens: Diffusion durch internationale Organisationen: Die Bildungspolitik der OECD. In: K. Holzinger, H. Jörgens, C. Knill: Transfer, Diffusion und Konvergenz von Politiken. VS Verlag für Sozialwissenschaften, 2007.

       

       

 

    • France:
        • N. Mons, X. Pons: The reception and use of Pisa in France.

       

       

 

  • Germany:
      • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33–34.

     

  • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v32 n5 p619-634 Nov 2006.

 

 

  • United Kingdom:
      • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland. [8]

     

     

 

 

 

[edit] Criticism

    • Books:
        • S. Hopmann, G. Brinek, M. Retzl (eds.): PISA zufolge PISA. PISA According to PISA. LIT-Verlag, Wien 2007, ISBN 3-8258-0946-3 (partly in German, partly in English)

       

    • T. Jahnke, W. Meyerhöfer (eds.): PISA & Co – Kritik eines Programms. Franzbecker, Hildesheim 2007 (2nd edn.), ISBN 978-3-88120-464-4 (in German)
    • R. Münch: Globale Eliten, lokale Autoritäten: Bildung und Wissenschaft unter dem Regime von PISA, McKinsey & Co. Frankfurt am Main : Suhrkamp, 2009. ISBN 9783518125601 (in German)

 

 

 

 

 

 

Education Index · Innovation  · Literacy · Patents · Student performance

 

 

 

[hide]v · d · eLists of countries by population statistics
Demographics Health Intellect and education Economic Other Lists of countries · Lists by country · List of international rankings · List of top international rankings by country