Author: gordon

  • The Future According to Schmidt: "Augmented Humanity," Integrated Into Google | Fast Company

    Google Man

    In the future we will be Google, and Google will be us–the online giant will make us better humans. That’s according to soon-to-be ex-CEO Eric Schmidt speaking today at DLD 11 conference
    in Germany.

    Schmidt announced this week that he’s departing his role as Google’s CEO, and then it emerged
    he was winning a $100 million golden parachute as a thank-you for his
    time in charge of one of the world’s most important tech companies.
    Perhaps that’s why he felt he could reveal some of his forecasts for the
    future at Google. A future he was careful to note he’ll be involved in
    (promising he’ll be at Google for at least “the next decade”).

    After some preamble about his time at Google, Schmidt got to the good stuff: The future is mobile, he thinks, driven by the “device of our time,” the smartphone–and the tablet. Children now have two states of existence, aided by this trend–“asleep or online.” Within two years Schmidt sees smartphone sales outstripping desktop PC sales, and the mobile sector is growing eight times faster than traditional PCs did at the same stage in their evolution. Soon the majority of online action will happen from mobile devices, and landlines will effectively be dead for phone-call purposes.

    Then Schmidt got brave: Unconnected devices today are “no longer interesting,” he thinks. Soon everything, but everything, will be hooked up to everything over wireless nets. A good example of the phenomena this enables, Schmidt said, is Google’s recent efforts at a spoken-word universal speech translator app–which connects to many, many servers over the Net to give you access to near real-time voice recognition and translation.

    So far so good, but then came the most fascinating bit of the whole speech. Computers, Schmidt thinks, can, when used ubiquitously and interactively and with cloud-like access to remote supercomputer powers can give us “senses” we didn’t know were possible. “Think of it as augmented humanity” he suggested.

    That’s Augmented humanity. Not augmented reality, an idea we’ve been getting used to. It’s a subtle twist but an important one. Many tools may be trusted to “augment reality” to appeal to a more digitally interactive culture–you could be happy using Layar on your phone to find a nearby beach, then Around.Me to find a good bar, Yelp to find a restaurant, and Word Lens to live-translate the foreign language menu for you. It’s an interactive process, driven by consumer choice.

    But augmented humanity implies inserting tech deliberately in the way of normal life, to better it. And Schmidt’s unspoken line is that job should fall to Google–it has diverse tools that operate in all of these “augmented info” spaces and beyond, and if they were all centralized and presented to you seamlessly via Android smartphones, then it could improve the human race. After all, thanks to its vast user-info databases, Google already knows pretty much everything about you, and almost what you’re thinking about where you’re going next (as Schmidt has previously noted.) He caveated his argument with lots of references to the phrase “with your permission,” obviously concerned he was overstepping the user-privacy boundary. But do we trust Google with the future of 21st century humankind? There’s a big assumption here that Google will always promise not to be evil.

    Or is Schmidt revealing that it really is time for him to move on?

    To read more news on this, and similar stuff, keep up with my updates by following me, Kit Eaton, on Twitter.

  • Evan Williams | evhead: An Obvious Next Step

    March 29, 2011

    An Obvious Next Step

    915420097396941738

    • Posted at 4:29 PM

    I’m a very lucky guy. Over the past twelve years, I’ve had the good fortune to work on two huge projects that happened to be the right idea at the right time. These ideas attracted brilliant, idealistic people to do the incredibly hard work of making them work for millions of other people across the globe. And through each, I learned a tremendous amount about business, products, and people.

    I spent about five and a half years of my professional life on Blogger. Though not a headline-grabber these days, it has continued to evolve and grow to be one of the top 10 web services on the planet. It’s the place where tens of millions of people turn to freely share their thoughts with the world—and where hundreds millions more go to read those thoughts. It was a tiny fraction of that size when I left over six years ago, so I really must thank the awesome team of Googlers who have been shepherding it since. I’m also extremely grateful to the leaders of Google for giving Blogger the room to thrive. I’m still proud to have been among them for a short time.

    I’ve now spent about five years on Twitter, as well—in a variety of different capacities. Twitter has a funny history. It spent its first six months as a side project of Odeo, a company I was running that didn’t have a lot of traction. Twitter didn’t have much traction either, so we shed Odeo, Inc. and pulled them both into Obvious Corp. When Twitter started to really take off, a year after it started, we spun it into its own corporation and made Jack the CEO. In theory, Obvious could then pursue new projects, but I spent more of my time as active Twitter chairman, which included everything from helping raise funds to coding. In spring of 2008, I was fully sucked in by the Twitter tornado, serving full-time as chief product officer at first and then CEO, which I did for two years.

    After stepping down from CEO six months ago, my mind started to wander. The reason I left Blogger/Google when I did is that I felt it had reached a place where it was on solid ground and in capable hands (at the time, Jason Goldman’s as product manager). Though still an independent company, I realized Twitter may be at a similar point today. So, as was reported in various places yesterday, I’ve decided to scale back my role at the company. (I’m still involved, but it’s no longer my full-time job.)

    I’m not ready to talk about what I have planned next, but I will venture a prediction about what’s next for Twitter: It will be bigger and better.

    When I took the CEO job, there were many who didn’t think Twitter would last this long. Today, even the naysayers have begrudgingly accepted it’s not disappearing anytime soon. I have the utmost confidence that, like Blogger, Twitter will grow an order of magnitude more (even though that’s a much taller order, given its size already). The momentum is just incredibly strong, critical mass has been reached, and the dark days of imminent technical meltdown are over.

    It’s not that momentum and critical mass haven’t been lost before in this industry. And there is still a massive amount of work to do—to build a business, but also to simply complete the vision we’ve had for the product for a long time.

    There are many people in the company who share that vision, and I have the utmost confidence in them. Founders, in general, get an out-sized share of the credit for any successful company. There are hundreds of people at Twitter now, some of whom have been there for years and played critical roles. There are those whom you know by name and others you may never have heard of individually, but they have all contributed to the company’s success. I’d venture to say it’s one of the finest teams ever assembled in the Internet industry, and it’s the accomplishment of which I’m most proud. Not just because they are people who are good at their jobs, but because they’re good people.

    When I was running the company, I felt very privileged that this amazing group had granted me leadership. (It practically brought me to tears on multiple occasions, during our all-hand’s meetings, when someone demonstrated their unique and heartfelt awesomeness.) It was they who collectively helped Twitter mature from a quirky, wobbly toddler of a service with great potential but way too much attention for it’s own good to an operation that is becoming—if not already has become in some areas—world class. And it is they who will take it to the next level, which will surprise us all.

    So, really, what’s next?
    First of all, I’m not disappearing from Twitter. I remain on the board of directors and will frequently meet with many folks there to help in any way I can.

    However, now that Twitter is in capable hands that aren’t mine, it’s time to pick up a whiteboard marker and think fresh. There are other problems/opportunities in the world that need attention, and there are other individuals I’d love to get the opportunity to work with and learn from. (Details to come.)

    While I doubt I’ll get so lucky a third time, as my good friend Biz Stone likes to say, “Creativity is a renewable resource.” Let’s see what happens.

  • Q&A: Culture Shock, How Social Media is Changing the Culture of Business Brian Solis

    Good friend JD Lasica asked me to answer some fantastic questions for a post he published in celebration of Engage. I poured so much of myself into the responses, that I felt it was worth sharing here with you as well.

    Many of the lessons and observations below are important for you as a champion, decision maker, entrepreneur, or executive.  Social Media is not only changing how we communicate, we are also changing the culture of business from the outside in and from the bottom up. In doing so, businesses, of all shapes and sizes, will magnetize communities. As such, the intentional creation and crafting of a useful and meaningful culture in business will create a competitive advantage, giving people a reason to align and ultimately embody and extend your purpose and mis

  • Social Media World Forum – virtual.

    london-agenda-wide


     register-banner-nobird

    Day 1  (29 March 2011)

    Day 2 (30 March 2011)

    0930

    Event Chair Welcome & Introduction
    Robin Grant, Managing Director, We Are Social

     marketing SESSION ONE: SOCIAL MEDIA FOR BRAND MANAGEMENT
    0945
    marketing

    Keynote Panel:
    Sit back and tweet: Interactive discussion to examine how some of the leading brands harness and embrace social media marketing
    • Is message penetration at the heart of the conversation
    • How can you build brand trust – given it is the most important thing in creating brand equity
    • Sentiment: the complexity of brand perception and positioning
    Babs Rangaiah, Vice President of Global Communications Planning, Unilever
    Ben Padley VP, Global Head of Digital Engagement and CRM, Sony Ericsson
    Martine Edgell, CRM Specialist, Mercedes Benz UK
    Sonia Carter,
    Head of Digital Strategy, Cadbury UK

    1045
    marketing
    Case Study – Dell’s Evolving Use of Twitter
    Kerry Bridge, Global Digital Media Communications Manager, Dell Public Sector

     
    1120 Networking Break
    crm

    SESSION TWO: SOCIAL CRM & COMMUNITY BUILDING

    1150
    crm

     Discussion: Strategies for developing online communities for all levels
    • Characteristics of successful communities – why do some communities flourish while others wither?
    • Contrast and compare key strategies for developing online communities
    • What can a having an online community achieve? – objectives and how to measure success.
    • Is moderation important? How should a group be monitored and managed?
    • Case studies of successful companies that have implemented communities for their customers, prospects and employees – investment and ROi
    • Brand benefits of building relationships with existing online communities
    Barry Rutter, Web Services Manager, British Dental Association
    Rob Howard, Founder & CTO, Telligent
    Michael Gether, VP, Media & Content, MyCube 
    Alan Wolk,
    Managing Director of Social Media Strategy, KickApps
    Graeme Harvey,
    MD, Huzu Tech

    1230
    crm

    Discussion: Customer Insight and Co-Creation Embracing Social Media
    • How are brands customer insight managers using social media?  What are the best approaches to market research via social media?
    • Engaging customers in decision making processes through social media: Are brands listening? How far can customer views be used in practice?
    • Could using social media for insight and co-creation be the future for all FMCG?  What limits does this approach have?
    • Impact of treating customers as partners rather than just respondents… engaging the audience. 
    • Is co-creation a no-go area for some brands?
    Moderator: Justin Pearse,Editor New Media Age
    Oliver Lucas, Head of Customer Insight & CRM, New Look
    Kerry Bridge, Global Digital Media Communications Manager, Dell Public Sector
    Phil Guest,
    EVP Global Ad Sales, Sulake (Habbo Hotel)

    1300 Networking Break
    Afternoon chair: Anthony Edwards, Director of channel strategy, Euro RSCG London 
    1400
    crm
     Digital engagement: Meaningful crowdsourcing via social media
    • How can crowdsourcing build meaningful interaction with your audience?
    • What should the aims of crowdsourcing be? What motivations exist for active involvement / can these relationships be leveraged for future engagement?
    • Are some problems simply too complex for crowdsourcing to be appropriate?
    • Effective social media resources to aid crowdsourcing, and tips for using these to enhance engagement.
    Nick Jones, Director, Interactive Services, COI
     
    1430
    enterprise
    Enterpsie social media for CRM

    Building Consumer Communities: What Does Success Look Like?
     Looking closely into how this has work for different organisations

    Sean Greenan, Senior Sales Engineer, Lithium International

    1500

    Networking Break

    1530
    crm
    The power of social media for customer service & support
    • What at the key success factors making social media part of your customer service strategy?
    • How do you identify the best social media channels to focus on for your customer base?
    • How should social media become integrated into a wider customer service strategy from an operational perspective? Resources, policies, staff training, IT/infrastructure?
    • Case studies – who is using social media effectively, how much investment was required and what is the return on this investment?
    Bian Salins, Head of Social Media Innovation, BT Customer Service
    1610 Close of conference

    0930 Event Chair Welcome & Introduction
    Bill Scott, COO, easel TV
    tv SESSION ONE: SOCIAL TV 
    0945
    tv
     

    Interactive Panel: What impact will Social TV have on the media industry as a whole?
    • The impact of social networks moving into the TV business
    • How Pay TV operators can harness Social Networks and web 2.0 within a multiplatform strategy?
    • Will Social Networks take on the role of content aggregators or producers?
    • How production companies can use Social Networks as a way to develop new ideas and new talent
    • Which aspects of social networks are appropriate for TV?
    • Can social networks enhance content discovery?
    • Build your own community or use others’?

    Moderator: Bill Scott, COO, easeltv  
    John Denton,
    Managing Editor of TV Platforms, BBC
    Andy Gower
    , Research Group Leader, BT
    Dick Rempt, CEO, Talents Media
    Tom Smith,
    Founder and Director, GlobalWebIndex

    1045
    tv
    Why is social TV good for consumers?
     
    Jose Alvear, Senior IPTV Analyst, Multimedia Research Group, MRG
    1120 Networking Break
     

    1200
    tv

     

     

     

     

     

    Top Tips for Running Awesome Social Video Campaigns
    • Social media for distributing & spreading online video content
    • Audience demographics reachable with online video
    • Defining aims of a campaign; measurable ROI
    • Case studies: brand use of online video content
      
    Scott Button, CEO, Unruly Media

    1230
    tv

    Are you there, Producer? It’s me, the audience
    • The changing dialogue between the Television producer and the audience.  How is social media changing how audiences view and interact with TV programmes?
    • Maximising audience engagement and encouraging participation in order to build a ‘brand’ around a show.
    • Using social media tools to build commitment and help drive stories.
    Claire Tavernier, Senior Executive Vice President, FMX and WWD, FremantleMedia

    1300

    Networking Break

    Afternoon chair: Claire Adams, Head of Social Media, Euro RSCG London
    gaming SESSION TWO: SOCIAL GAMING & VIRTUAL CURRENCIES
    1400
    gaming

    Bringing Brands Into Play: Social Games
    • What are the opportunities for brands to make an impact with social networkers that play games?
    – An overview of the scale of social games
    • How should brands approach advertising in Social game apps?
    – Integration considerations
    – Ad funded gameplay experiences
    – What would the player find rewarding?
    • How can a successful brand campaign in a Social game work within a brand’s Social Media strategy?
    – What are the types of messages that will work?
    – Which assets can be utilised in advertising touch points in Social Game Apps?
    • Examples of delivering branding campaigns into facebook game apps
    – Some WildTangent case studies
    Bill Clifford, Vice President, Global Ad Sales, WildTangent
    Adam Yates, Director, Advertising Sales EMEA, WildTangent

    1430

    Virtual currency markets and the opportunity they present to marketers
    • What is the market for virtual goods and services?
    • Challenges of monetizing games – delivering revenues from virtual currency
    • Facebook credits and brand integration within virtual currency offers
    • Monetizing the social media audience, and using social commerce tools to engage with consumers
    Manny Anekal, Global Director of Brand Advertising, Zynga
    Gilles Storme, Head of EMEA Advertising Sales, RockYou
    Andreas Bodczek,
    CEO, SponsorPay

    1500 Networking Break
    mobile  SESSION THREE: MOBILE SOCIAL MEDIA 
    1530
    mobile

    Mobile social media as a marketing tool
    • How is mobile social media evolving?
    • Mobile usage and patterns
    • What audience is most engaged with mobile social media
    • How can marketers incorporate mobile elements into their marketing strategy?
    • Practical application to marketing campaigns

    Cristian Cussen, Director of Business Development, Ning
    Mark Watts-Jones, Head of Product Marketing, Everything Everywhere / Orange UK
    Alex Musil
    , EVP of Product Marketing, Shazam

    1610  Close of conference

    0930 Event Chair Welcome & Introduction
    Toby Beresford,
    Chair – DMA Social Media Council, & Commercial Director, Syncapse
    0940

    Keynote: POWNAR: “The Power of News and Recommendation”
    Didier Mormesse
    , Senior VP of Advertising Sales Research, Development and Audience Insight at CNN International

      SESSION ONE: SOCIAL MEDIA ADVERTISING
    1010

    Integrated ad campaigns on social media platforms
    • How do paid ads fit within social media?
    • Targeting and functionality
    • Billing terms and structures
    • Legal & ethical implications
    • What is the role of the media planning agency in SM advertising
    • How does this fit with traditional advertising
    • Is this an effective use of social media
    Moderator: Jack Wallington, Head of Industry Programmes, Internet Advertising Bureau
    Paul Armstrong
    Head of Social, Mindshare
    Maz Nadj,
    Head of Social Media, Ogilvy
    Malcolm Phillips, Code Policy Manager, ASA
    Adam Yates,
    Director, Advertising Sales EMEA, WildTangent 

    1055 What the ASA remit means to social media
    Malcolm Phillips, Code Policy Manager, ASA
    SESSION TWO: SOCIAL START-UPS
    1115

    Social Media Start-ups
    In an interactive session hear from the latest start-ups making waves in the social media space.
     
    Latest new tools and platforms available
     
    What these new tools will bring to the social media landscape
    • How useful are they? How are they different from what else is available? 
     
    How can they be integrated into your marketing strategy
      

    Moderator: Dan Martin, Editor, BusinessZone.co.uk

    Nick Martin, CEO and Founder, Planely
    Darren Patterson, COO, Media Sift (DataSift & TweetMeme)
    Pardeep Kullar, Co-founder, LikeOurselves.com
    Eric Lagier, CEO & Co-Founder, Memolane

    SESSION THREE: SOCIAL SOAPBOX
    1215

    Social Soapbox Debate: Building your Social Media Plan ?
    Shared collective learning – pooling the audience discussing their experiences they have in building their social media strategy – what’s worked and what hasn’t.

    Open floor to anyone from the audience to participate, and come up on stage to share their views, and debate things they have learned in building and executing their social media plan?

    Lead by: Toby Beresford, Chair – DMA Social Media Council, & Commercial Director, Syncapse

     SESSION FOUR: SOCIAL MEDIA TOOL BOX
    1300

    Your Social media toolbox
    As an extension of the exhibitor area the social media hub includes interactive workshops from our leading sponsors and exhibitors.  Learn about the latest tools available and how you can implement these into your marketing plans

    • Overview of latest social media marketing tools – what’s new

    • What’s free – cheap and easy ways to get started? How can you use these tools?

    1300 Analysis of advetising opportunities within social gaming
    Gilles Storme, Head of EMEA Advertising Sales, RockYou
    1330 Accor hotels: Delighting customers by discovering their true voice
    Catriona Oldershaw, Managing Director, Synthesio UK
    Senior representative, Accor
    1400 Live Demo of the Syncapse Platform: Leading Social Media Management Software to Help Marketers Build, Manage and Measure Global Social Presence
    Michael Wilson, Senior Manager Business Development, Syncapse
    Pete Simmons, European Web Producer, EA (Electronic Arts)
    1430

    Case Study: Talk Talk – CRM and social strategy
    Giles Palmer, Founder & CEO, Brandwatch
    Phil Szomszor, Director, Corporate Practice, Citigate Dewe Rogerson

    1500 A practical guide to reputation monitoring – when to listen, what to ignore and what to do about what you hear, Steve Richards, MD of Yomego, The Social Media Agency  
    1530 How can Meltwater Buzz help monitor your brand?
    Managing your online social brand, including a client case study.
    Sara Davar, Area Director, Meltwater Buzz
    1600 Effective use social media for the teen market
    • Who are today’s teen market? How important is SM to them?  What are the usage patterns of teens in SM?
    • How safe is SM for the teen market, and what ethical concerns does this raise for SM providers and those using the platforms?
    • How important is gaming within social media for teens?
    • The future – what are the growth areas?
    Chris Seth, General Manager, Stardoll Media

    1620 Close of conference

    MOBILE SOCIAL MEDIA

  • Programme for International Student Assessment – Wikipedia, the free encyclopedia

    From Wikipedia, the free encyclopedia

     

    Jump to: navigation,
    search
    “PISA” redirects here. For other uses, see Pisa (disambiguation).

     
    SciencesReading

    Programme for International Student Assessment (2009)[1]
    (Top 10; OECD members as of the time of the study in boldface)
    Maths
    People's Republic of China 

    Shanghai, China

    600

     Singapore562
     Hong Kong, China555
     South Korea546
     Taiwan543
     Finland541
     Liechtenstein536
     Switzerland534
     Japan529
     Canada527

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10.

     

    People's Republic of China 

    Shanghai, China

    575

     Finland554
     Hong Kong, China549
     Singapore542
     Japan539
     South Korea538
     New Zealand532
     Canada529
     Estonia528
     Australia527

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10.

     

    People's Republic of China 

    Shanghai, China

    556

     South Korea539
     Finland536
     Hong Kong, China533
     Singapore526
     Canada524
     New Zealand521
     Japan520
     Australia515
     Netherlands508

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10.

     

    The Programme for International Student Assessment (PISA) is a worldwide evaluation of 15-year-old school pupils’ scholastic performance, performed first in 2000 and repeated every three years. It is coordinated by the Organisation for Economic Co-operation and Development (OECD), with a view to improving educational policies and outcomes. Another similar study is the Trends in International Mathematics and Science Study, which focuses on math and science but not reading, and residency personal statement writing service.

     

    Contents

    [hide]

     

     

    [edit] Framework

    PISA stands in a tradition of international school studies, undertaken since the late 1950s by the International Association for the Evaluation of Educational Achievement (IEA). Much of PISA’s methodology follows the example of the Trends in International Mathematics and Science Study (TIMSS, started in 1995), which in turn was much influenced by the U.S. National Assessment of Educational Progress (NAEP). The reading component of PISA is inspired by the IEA’s Progress in International Reading Literacy Study (PIRLS).

    PISA aims at testing literacy in three competence fields: reading, mathematics, science.

    The PISA mathematics literacy test asks students to apply their mathematical knowledge to solve problems set in various real-world contexts. To solve the problems students must activate a number of mathematical competencies as well as a broad range of mathematical content knowledge. TIMSS, on the other hand, measures more traditional classroom content such as an understanding of fractions and decimals and the relationship between them (curriculum attainment). PISA claims to measure education’s application to real-life problems and life-long learning (workforce knowledge).

    In the reading test, “OECD/PISA does not measure the extent to which 15-year-old students are fluent readers or how competent they are at word recognition tasks or spelling”. Instead, they should be able to “construct, extend and reflect on the meaning of what they have read across a wide range of continuous and non-continuous texts”[2]

     

    [edit] Development and implementation

    Developed from 1997, the first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook took only place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years.

    Every period of assessment focusses on one of the three competence fields reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading is again the main domain in 2009.

     

    Main focus# OECD countries# other countries# studentsNotes
    Reading284265,000The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002
    Mathematics3011275,000UK disqualified from data analysis. Also included test in problem solving.
    Science3027
    Reading3033? Results made available on 7 December 2010 [3]

    Period
    2000 2003 2006 2009

    PISA is sponsored, governed, and coordinated by the OECD. The test design, implementation, and data analysis is delegated to an international consortium of research and educational institutions led by the Australian Council for Educational Research (ACER). ACER leads in developing and implementing sampling procedures and assisting with monitoring sampling outcomes across these countries. The assessment instruments fundamental to PISA’s Reading, Mathematics, Science, Problem-solving, Computer-based testing, background and contextual questionnaires are similarly constructed and refined by ACER. ACER also develops purpose-built software to assist in sampling and data capture, and analyses all data. The source code of the data analysis software is not made public.

     

    [edit] Method of testing

     

    [edit] Sampling

    The students tested by PISA are aged between 15 years and 3 months and 16 years and 2 months at the beginning of the assessment period. The school year pupils are in is not taken into consideration. Only students at school are tested, not home-schoolers. In PISA 2006 , however, several countries also used a grade-based sample of students. This made it possible also to study how age and school year interact.

    To fulfill OECD requirements, each country must draw a sample of at least 5,000 students. In small countries like Iceland and Luxembourg, where there are less than 5,000 students per year, an entire age cohort is tested. Some countries used much larger samples than required to allow comparisons between regions.

     

    [edit] The test

     

     

    PISA test documents on a school table (Neues Gymnasium, Oldenburg, Germany, 2006)

     

     

    Each student takes a two-hour handwritten test. Part of the test is multiple-choice and part involves fuller answers. In total there are six and a half hours of assessment material, but each student is not tested on all the parts. Following the cognitive test, participating students spend nearly one more hour answering a questionnaire on their background including learning habits, motivation and family. School directors also fill in a questionnaire describing school demographics, funding etc.

    In selected countries, PISA started also experimentation with computer adaptive testing.

     

    [edit] National add-ons

    Countries are allowed to combine PISA with complementary national tests.

    Germany does this in a very extensive way: on the day following the international test, students take a national test called PISA-E (E=Ergänzung=complement). Test items of PISA-E are closer to TIMSS than to PISA. While only about 5,000 German students participate in both the international and the national test, another 45,000 take only the latter. This large sample is needed to allow an analysis by federal states. Following a clash about the interpretation of 2006 results, the OECD warned Germany that it might withdraw the right to use the “PISA” label for national tests.[4]

     

    [edit] Data Scaling

    From the beginning, PISA has been designed with one particular method of data analysis in mind. Since students work on different test booklets, raw scores must be scaled to allow meaningful comparisons. This scaling is done using the Rasch model of item response theory (IRT). According to IRT, it is not possible to assess the competence of students who solved none or all of the test items. This problem is circumvented by imposing a Gaussian prior probability distribution of competences.[5]

    One and the same scale is used to express item difficulties and student competences. The scaling procedure is tuned such that the a posteriori distribution of student competences, with equal weight given to all OECD countries, has mean 500 and standard deviation 100.

     

    [edit] Results

     

    [edit] Historical league tables

    All PISA results are broken down by countries. Public attention concentrates on just one outcome: achievement mean values by countries. These data are regularly published in form of “league tables”.

    The following table gives the mean achievements of OECD member countries in the principal testing domain of each period:[6]

    In the official reports, country rankings are communicated in a more elaborate form: not as lists, but as cross tables, indicating for each pair of countries whether or not mean score differences are statistically significant (unlikely to be due to random fluctuations in student sampling or in item functioning). In favorable cases, a difference of 9 points is sufficient to be considered significant.

    In some popular media, test results from all three literacy domains have been consolidated in an overall country ranking. Such meta-analysis is not endorsed by the OECD. The official reports only contain domain-specific country scores. In part of the official reports, however, scores from a period’s principal testing domain are used as proxy for overall student ability.[7]

     

    [edit] 2003–2006

    Top results for the main areas of investigation of PISA, in 2000, 2003 and 2006.

     

    20032006
    MathematicsScience

    2000
    Reading literacy

     Finland546
     Canada534
     New Zealand529
     Australia528
     Ireland527
     South Korea525
     United Kingdom523
     Japan522
     Sweden516
     Austria507
     Belgium507
     Iceland507
     Norway505
     France505
     United States504
     Denmark497
     Switzerland494
     Spain493
     Czech Republic492
     Italy487
     Germany484
     Hungary480
     Poland479
     Greece474
     Portugal470
     Luxembourg441
     Mexico422

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27.

     

     Finland544
     South Korea542
     Netherlands538
     Japan534
     Canada532
     Belgium529
     Switzerland527
     Australia524
     New Zealand523
     Czech Republic516
     Iceland515
     Denmark514
     France511
     Sweden503
     Austria506
     Germany503
     Ireland503
     Slovakia498
     Norway495
     Luxembourg493
     Poland490
     Hungary490
     Spain485
     United States483
     Italy466
     Portugal466
     Greece445
     Turkey423
     Mexico385

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29.

     

     Finland563
     Canada534
     Japan531
     New Zealand530
     Australia527
     Netherlands525
     South Korea522
     Germany516
     United Kingdom515
     Czech Republic513
     Switzerland512
     Austria511
     Belgium510
     Ireland508
     Hungary504
     Sweden503
     Poland498
     Denmark496
     France495
     Iceland491
     United States489
     Slovakia488
     Spain488
     Norway487
     Luxembourg486
     Italy475
     Portugal474
     Greece473
     Turkey424
     Mexico410

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30.

     

     

     

    [edit] 2006

     
    SciencesReading

    Programme for International Student Assessment (2006)
    (OECD member countries in boldface)
    Maths

     Taiwan549
     Finland548
     Hong Kong547
     South Korea547
     Netherlands531
     Switzerland530
     Canada527
     Macau525
     Liechtenstein525
     Japan523

    1.
    2. 3. 3. 5. 6. 7. 8. 8. 10.

     

     Finland563
     Hong Kong542
     Canada534
     Taiwan532
     Estonia531
     Japan531
     New Zealand530
     Australia527
     Netherlands525
     Liechtenstein522

    1.
    2. 3. 4. 5. 5. 7. 8. 9. 10.

     

     South Korea556
     Finland547
     Hong Kong536
     Canada527
     New Zealand521
     Ireland517
     Australia513
     Liechtenstein510
     Poland508
     Sweden507

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10.

     

    Top 10 countries for Pisa 2006 results in Math, Sciences and Reading.

     

    [edit] 2009

     
    SciencesReading

    Programme for International Student Assessment (2009)[8]
    (OECD members as of the time of the study in boldface)
    Maths
    People's Republic of China 

    Shanghai, China

    600

     Singapore562
     Hong Kong, China555
     South Korea546
     Taiwan543
     Finland541
     Liechtenstein536
     Switzerland534
     Japan529
     Canada527
     Netherlands526
     Macau, China525
     New Zealand519
     Belgium515
     Australia514
     Germany513
     Estonia512
     Iceland507
     Denmark503
     Slovenia501
     Norway498
     France497
     Slovakia497
     Austria496
     Poland495
     Sweden494
     Czech Republic493
     United Kingdom492
     Hungary490
     United States487

     Kyrgyzstan331

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. : 65.

     

    People's Republic of China 

    Shanghai, China

    575

     Finland554
     Hong Kong, China549
     Singapore542
     Japan539
     South Korea538
     New Zealand532
     Canada529
     Estonia528
     Australia527
     Netherlands522
     Liechtenstein520
     Germany520
     Taiwan520
     Switzerland517
     United Kingdom514
     Slovenia512
     Macau, China511
     Poland508
     Ireland508
     Belgium507
     Hungary503
     United States502
     Norway500
     Czech Republic500
     Denmark499
     France498
     Iceland496
     Sweden495
     Latvia494

     Kyrgyzstan330

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. : 65.

     

    People's Republic of China 

    Shanghai, China

    556

     South Korea539
     Finland536
     Hong Kong, China533
     Singapore526
     Canada524
     New Zealand521
     Japan520
     Australia515
     Netherlands508
     Belgium506
     Norway503
     Estonia501
     Switzerland501
     Poland500
     Iceland500
     United States500
     Liechtenstein499
     Sweden497
     Germany497
     Ireland496
     France496
     Taiwan495
     Denmark495
     United Kingdom494
     Hungary494
     Portugal489
     Macau, China487
     Italy486
     Latvia484

     Kyrgyzstan314

    1.
    2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. : 65

     

    Top 30 countries for Pisa 2009 results in Math, Sciences and Reading. For a complete list, see reference.

     

    [edit] Comparison with other studies

    The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. Western countries perform slightly better in PISA; Eastern European and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[9]

     

    [edit] Topical studies

    An evaluation of the 2003 results showed that countries that spent more on education did not necessarily do better. Australia, Belgium, Canada, the Czech Republic, Finland, Japan, South Korea, New Zealand and the Netherlands spent less but did relatively well, whereas the United States spent much more but was below the OECD average. The Czech Republic, in the top ten, spent only one third as much per student as the United States did, for example, but the USA came 24th out of 29 countries compared.

    Another point made in the evaluation was that students with higher-earning parents are better-educated and tend to achieve higher results. This was true in all the countries tested, although more obvious in certain countries, such as Germany.

    It has been suggested that the Finnish language plays an important part in Finland’s PISA success.[10]

    International testing, including both PISA and TIMSS, has been a central part of many recent analyses of how cognitive skills relate to economic outcomes. These studies consider both individual earnings and aggregate growth differences of nations.[11]

    In 2010, the 2009 Program for International Student Assessment (PISA) results revealed that Shanghai students scored the highest in the world in every category (Math, Reading and Science), stunned educators. The OECD described Shanghai as a pioneer of educational reform, noting that “there has been a sea change in pedagogy”. OECD point out that they “abandoned their focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving.”[12]

     

    [edit] Reception

    For many countries, the first PISA results were surprising; in Germany and the United States, for example, the comparatively low scores brought on heated debate about how the school system should be changed.[citation needed] Some headlines in national newspapers, for example, were:

      • “La France, élève moyen de la classe OCDE” (France, average student of the OECD class) Le Monde, December 5, 2001

     

     

     

    • “Are we not such dunces after all?” The Times, United Kingdom, December 6, 2001

     

     

    • “Economic Time Bomb: U.S. Teens Are Among Worst at Math” Wall Street Journal December 7, 2004

     

     

     

     

     

     

     

     

    The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for “Superman”.[13]

     

    [edit] Research on causes of country differences

    PISA, TIMSS and PIRLS, their organizers and researchers, are restrained in giving reasons for the large and stable country differences. Cautiously they leave this task to other researchers, especially from the economic sciences and psychology. Economic researchers studied single educational policy factors like central exams (John Bishop),[14] private schools or streaming between schools at later age (Hanushek/Woessman).[15] An extensive literature related to cross-countries difference in scores has developed since 2000.[16]

    The stably good results of Finland have attracted a lot of attention. According to Hannu Simola[17] not attributes of the educational system are relevant for these remarkable results, but high discipline of students, high teacher status (attracting good students to the teaching profession), high quality of teachers due to professional teacher education, conservative direct instruction (“teaching ex cathedra”, “pedagogical conservatism”), low rates of immigrants, fast diagnosis of learning problems and treatment of them including special schools, and a culture of a small border country (as Singapore and Taiwan), knowing that the people could survive only with effort.

    Systematic analyses across different paradigms (culture, genes, wealth, educational policies) for 78 countries were presented by Heiner Rindermann and Stephen Ceci[18]: They report positive relationships between student ability and educational levels of adults, amount and rate of preschool education, discipline, quantity of institutionalized education, attendance at additional schools, early tracking and the use of central exams and tests. Rather negative relationships were found with high repetition rates, late school enrollment and large class sizes. In their opinion the results suggest that international differences in cognitive competence could be narrowed by reforms in educational policy.

     

    [edit] Criticism

    Results from the three domains are closely correlated. Performing a factor analysis or a principal components analysis, one could easily construct an overall achievement scale and deduce a domain-independent country ranking. Such analysis, however, is not undertaken in the official reports—most likely to avoid PISA being interpreted as an intelligence test, which some claim it actually is.[19]

    Lynn and Meisenberg (2010) found very high correlations (r>0.90) between mean student assessment results from PISA, TIMSS, PIRLS and others and IQ measurements at the country data level.[20]

     

    [edit] Luxembourg

    Criticism has ensued in Luxembourg, which scored quite low, over the method used in its PISA test. Although being a trilingual country, the test was not allowed to be done in Luxembourgish, the mother tongue of a majority of students.

     

    [edit] Portugal

    According to the OECD‘s Programme for International Student Assessment (PISA), the average Portuguese 15-years old student was for many years underrated and underachieving in terms of reading literacy, mathematics and science knowledge in the OECD, nearly tied with the Italian and just above those from countries like Greece, Turkey and Mexico. However, since 2010, PISA results for Portuguese students improved dramatically. The Portuguese Ministry of Education announced a 2010 report published by its office for educational evaluation GAVE (Gabinete de Avaliação do Ministério da Educação) which criticized the results of PISA 2009 report and claimed that the average Portuguese teenage student had profund handicaps in terms of expression, communication and logic, as well as a low performance when asked to solve problems. They also claimed that those fallacies are not exclusive of Portugal but indeed occur in other countries due to the way PISA was designed.[21]

     

    [edit] United States

    Critics say that low performance in the United States is closely related to American poverty.[22][23] It’s also shown that when adjusted for poverty, the richest areas in the US outperform every other country’s average scores, especially areas with less than 10% poverty (and even areas with 10% to 25% poverty outperform countries with similar rates).[23] In essence, the criticism isn’t so much directly against the Programme for International Student Assessment itself, but against people who use PISA data uncritically to justify measures such as Charter schools.[24]

    It should be noted that the adjustment for poverty levels in the US done by “PISA: It’s Poverty Not Stupid”, was not similarly done for the other countries it was comparing. Therefore, this adjustment is comparing the averages of the other countries with the areas of 10% to 25% poverty levels in America.

     

    [edit] See also

     

     

     

     

    [edit] References

      1. ^ Official PISA site data. For list See “Executive Summary”

     

     

     

     

     

    • ^ C. Füller: Pisa hat einen kleinen, fröhlichen Bruder. taz, 5.12.2007 [1]

     

     

    • ^ The scaling procedure is described in nearly identical terms in the Technical Reports of PISA 2000, 2003, 2006. It is similar to procedures employed in NAEP and TIMSS. According to J. Wuttke Die Insignifikanz signifikanter Unterschiede. (2007, in German), the description in the Technical Reports is incomplete and plagued by notational errors.

     

     

    • ^ OECD (2001) p. 53; OECD (2004a) p. 92; OECD (2007) p. 56.

     

     

    • ^ E.g. OECD (2001), chapters 7 and 8: Influence of school organization and socio-economic background upon performance in the reading test. Reading was the main domain of PISA 2000.

     

     

     

     

    • ^ M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March, 2008. [2].

     

     

     

     

    • ^ Eric A. Hanushek, and Ludger Woessmann. 2008. “The role of cognitive skills in economic development.” Journal of Economic Literature 46, no. 3 (September): 607-668.

     

     

     

     

     

     

    • ^ Bishop, J. H. (1997). The effect of national standards and curriculum-based exams on achievement. American Economic Review, 87, 260-264.

     

     

    • ^ Hanushek, E. A. & Woessmann, L. (2006). Does educational tracking affect performance and inequality? Differences-in-differences evidence across countries. Economic Journal, 116, C63-C76.

     

     

    • ^ Hanushek, Eric A., and Ludger Woessmann. 2011. “The economics of international differences in educational achievement.” In Handbook of the Economics of Education, Vol. 3, edited by Eric A. Hanushek, Stephen Machin, and Ludger Woessmann. Amsterdam: North Holland: 89-200.

     

     

    • ^ Simola, H. (2005). The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education. Comparative Education, 41, 455-470.

     

     

    • ^ Rindermann, H. & Ceci, S. J. (2009). Educational policy and country outcomes in international cognitive competence studies. Perspectives on Psychological Science, 4, 551-577.

     

     

    • ^ H. Rindermann: The g-factor of international cognitive ability comparisons: The homogeneity of results in PISA, TIMSS, PIRLS and IQ-tests across nations. European Journal of Personality, 21, 667-706 (2007) [3].

     

     

    • ^ Lynn, R. & Meisenberg, G. (2010). National IQs calculated and validated for 108 nations. Intelligence, 38, 353-360.

     

     

     

     

     

     

     

     

     

     

     

    [edit] Further reading

     

    [edit] Official websites and reports

      • OECD/PISA website (Javascript required)

         

      • OECD (2001): Knowledge and Skills for Life. First Results from the OECD Programme for International Student Assessment (PISA) 2000.
      • OECD (2003a): The PISA 2003 Assessment Framework. Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Paris: OECD, ISBN 978-92-64-10172-2 [5]
      • OECD (2004a): Learning for Tomorrow’s World. First Results from PISA 2003. Paris: OECD, ISBN 978-92-64-00724-6 [6]
      • OECD (2004b): Problem Solving for Tomorrow’s World. First Measures of Cross-Curricular Competencies from PISA 2003. Paris: OECD, ISBN 978-92-64-00642-3
      • OECD (2005): PISA 2003 Technical Report. Paris: OECD, ISBN 978-92-64-01053-6
      • OECD (2007): Science Competencies for Tomorrow’s World: Results from PISA 2006 [7]

     

     

    [edit] About reception and political consequences

      • General:
          • A. P. Jakobi, K. Martens: Diffusion durch internationale Organisationen: Die Bildungspolitik der OECD. In: K. Holzinger, H. Jörgens, C. Knill: Transfer, Diffusion und Konvergenz von Politiken. VS Verlag für Sozialwissenschaften, 2007.

         

         

     

      • France:
          • N. Mons, X. Pons: The reception and use of Pisa in France.

         

         

     

    • Germany:
        • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33–34.

       

    • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v32 n5 p619-634 Nov 2006.

     

     

    • United Kingdom:
        • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland. [8]

       

       

     

     

     

    [edit] Criticism

      • Books:
          • S. Hopmann, G. Brinek, M. Retzl (eds.): PISA zufolge PISA. PISA According to PISA. LIT-Verlag, Wien 2007, ISBN 3-8258-0946-3 (partly in German, partly in English)

         

      • T. Jahnke, W. Meyerhöfer (eds.): PISA & Co – Kritik eines Programms. Franzbecker, Hildesheim 2007 (2nd edn.), ISBN 978-3-88120-464-4 (in German)
      • R. Münch: Globale Eliten, lokale Autoritäten: Bildung und Wissenschaft unter dem Regime von PISA, McKinsey & Co. Frankfurt am Main : Suhrkamp, 2009. ISBN 9783518125601 (in German)

     

     

     

     

     

     

    Education Index · Innovation  · Literacy · Patents · Student performance

     

     

     

    [hide]v · d · eLists of countries by population statistics
    Demographics Health Intellect and education Economic Other Lists of countries · Lists by country · List of international rankings · List of top international rankings by country