Tuesday, December 11, 2012

5 cara untuk meningkatkan karier di 2013


Five ways to take control of your career in 2013

By Guest Contributor | December 11, 2012, 6:42 AM PST
Employees, particularly IT pros, can no longer afford to rely 100% on their employers to help develop their skills, talents, and competencies. But continuous development is critical if you want to get ahead in your career, let alone keep current with technology. Here are five practical ways you can take control of your career.

1. Start a performance journal

Do you regularly keep notes on the highs and lows of your performance? Now I don’t mean that weekly or monthly report you’re required to write for your manager. I mean making notes on things you observe about your day-to-day performance:
  • the accomplishments you’re most proud of
  • the things you consider failures or sub-par performance, and what you learned from them
  • the thank you’s and other forms of acknowledgement you receive
  • the criticisms you received, justified or not
Collected, this information helps to give you perspective on your performance and potential, uncovering areas of strength and weakness. This kind of performance journaling will make your weekly/monthly reports, as well as your annual performance appraisal, easier to write. But more importantly, they’ll help you see your strengths, the things you’re passionate about, the areas where you need to develop, the stuff you hate, etc. Then you can use that information to drive your career development and progression.

2. Solicit feedback

Research shows that regular feedback improves performance. So if you want to take control of your career and improve performance, solicit feedback from others. Ask your manager, peers, internal and external customers, about how you’re doing and what you could to do improve. Feedback expert Jamie Reskersuggests you ask two key questions:
  • What’s one thing that I’m doing that’s working or that I should keep doing?
  • What’s one thing I could do to be more effective in my role?
The key here is to get information on one strength and one weakness and then tackle them. If you’re gathering feedback from multiple sources, identify the most commonly mentioned “thing” and then focus your efforts on leveraging your strength and addressing your weakness.

3. Complete a self-appraisal

Don’t treat your performance appraisal like a spectator sport! Your performance appraisal should be something you actively participate in. One the best ways to do that, and to start a dialogue with your manager is to complete a self-appraisal, whether your performance management processes requires you to or not. A self-appraisal is a great way to take charge of your performance, progress and development. It gets you thinking about your performance, accomplishments, learning needs and career aspirations. You should not only complete a self-appraisal, you should share it with your manager before they complete their appraisal of you, to give them your perspective and open the conversation.

4. Draft your goals

Drafting your goals is another way you can easily take control of your career. Start by looking at your organization’s high-level goals and any department level goals. Then think about how you in your job, with your role and responsibilities, can contribute to achieving those. Draft goals that capture your contributions. If there are any projects on the go that you’re passionate about or would love to work on, draft a goal to do so. And think too about your own development and career progression. Are there stretch goals you’d like to take on that will help you acquire new skills, or broaden or deepen existing ones? You’ll of course need to negotiate all this with your manager, but taking the initiative to draft your goals first will help make sure you tackle work that will benefit your career.

5. Own your development

Lots of us still fall into the trap of waiting for our managers to assign us development activities. But your career and development are ultimately your responsibility, not your manager’s. So think about the knowledge, skills and/or experience you need to do your job better today. Then identify what areas you need development in to support your career progression (your performance journal notes can help you with this). Next, think about how you learn best: by seeing, hearing or doing? Do you learn better on your own, or do you need social stimuli? Once you’ve figured that out, it’s time to start identifying learning activities and opportunities. What can your company reasonably support you in doing, and what can you tackle on your own, outside of work? Then start learning, taking advantage of all the opportunities at your disposal: courses, conferences, e-learning, podcasts, blogs, newsletters, work assignments, job shadowing, mentors, volunteer work, cross functional teams, etc.
It’s your career. Make the most of it by being proactive about your performance, development and career progression. No one else will do it for you.
Sean Conrad has spent his career in IT, and now helps end-user organizations to successfully implement talent management software solutions. He writes about career development and management best practices for the Halogen Software blog.

Sunday, December 2, 2012

LESSONS IN COUNTRY - PERFORMANCE IN EDUCATION (4)


Towards an index of education outputs

In addition to the Data Bank, an important goal of the Learning Curve project has been to create a comparative index of educational performance – the Global Index of Cognitive Skills and Educational Attainment. The results are meant not only to be interesting in themselves, but to help identify likely sources of good practice.
First, a caveat
The exercise has not been simple. One hurdle was determining how to measure performance. While it would have been desirable to include broader labour market and social outcomes on which education arguably has an impact, this proved impossible. Even were it demonstrably clear that education played a definite role in these areas, it is impossible to determine a way – consistent across time and geography – to isolate and measure the impact of that effect.
While more direct measures of educational results abound, robust, internationally comparative ones are rare. PISA, TIMSS and PIRLS testing has had such an impact in part because of the void it helped to fill. The Index therefore, through necessity, takes a view of educational performance based on where reasonably good data exist. The first such area, drawing on the results of the aforementioned tests, is the inculcation of cognitive skills. The second is a broader measure of educational attainment, which relies on literacy levels and graduation rates.
This focus does not eliminate data issues. Education systems are local: international comparability will never be perfect. Canada’s tertiary graduation rate, for example, is modest in the calculations for this Index because they draw on university results. If one includes graduates from Canada’s community colleges, though – tertiary type-B institutions to use the international classification – the graduation rate becomes one of the highest in the OECD. A lack of data on the results for type-B colleges, though, makes it impossible to do so generally. Moreover, metrics selected for the Index suffer from data lacunae. Singapore’s low educational attainment score in the Index – 33rd out of 40 – arises largely from a complete lack of available data on graduation rates.[28] Finally, combining results from different tests in a meaningful way required rebalancing of the existing data.
Ultimately, these data are inevitably proxies for broader results, and far from perfect ones. As Dr Finn points out of graduation rates, "they are complicated. You can raise your graduation rate by lowering academic expectations.” On the other hand, such rates, like literacy levels, do indicate in a rough way the breadth of education in a country. Similarly, Professor Hanushek notes that “countries that do well on PISA do well on tests of deeper knowledge.”
The methodology appendix describes in more detail the Index’s construction and relevant data issues. The broader message of this lengthy disclaimer is that the Index is very much a first step. We hope that, as understanding of the outcomes of education grows, the Index will become more complex and nuanced as well as be populated with more robust and varied data. For now, however, it is better to light a candle than curse the statistical darkness.
What the leaders have – and don't have – in common
Given the attention paid to the results of international education tests, the leading countries in the cognitive skills category of the Index come as no surprise. The top five – Finland, Singapore, Hong Kong, South Korea and Japan – all score more than one standard deviation above the norm in this part of the Index. The educational attainment category, based on literacy and graduation rates, tells a slightly different story. Here South Korea leads, followed by the UK, Finland, Poland and Ireland, with Japan, Hong Kong and Singapore further down the table. Because of their strength in both measures, then, Finland and South Korea are the clear overall leaders of the Index.
Chart 9: Global Index of Cognitive Skills and Educational Attainment – overall results
Chart 9: Global Index of Cognitive Skills and Educational Attainment – overall results
Note: The Index scores are represented as z-scores. The process of normalising all values in the Index into z-scores enables a direct comparison of country performance across all the indicators. A z-score indicates how many standard deviations an observation is above or below the mean of the countries in the Index.
Source: Economist Intelligence Unit.
These results mirror the conventional wisdom: already in 2007, the BBC referred to the two countries as “among the superpowers of education.”[29] But what do these have in common that might help to identify the keys to educational success? On the face of it, there is remarkably little.
In many ways, it is hard to find two education systems more different. South Korea’s schools are frequently described as test-driven, with a rigid curriculum and an emphasis on rote learning. Most striking is the amount of time spent in study. Once the formal school day is over, the majority of students go to private crammer schools, or hagwons. According to OECD data, of 15-year-old students for whom data was available in 2009, 68% engaged in private study of the Korean language, 77% in mathematics, 57% in science and 67% in other subjects. In later years, students typically do far more privately. The government has become so worried about the extent of these studies that it has banned hagwons from being open after 10pm, but still needs to send out patrols to shut down those which mask illegal, after-hour teaching by posing as self-study libraries.
On the other hand Finland, in the words of Professor Schwartz, “is a wonderful case study. Kids start school later; school hours are shorter than most others; they don’t assign homework; their teachers are in front of kids less. By one estimate, Italians go to school three years longer.” The PISA data shows that very few Finns take out-of-school lessons either, and those who do typically do worse on standardised tests, suggesting that this is largely remedial help. Finally, the system has a reputation for being focussed on helping children understand and apply knowledge, not merely repeat it.
The existing data also paint a picture of two distinct approaches. In some cases, the systems are widely different: average teacher salaries in South Korea are over twice the national average, while those in Finland are almost exactly average; pupil-teacher ratios, on the other hand, are much higher in South Korea. Where the two systems are similar, they are usually near the average for all countries in the Index. The only difference is school choice, where both are highly restrictive. That said, the vast amount of after-school private education in South Korea brings into question the relevance of that metric.
The two systems, though, do share some important aspects when examined closely. “When you look at both, you find nothing in common at first,” says Professor Schleicher, “but then find they are very similar in outlook.” One element of this is the importance assigned to teaching and the efforts put into teacher recruitment and training. As discussed above, the practices of the two countries differ markedly, but the status which teaching achieves and the resultant high quality of instruction are similar. Professor Schleicher adds that both systems also have a high level of ambition for students and a strong sense of accountability, but again these are “articulated differently. In South Korea, accountability is exam driven; in Finland, it is peer accountability, but the impact is very similar.”
Finally, there are cultural parallels. The two societies are highly supportive of both the school system itself and of education in general. Of course, other countries are also highly supportive of education, but what may set Finland and South Korea apart is that in both, ideas about education have also been shaped by a significant underlying moral purpose.
Although discussions of Korean attitudes to education frequently reference Confucian ideals, under a quarter of South Koreans were even literate by the end of the Korean War. In the decades that followed, education was not just about self-improvement: it was a way to build the country, especially as the Japanese colonial power had restricted the access of ethnic Koreans to schooling. The immediate cause of this drive has disappeared, but it has helped inculcate a lasting ethic of education which only strengthened the more widespread attitude in Asia that learning is a moral duty to the family and society as well as a necessary means of individual advancement.
In Finland, the ethos is different but no less powerful. As Mr Mackay explains, that country has made “a commitment as a nation to invest in learning as a way of lifting its commitment to equity. They wish to lift the learning of all people: it is about a moral purpose that comes from both a deeper cultural level and a commitment at a political-social level.” In other words, education is seen as an act of social justice.
Both of these moral purposes can cause difficulties in different ways. The high expectations and pressure mean that studies regularly find South Korean teenagers to be the least happy in the OECD. In Finland, the egalitarian system seems less effective at helping highly talented students to perform to the best of their ability than at making sure average results are high. Nevertheless, the power of these attitudes in shaping cultural norms and political decisions in ways that help education attainment overall are undeniable. Mr Angula, after many years as a teacher, Minister of Education, and Prime Minister, believes that “the key ingredient [in creating a successful education system] is for everybody to be committed and to understand that they are doing a public good.”


[28]Singapore is one of 14 countries in the Index for which internationally comparable graduation data are lacking. (The countries were nonetheless included in the Index because they met all the other data inclusion criteria.) They were thus assigned the mean z-score of the entire country sample for the given graduation rate indicators. This represents an opportunity for further and improved data collection that will be reflected in later versions of the Learning Curve.
[29] “Finland stays top of global class”, 4 December 2007, http://news.bbc.co.uk/1/hi/7126562.stm

Conclusion and recommendations for further study

The lessons of the Index broadly reflect much which comes out of this study. The understanding of what inputs lead to the best educational outcomes is still basic, which is not surprising given that robust international benchmarking figures are few and often of recent date. Moreover, education remains an art, and much of what engenders quality is difficult to quantify.
General lessons to be drawn, then, are often still basic as well. Dr Finn says of studies looking at high-performing school systems, “I don’t detect many similarities other than high standards, solid curriculum, competent teachers and a supportive culture that is education-minded.” Other research might point to the importance of school choice and school autonomy.
These insights are valuable, but only up to a point. Education systems are local; so too are their problems and solutions. What Professor Hanushek says of improving autonomy and choice applies generally: “Local countries and institutions are extraordinarily important. Each country has its own system. It is difficult to take any of the specifics and apply them elsewhere.” In seeking those solutions, officials also need a dose of humility, remembering that formal education can do only so much. As Professor Woessmann notes, “a lot of these things [determinants of academic success] are not amenable to government action. They are really within families and how society operates.” Moreover, as the differing approaches of Finland and South Korea show, there are diverse paths to success.
While the local matters greatly, the universal still has an important contribution to make. This study, like others, ends with an appeal for more research. Both relatively straightforward work and more complex tasks lie ahead. The former includes the generation of basic information on inputs and outcomes in a number of countries; the assessment of a wider range of skills using standardised tests; and finding appropriate ways to compare dissimilar educational systems in various countries. The more complex challenges involve assessing the impact of culture on education and the value of different means of changing cultures; determining the attributes of those teachers that add the most value; and understanding in more detail how accountability and choice can interact in positive ways. Such studies might involve innovative new metrics, new approaches or both.
The other important plea is that what is known not be ignored. Too often, the world’s innumerable education reforms draw on assumptions and ideology rather than solid information. International comparisons of educational inputs and outputs have already awakened countries to their own strengths and deficiencies, as well as pointing toward possibly fruitful sources of solutions. The LCDB and Index are offered as tools toward furthering this understanding. It is hoped that they will be useful as researchers and analysts seek deeper and more nuanced insight in the years to come.

Appendix 1: methodology for the quantitative component of The Learning Curve

As part of the Learning Curve programme, the Economist Intelligence Unit (EIU) undertook a substantial quantitative exercise to analyse nations' educational systems’ performance in a global context. The EIU set two main objectives for this work: to collate and compare international data on national school systems’ outputs in a comprehensive and accessible way, and for the results to help set the editorial agenda for the Learning Curve programme.
The EIU was aided by an Advisory Panel of education experts from around the world. The Panel provided advice on the aims, approach, methodology and outputs of the Learning Curve’s quantitative component. Feedback from the Panel was fed into the research in order to ensure the highest level of quality.
The EIU developed three outputs as part of the quantitative component of the Learning Curve. These are an exhaustive data bank of high quality national education statistics, an index measuring national cognitive skills and educational attainment, and research on correlations between educational inputs, outputs and wider society. Each is described in more detail below.
Learning Curve Data Bank
The Learning Curve Data Bank (LCDB) provides a large, transparent and easily accessible database of annual education inputs and outputs and socio-economic indicators on 50 countries (and one region – Hong Kong) going back to 1990 when possible. It is unique in that its aim is to include data that are internationally comparable. The user can sort and display the data in various ways via the website that accompanies this report.
Country selection
Country selection to the Data Bank was on the basis of available education input, output and socio-economic data at an internationally comparable level. A particularly important criterion was participation in the international PISA and/or TIMSS tests. Forty countries (and Hong Kong) were included as 'comprehensive-data' countries within the Data Bank, and ten countries as 'partial-data' countries, according to availability of data.
Indicator selection
The EIU's aim was to include only internationally comparable data. Wherever possible, OECD data or data from international organisations was used to ensure comparability. For the vast majority of indicators, the EIU refrained from using national data sources, and when possible, used inter- and extrapolations in order to fill missing data points. Different methods for estimations were used, including regression when found to be statistically significant, linear estimation, averages between regions, and deductions based on other research. The source for each and every data point is cited in the Data Bank. The data were last collected and/or calculated in September 2012.
Over 60 indicators are included, structured in three sections: inputs to education (such as education spending, school entrance age, pupil teacher ratio, school life expectancy, teacher salaries, among others), outputs of education (such as cognitive skills measured by international tests such as PISA, literacy rates, graduation rates, unemployment by educational attainment, labour market productivity, among others) and socio-economic environment indicators (social inequality, crime rates, GDP per capita, unemployment, among others). The Data Bank’s indicators were used to create the Index and conduct a correlations exercise.
Global Index of Cognitive Skills and Educational Attainment
The Global Index of Cognitive Skills and Educational Attainment compares the performance of 39 countries and one region (Hong Kong is used as a proxy for China due to the lack of test results at a national level) on two categories of education, cognitive skills and educational attainment. The index provides a snapshot of the relative performance of countries based on their education outputs.
Country and indicator selection
For data availability purposes, country selection to the Index was based on whether a country was a 'comprehensive-data' country within the Data Bank. Guided by the Advisory Panel, the EIU’s goal in selecting indicators for the Index was to establish criteria by which to measure countries’ output performance in education. Initial questions included: What level of cognitive skills are national education systems equipping students with, and how are students performing on internationally comparable tests at different ages? What are levels of reading, maths and science in these countries? How successful are national education systems at attaining a high level of literacy in the population? How successful are national education systems at educating students to secondary and tertiary degree level?
Based on this set of questions, the EIU chose objective quantitative indicators, grouping them into two groups: cognitive skills and educational attainment. For cognitive skills, the Index uses the latest reading, maths and science scores from PISA (Grade 8 level), TIMSS (Grade 4 and 8) and PIRLS (Grade 4). For educational attainment, the Index uses the latest literacy rate and graduation rates at the upper secondary and tertiary level. Data for some countries were more recent than others; when the latest available data point was five years older than the latest, the EIU chose not to include it, although this was very rarely found to be an issue.
The EIU made estimations when no internationally comparable data were available. For example, a number of countries’ Grade 8 TIMSS Science scores were estimated by regression with PISA Science scores, when the regression was found to be statistically significant. In addition, when OECD data were not available for graduation rates, national ministry or statistics bureau data were sanity-checked and then used if deemed internationally comparable.
Calculating scores and weightings
In order to make indicators directly comparable across all countries in the Index, all values were normalised into z-scores. This process enables the comparison and aggregation of different data sets (on different scales), and also the scoring of countries on the basis of their comparative performance. A z-score indicates how many standard deviations an observation is above or below the mean. To compute the z-score, the EIU first calculated each indicator’s mean and standard deviation using the data for the countries in the Index, and then the distance of the observation from the mean in terms of standard deviations.
The overall index score is the weighted sum of the underlying two category scores. Likewise, the category scores are the weighted sum of the underlying indicator scores. As recommended by the Advisory Panel, the default weight for the Index is two-thirds to cognitive skills and one-third to educational attainment. Within the cognitive skills category, the Grade 8 tests’ score accounts for 60% while the Grade 4 tests’ score accounts for 40% (Reading, Maths and Science all account for equal weights). Within the educational attainment category, the literacy rate and graduation rates account for equal weights. The user can, however, change the weightings and recalculate scores according to personal preference via the website that accompanies this report.
____________________________________________________________________________
Areas for caution
Because indexes aggregate different data sets on different scales from different sources, building them invariably requires making a number of subjective decisions. This index is no different. Each 'area for caution' is described below.
Z-scores for PISA, TIMSS and PIRLS
It is important to note that, strictly speaking, the z-scores for PISA, TIMSS and PIRLS are not directly comparable. The methodology applied both by the OECD and the International Association for the Evaluation of Educational Achievement (IEA) to calculate the performance of the participating countries consists of comparing the performance of the participating countries to the respective mean performance. (The countries’ ‘raw’ test scores before normalisation are not published; just their scores in comparison to the other participants.) Thus, which countries participate in each test and how well they perform in comparison to the other participants has a direct impact on the resulting final scores. Given that the sample of countries that take the PISA, TIMSS and PIRLS tests are not exactly the same, there are limitations to the comparability of their scores.
The EIU has chosen not to change these scores to account for this lack of direct comparability; however, it did consider other options along the way. The main alternative suggestion from the Advisory Panel was to use a pivot country in order to transform the z-scores of other countries in comparison to that pivot country’s z-score. Although this method is used in some studies, after substantial consideration, the EIU decided not to employ this method for the purpose of an index. The resulting z-scores after transformation depend heavily on the choice of pivot country; choosing one country as a pivot over another affects countries’ z-scores quite substantially. The EIU did not feel it was in a position to make such a choice. Despite these limitations to test scores’ direct comparability, the EIU believes that the applied methodology is the least invasive and most appropriate to aggregate these scores.
Graduation rate data
Some members of the Advisory Panel questioned the use of graduation rates in the Index in that it is not clear whether they add value as a comparative indicator of education performance. Unlike test results and literacy rates, standards to gaining an upper secondary and tertiary degree do differ across countries. Notwithstanding, the EIU believes that graduation rates do add value in evaluating a national educational system's performance, as there is common acceptance that national education systems should aim for their citizens to gain educational qualifications, especially at the secondary level. Including graduation rate data in the Index therefore awards countries that have put this aim into practice, albeit at varying levels of quality.
Because of the variation in how countries measure graduation rates, the EIU followed the Panel's suggestion in using OECD graduation rate data, which use one main definition. When OECD data were not available, national ministry or statistics bureau data were sanity-checked and then used if deemed comparable. In some cases, no data on graduation rates were available. In this case, the EIU awarded the country the mean score for this indicator. One disadvantage of giving a country the mean score is that if in reality it performs worse than the average in this indicator, the Index boosts its score, and vice versa.
The EIU used the most recent data available. Because graduation rates are based on the pattern of graduation existing at the time, they are sensitive to changes in the educational system, such as the addition of new programmes or a change in programme duration. As an extreme example, Portugal’s upper secondary graduation rate increased from a range between 50% and 65% in the early 2000s to 2008, to 104% in 2010, as a result of the government’s “New Opportunities” programme, launched to provide a second chance for those individuals who left school early without a secondary diploma. In order to treat countries consistently, the Index takes the 2010 figure. Although this inflates Portugal’s score in this indicator, this inflation should eventually fall out of the Index should it be updated on an annual or bi-annual basis. Given the limitations of graduation rate data, the EIU followed the Panel's suggestion of giving a smaller weighting (one-third) to educational attainment.
It is also important to note that the tertiary graduation rate indicator covers only tertiary-type A programmes. Tertiary-type B programmes are not included. This methodology was chosen largely because not all countries collect data and organise their education systems along the lines of A and B. As per the OECD, tertiary-type A programmes are largely theory-based and are designed to provide qualifications for entry into advanced research programmes and professions with high requirements in knowledge and skills. These programmes are typically delivered by universities, and their duration ranges from three to five years, or more at times. Tertiary-type B programmes are classified at the same academic level as those of type A, but are often shorter in duration (usually two to three years). They are generally not intended to lead to further university-level degrees, but rather to lead directly to the labour market.
Although excluding tertiary-type B programmes makes for a more relevant comparison among countries, it also slightly disadvantages a number of countries that have particularly high type B graduation rates (as these rates are not included). These countries are Canada, Ireland, Japan and New Zealand. Nonetheless, this exclusion has a limited impact on these countries’ ranking in the Index.
Other indicators
The EIU had wanted to include other education performance indicators in the Index, such as how well national education systems prepare students for the labour market and the performance of vocational studies. However, data availability was a limiting factor. The EIU found that sufficient data were not available that isolates educational attainment within labour market outcomes; and internationally comparable data on vocational studies covering all countries in the Index were not readily available either.
___________________________________________________________________________

Correlations
With the ‘comprehensive-data’ countries data from the Data Bank, a correlations exercise was undertaken in order to test relationships across countries between education inputs, outputs and wider society. The EIU tested for correlations between the inputs to and outputs of education, the inputs to education and socio-economic environment indicators (as a proxy for wider society), and the outputs of education and socio-economic environment indicators.
Definition of a correlation and thresholds used
The correlation coefficient is a measure of the degree of linear relationship between two variables. While in regression the emphasis is on predicting one variable from the other, in correlation the emphasis is on the degree to which a linear model may describe the relationship between two variables. Importantly, the presence of a correlation does not imply causality.
In order to ensure that relationships being found were indeed strong, the EIU looked for at least a 0.65 level of correlation (the higher it is, the stronger the relationship). It is important to acknowledge that some social science research uses a lower level of correlation, but the EIU wished to maintain a high level to avoid finding relationships between indicators that might not be significant.
Calculating correlations
Correlation tests were conducted on an indicator-by-indicator basis, between two variables over time (on an annual basis) and at three-year growth rates (for example, the three-year growth rate of 1999 (1996-99) against the three-year growth rate of 2007 (2004-07)). For the latter tests, adjustments were made to include TIMSS and PIRLS tests even though these are not taken every 3 years (they are taken every four and five years respectively). The EIU used the same time lags across countries on the same indicator, as per the Panel’s suggestions.
When looking for evidence of a strong correlation, the EIU sought a strong relationship over time. For example, although there may have been evidence of a strong correlation between one input variable in 1990 and an output variable in 2005; a strong level of correlation would also need to be found for 1991 and 2006, 1992 and 2007, and so on, for at least a number of years. In addition, correlation tests were only run if there were at least 15 countries with relevant data for both of the indicators being assessed.
Factors affecting the correlations
The EIU did not find a great number of strong relationships. Given the complexity of education, this was not totally surprising. However, other factors may also account for the lack of correlations. For one, not all indicators were available going back 15-20 years in time. There was also a lack of data availability for some countries (some of this due to the Data Bank’s focus on ensuring that data being used were internationally comparable). Finally, other qualitative factors that are difficult to measure, such as culture and the quality of teaching, were not included in the Data Bank. These factors may have a significant impact on education outputs, but the EIU was not able to take these into account within the correlations exercise.

LESSONS IN COUNTRY - PERFORMANCE IN EDUCATION (3)


School choice and accountability: caveat scholacticus

The choice debate
In the English city of Guildford in 2011, every final-year student in the Royal Grammar School earned at least three A-levels, the highest secondary-school subject qualification. The equivalent figure for the city’s Kings College for the Arts and Technology was just 69%. Neither figure was a surprise, nor is such variation exceptional. In most places, it is simply accepted that specific schools – like individual teachers – have different results which tend to persist over time. A natural conclusion is that giving parents, and through them students, the ability to choose better performing schools should lead to better outcomes.
Unfortunately, this issue is far more complex and not just because of the range of systems through which choice operates across the world – including both publicly and privately funded options. Whatever their specific strong and weak points, all these arrangements need accurate information. Getting it wrong can be harmful. A study of Beijing parental selection of primary schools found that excessive optimism about place availability at better schools led parents to use up application choices on schools that were already full. Less optimistic parents snapped up places at the next tier of schools, leaving only markedly worse ones for the children of those making the initial mistake.[10]On the other hand, researchers in North Carolina found that better, clearer information on local schools increased the number of low-income parents taking advantage of school choice, and that the children so placed performed better.[11]  As in any quasi-market, for choice to work, schools have to reveal how well they are doing: choice and accountability must go hand in hand.
Chart 5: School choice in selected countries, aggregated score, 2009
Chart 5: School choice in selected countries, aggregated score, 2009

Note: The score, which is on a scale of 0 to 1, is an aggregate of the following indicators: enrolment choices (freedom of enrolment choice at primary and lower secondary education), the level of school choice (% of pupils living in an area with more than two schools), parental expectations, and financial choice and information (availability of school vouchers and government responsibility for informing parents on school choices (primary and lower secondary).
Sources: Economist Intelligence Unit and OECD.
Any accountability system, however, requires some decision on what should be measured. Demographic differences between the children in the two Guildford schools above might explain the gap in results far better than the education provided. Mr Cappon notes of Canada: “Social class and school choice tend to go together.” Indeed, much of the choice and accountability debate continues because such other issues cloud the picture.
Recent research suggests that, at the system-wide level, the potential for informed choice helps raise educational outcomes and reduces costs. In particular, a cross-country comparison of the number of private, often faith-based, schools – an indication of the degree of choice – with the 2003 PISA results found that, even after controlling for other factors, “the share of schools that are privately operated has an economically and statistically significant positive effect on student achievement in mathematics, science, and reading.”[12] The benefits were greater than average for students with a lower socio-economic status where such private schools were publicly funded, as in Belgium and the Netherlands. Professor Woessmann, one of the authors, explains: “If there is more choice for parents, and more non-governmental school operators so that schools are not managed by one big state monopoly, countries perform much better.”
How this choice drives the system to better results in practice, however, is a matter of no little debate. Indeed, any discussion involving market-like mechanisms and education inevitably leads to contentious, often politicised, debate. Unfortunately, the resultant heat has shed little consistent light.
Vouchers and charter schools
Some of the most investigated choice initiatives operate in the US. Voucher programmes provide funding – generally assigned by lottery as the programmes are almost invariably oversubscribed – that pay for the private education of underprivileged children. A 2008 review by Patrick Wolf, Professor of School Choice at the University of Arkansas, looked at the 10 best studies of these programmes and found widely varying results.[13] In general, all or some students who used vouchers did better academically in certain fields, especially maths. A more recent study by Mr Wolf of the long-standing Milwaukee voucher system brought further variability: voucher students there outdid peers in reading but underperformed in maths.[14]
The impact of such programs on abilities tends to be unpredictable, but that may not be the point. Parents almost invariably are satisfied with them, although perhaps for reasons quite apart from grades. Given the public options available to some of these students, physical safety is an issue: one study found no academic differences for voucher users, but they did have lower arrest rates.[15] 
Another possible impact of choice is to create competition so that all schools improve, especially where they are made to give data on results. Debate on the extent to which this has taken place and whether competition was the driver of perceived change is also on-going.[16] The one clear point is that vouchers, and choice, do not seem to hurt existing school systems.[17]
A more widespread US experiment in using choice and accountability to improve education has been the growth of charter schools. These autonomous, privately-run but publicly-funded schools open to all students – capacity permitting – exist in 41 states. In return for autonomy, these institutions are made accountable. Charters are granted with binding requirements to achieve certain levels of academic success among students.
As with vouchers, the success of charter schools as a whole is the focus of intense debate. The largest review to date of research presents a mixed picture. The Center for Research on Education Outcomes looked at research from 15 American states and the District of Columbia. It found that, on average, students in these schools tended to do slightly worse than those in nearby public schools. But the broader message was variety: 17% of charter schools do better, 46% are just as good, and 37% do worse. Moreover, the success of the schools depends on the way they are regulated. Roughly even numbers of states had schools where students on average did better than in traditional schools and schools where students did worse.[18]
Chart 6: School responsibility and autonomy, average score, 2009
Chart 6: School responsibility and autonomy average score, 2009

Note: The score is the average of 'index of responsibility for resources allocation' and 'index of responsibility for curriculum and assessment'. These indexes have an OECD mean of 0 and a standard deviation of 1. Positive values on these indexes indicate relatively more responsibility for schools than the local, regional or national education authority.
Sources: Economist Intelligence Unit and OECD.
Dr Finn believes that greater autonomy and accountability are needed within US schools, but he also remarks that “one of the sobering lessons of the last 15 years is that hanging a sign with the word 'charter' in it on the front door does not make it a better school. In any state, some of best and worst schools are charter schools, except perhaps in Massachusetts because it only gave charters to people who knew what they were doing.”
Indeed, the wide variety is probably a predictable result of how these schools provide value. According to Professor Stecher, “the strength of charter schools seems to be that they permit innovation outside of bureaucracy, for good or for ill. The movement needs to be accompanied by careful monitoring to protect the welfare of kids, but it is leading to some really interesting opportunities and models of reform.” He cites Aspire Public Schools, a California non-profit charter school system that, even though three-quarters of students come from impoverished families, had average scores that exceeded the state’s overall mean by more than 5%.
The broader lesson seems to be an obvious one. In the words of a study by Harvard academics, “school choice can improve students’ longer-term life chances when they can gain access to schools that are better....”[19] The key, as in any market situation, is deciding which ones are: sometimes choice means opting for existing provision, but this does not negate its value.
School choice in developing countries
Where such provision is poor, however, choice and accountability can be essential. James Tooley, Professor of Education Policy at Newcastle University, has done extensive research into the huge number of unofficial private schools used by economically underprivileged students in developing countries. In many cases, rather than trusting state provision, families are willing to spend often a substantial part of their income to send children to these unregistered schools. The reason is simple: parents know that education is important but public provision is sub-standard or illusory. Professor Tooley ascribes parents' decisions in this area to their mistrust of state-school teachers, who are accused of absenteeism, poor teaching habits and poor attitudes toward students themselves.
As with any unofficial activity, it is hard to assess its full scope. Professor Tooley notes that the best data from India shows around a quarter attending private schools in rural areas, and other research indicates around 65-70% do so in urban areas. He therefore estimates the overall total at around 40% or more – a figure consistent with his own, less detailed research in communities in Ghana, Kenya, and Nigeria.[20]
These schools exist because they provide results: Professor Tooley’s research in a variety of locations has found significantly better reading, mathematics, and English skills. Similarly, World Bank-supported researchers from the Learning and Educational Attainment in Punjab Schools (LEAPS) project found that in that Pakistani state, students in such private schools were on average 1.5 to 2.5 years ahead of counterparts in government schools, even though the latter spent three times as much per pupil.[21]
What makes these private schools so much more effective is not immediately clear, says Professor Tooley. They typically have fewer resources, class sizes vary widely and often the teachers are not as well trained or do not have as much teaching experience. He concludes that “there is a missing ingredient [from public schools that exists] in private schools. It must be accountability. The teachers have to teach, otherwise they get removed; the schools need to please parents.”
The extreme situation faced by these parents gives the same message as the correlation between PISA outcomes and private-school numbers: choice and accountability can have an important impact on results. On the other hand, the experience of school choice in the US shows that the way these mechanisms work are complex, require parents to have as much information as possible and can penalise wrong choices as much as reward right ones. Rio de Janeiro’s Ms Costin points out, however, that the effort needed to bring in parents is worth it even in the poorest areas: “They are not second-class citizens. Their opinion is important. Parents know which school is a good school. Social pressure for quality can be exerted even by illiterate parents."


[10]Fang Lai, Elisabeth Sadoulet, Alain de Janvry, “The Adverse Effects of Parents' School Selection Errors on Academic Achievement: Evidence from the Beijing Open Enrollment Program”, Economics of Education Review(2009) v28 n4: 485-496
[11]Justine S. Hastings and Jeffrey M. Weinstein, “Information, School Choice, and Academic Achievement: Evidence from Two Experiments”, The Quarterly Journal of Economics, (2008): 1373-1414.
[12]Ludger Woessmann and Martin West, “Competition from private schools boosts performance system-wide”,Voxhttp://www.voxeu.org/article/competition-private-schools-boosts-performance-system-wide.
[13]“School Voucher Programs: What the Research Says About Parental School Choice”, Brigham Young University Law Review, (2008): 415-446.
[14]The Comprehensive Longitudinal Evaluation of the Milwaukee Parental Choice Program: Summary of Final Reports, February 2012, http://www.uark.edu/ua/der/SCDP/Research.html.
[15]Julie Berry Cullen, Brian A. Jacob, and Steven Levitt, “The Effect of School Choice on Participants: Evidence From Randomized Lotteries”, Econometrica, (2006), 74: 1191–1230.
[16]See: Caroline Hoxby, School Choice and School Productivity: (Or Could School Choice Be A Tide That Lifts All Boats?), 2002, NBER Working Paper 8873, an influential, article advocating this argument,http://www.nber.org/papers/w8873; Greg Forster, A Win-Win Solution: The Empirical Evidence on School Vouchers,2011; David N. Figlio and Cecilia Elena Rouse, Do Accountability and Voucher Threats Improve Low-performing Schools?, 2005, NBER Working Paper 11597.
[17]Research on voucher programmes in Chile have produced similarly contrasting results to those in America (Francisco Gallego, “School Choice, Incentives, and Academic Outcomes: Evidence for Chile”, paper 39, Econometric Society 2004 Latin American Meetings; Chang-Tai Hsieh and Miguel Urquiola, “The effects of generalized school choice on achievement and stratification: Evidence from Chile’s voucher program”, Journal of Public Economics (2006) 90: 1477–1503).
[18]Center for Research on Education Outcomes, Multiple Choice: Charter School Performance in 16 States, June 2009.
[19]David Deming, Justine Hastings, Thomas Kane, and Douglas Staiger, School Choice, School Quality and Postsecondary Attainment, 2011, NBER Working Paper 17438.
[20] It should be noted for disclosure purposes that Pearson, who commissioned this report from the Economist Intelligence Unit, is a minority investor in a chain of schools in Ghana co-founded by James Tooley.
[21]James Tooley, Yong Bao, Pauline Dixon, John Merrifield, “School Choice and Academic Performance: Some Evidence From Developing Countries,” Journal of School Choice, 2011, 5: 1–39; Baladevan Rangaraju, James Tooley, Pauline Dixon, The Private School Revolution in Bihar: Findings from a survey in Patna Urban, 2012; World Bank, Learning and Educational Achievement in Punjab Schools Report Summary, 2008.


Returns to schooling: education, labour and social outcomes

The individual benefits
On a personal level, education is good for you – literally. In most countries, levels of academic attainment correlate with life expectancy, and some research suggests that this link is causal rather than coincidental.[22] Other apparent personal benefits statistically related to time spent in education include, according to one extensive literature review, promoting better decisions on “marriage, and parenting. It also improves patience, making individuals more goal-oriented and less likely to engage in risky behaviour.”[23] For some, learning itself is fun.
The most researched aspect of personal gains from education is the economic one, referred to as the returns to schooling. Since Gary Becker published Human Capital in the mid-1960s, a host of studies have calculated the financial benefit in various countries of time spent in school. These typically reveal a gain in annual earnings of between 8% and 10% for every additional year of education.
It is not, however, straightforward to use such insight in order to improve a country’s average earning potential. Education may not even be the cause of individual higher wages: instead it could be that educational success signals to employers the presence of other valuable qualities. Moreover, returns to education vary, on occasion widely, in a number of ways. For one thing they tend to be higher in less developed countries. In wealthier nations benefits tend to accrue more at the tertiary level, while in poorer ones they have been shifting from the primary to the secondary level. Time in school beyond that required for the occupation which the student eventually takes up – known as “over-education” – yields substantially lower returns. Results also differ by geography, or even city, within countries, and often also between gender.
Just why these differences appear is not always clear, but simply keeping everyone in the black box of education a few years longer will not yield magic results. Above all, the quality of education matters: one World Bank study suggested that the apparent decline in the returns to primary education in developing countries may arise from the length of time it takes to teach even basic literacy and numeracy in a number of those countries.[24] 
Getting the best at national level
Good education may, in most cases, help the individuals being educated, but does it help their society as well? A substantial literature sees behavioural impacts on educated individuals that have positive societal impact – for liberal democracies at least – including, to name just a few, better health for the relatives of those educated, lower arrest rates, higher voter participation and even a greater tendency to support free speech.[25]
In considering country-level benefits, the more common area of study has also been economic. On a basic level, education helps. Our correlation analysis shows a strong link between average years in school – or school life expectancy – and labour productivity. This does not surprise Namibia's Mr Angula: “A well-educated nation is likely to be innovative. I don’t think that you have to go to the statistical evidence to find that. People are able to use knowledge for economic development.” It is not simply that better educated people themselves are more productive. Extensive research has found a spill-over effect from education, with benefits arising both from how the educated share their knowledge with others and how they are better able to pick up new skills themselves by building on their existing education.
Chart 7a: Relationship between school life expectancy and labour productivity, 1990-2011
Chart 7a: Relationship between school life expectancy and labour productivity, 1990-2011
Note: The scatter matrix shows the correlation of school life expectancy for all years against all possible future years for overall productivity of labour.  The correlation in each set of years is well above our threshold for "strong" correlations of 0.65.
Sources: Economist Intelligence Unit and UNESCO.

Chart 7b:  Relationship between school life expectancy (in 1995) and labour productivity (in 2010)
Chart 7b:  Relationship between school life expectancy (in 1995) and labour productivity (in 2010)
Note: The scatter chart shows the correlation of school life expectancy in 1995 against overall productivity of labour in 2010 for 37 countries.  The correlation is 0.817, well above our threshold for "strong" correlations of 0.65.
Sources: Economist Intelligence Unit and UNESCO.
The difficulty for policymakers, though, is deciding what sort of education works best when so many factors affect the economy. Predictably, quality appears to be more important than duration. In one analysis, Professors Hanushek and Woessmann found that when cognitive skills, as measured by PISA scores, are correlated with GDP, then the impact of total years of schooling becomes irrelevant. In other words, how long it took to learn was less important than that learning had occurred.[26] This may seem obvious, but it is directly applicable to decisions such as starting primary education a year earlier or using the same resources for teacher training.
More complicated than quality is the question of what sort of content in an education system will yield the best labour market and economic outcomes. For example, some countries prize strong vocational school programmes while others prefer more unified systems. One advocate of vocational education is Professor Schwartz, who says of the US that “having a system focussed entirely on preparing students for four-year colleges and universities is a major problem. Only 30% of young Americans actually get a four-year degree by their mid-twenties, and many of those wind up in jobs that didn't require a degree. The consequence of not having a strong post-secondary vocational system is that most young Americans reach their mid-twenties without the skills and credentials needed for success in a technology-driven economy.”
Mr Angula, whose country is looking to bolster its vocational education system, adds that systems “need to create linkages between the school and the community, and the school and the economy, so that education should have a meaning in the context that it is practised. Sometimes it is hard for students to apply their knowledge or skills.” Without seeing any relevance, they might simply leave education.
Softer skills
The questions of the appropriate education content to best ensure future economic growth and how best to equip students to face an uncertain future are also at the core of reforms in some of the more successful school systems, particularly in Asia. Singapore’s Professor Lee explains that “of today’s job titles compared to those of 1995, many are very new; the skills are very new. We anticipate that evolution will be fast into the future.” For over a decade, his country’s Ministry of Education has engaged in future scanning to identify the likely skills needed in the coming years, and adjusted its offerings to students accordingly. More important, since 1997, says Professor Lee, Singapore has shifted away from teaching rote knowledge to a firm foundation in the basics of maths, science, and literacy combined with an inculcation of how to understand and apply information. “We feel it contributes toward the students acquiring knowledge and skills of cognition and creativity attributes which are very important in the 21st century landscape.”
Both of these developments reflect an attitude that education systems need to be prepared for ongoing change rather than seek a single, best end state. “No education system can remain static,” writes Singapore’s Prime Minister, Lee Hsien Loong, in the foreword to a recent report on education and geopolitics in the 21st century. “The world is changing rapidly. Technology is transforming our lives. The skills needed in the future will be very different from those needed today.”[27]
Singapore is not alone. Shanghai students finished first in the latest PISA tests, but China is also shifting toward a much greater emphasis on creativity. Professor Zhao explains that the country’s leadership believes “the economy is moving quickly from a labour-intensive one to a knowledge economy. It needs creative talent.” Indeed, he finds it ironic that China is moving more in the direction of Western models even while politicians in those countries sometimes praise that of traditional Asian education. South Korean schools, meanwhile, are now being encouraged to develop "creativity, character and collaboration".
Chart 8: Percentage of labour force reaching secondary and tertiary attainment, selected countries, 2008 (%)
Chart 8: Percentage of labour force reaching secondary and tertiary attainment, selected countries, 2008 (%)
 Source: International Labour Organization.
Teaching people how to work together is indeed of growing relevance to the economy. According to Ms Parthasarathi, “A lot of education in the second half of the 20th century has made children fiercely individualistic, not good in a team, but these team skills – an ability to interact with respect with people; to empathise; to be innovatively adventurous – are essential for certain types of creativity.” In order to drive the teaching of collaborative skills, the Assessment and Teaching of 21st-Century Skills project – a multi-stakeholder group that includes the education ministries of the US, Australia, Singapore, Finland, the Netherlands and Costa Rica – has been seeking to develop metrics to test such abilities. These will be integrated into the PISA 2015 tests – a sign, Professor Schleicher says, that “the kinds of skills that matter in life are changing.”
Education can clearly deliver substantial social and economic outcomes. Understanding how it does so, however, and maximising those results are still works in progress for educational leaders. Says Mr Mackay, Chair of the Australian Institute for Teaching and School Leadership: “None of the countries you might think would be complacent are complacent at all: they are investing in new metrics.”


[22]Hans van Kippersluis, Owen O’Donnell, and Eddy van Doorslaer, “Long Run Returns to Education: Does Schooling Lead to an Extended Old Age?”, Journal of Human Resources (2009): 1–33.
[23]Philip Oreopoulos, Kjell G. Salvanes, “How large are returns to schooling? Hint: Money isn’t everything”, National Bureau of Economic Research Working Paper 15339, September 2009.
[24]Tazeen Fasih, Linking Education Policy to Labor Market Outcomes, World Bank, 2008, see also Tazeen Fasih, et al, Heterogeneous Returns to Education in the Labor Market, World Bank Policy Research Working Paper 6170, August 2012.
[25]Thomas S. Dee, Are There Civic Returns to Education? National Bureau of Economic Research Working Paper 9588, March 2003; Craig Riddell, “The Impact of Education on Economic and Social Outcomes: An Overview of Recent Advances in Economics”, Canadian Policy Research Network, 2006.
[26]Eric A. Hanushek and Ludger Woessmann, “Education and Economic Growth”, in Dominic J. Brewer and Patrick J. McEwan, eds. Economics of Education (2010).
[27]Foreword to Michael Barber, Katelyn Donnelly and Saad Rizvi, Oceans of innovation: The Atlantic, the Pacific, global leadership and the future of education, 2012.