Monday, January 30, 2017

Making the grade: Chinese school lets failing students borrow points from ‘mark bank’

Exams can be a major source of anxiety for students, especially if they receive a failing mark. But one school in China is now allowing its pupils to borrow necessary points from a ‘grade bank’ in order to pass any test.
The innovative scheme is the brainchild of the Nanjing No 1 High School in Nanjing, Jiangsu province, which is running a trial of the concept with its 10th grade Advanced Placement (AP) class, the South China Morning Post reported, citing the Yangtze Evening Post. 
The director of the school,Kan Huang, said the new scheme is an effort to place more emphasis on students' growth rather than their exam performance, noting the current exam-oriented culture which could lead to a pupil's future being determined by a "single major exam."
Physics teacher Mei Hong also hailed the scheme, saying it is aimed at giving students a second chance.
“Fifty-nine points and 60 points are actually not that different,” she said, noting that one score is failing while the other is passing. The difference, however, "weighs heavily on students' psyches."
A total of 13 students out of 49 have so far borrowed points from the grade bank.
One of those students took points to pass a recent geography exam after falling ill.
"Because I was sick, I missed some classes. The mark bank saved my grade," he said.
But just as borrowing money from a real bank comes with the requirement of paying it back with interest, the grade bank works the same way.
Students must repay the marks by scoring extra points on a future exam, though some teachers also allow students to pay them back by conducting public speeches or lab experiments.
And just like a regular bank, students are allowed to pay in installments – and those who do not pay back their debts will be prevented from future borrowing.
Parents who are involved in the banking industry were asked by the school to help devise the scheme.
China's education system is extremely dependent on just a few major exams, with college entry almost entirely determined by one test in the final year of school. Students are therefore under increased pressure to perform, with a 2014 study finding that most cases of student suicides can be attributed to pressure from school tests.

https://www.rt.com/news/373571-chinese-students-grade-bank/

Yakın Tarihimizde Masonluk





Kaynak: Tolga Gerger
hakikatitarih

İnkılâp!





Saturday, January 28, 2017

Early Modern Europe was the Last Place You Would Have Expected a Scientific Revolution


We often associate science with the values of secularism and tolerance. If so, early modern Europe is the last place you would have expected a scientific revolution. Europe in the days of Columbus, Copernicus and Newton had the highest concentration of religious fanatics in the world, and the lowest level of tolerance. The luminaries of the Scientific Revolution lived in a society that expelled Jews and Muslims, burned heretics wholesale, saw a witch in every cat-loving elderly lady and started a new religious war every full moon.

If you travelled to Cairo or Istanbul around 1600, you would find there a multicultural and tolerant metropolis, where Sunnis, Shiites, Orthodox Christians, Catholics, Armenians, Copts, Jews and even the occasional Hindu lived side by side in relative harmony. Though they had their share of disagreements and riots, and though the Ottoman Empire routinely discriminated against people on religious grounds, it was a liberal paradise compared with Europe. If you then travelled to contemporary Paris or London, you would find cities awash with religious extremism, in which only those belonging to the dominant sect could live. In London they killed Catholics, in Paris they killed Protestants, the Jews had long been driven out, and nobody in his right mind would dream of letting any Muslims in. And yet, the Scientific Revolution began in London and Paris rather than in Cairo and Istanbul. 

Borders Drawn with Rulers


Is it true that when text and reality collide, reality sometimes has to give way? Isn’t it just a common but exaggerated slander of bureaucratic systems? Most bureaucrats – whether serving pharaoh or Mao Zedong – were reasonable people, and surely would have made the following argument: ‘We use writing to describe the reality of fields, canals and granaries. If the description is accurate, we make realistic decisions. If the description is inaccurate, it causes famines and even rebellions. Then we, or the administrators of some future regime, learn from the mistake, and strive to produce more truthful descriptions. So over time, our documents are bound to become ever more precise.’

That’s true to some extent, but it ignores an opposite historical dynamic. As bureaucracies accumulate power, they become immune to their own mistakes. Instead of changing their stories to fit reality, they can change reality to fit their stories. In the end, external reality matches their bureaucratic fantasies, but only because they forced reality to do so. For example, the borders of many African countries disregard river lines, mountain ranges and trade routes, split historical and economic zones unnecessarily, and ignore local ethnic and religious identities. The same tribe may find itself riven between several countries, whereas one country may incorporate splinters of numerous rival clans. Such problems bedevil countries all over the world, but in Africa they are particularly acute because modern African borders don’t reflect the wishes and struggles of local nations. They were drawn by European bureaucrats who never set foot in Africa.

In the late nineteenth century, several European powers laid claim to African territories. Fearing that conflicting claims might lead to an all-out European war, the concerned parties got together in Berlin in 1884, and divided Africa as if it were a pie. Back then, much of the African interior was terra incognita to Europeans. The British, French and Germans had accurate maps of Africa’s coastal regions, and knew precisely where the Niger, the Congo and the Zambezi empty into the ocean. However, they knew little about the course these rivers took inland, about the kingdoms and tribes that lived along their banks, and about local religion, history and geography. This hardly mattered to the European diplomats. They took out an empty map of Africa, spread it over a well-polished Berlin table, sketched lines here and there, and divided the continent between them.

When the Europeans penetrated the African interior, armed with the agreed-upon map, they discovered that many of the borders drawn in Berlin hardly did justice to the geographic, economic and ethnic reality of Africa. However, to avoid renewed clashes, the invaders stuck to their agreements, and these imaginary lines became the actual borders of European colonies. During the second half of the twentieth century, as the European empires disintegrated and the colonies gained their independence, the new countries accepted the colonial borders, fearing that the alternative would be endless wars and conflicts. Many of the difficulties faced by present-day African countries stem from the fact that their borders make little sense. When the written fantasies of European bureaucracies encountered the African reality, reality was forced to surrender. 

Currency Revolution


“On 3 November 1985 the Myanmar government unexpectedly announced that bank-notes of twenty-five, fifty and a hundred kyats were no longer legal tender. People were given no opportunity to exchange the notes, and savings of a lifetime were instantaneously turned into heaps of worthless paper. To replace the defunct notes, the government introduced new seventy-five-kyat bills, allegedly in honour of the seventy-fifth birthday of Myanmar’s dictator, General Ne Win. In August 1986, banknotes of fifteen kyats and thirty-five kyats were issued. Rumour had it that the dictator, who had a strong faith in numerology, believed that fifteen and thirty-five are lucky numbers. They brought little luck to his subjects. On 5 September 1987 the government suddenly decreed that all thirty-five and seventy-five notes were no longer money.”

Long Live The Revolution!


History provides ample evidence for the crucial importance of large-scale cooperation. Victory almost invariably went to those who cooperated better – not only in struggles between Homo sapiens and other animals, but also in conflicts between different human groups. Thus Rome conquered Greece not because the Romans had larger brains or better toolmaking techniques, but because they were able to cooperate more effectively. Throughout history, disciplined armies easily routed disorganised hordes, and unified elites dominated the disorderly masses. In 1914, for example, 3 million Russian noblemen, officials and business people lorded it over 180 million peasants and workers. The Russian elite knew how to cooperate in defence of its common interests, whereas the 180 million commoners were incapable of effective mobilisation. Indeed, much of the elite’s efforts focused on ensuring that the 180 million people at the bottom would never learn to cooperate.

In order to mount a revolution, numbers are never enough. Revolutions are usually made by small networks of agitators rather than by the masses. If you want to launch a revolution, don’t ask yourself, ‘How many people support my ideas?’ Instead, ask yourself, ‘How many of my supporters are capable of effective collaboration?’ The Russian Revolution finally erupted not when 180 million peasants rose against the tsar, but rather when a handful of communists placed themselves at the right place at the right time. In 1917, at a time when the Russian upper and middle classes numbered at least 3 million people, the Communist Party had just 23,000 members. The communists nevertheless gained control of the vast Russian Empire because they organised themselves well. When authority in Russia slipped from the decrepit hands of the tsar and the equally shaky hands of Kerensky’s provisional government, the communists seized it with alacrity, gripping the reins of power like a bulldog locking its jaws on a bone. 

“The communists didn’t release their grip until the late 1980s. Effective organisation kept them in power for eight long decades, and they eventually fell due to defective organisation. On 21 December 1989 Nicolae Ceauşescu, the communist dictator of Romania, organised a mass demonstration of support in the centre of Bucharest. Over the previous months the Soviet Union had withdrawn its support from the eastern European communist regimes, the Berlin Wall had fallen, and revolutions had swept Poland, East Germany, Hungary, Bulgaria and Czechoslovakia. Ceauşescu, who had ruled Romania since 1965, believed he could withstand the tsunami, even though riots against his rule had erupted in the Romanian city of Timişoara on 17 December. As one of his counter-measures, Ceauşescu arranged a massive rally in Bucharest to prove to Romanians and the rest of the world that the majority of the populace still loved him – or at least feared him. The creaking party apparatus mobilised 80,000 people to fill the city’s central square, and citizens throughout Romania were instructed to stop all their activities and tune in on their radios and televisions.

To the cheering of the seemingly enthusiastic crowd, Ceauşescu mounted the balcony overlooking the square,  as he had done scores of times in previous decades. Flanked by his wife Elena, leading party officials and a bevy of bodyguards, Ceauşescu began delivering one of his trademark dreary speeches. For eight minutes he praised the glories of Romanian socialism, looking very pleased with himself as the crowd clapped mechanically. And then something went wrong. You can see it for yourself on YouTube. Just search for ‘Ceauşescu’s last speech’, and watch history in action.20
The YouTube clip shows Ceauşescu starting another long sentence, saying, ‘I want to thank the initiators and organisers of this great event in Bucharest, considering it as a—’, and then he falls silent, his eyes open wide, and he freezes in disbelief. He never finished the sentence. You can see in that split second how an entire world collapses. Somebody in the audience booed. People still argue today who was the first person who dared to boo. And then another person booed, and another, and another, and within a few seconds the masses began whistling, shouting abuse and calling out ‘Ti-mi-şoa-ra! Ti-mi-şoa-ra!”
 
All this happened live on Romanian television, as three-quarters of the populace sat glued to the screens, their hearts throbbing wildly. The notorious secret police – the Securitate – immediately ordered the broadcast to be stopped, but the television crews disobeyed. The cameraman pointed the camera towards the sky so that viewers couldn’t  see the panic among the party leaders on the balcony, but the soundman kept recording, and the technicians continued the transmission. The whole of Romania heard the crowd booing, while Ceauşescu yelled, ‘Hello! Hello! Hello!’ as if the problem was with the microphone. His wife Elena began scolding the audience, ‘Be quiet! Be quiet!’ until Ceauşescu turned and yelled at her – still live on television – ‘You be quiet!’ Ceauşescu then appealed to the excited crowds in the square, imploring them, ‘Comrades! Comrades! Be quiet, comrades!’

But the comrades were unwilling to be quiet. Communist Romania crumbled when 80,000 people in the Bucharest central square realised they were much stronger than the old man in the fur hat on the balcony. What is truly astounding, however, is not the moment the system collapsed, but the fact that it managed to survive for decades. Why are revolutions so rare? Why do the masses sometimes clap and cheer for centuries on end, doing everything the man on the balcony commands them, even though they could in theory charge forward at any moment and tear him to pieces? 

Ceauşescu and his cronies dominated 20 million Romanians for four decades because they ensured three vital conditions. First, they placed loyal communist apparatchiks in control of all networks of cooperation, such as the army, trade unions and even sports associations. Second, they prevented the creation of any rival organisations – whether political, economic or social – which might serve as a basis for anti-communist cooperation. Third, they relied on the support of sister communist parties in the Soviet Union and eastern Europe. Despite occasional tensions, these parties helped each other in times of need, or at least guaranteed that no outsider poked his nose into the socialist paradise. Under such conditions, despite all the hardship and suffering inflicted on them by the ruling elite, the 20 million Romanians were unable to organise any effective opposition.

Ceauşescu fell from power only once all three conditions no longer held. In the late 1980s the Soviet Union withdrew its protection and the communist regimes began falling like dominoes. By December 1989 Ceauşescu could not expect any outside assistance. Just the opposite – revolutions in nearby countries gave heart to the local opposition. The Communist Party itself began splitting into rival camps. The moderates wished to rid themselves of Ceauşescu and initiate reforms before it was too late. By organising the Bucharest demonstration and broadcasting it live on television, Ceauşescu himself provided the revolutionaries with the perfect opportunity to discover their power and rally against him. What quicker way to spread a revolution than by showing it on TV?

Yet when power slipped from the hands of the clumsy organiser on the balcony, it did not pass to the masses in the square. Though numerous and enthusiastic, the crowds did not know how to organise themselves. “Hence just as in Russia in 1917, power passed to a small group of political players whose only asset was good organisation. The Romanian Revolution was hijacked by the self-proclaimed National Salvation Front, which was in fact a smokescreen for the moderate wing of the Communist Party. The Front had no real ties to the demonstrating crowds. It was manned by mid-ranking party officials, and led by Ion Iliescu, a former member of the Communist Party’s central committee and one-time head of the propaganda department. Iliescu and his comrades in the National Salvation Front reinvented themselves as democratic politicians, proclaimed to any available microphone that they were the leaders of the revolution, and then used their long experience and network of cronies to take control of the country and pocket its resources.

In communist Romania almost everything was owned by the state.   

Democratic Romania quickly privatised its assets, selling them at bargain prices to the ex-communists, who alone grasped what was happening and collaborated to feather each other’s nests. Government companies that controlled national infrastructure and natural resources were sold to former communist officials at end-of-season prices while the party’s foot soldiers bought houses and apartments for pennies.

Ion Iliescu was elected president of Romania, while his colleagues became ministers, parliament members, bank directors and multimillionaires. The new Romanian elite that controls the country to this day is composed mostly of former communists and their families. The masses who risked their necks in Timişoara and Bucharest settled for scraps, because they did not know how to cooperate and how to create an efficient organisation to look after their own interests. 

A similar fate befell the Egyptian Revolution of 2011. What television did in 1989, Facebook and Twitter did in 2011. The new media helped the masses coordinate their activities, so that thousands of people flooded the streets and squares at the right moment and toppled the Mubarak regime. However, it is one thing to bring 100,000 people to Tahrir Square, and quite another to get a grip on the political machinery, shake the right hands in the right back rooms and run a country effectively. Consequently, when Mubarak stepped down the demonstrators could not fill the vacuum. Egypt had only two institutions sufficiently organised to rule the country: the army and the Muslim Brotherhood. Hence the revolution was hijacked first by the Brotherhood, and eventually by the army.
The Romanian ex-communists and the Egyptian generals were not more intelligent or nimble-fingered than either the old dictators or the demonstrators in Bucharest and Cairo. Their advantage lay in flexible cooperation. They cooperated better than the crowds, and they were willing to show far more flexibility than the hidebound Ceauşescu and Mubarak. 

A Brief History of Lawns


If history doesn’t follow any stable rules, and if we cannot predict its future course, why study it? It often seems that the chief aim of science is to predict the future – meteorologists are expected to forecast whether tomorrow will bring rain or sunshine; economists should know whether devaluing the currency will avert or precipitate an economic crisis; good doctors foresee whether chemotherapy or radiation therapy will be more successful in curing lung cancer. Similarly, historians are asked to examine the actions of our ancestors so that we can repeat their wise decisions and avoid their mistakes. But it almost never works like that because the present is just too different from the past. It is a waste of time to study Hannibal’s tactics in the Second Punic War so as to copy them in the Third World War. What worked well in cavalry battles will not necessarily be of much benefit in cyber warfare.

Science is not just about predicting the future, though. Scholars in all fields often seek to broaden our horizons, thereby opening before us new and unknown futures. This is especially true of history. Though historians occasionally try their hand at prophecy (without notable success), the study of history aims above all to make us aware of possibilities we don’t normally consider. Historians study the past not in order to repeat it, but in order to be liberated from it.

**
What’s true of grand social revolutions is equally true at the micro level of everyday life. A young couple building a new home for them “selves may ask the architect for a nice lawn in the front yard. Why a lawn? ‘Because lawns are beautiful,’ the couple might explain. But why do they think so? It has a history behind it.

Stone Age hunter-gatherers did not cultivate grass at the entrance to their caves. No green meadow welcomed the visitors to the Athenian Acropolis, the Roman Capitol, the Jewish Temple in Jerusalem or the Forbidden City in Beijing. The idea of nurturing a lawn at the entrance to private residences and public buildings was born in the castles of French and English aristocrats in the late Middle Ages. In the early modern age this habit struck deep roots, and became the trademark of nobility.
Well-kept lawns demanded land and a lot of work, particularly in the days before lawnmowers and automatic water sprinklers. In exchange, they produce nothing of value. You can’t even graze animals on them, because they would eat and trample the grass. Poor peasants could not afford wasting precious land or time on lawns. The neat turf at the entrance to chateaux was accordingly a status symbol nobody could fake. It boldly proclaimed to every passerby: ‘I am so rich and powerful, and I have so many acres and serfs, that I can afford this green extravaganza.’ The bigger and neater the lawn, the more powerful the dynasty. If you came to visit a duke and saw that his lawn was in bad shape, you knew he was in trouble.

The precious lawn was often the setting for important celebrations and social events, and at all other times was strictly off-limits. To this day, in countless palaces, government buildings and public venues a stern sign commands people to ‘Keep off the grass’. In my former Oxford college the entire quad was formed of a large, attractive lawn, on which we were allowed to walk or sit on only one day a year. On any other day, woe to the poor student whose foot desecrated the holy turf.
Royal palaces and ducal chateaux turned the lawn into a symbol of authority. When in the late modern period kings were toppled and dukes were guillotined, the new presidents and prime ministers kept the lawns. Parliaments, supreme courts, presidential residences and other public buildings increasingly proclaimed their power in row upon row of neat green blades. Simultaneously, lawns conquered the world of sports. For thousands of years humans played on almost every conceivable kind of ground, from ice to desert. Yet in the last two centuries, the really important games – such as football and tennis – are played on lawns. Provided, of course, you have money. In the favelas of Rio de Janeiro the future generation of Brazilian football is kicking makeshift balls over sand and dirt. But in the wealthy suburbs, the sons of the rich are enjoying themselves over meticulously kept lawns.

Humans thereby came to identify lawns with political power, social status and economic wealth. No wonder that in the nineteenth century the rising bourgeoisie enthusiastically adopted the lawn. At first only bankers, lawyers and industrialists could afford such luxuries at their private residences. Yet when the Industrial Revolution broadened the middle class and gave rise to the lawnmower and then the automatic sprinkler, millions of families could suddenly afford a home turf. In American suburbia a spick-and-span lawn switched from being a rich person’s luxury into a middle-class necessity.
This was when a new rite was added to the suburban liturgy. After Sunday morning service at church, many people devotedly mowed their lawns. Walking along the streets, you could quickly ascertain the wealth and position of every family by the size and quality of their turf. There is no surer sign that something is wrong at the Joneses’ than a neglected lawn in the front yard. Grass is nowadays the most widespread crop in the USA after maize and wheat, and the lawn industry (plants, manure, mowers, sprinklers, gardeners) accounts for billions of dollars every year.

The lawn did not remain solely a European or American craze. Even people who have never visited the Loire Valley see US presidents giving speeches on the White House lawn, important football games played out in green stadiums, and Homer and Bart Simpson quarrelling about whose turn it is to mow the grass. People all over the globe associate lawns with power, money and prestige. The lawn has therefore spread far and wide, and is now set to conquer even the heart of the Muslim world. Qatar’s newly built Museum of Islamic Art is flanked by magnificent lawns that hark back to Louis XIV’s Versailles much more than to Haroun al-Rashid’s Baghdad. They were designed and constructed by an American company, and their more than 100,000 square metres of grass – in the midst of the Arabian desert – require a stupendous amount of fresh water each day to stay green. Meanwhile, in the suburbs of Doha and Dubai, middle-class families pride themselves on their lawns. If it were not for the white robes and black hijabs, you could easily think you were in the Midwest rather than the Middle East.

Having read this short history of the lawn, when you now come to plan your dream house you might think twice about having a lawn in the front yard. You are of course still free to do it. But you are also free to shake off the cultural cargo bequeathed to you by European dukes, capitalist moguls and the Simpsons – and imagine for yourself a Japanese rock garden, or some altogether new creation. This is the best reason to learn history: not in order to predict the future, but to free yourself of the past and imagine alternative destinies. Of course this is not total freedom – we cannot avoid being shaped by the past. But some freedom is better than none.

Thursday, January 26, 2017

The Eternal Passing Of Time



The portrait shows this – his eyes fixed elsewhere,
Mehmet the Conqueror holds a rose
To the Turkic scimitar of his nose.
The engrossing necessities of money and war,
The wise politician’s precautionary
Fratricides, the apt play of power –
All proper activities in his sphere,
And he excelled at them all. So why the flower?
A nod, perhaps, to something less worldly;
Not beauty, I think, whatever that is,
Not love, not ‘nature’,
Not Allah, by that or any other name –
Just a moment’s immersion in the texture
Of existence, the eternal passing of time.

All That Man Is



“How little we understand about life as it is actually happening. The moments fly past, like trackside pylons seen from a train window.

The present, perpetually slipping away.”

Friday, January 20, 2017

Knowledge that does not change behaviour is useless



















 

“This is the paradox of historical knowledge. Knowledge that does not change behaviour is useless. But knowledge that changes behaviour quickly loses its relevance. The more data we have and the better we understand history, the faster history alters its course, and the faster our knowledge becomes outdated.

Centuries ago human knowledge increased slowly, so politics and economics changed at a leisurely pace too. Today our knowledge is increasing at breakneck speed, and theoretically we should understand the world better and better. But the very opposite is happening. Our new-found knowledge leads to faster economic, social and political changes; in an attempt to understand what is happening, we accelerate the accumulation of knowledge, which leads only to faster and greater upheavals. Consequently we are less and less able to make sense of the present or forecast the future. In 1016 it was relatively easy to predict how Europe would look in 1050. Sure, dynasties might fall, unknown raiders might invade, and natural disasters might strike; yet it was clear that in 1050 Europe would still be ruled by kings and priests, that it would be an agricultural society, that most of its inhabitants would be peasants, and that it would continue to suffer greatly from famines, plagues and wars. In contrast, in 2016 we have no idea how Europe will  look in 2050. We cannot say what kind of political system it will have, how its job market will be structured, or even what kind of bodies its inhabitants will possess.”

The Right to Happiness 2


“In the twentieth century per capita GDP was perhaps the supreme yardstick for evaluating national success. From this perspective, Singapore, each of whose citizens produces on average $56,000 worth of goods and services a year, is a more successful country than Costa Rica, whose citizens produce only $14,000 a year. But nowadays thinkers, politicians and even economists are calling to supplement or even replace GDP with GDH – gross domestic happiness. After all, what do people want? They don’t want to produce. They want to be happy. Production is important because it provides the material basis for happiness. But it is only the means, not the end. In one survey after another Costa Ricans report far higher levels of life satisfaction than Singaporeans. Would you rather be a highly productive but dissatisfied Singaporean, or a less productive but satisfied Costa Rican?
This kind of logic might drive humankind to make happiness its second main goal for the twenty-first century. At first glance this might seem a relatively easy project. If famine, plague and war are disappearing, if humankind experiences unprecedented peace and prosperity, and if life expectancy increases dramatically, surely all that will make humans happy, right?

Wrong. When Epicurus defined happiness as the supreme good, he warned his disciples that it is hard work to be happy. Material achievements alone will not satisfy us for long. Indeed, the blind pursuit of money, fame and pleasure will only make us miserable. Epicurus recommended, for example, to eat and drink in moderation, and to curb one’s sexual appetites. In the long run, a deep friendship will make us more content than a frenzied orgy. Epicurus outlined an entire ethic of dos and don’ts to guide people along the treacherous path to happiness.

Epicurus was apparently on to something. Being happy doesn’t come easy. Despite our unprecedented achievements in the last few decades, it is far from obvious that contemporary people are significantly more satisfied than their ancestors in bygone years. Indeed, it is an ominous sign that despite higher prosperity, comfort and security, the rate of suicide in the developed world is also much higher than in traditional societies.

In Peru, Guatemala, the Philippines and Albania – developing countries suffering from poverty and political instability – about one person in 100,000 commits suicide each year. In rich and peaceful countries such as Switzerland, France, Japan and New Zealand, twenty-five people per 100,000 take their own lives annually. In 1985 most South Koreans were poor, uneducated and tradition-bound, living under an authoritarian dictatorship. Today South Korea is a leading economic power, its citizens are among the best educated in the world, and it enjoys a stable and comparatively liberal democratic regime. Yet whereas in 1985 about nine South Koreans per 100,000 killed themselves, today the annual rate of suicide has more than tripled to thirty per 100,000. 
**
“And even if we have overcome many of yesterday’s miseries, attaining positive happiness may be far more difficult than abolishing downright suffering. It took just a piece of bread to make a starving medieval peasant joyful. How do you bring joy to a bored, overpaid and overweight engineer? The second half of the twentieth century was a golden age for the USA. Victory in the Second World War, followed by an even more decisive victory in the Cold War, turned it into the leading global superpower. Between 1950 and 2000 American GDP grew from $2 trillion to $12 trillion. Real per capita income doubled. The newly invented contraceptive pill made sex freer than ever. Women, gays, African Americans and other minorities finally got a bigger slice of the American pie. A flood of cheap cars, refrigerators, air conditioners, vacuum cleaners, dishwashers, laundry machines, telephones, televisions and computers changed daily life almost beyond recognition. Yet studies have shown that American subjective well-being levels in the 1990s remained roughly the same as they were in the 1950s.”


**
The exciting sensations of the race are as transient as the blissful sensations of victory. The Don Juan enjoying the thrill of a one-night stand, the businessman enjoying biting his fingernails watching the Dow Jones rise and fall, and the gamer enjoying killing monsters on the computer screen will find no satisfaction remembering yesterday’s adventures. Like the rats pressing the pedal again and again, the  “Don Juans, business tycoons and gamers need a new kick every day. Worse still, here too expectations adapt to conditions, and yesterday’s challenges all too quickly become today’s tedium. Perhaps the key to happiness is neither the race nor the gold medal, but rather combining the right doses of excitement and tranquillity; but most of us tend to jump all the way from stress to boredom and back, remaining as discontented with one as with the other.

If science is right and our happiness is determined by our biochemical system, then the only way to ensure lasting contentment is by rigging this system. Forget economic growth, social reforms and political revolutions: in order to raise global happiness levels, we need to manipulate human biochemistry. And this is exactly what we have begun doing over the last few decades. Fifty years ago psychiatric drugs carried a severe stigma. Today, that stigma has been broken. For better or worse, a growing percentage of the population is taking psychiatric medicines on a regular basis, not only to cure debilitating mental illnesses, but also to face more mundane depressions and the occasional blues.

For example, increasing numbers of schoolchildren take stimulants such as  Ritalin. In 2011, 3.5 million American children were taking medications for ADHD (attention deficit hyperactivity disorder). In the UK the number rose from 92,000 in 1997 to 786,000 in 2012.38 The original aim had been to treat attention disorders, but today completely healthy kids take such medications to improve their performance and live up to the growing expectations of teachers and parents. Many object to this development and argue that the problem lies with the education system rather than with the children. If pupils suffer from attention disorders, stress and low grades, perhaps we ought to blame outdated teaching methods, overcrowded classrooms and an unnaturally fast tempo of life. Maybe we should modify the schools rather than the kids? It is interesting to see how the arguments have evolved. People have been quarrelling about education methods for thousands of years. Whether in ancient China or Victorian Britain, everybody had his or her pet method, and vehemently opposed all alternatives. Yet hitherto everybody still agreed on one thing: in order to improve education, we need to change the schools. Today, for the first time in history, at least some people think it would be more efficient to change the pupil's biochemistry."

Terrorism is a Strategy of Weakness

Terrorism is a strategy of weakness adopted by those who lack access to real power. At least in the past, terrorism worked by spreading fear rather than by causing significant material damage. Terrorists usually don’t have the strength to defeat an army, occupy a country or destroy entire cities. Whereas in 2010 obesity and related illnesses killed about 3 million people, terrorists killed a total of 7,697 people across the globe, most of them in developing countries. For the average American or European, Coca-Cola poses a far deadlier threat than al-Qaeda.

“How, then, do terrorists manage to dominate the headlines and change the political situation throughout the world? By provoking their enemies to overreact. In essence, terrorism is a show. Terrorists stage a terrifying spectacle of violence that captures our imagination and makes us feel as if we are sliding back into medieval chaos. Consequently states often feel obliged to react to the theatre of terrorism with a show of security, orchestrating immense displays of force, such as the persecution of entire populations or the invasion of foreign countries. In most cases, this overreaction to terrorism poses a far greater threat to our security than the terrorists themselves.

Terrorists are like a fly that tries to destroy a china shop. The fly is so weak that it cannot budge even a single teacup. So it finds a bull, gets inside its ear and starts buzzing. The bull goes wild with fear and anger, and destroys the china shop. This is what happened in the Middle East in the last decade. Islamic fundamentalists could never have toppled Saddam Hussein by themselves. Instead they enraged the USA by the 9/11 attacks, and the USA destroyed the Middle Eastern china shop for them. Now they flourish in the wreckage. By themselves, terrorists are too weak to drag us back to the Middle Ages and re-establish the Jungle Law. They may provoke us, but in the end, it all depends on our reactions. If the Jungle Law comes back into force, it will not be the fault of terrorists. 

Today the main source of wealth is knowledge


Today the main source of wealth is knowledge. And whereas you can conquer oil fields through war, you cannot acquire knowledge that way. Hence as knowledge became the most important economic resource, the profitability of war declined and wars became increasingly restricted to those parts of the world – such as the Middle East and Central Africa – where the economies are still old-fashioned material-based economies.

In 1998 it made sense for Rwanda to seize and loot the rich coltan mines of neighbouring Congo, because this ore was in high demand for the manufacture of mobile phones and laptops, and Congo held 80 per cent of the world’s coltan reserves. Rwanda earned $240 million annually from the looted coltan. For poor Rwanda that was a lot of money.  In contrast, it would have made no sense for China to invade California and seize Silicon Valley, for even if the Chinese could somehow prevail on the battlefield, there were no silicon mines to loot in Silicon Valley. Instead, the Chinese have earned billions of dollars from cooperating with hi-tech giants such as Apple and Microsoft, buying their software and manufacturing their products. What Rwanda earned from an entire year of looting Congolese coltan, the Chinese earn in a single day of peaceful commerce.

In consequence, the word ‘peace’ has acquired a new meaning. Previous generations thought about peace as the temporary absence of war. Today we think about peace as the implausibility of war. When in 1913 people said that there was peace between France and Germany, they meant that ‘there is no war going on at present between France and Germany, but who knows what next year will bring’. When today we say that there is peace between France and Germany, we mean that  it is inconceivable under any foreseeable circumstances that war might break out between them. Such peace prevails not only between France and Germany, but between most (though not all) countries. There is no scenario for a serious war breaking out next year between Germany and Poland, between Indonesia and the Philippines, or between Brazil and Uruguay. 

  

Sunday, January 15, 2017

Feedback


Giving unclear, infrequent feedback has somewhat of the same effect — though slightly less violent. You end up hurting the person receiving the feedback more, even though you’re just doing what your parents always told you empathetic people do: If you don’t have anything nice to say, don’t say it at all.
This is where Scott says she’s seen managers make the most mistakes. “No one sets out to be unclear in their feedback, but somewhere along the line things change. You’re worried about hurting the person’s feelings so you hold back. Then, when they don’t improve because you haven’t told them they are doing something wrong, you wind up firing them. Not so nice after all…”
In order to give people the feedback they need to get better, you can’t give a damn about whether they like you or not. “Giving feedback is very emotional. Sometimes you get yelled at. Sometimes you get tears. These are hard, hard conversations.”
Scott breaks giving feedback into four quadrants. On the horizontal axis you have unclear to clear feedback, and on the vertical you have the spectrum of anticipated emotions from happy to unhappy. The gentler the feedback, the less clear it tends to be. That’s the cruel empathy quadrant. The one you want to be in is the top right — clear even if it’s bad news.


Tough love is how you build trust the fastest.
“When you’re hard on someone but they really hear you, that’s when you build trust over time,” she says. “They’re going to react emotionally. All you can do is react empathetically. Don’t try to prevent or control someone’s feelings.”
The Tactics:
  • Just say it. “A lot of management training ties you in knots trying to say things just right. Just let it go. Just say it. It will probably be fine. Say it in private and say it right away. Criticism has a half-life. The longer you wait the worse the situation gets.”
  • Don’t be loose with praise. Managers tend to expect that praise is easier than criticism, but it can go awry. “If you’re wrong about what you’re praising someone for; if you don’t know the details; if you’re not sincere; it’s actually going to be worse for the person than saying nothing. Praise in public, but only if you know you’re absolutely right and you mean it. Otherwise people will see right through you.”

The New Human Agenda




“At the dawn of the third millennium, humanity wakes up, stretching its limbs and rubbing its eyes. Remnants of some awful nightmare are still drifting across its mind. ‘There was something with barbed wire, and huge mushroom clouds. Oh well, it was just a bad dream.’ Going to the bathroom, humanity washes its face, examines its wrinkles in the mirror, makes a cup of coffee and opens the diary. ‘Let’s see what’s on the agenda today.’

For thousands of years the answer to this question remained unchanged. The same three problems preoccupied the people of twentieth-century China, of medieval India and of ancient Egypt. Famine, plague and war were always at the top of the list. For generation after generation humans have prayed to every god, angel and saint, and have invented countless tools, institutions and social systems – but they continued to die in their millions from starvation, epidemics and violence. Many thinkers and prophets concluded that famine, plague and war must be an integral part of God’s cosmic plan or of our imperfect nature, and nothing short of the end of time would free us from them.

“Yet at the dawn of the third millennium, humanity wakes up to an amazing realisation. Most people rarely think about it, but in the last few decades we have managed to rein in famine, plague and war. Of course, these problems have not been completely solved, but they have been transformed from incomprehensible and uncontrollable forces of nature into manageable challenges. We don’t need to pray to any god or saint to rescue us from them. We know quite well what needs to be done in order to prevent famine, plague and war – and we usually succeed in doing it.

True, there are still notable failures; but when faced with such failures we no longer shrug our shoulders and say, ‘Well, that’s the way things work in our imperfect world’ or ‘God’s will be done’. Rather, when famine, plague or war break out of our control, we feel that somebody must have screwed up, we set up a commission of inquiry, and promise ourselves that next time we’ll do better. And it actually works. Such calamities indeed happen less and less often. For the first time in history, more people die today from eating too much than from eating too little; more people die from old age than from infectious diseases; and more people commit suicide than are killed by soldiers, terrorists and criminals combined. In the early twenty-first century, the average human is far more likely to die from bingeing at McDonald’s than from drought, Ebola or an al-Qaeda attack.”

**
“Until recently most humans lived on the very edge of the biological poverty line, below which people succumb to malnutrition and hunger. A small mistake or a bit of bad luck could easily be a death sentence for an entire family or village. If heavy rains destroyed your wheat crop, or robbers carried off your goat herd, you and your loved ones may well have starved to death. Misfortune or stupidity on the collective level resulted in mass famines. When severe drought hit ancient Egypt or medieval India, it was not uncommon that 5 or 10 per cent of the population perished. Provisions became scarce; transport was too slow and expensive to import sufficient food; and governments were far too weak to save the day.”

**
“After famine, humanity’s second great enemy was plagues and infectious diseases. Bustling cities linked by a ceaseless stream of merchants, officials and pilgrims were both the bedrock of human civilisation and an ideal breeding ground for pathogens. People consequently lived their lives in ancient Athens or medieval Florence knowing that they might fall ill and die next week, or that an epidemic might suddenly erupt and destroy their entire family in one swoop.

The most famous such outbreak, the so-called Black Death, began in the 1330s, somewhere in east or central Asia, when the flea-dwelling bacterium Yersinia pestis started infecting humans bitten by the fleas. From there, riding on an army of rats and fleas, the plague quickly spread all over Asia, Europe and North Africa, taking less than twenty years to reach the shores of the Atlantic Ocean. Between 75 million and 200 million people died – more than a quarter of the population of Eurasia. In England, four out of ten people died, and the population dropped from a pre-plague high of 3.7 million people to a post-plague low of 2.2 million. The city of Florence lost 50,000 of its 100,000 inhabitants.”

**
“The Black Death was not a singular event, nor even the worst plague in history. More disastrous epidemics struck America, Australia and the Pacific Islands following the arrival of the first Europeans. Unbeknown to the explorers and settlers, they brought with them new infectious diseases against which the natives had no immunity. Up to 90 per cent of the local populations died as a result.”

**
“Incidentally cancer and heart disease are of course not new illnesses – they go back to antiquity. In previous eras, however, relatively few people lived long enough to die from them.”




Monday, January 9, 2017

Why schools should not teach general critical-thinking skills




























Being an air-traffic controller is not easy. At the heart of the job is a cognitive ability called ‘situational awareness’ that involves ‘the continuous extraction of environmental information [and the] integration of this information with prior knowledge to form a coherent mental picture’. Vast amounts of fluid information must be held in the mind and, under extreme pressure, life-or-death decisions are made across rotating 24-hour work schedules. So stressful and mentally demanding is the job that, in most countries, air-traffic controllers are eligible for early retirement. In the United States, they must retire at 56 without exception.

In the 1960s, an interesting series of experiments was done on air-traffic controllers’ mental capacities. Researchers wanted to explore if they had a general enhanced ability to ‘keep track of a number of things at once’ and whether that skill could be applied to other situations. After observing them at their work, researchers gave the air-traffic controllers a set of generic memory-based tasks with shapes and colours. The extraordinary thing was that, when tested on these skills outside their own area of expertise, the air-traffic controllers did no better than anyone else. Their remarkably sophisticated cognitive abilities did not translate beyond their professional area.

Since the early 1980s, however, schools have become ever more captivated by the idea that students must learn a set of generalised thinking skills to flourish in the contemporary world – and especially in the contemporary job market. Variously called ‘21st-century learning skills’ or ‘critical thinking’, the aim is to equip students with a set of general problem-solving approaches that can be applied to any given domain; these are lauded by business leaders as an essential set of dispositions for the 21st century. Naturally, we want children and graduates to have a set of all-purpose cognitive tools with which to navigate their way through the world. It’s a shame, then, that we’ve failed to apply any critical thinking to the question of whether any such thing can be taught.

As the 1960s studies on air-traffic controllers suggested, to be good in a specific domain you need to know a lot about it: it’s not easy to translate those skills to other areas. This is even more so with the kinds of complex and specialised knowledge that accompanies much professional expertise: as later studies found, the more complex the domain, the more important domain-specific knowledge. This non-translatability of cognitive skill is well-established in psychological research and has been replicated many times. Other studies, for example, have shown that the ability to remember long strings of digits doesn’t transfer to the ability to remember long strings of letters. Surely we’re not surprised to hear this, for we all know people who are ‘clever’ in their professional lives yet who often seem to make stupid decisions in their personal lives.

In almost every arena, the higher the skill level, the more specific the expertise is likely to become. In a football team, for example, there are different ‘domains’ or positions: goalkeeper, defender, attacker. Within those, there are further categories: centre-back, full-back, attacking midfielder, holding midfielder, attacking player. Now, it might be fine for a bunch of amateurs, playing a friendly game, to move positions. But, at a professional level, if you put a left-back in a striker’s position or a central midfielder in goal, the players would be lost. For them to make excellent, split-second decisions, and to enact robust and effective strategies, they need thousands of specific mental models – and thousands of hours of practice to create those models – all of which are specific and exclusive to a position.

Of course, critical thinking is an essential part of a student’s mental equipment. However, it cannot be detached from context. Teaching students generic ‘thinking skills’ separate from the rest of their curriculum is meaningless and ineffective. As the American educationalist Daniel Willingham puts it:
[I]f you remind a student to ‘look at an issue from multiple perspectives’ often enough, he will learn that he ought to do so, but if he doesn’t know much about an issue, he can’t think about it from multiple perspectives … critical thinking (as well as scientific thinking and other domain-based thinking) is not a skill. There is not a set of critical thinking skills that can be acquired and deployed regardless of context.
This detachment of cognitive ideals from contextual knowledge is not confined to the learning of critical thinking. Some schools laud themselves for placing ‘21st-century learning skills’ at the heart of their mission. It’s even been suggested that some of these nebulous skills are now as important as literacy and should be afforded the same status. An example of this is brain-training games that claim to help kids become smarter, more alert and able to learn faster. However, recent research has shown that brain-training games are really only good for one thing – getting good a brain-training games. The claim that they offer students a general set of problem-solving skills was recently debunked by a study that reviewed more than 130 papers, which concluded:
[W]e know of no evidence for broad-based improvement in cognition, academic achievement, professional performance, and/or social competencies that derives from decontextualised practice of cognitive skills devoid of domain-specific content.
The same goes for teaching ‘dispositions’ such as the ‘growth mindset’ (focusing on will and effort as opposed to inherent talent) or ‘grit’ (determination in the face of obstacles). It’s not clear that these dispositions can be taught, and there’s no evidence that teaching them outside a specific subject matter has any effect.

Instead of teaching generic critical-thinking skills, we ought to focus on subject-specific critical-thinking skills that seek to broaden a student’s individual subject knowledge and unlock the unique, intricate mysteries of each subject. For example, if a student of literature knows that Mary Shelley’s mother died shortly after Mary was born and that Shelley herself lost a number of children in infancy, that student’s appreciation of Victor Frankenstein’s obsession with creating life from death, and the language used to describe it, is more enhanced than approaching the text without this knowledge. A physics student investigating why two planes behave differently in flight might know how to ‘think critically’ through the scientific method but, without solid knowledge of contingent factors such as outside air temperature and a bank of previous case studies to draw upon, the student will struggle to know which hypothesis to focus on and which variables to discount.

As Willingham writes: ‘Thought processes are intertwined with what is being thought about.’ Students need to be given real and significant things from the world to think with and about, if teachers want to influence how they do that thinking.Aeon counter – do not remove

Carl Hendrick
This article was originally published at Aeon and has been republished under Creative Commons.

Thursday, January 5, 2017

The Right to Happiness







 










“Throughout history numerous thinkers, prophets and ordinary people defined happiness rather than life itself as the supreme good. In ancient Greece the philosopher Epicurus explained that worshipping gods is a waste of time, that there is no existence after death, and that happiness is the sole purpose of life. Most people in ancient times rejected Epicureanism, but today it has become the default view. Scepticism about the afterlife drives humankind to seek not only immortality, but also earthly happiness. For who would like to live for ever in eternal misery?


For Epicurus the pursuit of happiness was a personal quest. Modern thinkers, in contrast, tend to see it as a collective project. Without government planning, economic resources and scientific research, individuals will not get far in their quest for happiness. If your country is torn apart by war, if the economy is in crisis and if health care is non-existent, you are likely to be miserable. At the end of the eighteenth century the British philosopher Jeremy Bentham declared that the supreme good is ‘the greatest happiness of the greatest number’, and concluded that the sole worthy aim of the state, the market and the scientific community is to increase global happiness. Politicians should make peace, business people should foster prosperity and scholars should study nature, not for the greater glory of king, country or God – but so that you and I could enjoy a happier life.

During the nineteenth and twentieth centuries, although many paid lip service to Bentham’s vision, governments, corporations and laboratories focused on more immediate and well-defined aims. Countries measured their success by the size of their territory, the increase in their population and the growth of their GDP – not by the happiness of their citizens. Industrialised nations such as Germany, France and Japan established gigantic systems of education, health and welfare, yet these systems were aimed to strengthen the nation rather than ensure individual well-being.

Schools were founded to produce skilful and obedient citizens who would serve the nation loyally. At eighteen, youths needed to be not only patriotic but also literate, so that they could read the brigadier’s order of the day and draw up tomorrow’s battle plans. They had to know mathematics in order to calculate the shell’s trajectory or crack the enemy’s secret code. They needed a reasonable command of electrics, mechanics and medicine, in order to operate wireless sets, drive tanks and take care of wounded comrades. When they left the army they were expected to serve the nation as clerks, teachers and engineers, building a modern economy and paying lots of taxes.

The same went for the health system. At the end of the nineteenth century countries such as France, Germany and Japan began providing free health care for the masses. They financed vaccinations for infants, balanced diets for children and physical education for teenagers. They drained festering swamps, exterminated mosquitoes and built centralised sewage systems. The aim wasn’t to make people happy, but to make the nation stronger. The country needed sturdy soldiers and workers, healthy women who would give birth to more soldiers and workers, and bureaucrats who came to the office punctually at 8 a.m. instead of lying sick at home.

Even the welfare system was originally planned in the interest of the nation rather than of needy individuals. When Otto von Bismarck pioneered state pensions and social security in late nineteenth-century Germany, his chief aim was to ensure the loyalty of the citizens rather than to increase their well-being. You fought for your country when you were eighteen, and paid your taxes when you were forty, because you counted on the state to take care of you when you were seventy."