Search This Blog

CCE in brief

My photo
Recovering backpacker, Cornwallite at heart, political enthusiast, catalyst, writer, husband, father, community volunteer, unabashedly proud Canadian. Every hyperlink connects to something related directly or thematically to that which is highlighted.

Friday, 17 August 2012

Hurry ruins saints as well as artists.

“There can be an intense egoism in following everyone else. People are in a HURRY to magnify themselves by IMITATING what is popular, and too lazy to think of anything better. Hurry ruins saints as well as artists. They want QUICK SUCCESS and they are in such a hurry to get it that they cannot take time to be TRUE TO THEMSELVES. And when the madness is upon them, they argue that their very haste is a species of integrity.”

  - Thomas Merton

On Organizational Culture (by Jason Lauritsen)

A worthwhile read.  Of course, a junior in an office shouldn't expect the world to bend to their will, either; that equally leads to productivity challenges.  The middle ground is, shockingly, found in communication - when you establish a relationship, both sides put effort into finding the sources of tension as well as the points of commonality.  When you build on the latter and take pains to understand the why of the former, you might just learn something.

Strength through diversity; it's as true of corporate structure as it is of social or biological genetics.

Is Organizational “Fit” Really a Good Thing?

Organizational fit (or cultural fit) is a topic that gets a lot of discussion, particularly in recruiting circles. We strive to find people who will be a good fit for the organization–people who will quickly and naturally fall into the rhythm of the organization and how things get done around here.
This is no small issue. As a person who has been in a couple of different organizations where I wasn’t a good “fit,” I can tell you that there is a high toll paid by the individual when working in a place they don’t fit. The organization also pays a price and that price inceases the larger the role is where the misfit happens.

So, most organizations have decided that this is a recruiting issue. The best way to achieve a fit is to hire the right people in the first place. And I used to buy into this approach. Hiring people who fit the culture and organization right out of the shoot definitely makes for happier hiring managers and less friction around new hires in the organization.

But here’s the problem. It is easy to think of and practice cultural fit as meaning “walks, talks, thinks and acts like us.” When we start selecting and hiring on this model, we are adding more of what we already have. What we aren’t adding is diversity. And the research of people like Scott Page and others is pretty convincing that if we care at all about innovation or problem solving, diversity is a critical component. Hiring and managing for organizational fit feels a lot at times like hiring for homogeneity and that could actually be harming our organization’s future prospects.

“If everyone is thinking alike, then no one is thinking.”
― Benjamin Franklin

What if, instead of working on the supply side of this equation, we started tackling the other side-the culture itself. Organizational cultures operate like an immune system when left unchecked. There are cultural antibodies that attack difference. If you don’t fit, you feel it, often painfully. Rather than trying to find more sameness, what if we conditioned our cultures instead to welcome variety and difference? What if we started a program to vaccinate our culture so that it wouldn’t attack someone for not fitting in.

When I think about my own experiences where I was not a cultural fit, the friction and tension I felt was rooted in the desire for acceptance and belonging. The problem was lack of interest in these cultures to accept me as I was. To belong, I was expected to change, to learn to fit in. I think this is the norm based on the stories I hear from the people I meet. And generally, the more we try to change who we are as individuals to fit in, the less happy and successful we are. So, working to achieve cultural fit might actually be working directly against things we desire in our organizations like optimum performance.

So, as leaders and designers of organizations, I’d challenge you to step back from the conversation about cultural fit for a moment to consider what it means when we say that. Granted, cultural fit achieves short term comfort, but I think it may be doing long term harm. Perhaps what we truly need is not more cultural fit, but instead a different kind of culture that is welcoming, accepting, and creates feelings of belonging and inclusion for everyone who has the talents to help our company grow. If you are ready to try on what this might mean in your organization, stop over to my colleague Joe Gerstandt’s blog and spend some time there reading about how to make this happen.

Matt Ridley on why It's The End of the World!!

... but only as we know it.  The world as it is just keeps on truckin', or to quote Ian Malcolm  - life finds a way.

Apocalypse Not: Here’s Why You Shouldn’t Worry About End Times

  • By Matt Ridley

Photo: Garry McLeod; Alamy; Yva Momatiuk & John Eastcott/Getty

This is the question posed by the website “super volcanos? pestilence and disease? asteroids? comets? antichrist? global warming? nuclear war?” the site’s authors are impressively open-minded about the cause of the catastrophe that is coming at 11:11 pm on december 21 this year. but they have no doubt it will happen. after all, not only does the Mayan Long Count calendar end that day, but “the sun will be aligned with the center of the Milky Way for the first time in about 26,000 years.”
Case closed: Sell your possessions and live for today.
When the sun rises on December 22, as it surely will, do not expect apologies or even a rethink. No matter how often apocalyptic predictions fail to come true, another one soon arrives. And the prophets of apocalypse always draw a following—from the 100,000 Millerites who took to the hills in 1843, awaiting the end of the world, to the thousands who believed in Harold Camping, the Christian radio broadcaster who forecast the final rapture in both 1994 and 2011.
2009 bug

Religious zealots hardly have a monopoly on apocalyptic thinking. Consider some of the environmental cataclysms that so many experts promised were inevitable. Best-selling economist Robert Heilbroner in 1974: “The outlook for man, I believe, is painful, difficult, perhaps desperate, and the hope that can be held out for his future prospects seem to be very slim indeed.” Or best-selling ecologist Paul Ehrlich in 1968: “The battle to feed all of humanity is over. In the 1970s ["and 1980s" was added in a later edition] the world will undergo famines—hundreds of millions of people are going to starve to death in spite of any crash programs embarked on now … nothing can prevent a substantial increase in the world death rate.” Or Jimmy Carter in a televised speech in 1977: “We could use up all of the proven reserves of oil in the entire world by the end of the next decade.”

Predictions of global famine and the end of oil in the 1970s proved just as wrong as end-of-the-world forecasts from millennialist priests. Yet there is no sign that experts are becoming more cautious about apocalyptic promises. If anything, the rhetoric has ramped up in recent years. Echoing the Mayan calendar folk, the Bulletin of the Atomic Scientists moved its Doomsday Clock one minute closer to midnight at the start of 2012, commenting: “The global community may be near a point of no return in efforts to prevent catastrophe from changes in Earth’s atmosphere.”

Over the five decades since the success of Rachel Carson’s Silent Spring in 1962 and the four decades since the success of the Club of Rome’s The Limits to Growth in 1972, prophecies of doom on a colossal scale have become routine. Indeed, we seem to crave ever-more-frightening predictions—we are now, in writer Gary Alexander’s word, apocaholic. The past half century has brought us warnings of population explosions, global famines, plagues, water wars, oil exhaustion, mineral shortages, falling sperm counts, thinning ozone, acidifying rain, nuclear winters, Y2K bugs, mad cow epidemics, killer bees, sex-change fish, cell-phone-induced brain-cancer epidemics, and climate catastrophes.

So far all of these specters have turned out to be exaggerated. True, we have encountered obstacles, public-health emergencies, and even mass tragedies. But the promised Armageddons—the thresholds that cannot be uncrossed, the tipping points that cannot be untipped, the existential threats to Life as We Know It—have consistently failed to materialize. To see the full depth of our apocaholism, and to understand why we keep getting it so wrong, we need to consult the past 50 years of history.
The classic apocalypse has four horsemen, and our modern version follows that pattern, with the four riders being chemicals (DDT, CFCs, acid rain), diseases (bird flu, swine flu, SARS, AIDS, Ebola, mad cow disease), people (population, famine), and resources (oil, metals). Let’s visit them each in turn.

Photo: Michael Ochs Archives/Getty
Silent Spring, published 50 years ago this year, was instrumental in the emergence of modern environmentalism. “Without this book, the environmental movement might have been long delayed or never have developed at all,” Al Gore wrote in his introduction to the 1994 edition. Carson’s main theme was that the use of synthetic pesticides—DDT in particular—was causing not only a massacre of wildlife but an epidemic of cancer in human beings. One of her chief inspirations and sources for the book was Wilhelm Hueper, the first director of the environmental arm of the National Cancer Institute. So obsessed was Hueper with his notion that pesticides and other synthetic chemicals were causing cancers (and that industry was covering this up) that he strenuously opposed the suggestion that tobacco-smoking take any blame. Hueper wrote in a 1955 paper called “Lung Cancers and Their Causes,” published in CA: A Cancer Journal for Clinicians, “Industrial or industry-related atmospheric pollutants are to a great part responsible for the causation of lung cancer … cigarette smoking is not a major factor in the causation of lung cancer.”

In fact, of course, the link between smoking and lung cancer was found to be ironclad. But the link between modern chemicals and cancer is sketchy at best. Even DDT, which clearly does pose health risks to those unsafely exposed, has never been definitively linked to cancer. In general, cancer incidence and death rates, when corrected for the average age of the population, have been falling now for 20 years.

By the 1970s the focus of chemical concern had shifted to air pollution. Life magazine set the scene in January 1970: “Scientists have solid experimental and theoretical evidence to support … the following predictions: In a decade, urban dwellers will have to wear gas masks to survive air pollution … by 1985 air pollution will have reduced the amount of sunlight reaching earth by one half.” Instead, driven partly by regulation and partly by innovation, both of which dramatically cut the pollution coming from car exhaust and smokestacks, ambient air quality improved dramatically in many cities in the developed world over the following few decades. Levels of carbon monoxide, sulphur dioxide, nitrogen oxides, lead, ozone, and volatile organic compounds fell and continue to fall.

In the 1980s it was acid rain’s turn to be the source of apocalyptic forecasts. In this case it was nature in the form of forests and lakes that would bear the brunt of human pollution. The issue caught fire in Germany, where a cover story in the news magazine Der Spiegel in November 1981 screamed: “the forest dies.” Not to be outdone, Stern magazine declared that a third of Germany’s forests were already dead or dying. Bernhard Ulrich, a soil scientist at the University of Göttingen, said it was already too late for the country’s forests: “They cannot be saved.” Forest death, or waldsterben, became a huge story across Europe. “The forests and lakes are dying. Already the damage may be irreversible,” journalist Fred Pearce wrote in New Scientist in 1982. It was much the same in North America: Half of all US lakes were said to be becoming dangerously acidified, and forests from Virginia to central Canada were thought to be suffering mass die-offs of trees.

Conventional wisdom has it that this fate was averted by prompt legislative action to reduce sulphur dioxide emissions from power plants. That account is largely false. There was no net loss of forest in the 1980s to reverse. In the US, a 10-year government-sponsored study involving some 700 scientists and costing about $500 million reported in 1990 that “there is no evidence of a general or unusual decline of forests in the United States and Canada due to acid rain” and “there is no case of forest decline in which acidic deposition is known to be a predominant cause.” In Germany, Heinrich Spiecker, director of the Institute for Forest Growth, was commissioned by a Finnish forestry organization to assess the health of European forests. He concluded that they were growing faster and healthier than ever and had been improving throughout the 1980s. “Since we began measuring the forest more than 100 years ago, there’s never been a higher volume of wood … than there is now,” Spiecker said. (Ironically, one of the chief ingredients of acid rain—nitrogen oxide—breaks down naturally to become nitrate, a fertilizer for trees.) As for lakes, it turned out that their rising acidity was likely caused more by reforestation than by acid rain; one study suggested that the correlation between acidity in rainwater and the pH in the lakes was very low. The story of acid rain is not of catastrophe averted but of a minor environmental nuisance somewhat abated.

The threat to the ozone layer came next. In the 1970s scientists discovered a decline in the concentration of ozone over Antarctica during several springs, and the Armageddon megaphone was dusted off yet again. The blame was pinned on chlorofluorocarbons, used in refrigerators and aerosol cans, reacting with sunlight. The disappearance of frogs and an alleged rise of melanoma in people were both attributed to ozone depletion. So too was a supposed rash of blindness in animals: Al Gore wrote in 1992 about blind salmon and rabbits, while The New York Times reported “an increase in Twilight Zone-type reports of sheep and rabbits with cataracts” in Patagonia. But all these accounts proved incorrect. The frogs were dying of a fungal disease spread by people; the sheep had viral pinkeye; the mortality rate from melanoma actually leveled off during the growth of the ozone hole; and as for the blind salmon and rabbits, they were never heard of again.

There was an international agreement to cease using CFCs by 1996. But the predicted recovery of the ozone layer never happened: The hole stopped growing before the ban took effect, then failed to shrink afterward. The ozone hole still grows every Antarctic spring, to roughly the same extent each year. Nobody quite knows why. Some scientists think it is simply taking longer than expected for the chemicals to disintegrate; a few believe that the cause of the hole was misdiagnosed in the first place. Either way, the ozone hole cannot yet be claimed as a looming catastrophe, let alone one averted by political action.

Photos: Lars Baron/Getty; Corbis; Julie Dermansky/Corbis
Repeatedly throughout the past five decades, the imminent advent of a new pandemic has been foretold. The 1976 swine flu panic was an early case. Following the death of a single recruit at Fort Dix, the Ford administration vaccinated more than 40 million Americans, but more people probably died from adverse reactions to the vaccine than died of swine flu.

A few years later, a fatal virus did begin to spread at an alarming rate, initially through the homosexual community. AIDS was soon, rightly, the focus of serious alarm. But not all the dire predictions proved correct. “Research studies now project that one in five—listen to me, hard to believe—one in five heterosexuals could be dead from AIDS at the end of the next three years. That’s by 1990. One in five,” Oprah Winfrey warned in 1987.

Bad as AIDS was, the broad-based epidemic in the Americas, Europe, and Asia never materialized as feared, though it did in Africa. In 2000 the US National Intelligence Council predicted that HIV/AIDS would worsen in the developing world for at least 10 years and was “likely to aggravate and, in some cases, may even provoke economic decay, social fragmentation and political destabilization in the hardest hit countries in the developing and former communist worlds.”

Yet the peak of the epidemic had already passed in the late 1990s, and today AIDS is in slow retreat throughout the world. New infections were 20 percent lower in 2010 than in 1997, and the lives of more than 2.5 million people have been saved since 1995 by antiretroviral treatment. “Just a few years ago, talking about ending the AIDS epidemic in the near term seemed impossible, but science, political support, and community responses are starting to deliver clear and tangible results,” UNAIDS executive director Michel Sidibé wrote last year.

The emergence of AIDS led to a theory that other viruses would spring from tropical rain forests to wreak revenge on humankind for its ecological sins. That, at least, was the implication of Laurie Garrett’s 1994 book, The Coming Plague: Newly Emerging Diseases in a World Out of Balance. The most prominent candidate was Ebola, the hemorrhagic fever that starred in Richard Preston’s The Hot Zone, published the same year. Writer Stephen King called the book “one of the most horrifying things I’ve ever read.” Right on cue, Ebola appeared again in the Congo in 1995, but it soon disappeared. Far from being a harbinger, HIV was the only new tropical virus to go pandemic in 50 years.

In the 1980s British cattle began dying from mad cow disease, caused by an infectious agent in feed that was derived from the remains of other cows. When people, too, began to catch this disease, predictions of the scale of the epidemic quickly turned terrifying: Up to 136,000 would die, according to one study. A pathologist warned that the British “have to prepare for perhaps thousands, tens of thousands, hundreds of thousands, of cases of vCJD [new variant Creutzfeldt-Jakob disease, the human manifestation of mad cow] coming down the line.” Yet the total number of deaths so far in the UK has been 176, with just five occurring in 2011 and none so far in 2012.

In 2003 it was SARS, a virus from civet cats, that ineffectively but inconveniently led to quarantines in Beijing and Toronto amid predictions of global Armageddon. SARS subsided within a year, after killing just 774 people. In 2005 it was bird flu, described at the time by a United Nations official as being “like a combination of global warming and HIV/AIDS 10 times faster than it’s running at the moment.” The World Health Organization’s official forecast was 2 million to 7.4 million dead. In fact, by late 2007, when the disease petered out, the death toll was roughly 200. In 2009 it was Mexican swine flu. WHO director general Margaret Chan said: “It really is all of humanity that is under threat during a pandemic.” The outbreak proved to be a normal flu episode.

The truth is, a new global pandemic is growing less likely, not more. Mass migration to cities means the opportunity for viruses to jump from wildlife to the human species has not risen and has possibly even declined, despite media hype to the contrary. Water- and insect-borne infections—generally the most lethal—are declining as living standards slowly improve. It’s true that casual-contact infections such as colds are thriving—but only by being mild enough that their victims can soldier on with work and social engagements, thereby allowing the virus to spread. Even if a lethal virus does go global, the ability of medical science to sequence its genome and devise a vaccine or cure is getting better all the time.

Of all the cataclysmic threats to human civilization envisaged in the past 50 years, none has drawn such hyperbolic language as people themselves. “Human beings are a disease, a cancer of this planet,” says Agent Smith in the film The Matrix. Such rhetoric echoes real-life activists like Paul Watson of the Sea Shepherd Conservation Society: “We need to radically and intelligently reduce human populations to fewer than one billion … Curing a body of cancer requires radical and invasive therapy, and therefore, curing the biosphere of the human virus will also require a radical and invasive approach.”

On a “stinking hot” evening in a taxi in Delhi in 1966, as Paul Ehrlich wrote in his best seller, The Population Bomb, “the streets seemed alive with people. People eating, people washing, people sleeping. People visiting, arguing, and screaming. People thrusting their hands through the taxi window, begging. People defecating and urinating. People clinging to buses. People herding animals. People, people, people, people.” Ehrlich’s conclusion was bleak: “The train of events leading to the dissolution of India as a viable nation” was already in progress. And other experts agreed. “It is already too late to avoid mass starvation,” said Denis Hayes, organizer of the first Earth Day in 1970. Sending food to India was a mistake and only postponed the inevitable, William and Paul Paddock wrote in their best seller, Famine—1975!

What actually happened was quite different. The death rate fell. Famine became rarer. The population growth rate was cut in half, thanks chiefly to the fact that as babies stop dying, people stop having so many of them. Over the past 50 years, worldwide food production per capita has risen, even as the global population has doubled. Indeed, so successful have farmers been at increasing production that food prices fell to record lows in the early 2000s and large parts of western Europe and North America have been reclaimed by forest. (A policy of turning some of the world’s grain into motor fuel has reversed some of that decline and driven prices back up.)

Meanwhile, family size continues to shrink on every continent. The world population will probably never double again, whereas it quadrupled in the 20th century. With improvements in seeds, fertilizers, pesticides, transport, and irrigation still spreading across Africa, the world may well feed 9 billion inhabitants in 2050—and from fewer acres than it now uses to feed 7 billion.

Photo: Jo Yong Hak/Reuters; Ian hanning/Redux
In 1977 President Jimmy Carter went on television and declared: “World oil production can probably keep going up for another six or eight years. But sometime in the 1980s, it can’t go up anymore. Demand will overtake production.” He was not alone in this view. The end of oil and gas had been predicted repeatedly throughout the 20th century. In 1922 President Warren Harding created the US Coal Commission, which undertook an 11-month survey that warned, “Already the output of [natural] gas has begun to wane. Production of oil cannot long maintain its present rate.” In 1956, M. King Hubbert, a Shell geophysicist, forecast that gas production in the US would peak at about 14 trillion cubic feet per year sometime around 1970.

All these predictions failed to come true. Oil and gas production have continued to rise during the past 50 years. Gas reserves took an enormous leap upward after 2007, as engineers learned how to exploit abundant shale gas. In 2011 the International Energy Agency estimated that global gas resources would last 250 years. Although it seems likely that cheap sources of oil may indeed start to peter out in coming decades, gigantic quantities of shale oil and oil sands will remain available, at least at a price. Once again, obstacles have materialized, but the apocalypse has not. Ever since Thomas Robert Malthus, doomsayers have tended to underestimate the power of innovation. In reality, driven by price increases, people simply developed new technologies, such as the horizontal drilling technique that has helped us extract more oil from shale.

It was not just energy but metals too that were supposed to run out. In 1970 Harrison Brown, a member of the National Academy of Sciences, forecast in Scientific American that lead, zinc, tin, gold, and silver would all be gone by 1990. The best-selling book The Limits to Growth was published 40 years ago by the Club of Rome, a committee of prominent environmentalists with a penchant for meeting in Italy. The book forecast that if use continued to accelerate exponentially, world reserves of several metals could run out by 1992 and help precipitate a collapse of civilization and population in the subsequent century, when people no longer had the raw materials to make machinery. These claims were soon being repeated in schoolbooks. “Some scientists estimate that the world’s known supplies of oil, tin, copper, and aluminum will be used up within your lifetime,” one read. In fact, as the results of a famous wager between Paul Ehrlich and economist Julian Simon later documented, the metals did not run out. Indeed, they grew cheaper. Ehrlich, who claimed he had been “goaded” into the bet, growled, “The one thing we’ll never run out of is imbeciles.”

Over the past half century, none of our threatened eco-pocalypses have played out as predicted. Some came partly true; some were averted by action; some were wholly chimerical. This raises a question that many find discomforting: With a track record like this, why should people accept the cataclysmic claims now being made about climate change? After all, 2012 marks the apocalyptic deadline of not just the Mayans but also a prominent figure in our own time: Rajendra Pachauri, head of the Intergovernmental Panel on Climate Change, who said in 2007 that “if there’s no action before 2012, that’s too late … This is the defining moment.”

So, should we worry or not about the warming climate? It is far too binary a question. The lesson of failed past predictions of ecological apocalypse is not that nothing was happening but that the middle-ground possibilities were too frequently excluded from consideration. In the climate debate, we hear a lot from those who think disaster is inexorable if not inevitable, and a lot from those who think it is all a hoax. We hardly ever allow the moderate “lukewarmers” a voice: those who suspect that the net positive feedbacks from water vapor in the atmosphere are low, so that we face only 1 to 2 degrees Celsius of warming this century; that the Greenland ice sheet may melt but no faster than its current rate of less than 1 percent per century; that net increases in rainfall (and carbon dioxide concentration) may improve agricultural productivity; that ecosystems have survived sudden temperature lurches before; and that adaptation to gradual change may be both cheaper and less ecologically damaging than a rapid and brutal decision to give up fossil fuels cold turkey.

We’ve already seen some evidence that humans can forestall warming-related catastrophes. A good example is malaria, which was once widely predicted to get worse as a result of climate change. Yet in the 20th century, malaria retreated from large parts of the world, including North America and Russia, even as the world warmed. Malaria-specific mortality plummeted in the first decade of the current century by an astonishing 25 percent. The weather may well have grown more hospitable to mosquitoes during that time. But any effects of warming were more than counteracted by pesticides, new antimalarial drugs, better drainage, and economic development. Experts such as Peter Gething at Oxford argue that these trends will continue, whatever the weather.

Just as policy can make the climate crisis worse—mandating biofuels has not only encouraged rain forest destruction, releasing carbon, but driven millions into poverty and hunger—technology can make it better. If plant breeders boost rice yields, then people may get richer and afford better protection against extreme weather. If nuclear engineers make fusion (or thorium fission) cost-effective, then carbon emissions may suddenly fall. If gas replaces coal because of horizontal drilling, then carbon emissions may rise more slowly. Humanity is a fast-moving target. We will combat our ecological threats in the future by innovating to meet them as they arise, not through the mass fear stoked by worst-case scenarios.

Matt Ridley ( is a columnist for The Wall Street Journal and the author, most recently, of The Rational Optimist: How Prosperity Evolves.

Wednesday, 15 August 2012

Pull Back the Veil to Reveal...

  -  Pauline Kael

  -  Mal, Inception

  - Inscription from a statue of Isis

Today, I read an article about how overconfident people succeed, frequently in spite of the limits of their talents.  Promotions, monetary success, etc aren't about skills in many fields, ranging from sales to politics to polling - it's about confidence.  When these people are in positions beyond which their skills are suited - say, management, they can actually impede success of their direct reports.  Mistakes aren't owned. they are trickled down.  Underlings get thrown under the bus, which continues to chug along the wrong trajectory.

I have also written about the cognition of confidence, too - people who are expert at a given task tend not to be hyper-confident.  There is always doubt that perhaps, they've missed something in their analysis, a better example could have been found, etc.  This is how expertise is developed - uncertainty and the need to do better drives these individuals to push beyond the limits of what they know.

If you're overconfident, there's no need to pursue new facts that will cloud the issue - you have the answer, the ability, the skill without modification.  You just need to keep doing what you know.   Which is why overconfidence is the enemy of expertise.

Ignorance is a veil blocking off the dark corners of our mind.  When we pull back those veils, we find new things, new facets, new capacities to build and connect ideas - then communicate them.  When we're overconfident in the model of us we've established, we're telling ourselves we've already pulled back all the curtains; there's nothing left to discover.  We're deluding ourselves we're the best at what we do, we know more than others, the answers are ours, if victory were solely up to us, it would be assured.

Yet nobody's perfect, there is always room for improvement.  The right push, the right crisis - the right motivation will shake our worlds enough to show the impermanence of the walls we've established as safe ground, forcing us to lift another veil. 

Dreams make for another great metaphor.  By their nature, dreams are removed from reality and include impossible detail, but we don't question that while we're in the dream.  When we awaken, it becomes clear to us that the things we accepted could not be - nuance was missing, detail, texture.

Such is perception.  We imbue what we perceive with emotional value - it is that internally described value, not the nature of the thing perceived, that shapes our judgement.  On the surface, the employee who under performs is incompetent and that's all there is to it, or the boss is a superficial egomaniac who cares less about results and more about flaunting their position.  One level deeper, the employee might have been sold that they are something they are not - same too the boss.  There might be family, health, all kinds of issues that shape who that person is and what lineage they bring to a given moment of contact.  The same basic principle applies to someone who looks in the mirror and "feels fat" despite what anyone tells them; it's the feeling of the thing, not the thing itself, that resonates.

You can say that's all irrelevant - all that matters is the bottom line or the pay cheque.  You can tell yourself to keep home and personal and work and social lives separate, but there's just one person in which those worlds are equally grounded - you.  In experience, you find that it's hard to keep those worlds from collapsing in on each other. 

When you cling to your confidence and try to deny the possibility that reality has more depth than you've credited it with, confidence can quickly turn in to fear of the unknown and in response, anger.  When you come to accept that you are not perfect and have much to learn, you tear down another veil.

Maybe you don't like the metaphors; maybe allegory isn't your thing.  Try it this way:

Pay no attention to what's behind the curtain.

Tuesday, 14 August 2012

My Favourite Book

The ultimate journey - and I do love my road trips.

I only wish I understood Japanese so I could understand it more fully.

초 점 : Waste, Society and Discipline

Philosophers like to ask what it is that separates people from animals; we have created civilization, after all, where lesser mortals have not.  Surely something fundamental sets us apart from and above the rest of the ecosystem.

Whatever it is, it's not much.  We have far more in common with our fellow animals than we would like to admit.  Genetically, there's only a 1.3% divergence between human DNA and that of our closest relative, the bonobo.  There's even less of a difference between the bonobo and the chimp, with the difference being assumed to have resulted from the growth of the Congo river.  It should come as no surprise that geography should have played a role in creating distinction between species; within our own species, geography has played a huge role in influencing cultural development.

Is there a logical connection between rough, mountainous terrain and tribalism?  Of course there is; in tough terrains, you need to be tough to survive; the triggering of survival-of-the-fittest hard-wiring also fires up territorialism, single-minded focus and a certain degree of fatalism.  Is there a connection between arid landscapes and the deification of ancestors?  There is - dry terrains mummify bodies, helping them retain life-like qualities after death.  Mummification has been understood and practiced the world over for more-or-less the same reasons.  When you're ancestors are always around, watching over your shoulder in supernatural form, you gain a new respect for your place in the continuum of history; personal motives naturally gravitate towards the notion of maintaining/expanding family legacy.  Dan Gardner wrote an article on the relevance of culture to work ethic, etc; take that a step further, you can legitimately say that geography informs culture as much as culture informs behaviour.

These are all factors that are beyond our control; we do not decide into which family in which country we are born; all we have to decide is what to do with the time that is given to us.  But even then, how many of our decisions fall under our conscious control?  We like to focus on teenagers as exemplars of short-sighted, potentially destructive behaviour embodied by the notion of sex, drugs and rock and roll.  Yet they're hardly the only ones.  Adults get into sex scandals, say things without thinking through the consequences and then try to justify or deny them after the fact, drive unsafely and do all kind of things that can be immediately and personally detrimental or, in aggregate, socially (and therefore personally) detrimental.

The European Debt Crisis is a recent example, but you can look to the Great Fire of London or the Bubonic Plague for others.  Littering would also fit into this category.  Indeed, the history of waste management is the story of urban development and the increased centralization of coordinated authority - government - to manage the collective risks of waste.  It makes sense to ban activities like throwing waste on the street if the collective risks of such behaviour include things like plague.  Sneezing without covering your mouth, sudden lane changes and not moving to the back of a bus that's quickly filling up are variations on the same theme.  While we frame issues like getting inoculations as matters of personal choice, each of these issues have consequences for everyone else.  We provide social programming like EI or public healthcare for the same reason; you can put all the emphasis on individual responsibility as you want, but the fact remains that when people don't act with society in mind, the consequences get faced by the group, not just the individual.

Of course, in sparse populations, these collective risks are minimal.  If a couple of farmers have a pistols-at-dawn duel, there's less chance of a stray bullet hitting someone else.  If a nomadic people dump their waste out their front doors, those doors will move; the risks of contagion are reduced.  So it is that our ape cousins and other species are less concerned about their waste; they produce far less and are less likely to sleep where they dump.  Their genetic programming (of which we have so much in common) doesn't account for things like planning waste disposal or being mindful of the two-step consequence of actions like public littering or passing on viruses by sneezing without covering.

People litter for the same reason that any species litters - the overwhelming drives of our genetic programming is to not waste energy on activities that aren't of personal benefit.  Call it the genetics of laziness; not something we develop, but something we overcome.  Litter has no relevance as a concept until the waste we produce grows to a certain quantity.  Humans have been urban for a scant spec of time; our genetic programming still lags behind the needs of communal living.  Cleaning up after ourselves isn't instinctive, as any parent can tell you - it's learned behaviour.  The same holds true of not engaging in risky behaviour that could bring discomfort (or dishonour) to one's family or tribe; in the absence of an internalized awareness of social consequence, bad behaviour is more likely to flourish.  This is something as true of the human animal as it is of others.

There is a social tool that we have developed that helps us bridge the gap between biological drives and social needs - discipline.  Key to developing any craft, discipline is the ability to ignore discomfort, distraction or a lack of clarity on end product and work through a task regardless.  Discipline allows us to push beyond what we are inclined to do and explore what we are capable of, including considering eventual consequences. 

Of course, humans aren't the only animals capable of discipline. Dogs, birds, elephants, etc. can be trained to wait or perform tasks if the right carrots and sticks are employed. What perhaps makes us unique is the ability to discipline ourselves, internally, without need of external motivation.  Whereas trained animals perceive a path - a leads to b - at our best, people are able to perceive the bigger picture and be aware of content, context and consequence of actions yet to be performed.  Miyamoto Musashi, author of one of the world's favourite books on strategy, described the features of mindfulness thusly:

1. Do not think dishonestly.
2. The Way is in training.
3. Become acquainted with every art.
4. Know the Way of all professions.
5. Distinguish between gain and loss in worldly matters.
6. Develop intuitive judgement and understanding for everything.
7. Perceive those things which cannot be seen.
8. Pay attention even to trifles.
9. Do nothing which is no use.

Musashi dedicated his whole life to being the greatest warrior in Japan.  Accepting no limitations as acceptable, Musashi broke down one personal barrier after another.  By the end of his journey, Musashi had realized like so many others that the greatest enemy he would face wasn't another swordsman or even an army, but his own ignorance. 

When we become mindful of content, context and consequence, we gain insight into the actions of others as well as the broader ramifications of our own actions.  With this insight, we start to act in a more pro-social, strategic way. 

If there is one thing that separates us from them, it would be consciousness.

Monday, 13 August 2012

CFN - The Politics of Responsibility – From Alaskan Airlines to Adam Carroll

Politics, of course, isn’t about doing what’s right – it’s about doing what it takes to get ahead. The public might decry attack ads, but they will continue to be used because they’re effective. It might frustrate us when partisan groups get the gravy and friends of politicians get appointments, but if we’re not voting, not donating and not engaging in the political process, what else can we really expect? Sir John A. MacDonald used to say “Give me better wood and I will make you a better cabinet” – we can only expect our democracy to represent our interests when we actively participate in the process. But then again, when politicians of all stripes regularly ignore the public trust, how can we possibly expect contributions to have an impact? It’s a chicken and omelette kind of thing.

El Lechero

Do not merely pinch off the leaves
Or concern yourself only with the branches

Profit Isn't Everything

The West sees that prosperity is the answer to all problems.  Duh.  We just need to sell that idea to everyone else.

I hear this argument all the time from friends with a libertarian bent; money is everything.  If you have no unions or collective to back you, you're free to make your own choices.  If there's no social safety net or unions to fetter the market, people will be free to go out and get real jobs; the net restricts them from realizing their full potential.  Competition between employers and employees will provide the right tension that will see prosperity fill the pockets of everyone. 

No wait, scratch that - it's money and free choice, not competition, that is the root of all that is good.  If people were forced to think about every piece of minutiae themselves, then they would not just get money, but use it entirely wisely.  Everyone would automatically know to get the right insurances in case of home fires or physical accidents and if they should ever have children or parents with significant special needs, these employees will have become so valuable independently that employers will compete with each other over benefits packages to support those loved ones.  Or, the employees will have enough money to pay for whatever services are needed.  Funny how people who don't believe in handouts still line up for them when they're in need.  They're lucky there's a system in place to help them, but then again - that's how society works.

It's actually these libertarians who are the ones wearing rose-coloured glasses.

It seems to me that in their mind, sloth or crime are the products of socialism, resources are unlimited.  Nature is an inconvenience that can be resolved through financial planning alone.

Droughts, famines, floods?  The damage doesn't matter, if you're covered; if anything, you can buy food  and such from somewhere else.  Simple.  If there are criminals, well, you get rid of them - survival of the fittest.  Illness?  We can all buy our way to health - not that we'd want to support each other in the case of disease; it's up to the sick to plan for their own concerns.  Unfettered free markets and complete individual independence will solve everything.

Only, resources aren't unlimited; if you cut down all the trees in one geographic location - say, Eastern Island - then they don't grow back.  You can kill a civilization that way.  If you want to see what unfettered individual competition looks like, drive on the 401; people changing lanes without looking, riding bumpers, taking unnecessary risks without even realizing they're doing so because they're focused on themselves getting ahead and not minding the traffic.  Selfish driving behaviour is the number one cause of gridlock, but that behaviour is shaped by Western definitions of time pressures.  Time is money, etc, so you need to rush to get things done and with commute times, that becomes difficult.  How much productive time gets lost to commutes and gridlock?

Dan Gardner wrote a piece recently on culture where he referred to the success of Chinese immigrants and said there's a cultural element to that success.  He's not wrong on that - but does that culture focus on a profit motive?  A generalization, to be sure, but Chinese cultures place a greater emphasis on lineage, family and face - something we don't quite seem to get here in the West.  Face isn't about pride, because pride accepts no wrongs; if you can talk or buy your way out of a problem, then there's no problem.  Face includes shame - crippling shame that gets brought not only to oneself, but to one's family, school, etc. 

You could very easily argue, then, that the most "successful" cultures aren't focused on wealth-accumulation and therefore, aren't very free market.

In his piece, Solberg talks about the health crisis.  Huge problem, everyone talks about it, nobody knows how to solve it.  So let's just stop talking about it and move on?

The grand irony is, a lot of inspiration is being taken these days from the First Nations tradition of the medicine wheel; seeing the connections between body and mind (physical and mental health) and the need to look at problems holistically.  An emphasis is put on nurturing individual responsibility and collective support - so, what we have now, only more in-depth and less siloed.  In short, the opposite direction of where the free marketers think healthcare should go.

Which brings up another problem - what of those who don't want to choose unfettered capitalism?  What of those who freely choose a more holistic approach to society?  Should they be forced to adopt a Western model of prosperity?  Harper's already apologized for Residential Schools; surely he doesn't want to add his own chapter to the sad story of Western imperialism. 

The fact is, there are many ways to live and govern; free market capitalism is, much like communism, one system that can only work if everyone buys in and thinking along a certain line.  It'll never happen.  It's a paper system that can never hold.