Search This Blog

CCE in brief

My photo
Recovering backpacker, Cornwallite at heart, political enthusiast, catalyst, writer, husband, father, community volunteer, unabashedly proud Canadian. Every hyperlink connects to something related directly or thematically to that which is highlighted.

Saturday, 30 June 2012

False Dichotomies and The Genetics of Politics




Actually, Lakoff is oversimplies at first, but then points out that a clear split in left/right poltics is simply another false dichotomy.  There are "hard-core Conservatives" who believe in charity and also desperately want people to be able to succeed, independently, which is different than not caring about what happens to them.  There are liberals who believe in the principle of moving forward together, but don't practice what they preach.

In reality, there's a bit of each perspective - survival-of-the-fittest and success of the collective - in each of us.  Where does the balance lie?  Can it change?  It can change, because of a thing called neuro-plasticity.  Where it resides is in our noggins - hypothamlus, amygdala, neo-cortex.  A process called cognitive behavioural therapy - kind of a physiotherapy for your brain - allows you to consciously repattern your thoughts so that you are less reactive and have more active control over how you feel and how you respond to the world.

Of course, you don't want to become too analytical, or you lose your capacity for action.  But you don't want to be shooting-from-the-hip, either.  So where's the sweet spot?

It's in managing both our biological evolutionary and social evolutionary drives in cheque.

Put in otherwords, it's where the wise folk of history have always taught us - balance lies in the centre.





Leadership: Do the Right Thing





Whether we like it or not, disasters are social affairs.  The people directly impacted feel the full brunt of an earthquake, a tornado, or a man-made disaster.  Lives are lost, livlihoods are destroyed, people are shaken to their very core.  In the face of disaster, one is justified in wondering how any individual can hope to carry the burden of loss moving forward. 

The people directly impacted face the worst, but the rest of us are reminded that we are just as vulnerable.  When someone gets shot in a public setting, we realize it could easily have been us.  When a mall roof collapses, we think, "my kids shop there."  The illness of another reminds us that we are all mortal.

In times of tragedy, whether ours or someone else's, we all feel a more vulnerable.  We tend to feel more giving, too - it's a genetic thing.  But how?  To who?  For those who have fallen, who amongst us will be there to extend a hand of support?

That's where leaders come in.  True leaders bring people together, reminds us we are not alone and helps us rise up, together.  No man is an island - true leaders bridge the gap. 

In our wired world, it can be hard to do the right thing without thinking of potential self-promotional opportunities.  Every negative or questionable thing leaders do gets dissected and broadcast on the big stage; it's hard to resist the temptation to cast light on the good stuff one does.  In fact, you might say it takes a certain degree of selflessness to do the right thing solely because it is the right thing.

Which brings me to the Highway of Heroes.  Does anyone remember the big, splashy event and after-shock releases that came with the announcement that Premier McGuinty was declaring a section of the 401 in honour of deceased Canadian veterans?  I'd be surprised if you did - there weren't any.

Despite the media-gains that could potentially have come from such an act, Premier McGuinty simply did it because it was the right thing to do.  While he doesn't always get this right, I'd have to say that Dalton McGuinty is the most consistent leader we have in terms of doing the right thing for no reason either than it's the right thing to do.

For all the stuff he does publicly that people either admire or despise, McGuinty is constantly doing the little things that inspire.  If you look on any social media outlet, you'll see the odd mention or picture of him pop up in places that aren't splashy, don't gain him any credit publicly, but are indicative of a man who truly cares.

Which brings me back to Elliot Lake.  In the face of such a horrific tragedy, the Premier had two responsibilities - to the people who suffered, to remind them they are not alone, but also to the people of Ontario, to make it clear that in times of crises, we are all in this together and that we will move forward, together.

In my opinion, McGuinty had to reach out to the families.  He had to speak to the community leaders and offer his support.  Those things didn't have to be publicized; we didn't necessarily need to know he had spoken to the families, only they needed to know their Premier cared.  When it comes to provincial solidarity and seeking responses to avoid future such tragedies, that does have to be public.  The whole province needs to know that someone is looking out for our long-term interests.

Sure, it's possible that best intentions will be misconstrued.  It's quite likely that opposing interests will hammer you in the worst manner possible, no matter what you do - though it doesn't have to be that way.

True leadership isn't about personal wins.  For a true leader, how they are viewed comes second; how their people fare always come first.  Which is why real leadership is a lonely business.



Friday, 29 June 2012

When Innovation Fails (by John Kotter)






First, we lacked enthusiasm. None of us went into the process with a sense of passion that we could, or even should, find something new and different. We were appointed, and as we all know, when you are tapped on the shoulder, you don’t say no, lest you jeopardize your next promotion.

Second, we were all senior. Way too senior. Yes, a senior leader’s perspective is useful, but much more important is that of the many customer-facing people who are dealing with these issues everyday, trying to fix what is broken, seeing what others are doing, and who know (almost without thinking about it) what is wrong with how the company does what it does. They have less to lose by challenging and questioning, and it doesn’t form the basis of what made them successful. Conversely, how eager are leaders to challenge that which made them successful in the first instance?

Third, no-one knew what we were trying to do. Who cared if we had spent 100 hours working on building something new and exciting? The other 14 people, maybe. And those who did know about our little initiative? They looked at us as though we had all the answers; by being placed on the “Innovation Board,” we were seen as the innovative thinkers — and, as a result, others were left feeling that they lacked what it took to generate fresh, game-changing ideas. How wrong.

The way we tried to create innovation — organizing, prescribing, and delegating it — didn’t work. Yes, these groups may improve things — maybe even a lot of things — but rarely is this “innovation.” Producing the next product (like the iPod) or creating a new market (Instagram, without a big team) that had never existed before — that is innovation. Microsoft creating the Surface — not so much innovation, and more catch up? Thinking once an innovator, always an innovator — not a path to success.


1) Use only volunteers. Who in your organization feels excited about trying to build that next product or market? Let THEM — the people on the front lines, who are doing their day jobs, who happen to know your business better than anyone else — into the process. Jim Whitehurst of Red Hat figured this out. Others have too.

2) Sponsor an environment that encourages risk-taking. It’s fantastic if an employee takes initiative and risks making innovation. What do you do if they fail? Are they punished? You can be certain if an employee is punished for taking a risk toward innovation, that other employees will not be taking any risks.

3) Make time for brainstorming. It takes time, and sometimes it takes creating a particular environment to get employees into an innovative mindset.
This approach will foster some truly transformative ideas, and there will be an enthusiastic base of supporters ready to turn those ideas into action.

Now that’s a novel approach. Maybe, dare I say, an innovation.


Russell Raath is a senior engagement leader at Kotter International, a firm that helps leaders accelerate strategy implementation in their organizations.

You can't blame me for enjoying it when what I say fits in with what smarter people are suggesting.  But either way, there it is.

IDIC





Incremental change through recombinations.  Idea sex, when you get down to it, follows the same principles as the more physical kind.  What's key to this evolutionary process


X marks the spot.


The Sanity of Truth







And, as they say - the truth will set you free.


Thursday, 28 June 2012

Forget Edison: This is How History's Greatest Inventions Really Happened (by Derek Thompson)



The myth of the solitary inventor -- in 8 short stories

Screen Shot 2012-06-15 at 11.54.38 AM.png
The world's most famous inventors are household names. As we all know, Thomas Edison invented the light bulb, Alexander Graham Bell invented the phone, and Eli Whitney invented the cotton gin.
Except they didn't. The ideas didn't spring, Athena-like, fully formed from their brains. In fact, they didn't spring fully formed from anybody's brains. That is the myth of the lonely inventor and the eureka moment.
"Simultaneous invention and incremental improvement are the way innovation works, even for radical inventions," Mark A. Lemley writes in his fascinating paper The Myth of the Sole Inventor. Lemley's paper concentrates on the history and problems of patents. But he also chronicles the history of the 19th and 20th century's most famous inventors -- with an emphasis on how their inventions were really neither theirs, nor inventions. Here is a super-quick summary of his wonderful distillation of the last 200 years in collaborative innovation.
439px-Patent_for_Cotton_Gin_(1794)_-_hi_res.jpgCOTTON GIN

The fabric cotton comes from cotton fibers that mix with seeds in the pods of cotton plants. To make the fabric, therefore, you have to separate the fibers from the seeds. For centuries this was done mostly by hand, until Eli Whitney "invented" the cotton gin in 1793. But various forms of roller gins (i.e. technologies for separating fibers from seeds) had been around for thousands of years. Five years earlier, in 1788, Joseph Eve developed his own mechanized self-feeding roller gin. Whitney's true innovation was to improve existing cotton gins by "replacing rollers with coarse wire teeth that rotated through slits to pull the fiber from the seed." If this insight was a breakthrough, the glory goes to Whitney only he was faster than his competitors. In 1795, John Barcley filed a patent on a gin featuring circles of teeth -- awfully similar to Whitney's wire-tooth model (see left). In short, the modern cotton gin was a eureka moment that multiple inventors experienced nearly simultaneously and was expedited by their competition.

TELEGRAPH

As the tale goes, Samuel Morse was having dinner with friends and debating electromagnetism (like you do) when he realized that if an electrical signal could travel instantly across a wire, why couldn't information do the same? Like most fun eureka stories, L-Telegraph1.pngit's a fib. The telegraph was invented by not only Morse, but also Charles Wheatstone, Sir William Fothergill Cooke, Edward Davy, and Carl August von Steinhiel so near to each other that the British Supreme Court refused to issue one patent. It was Joseph Henry, not Morse, who discovered that coiling wire would strengthen electromagnetic induction. Of Morse's key contribution -- the application of Henry's electromagnets to boost signal strength -- Lemley writes that "it is not even clear that he fully understood how that contribution worked."

TELEPHONE

Like Morse, Alexander Graham Bell invented a technology that would later bear his name. But how much did he deserve it? The problem that Bell solved was to turn electrical signals into sounds. But this was such an obvious extension of the telegraph that there were many people working on it. Philip Reis had already designed a sound transmitter in 1860, and Hermann Ludwig Ferdinand von Helmholtz (one guy) had already built a receiver. Bell's real contribution was "to vary the strength of the current to capture variations in voice and sound," Lemley writes. In this tweak, he was racing against Thomas Edison. Even Bell's final product -- which combined transmitter, fluctuating current, and receiver -- had company. Elisha Gray filed a patent application on the exact same day as Bell, only to lose the patent claim in court. Lemly's conclusion: "Bell's iconic status owes as much to his victories in court and in the marketplace as at the lab bench."

Light_bulb.pngLIGHT-BULB

As just about everyone is taught, Thomas Edison invented the light-bulb. And as just about everyone later learns, Thomas Edison in no way invented the light-bulb. Electric lighting existed before him, incandescent light bulbs existed before him, and when other inventors got wind of Edison's tinkerings, they roundly sued him for patent infringement. So what did Edison actually do? He discovered that a special species of bamboo had a higher resistance to electricity than carbonized paper, which means it could more efficiently produce light. Edison got rich off the bamboo, and filthy disgusting rich from superior manufacturing and marketing of his product. But within a generation other inventors had developed better filaments and today's light-bulbs

THE MOVIE PROJECTOR

Most of these stories here are about how we mistake incremental improvements for eureka moments. But the story of the movie projector is simpler. It's basically a story about theft. Francis Jenkins built what we consider the ur-instrument of the motion-picture industry with a projector that showed strips of films for 1/24th of a second, creating the illusion of moving pictures. But his financial backer stole the Jenkins prototype and sold it to a theater chain, which called it the "Edison Vitascope" for no better reason than the word Edison was familiar and useful for branding. That Edison was tinkering with his own movie projector is true, but besides the point. His legacy here was mostly the work of a thief.

THE AUTOMOBILE

1885Benz.jpgToday's cars bear the names of their founders and innovators: Benz, Peugeot, Renault. But have you ever heard of a Dodge bicycle? Or a Mercedes tricycle? In fact, both companies specialized in bikes before moving the autos. The car industry represents the epitome of incremental innovation. Take a tricycle. Add an engine. You've got a car. (Just look at the picture to the right, of the the original Benz Motorwagen from 1885). Condensing the invention of cars to those six words leaves out a lot of detail and a few main characters. It was Gottlieb Daimler and Wilhelm Maybach who designed the first four-wheel car with a four-stroke engine and Henry Ford who perfected the assembly line. But the long story short is that the car was a typical "invention" that was far too complicated for one person to conceive on his own.

399px-Wright_Flyer_III_above.jpgTHE AIRPLANE

Speaking of building bikes, that's exactly what Orville and Wilbur Wright did before they became the first team to fly a heavier-than-air machine. But, as we've learned, every great inventor stands on the shoulders of giants. When the Wright brothers asked the Smithsonian for all available information on the history of flight in 1899, they opened a history that had begun with DaVinci's scribbling and continued all the way to the 19th century gliders of Otto Liliental. But the Wrights solved one of the most nagging problems facing airplane developers -- stability -- by having "a single cable warp the wing and turn the rudder at the same time." That was the tweak that put the first plane in the air.

TELEVISION

The "Farnsworth Invention" was named after Philo T. Farnsworth, the nominal father of television. But his invention was neither his nor an invention. Teams of scientists and tinkerers all around the world were working to build, essentially, a radio for images -- i.e.: to combine the technology of a wireless telegraph with the magic of a movie projector. One key was the cathode ray tube, a vacuum with an electron gun that beams images onto screen that can receive or transmit signals. But the cathode ray tube itself has so many fathers that it's difficult to say exactly who invented even the central organ of the television, much less the television itself. In 1927, Farnsworth projected a straight line on a machine he called the Image Dissector, which is truly the basis for the all-electronic television. But, unlike Edison, he was not as gifted at marketing, producing, and becoming a household name for his tweak. "It may be accurate to describe Farnsworth as an inventor of the television, but surely not as the inventor," Lemley writes.

***


At the end of this section, Lemley lists four inventors who, yeah, okay, really were alone. But the funny thing about the exceptions is that they're almost all accidents.

Alexander Fleming discovered the anti-bacterial properties of penicillin because a sample of bacteria had accidentally been contaminated with mold. No one is sure where the mold came from; Fleming's discovery was true serendipity. Even in that case, there is some evidence that others made the same accidental discovery. The adhesive behind the Post-It note was developed in 1968, and languished in 3M for six years before a different 3M employee hit on the idea of putting it to use attaching a bookmark to a book.

Charles Goodyear discovered vulcanized rubber when a batch of rubber was accidentally left on a stove; Goodyear had previously thought that heat was a problem for rubber, not the solution.

Wilson Greatbatch developed the pacemaker when he accidentally grabbed the wrong resistor from a box when he was completing a circuit.

Louis Daguerre invented film when, having failed to produce an image on an iodized silver plate, he put the plate away in a cabinet filled with chemicals and the fumes from a spilled jar of mercury produced an image on the plate.
It would seem that eureka is Greek for "oops."

Images above from top: the cotton gin patent by Eli Whitney; a Morse key; Edison's patent; the Benz patent; the Wright brothers take off, 1905; All credit: Wikimedia Commons

How do you tell when the news is biased? It depends on how you see yourself (by Jonathan Stray)


Jonathan might be Stray, but he's certainly not LOST.


How do you tell when the news is biased? It depends on how you see yourself
Does the quest for balance in news stories open journalists up to claims of bias? It’s all about the framing.
Email


Take a moment with the headlines from this screenshot of The New York Times homepage from January. Really — it’s a little experiment. Click the image above for a larger view if you need to.

Ready?

How did you feel about these headlines? Does it matter to you to learn that they actually came from Fox News on the same day? (Screenshot for proof.) This faux home page was created by Dan Schultz, the MIT grad student also responsible for Truth Goggles, using his NewsJack point-and-click “remixer.”

Knowing what you know now, do these headlines seem different to you? If so, you’ve just proved that we detect and judge bias based on things other than what journalists actually write.

This effect has been noticed before. At the University of Michigan, William Youmans and Katie Brown showed the same Al Jazeera English news clip to American audiences, but with a catch: Half saw the news with its original Al Jazeera logo intact, and half saw the same video with a CNN logo instead. Viewers who saw the story with the original Al Jazeera logo rated Al Jazeera as more biased than before they had seen the clip. But people who watched the same footage with the fake CNN logo on it rated CNN as less biased than before!

Does this mean that we judge “bias” by brand, not content? Many people have tried to define what media bias is, and attempted to measure it, but I want to try to answer a different question here: not how we can decide if the news is biased, but how each of us actually does decide — and what it means for journalists.

The hostile media effect

During the Lebanese civil war in 1982, Christian militias in Beirut massacred thousands of Palestinian refugees while Israeli solders stood by. In 1985, researchers showed television news coverage of the event to pro-Israeli and pro-Arab viewers. Both sides thought the coverage was biased against them.
This effect — where both sides feel that a neutral story is biased against them — has been replicated so many times, in so many different cultural settings, with so many types of media and stories, that it has its own name: hostile media effect. The same story can make everyone on all sides think the media is attacking them.

Like a lot of experimental psychological research, the hostile media effect suggests we’re not as smart as we think we are. We might like to think of ourselves as impartial judges of credibility and fairness, but the evidence says otherwise. Liberals and conservatives can (and often do) believe the same news report is biased against both their views; they aren’t both right.

But why does this happen? Specifically, why does it happen for some stories and topics and not others? Discussion of climate change often provokes charges of bias, but discussion of other hugely significant science stories, such as the claimed link between vaccination and autism, usually produces a much smaller outcry.

You see bias when you see yourself as part of a group

Communications researcher Scott Reid has proposed that we can explain the hostile media effect through the psychological theory of self-categorization. This is a theory about personal identity and group identity, and it says that we “self-stereotype,” placing conceptual labels on ourselves just as we might make assumptions about other people. We all have multiple identities of this kind: gender, age, political preferences, race, nationality, subculture, and so on.

To test this, he performed a series of recently published experiments with American students. In the first, he used a survey to ask people whether they thought the media was biased, as well as their personal political orientations, both on a numerical scale from liberal to conservative. The catch was different groups got different cover pages with different sets of instructions. The first set of instructions was neutral:
The purpose of this questionnaire is to get your views of the news media in general.
The second set of instructions was designed to play up feelings of partisanship:
In recent times the differences between Republicans and Democrats have become highly polarized. Many of the issues discussed in the media are seen very differently by Republicans and Democrats. In this context, it is important to gauge people’s views of the media.
The third set of instructions was also designed to reinforce an identity, but in this case an identity that might be common to both liberals and conservatives — that of being an “American” versus the rest of the world.
With increasing globalization, it has become apparent that the media differs across countries and cultures. Al Jazeera has become the voice for much of the Arab world, both within the United States and in the Middle East. Given these changes, it is important to gauge people’s views of the news media in the United States.
And, oddly enough, the same survey gave different results, depending on the instructions:


Each of the lines on this graph shows how people’s perception of bias varied with their political orientation. The downward slope means that the more conservative someone was — the farther to the right on the “political position” scale — the more they perceived the media as hostile to Republicans, just as expected.

The surprising thing is that the strength of this perception depended on the framing each group had been given. When people were prompted to think about Republicans and Democrats, they perceived more media bias against their views, as indicated by the steep dashed line. When they were instructed to think about America vs. the world, they perceived slightly less bias then the neutral condition, as indicated by the shallow dotted line. Our perception of bias changes depending on the self-identity we currently have in mind.

This self-categorization explanation also predicts that people who are more partisan perceive greater bias, even when the news is in their favor. In Reid’s second experiment, people read an article about polling numbers for the 2008 presidential primaries, containing language like “among Republicans, former New York mayor Rudy Giuliani maintained a 14-point lead over Arizona Sen. John McCain for the Republican presidential nomination,” and similar statements about the Democratic candidates. This time, the source of the information was manipulated: One group saw the poll attributed to the “Economic Policy Institute, a Democrat think tank and polling agency,” while the other was told it came from the “American Enterprise Institute, a Republican think tank and polling agency.”

In this purely factual scenario — dry-as-toast poll numbers, no opinions, no editorializing — respondents still had completely different reactions depending on the source. As you might expect, people who believed that the poll numbers came from the American Enterprise Institute thought that the story was biased towards Giuliani (and vice versa), confirming the hostile media effect. But the perception of favoritism increased not according to whether the reader personally identified as Republican or Democratic, but on how strong this identification was. The implication is that if you feel strongly about your group, you’re likely to see all news as more biased — even when the bias favors you.

Reid’s final experiment tested perceptions of overt attacks. He used a scathing review of Michael Moore’s Fahrenheit 911, originally published on Slate, which begins:
One of the many problems with the American left, and indeed of the American left, has been its image and self-image as something rather too solemn, mirthless, herbivorous, dull, monochrome, righteous, and boring.
The copy given to subjects (falsely) claimed the author was a member of either a Democrat or a Republican think tank. (In reality, the author was the late Christopher Hitchens.) As you might expect, people who identified as Republicans saw the review as more neutral, regardless of who they thought wrote it. The strange thing is that strong Democrats actually saw the review as slightly in favor of Democrats when they believed it was written by a Democrat! We interpret criticism completely differently depending on how we see the relationship between ourselves and the author.

What’s a journalist to do?

The first defense against accusations of bias is to report fairly. But the hostile media effect pretty much guarantees that some stories are going to be hated by just about everyone, no matter how they’re written. I suppose this is no surprise for any journalist who reads the comments section, but it has implications for how news organizations might respond to such accusations.

This research also suggests that the longstanding practice of journalists hiding their personal affiliations might actually be effective at reducing perceived bias. But only up to a point: To avoid charges of bias, the audience needs to be able to see the journalist as fundamentally one of them. This might require getting closer to the audience, not hiding from them. If we each live inside of many identities, then there are many possible ways to connect; conversely, it would be helpful to know, empirically, under what conditions a journalist’s politics are actually going to be a problem for readers, and for which readers.

We might also want to consider our framing more carefully. Because perceptions of bias depend on how we are thinking about our identity in that moment, if we can find a way to tell our stories outside of partisan frames, we might also reduce feelings of unfairness. The trick would be to shy away from invoking divisive identities, preferring frames that allow members of a polarized audience to see themselves as part of the same group. (In this regard, the classic “balanced” article that quotes starkly opposing sides might be a particularly bad choice.)

Encouraging the audience to perceive itself as unified — this seems simplistic, or na├»ve. But the consideration of identity is foundational to fields like mediation and conflict resolution. Experimental evidence suggests that it might be important in journalism too.

Free Will in the Land of the Blind



Ready for it?  You don't really have free will.  At best you have semi-free will that is heavily influenced by factors beyond your control, whether you're conscious of it or not.

It's a topic that makes people squirm (or get angry); the idea that you don't have internal control can be a terrifying one.  How many people are more comfortable with the notion of a War of the Worlds than they are with the concept of Invasion of the Body Snatchers?  There's a reason why it's the Zombie Apocalypse we fear most.

The truly amusing part is that every hyper-confident element of society thinks they're exempt from this risk; they have it all figured out, they're the one-eyed kings, the Shauns of the Dead.  When disaster strikes, they're the ones who will escape - not be part of the zombie horde.  It rarely occurs to them that maybe, just maybe, there's another veil to be lifted, another connection they haven't quite made, but could.  Yet almost cyclically society becomes so fearful that these folk, the people in whom we have placed our trust don't have the answers that even they start to doubt that anyone knows what will happen next.

A bit of doubt is a positive thing; it keeps you questioning, challenging, pushing boundaries and peeking around corners.  The more conscious you of are of the things that shape your perception, the more clearly you see; it's through that level of transparency we truly understand who we are and what we're part of.  It's not the destination, but the journey, etc.

We don't live in a two-dimensional world of black and white; life is more like a sphere reflecting the full spectrum of light.  If you go inward deeply enough, you see where everything connects, at what Lao-tzu called Tao.  Science is still trying to quantify that space through the quest for a Theory of Everything.

In terms of cognition - it's there, at the centre, that you don't gain control but become control.  From the centre, all things are possible:


That's a path open to all of us - we just need to be conscious of the terrain.

UPDATE 15/11/14:  Fear the stirogi, fearsome not-quite-humans with a hive mind?  Exactly.



Creative Destruction: The Cognitive Dissonance of the Political Right



The Far Right advertise themselves as champions of free speech. They might even be inclined to say "I do not agree with what you have to say, but I'll defend to the death your right to say it" (although I don't know how many could tell you where the quote came from). It's funny, though - when others say things that don't agree with these Libertarians of Speech, it's not that uncommon for them to dismiss or insult those opinions. It's all competition, fair game, results in stronger opinions, they will say.  In business, if you can't answer so what and do so in 30 seconds, youre opinion has no value; you have no value.
But then, you have a Federal Conservative Government that is proactively stifling opinion (evidence-based opinion at that) that disagrees with their talking points. Of course, that's different - that's about austerity. There's no more money to be had, so we need to rein in spending.
But, if we're reining in spending, why the complete bungle that is the F35 scandal? Not only did the Harper government try to stifle the facts - they right-out lied about them. That's not very free-speechy. Then you've got John Baird announcing increased funding to a global counter-terrorism fund - that doesn't seem very austerity-minded, either. Maybe it's all relative.
But again we have John Baird telling us we should stop pretending that everything is relative. Ah, but he was referencing terrorism, so maybe it's okay to be relative in, um, context.


But there is no such thing as context - Vic Toews made that clear to us when he said you're either "with us of with the child pornographers."

The CPC, you see, is all about protecting Canadians from external (like foreign money - not very free-trady of them) and internal (like socialists and separatists) threats in, I guess, the name of free, democratic, Canadian expression.  Except, John Baird proposed going over the heads of our democratic system and Stephen Harper has invited a well-recognized international threat onto our shores.


Is it just me, or is there a growing logic gap here?
The CPC and much of their base support are growing more frustrated by the hour with the way the world is playing out around them. They're externalizing this mounting anxiety onto others but really, they need to start looking inwards.

At the root of these confabulations of logic and the growing malaise within the CPC (and the country) is the notion of cognitive dissonance - when you have two beliefs that come in conflict. The Tory braintrust is very much in conflict between what they logically know is right and what they feel, emotionally, is right.

Fortunately for them (and all of us) there's a release mechanism out there that can ease the tension and help square the circle around this cognitive conundrum. It'll cost a bit, but less than some of their other recent spending sprees.  It'll also help the economy, improve security and start tackling the big challenges of today, many which are communication-based.  It would even provide a political win, which they sorely need.

We can only open the door for them, though - it's up to Harper and Co to walk through.


Obamacare, Justin Trudeau, Stability and Confidence




 What's more, there's another way of possibly interpreting the health-care decision, from a macroeconomic perspective, which is that maybe the Supreme Court ruling removes one very large uncertainty from the discussion.


This is one of the biggest, most head-smacking confabulations we come across in society.  People on the far right of the political spectrum are all about independent strength, competition, etc.  Direct, free-market competition is what drives economies forward, they say.  But then they talk about investor confidence and the need for measures of certainty.

Direct competition leads to nothing but uncertainty.  It's why campaigns matter.  Markets don't like uncertainty, challenges, etc; they want sure thing investments.  Safety nets provide stability and stability is what feeds confidence.  In fact, the private sector is all about stability; that's why they plan, poll and invest in long-term benefits by doing such things as donating to politicians.  They want to know what's around the corner and if they're smart, they try to stack the deck in their favour through outreach initatives that, broken down, look an awful lot like altruism.

Despite what the Right likes to tell itself, politics isn't about strength of the fittest, nor are job competitions, modern warfare, etc.  Even in a one-on-one-fight, strength isn't everything.  There's no question that Patrick Brazeau was physically stronger than Justin Trudeau, but he got his ass handed to him.  Why?  Easy; by focusing on strength, Brazeau discounted such things as endurance and Trudeau's tactical advantage of reach.

Trudeau won despite the physical odds because he planned ahead.  He trained against opponents of Brazeau's size, he focused on endurance, all the while knowing that his opponent wasn't engaging in any kind of strategic planning at all.  In short, Justin Trudeau studied the ground, understood himself and his opponent, and planned accordingly.  Stephen Harper did the exact same thing - he didn't win because of attack ads, he won because he courted demographic groups the Liberals were ignoring.  It was because the Harper Conservatives proactively invested in relationships with broader swaths of Canadians (while the Liberals were becoming "efficient" in that they were neglecting increasing chunks of their base) that success was acheived.

There isn't a single example in history of one strong, stagnant, oppressive regime that has lasted forever.  It's simply not a possibility.  The more any government (of any stripe) tightens its grip, the more control slips through its fingers.  Reactive, individual competition is taxing.  Proactively, it's strategic collaboration that leads to growth. 

If you want to go fast, go alone - but if you want to go far, move



And yes, I did go out of my way to find a picture of Obama with the caduceus.  I do love me my metaphors...

Wednesday, 27 June 2012

The Global Mental Health Crisis: A Bubble Soon to Burst?



It has been recognized there is a global mental health crisis.  The OECD has mental health very much in mind.  Countries like Australia and the UK have recognized the fundamental truth that you can't have health without mental health; the mind, after all, is simply another system in the body.  Here in Canada, we have our first national mental health strategy - from a government that doesn't believe in nationalized anything.

That shows how seriously people at the highest levels take this emerging crisis.

Yet, economic times are tight and when there really is no more money to be spent, services like mental health are among the first to get trimmed.  Worse still; on the whole, people still don't get mental health.  The recent CAMH Defeat Denial campaign brilliantly sets out the common responses people with mental illnesses have.

Adding to the challenge - the way we look at work is completely counterinuitive to the way mental health functions.  Micro-managing bosses are convinced that their approach is both right and necessary; they have the gold, they make the rules, employees owe them for the right to work and therefore must be at their beck-and-call 24/7, since work cycles now run without end.  Have you ever told a micromanager that their "management style" is actually detrimental to business functioning and damaging your state of mental health?  If you don't get fired, you'll probably hear one of the defeat denial lines.

Which is why people suffer in silence.  Nobody wants to be judged and know that if they speak out, that's exactly what will happen.  Heartbreakingly, that's often as true for the micromanagers (see micromanaging disorder) as it is for the micromanaged.  As we're afraid of labels or are concerned about ourselves or the nation becoming overmedicated, we simply try to ignore the realities of mental health.  When we do that, we're missing the obvious, affordable but ultimately frightening solution that's really all that's left to us: culture change.

As a society, we need to rethink the way we view mental health - which will mean challenging some confabulated notions we have about identity and self-control.  Neurscience has already got the information - we just need to accept and internalize it.  We need to revamp the way we look at work and motivation.  We also need to understand cognitive development more closely and keep how individual brains work in mind as we develop our next models of education.

In short, we need to stop focusing on curing lead poisoning and start taking the lead out of our water.  This is all possible - we can make our systems work more efficiently, fostering better outcomes - if we do so consciously and proactively.

That's the challenge of the 21st Century - our cognitive equivalent to the labour movement that stemmed from the Industrial Revolution. 




Tuesday, 26 June 2012

Musings From the Subway: Was Ayn Rand a Prisoner of Her Own Limitation?

Because I've been quoted Ayn Rand a couple times in the past few days, because I've got some time due to a canceled meeting and because I entertain easily, I am currently thinking about the concept of free speech. As I tend to think expansively, this also means considering freedom of expression and freedom of choice.

What is speech? Speech is a verbal form of communication. What is expression? Expression is the broader category of communicative tools, including body language, art, music, etc. What is choice? Choice is the ability to pick between diverse options to (attempt) to acheive a desired (conscious or subconscious) outcome.

But back to speech. For language to serve its purpose, it has to have structure. Sounds are ascribed to letters (or characters), concepts are defined by words, grammar puts the pieces together in comprehensible ways. There's only so much you can play with that structure if you want to maintain your intent. For instance, if I want to tell you "the dog bit the man" but choose to alter the word ordering and say "the man bit the dog" I'm not communicating my intent very well.

Now, the more words you know and the better you understand the nuances of grammer, the more freedom of expression you have. I could say "the big dog bit the frightened man" or "the man was bitten by the dog" - same intent, but expanded meaning or a different grammatical flavour. I could even go further - I could say it in a different language, say, Spanish.

A problem emerges there, though, because whereas "dog" implies no gender, I can't say the word in Spanish without gender clarification; dog will either be "perro" (masculine) or "perra" (feminine). Perra has broader meaning in Spanish, just as "bitch" does in English. If I say, "the bitch bit the man" I'd better have some context to be placing the phrase in, or I'm REALLY going off-message. Even within choice, there are certain constraints. And if I speak English, but NOT Spanish? Joder.

When you get down to it, we aren't really free to communicate our intent exactly - my ability to express myself is limited UNLESS I choose to learn Spanish. While that requires some discipline and training, learning a new language expands by ability to communicate - not only do I gain access to new audiences to speak freely with, but the variances of language will provide me with new ways of looking at concepts + again, expanding my options (choices) for expression.

But if I don't want to learn a new language? If my choice is to NOT expand my ability to express myself because I don't feel like I should have to learn a second language, what then? That's kinda limiting, but there are still options. Let's go back to the drawing board - literally. If I draw a picture of a man biting a dog, that solves a lot of my problems. It doesn't matter what language you speak, a picture is universal. A dog biting a man as a picture looks the same to a Masai tribesman as it does to the guy sitting next to me on the subway. So long as I have tools to draw, I can communicate with anyone.

That is, if I know how to draw. Problem is, art is like speech - it's not inherent, it's learned. I need exposure to materials, opportunity to practice and, of course, guidance helps - nobody learns their first language completely on their own. Without training, if I'm obstinate, I could probably come up with something serviceable. But - what if the person I'm trying to communicate with doesn't only speak English, but is blind? I'm stuck again.

Unless, of course, I think three-dimensionally. If I can't speak to the person, I can't draw for the person, perhaps I can create a bas-relief (from the French) that allows the person I'm trying to communicate with to FEEL what I'm trying to communicate. If I'm not used to thinking outside the box, though - if I'm starting off with a limited number of communicative tools - that idea just might not occur to me.

There is always, always a way to communicate - it just takes a bit of experience, some creativity and a bit of training. Authors like James Joyce or Jack Kerouac broke literary convention and found ways to more fully express the textures of their intent. Film makers like Quentin Tarantino or Chris Nolan play around with the order of story within the three-act structure of a screenplay. Picasso broke ground by inventing several styles of art and even restructuring images like faces to capture more mood than literal representation. But each started within an established structure.

Of course, each of these individuals had specialized training, too; they learned their craft inside and out and voraciously consumed everything they could, both within their own field and beyond. They didn't ignore the existing structures; they mastered them, then added new facets. Along the way each also had relationships, partnerships, teachers and mentors. In one way or another, they were part of social institutions that allowed them access to the tools they needed to innovate and express.

These artists made their living off the sales of their works - but where did they sell them? To who, through which processes? Again, they ascribed to a system that included supply-and-demand, demand-generation through networking, etc. None of their efforts were laissez-faire (there's that French again); each was proactive, driven. Each of the systems these artists ascribed to, including language, was governed by a set or rules that while flexible, were necessary. Those systems provided options; options provide a greater variety of choices and choice, ultimately, is what allows for true freedom of expression.

Of course, you could very well decide you don't want to communicate with a Spanish-speaker (and instead ask if they speak American). You could choose to avoid deaf people, or people who you don't understand or don't agree with you (and call them dumb for not getting you in the first place). Really, you could choose to firewall yourself off from everyone that challenges you to think beyond your existing limitations, if that's your choice. But isn't that a choice that limits your options to choose and therefore express yourself?

If "freedom" implies the ability to express oneself freely, one would think that implies some sort of structure (family, community, society, language, commerce, the church, the state) and exposure (language, science, art, culture, math, etc) to nurture the tools of expression. It's the absence of supportive structure that's truly limiting.

I think the big mistake Rand made was in assuming that structures are imposed. She thought that people were independent units, completely separate from each other and that society was a ruse-word for domination. Yet, I don't think she grew her own food, knitted her own clothes or kept secure her own borders. She certainly didn't invent her own language.

Fundamentally, everything is structure. Ecosystems, solar systems, respiratory systems are all mini-societies that follow prescribed rules. They have to, otherwise they wouldn't function - just like language.

The route to true freedom of expression isn't through limitation, but access. The more you learn, the more you know. The more you collaborate, the greater your number of options. The more options you have, the greater becomes your suite of choices and THAT, truly, is what freedom (and society) is all about.

Perhaps if Rand had expanded her horizons a bit, she could have realized this. Alas, had anyone tried to explain this to her, she probably would have told them to fuck off.

Harper is No Lion



Funny how things connect.  I had just finished my last post about, essentially, the biology of behaviour and how aggressive behaviour requires broader energy conversation to be sustainable when I read the above piece on Harper's continued efforts to shrink government services yet expand personal control and institutions of security.

Neo-Cons want people to be tough or fade away; therefore, they reduce services expecting that the strong will survive.  Not much thought is given to those who are less able under whatever circumstances, but neo-Cons tend to be limited in scope like that.  Funny enough, Neo-Cons see the national defense and aggressively positioning Canada on the world stage as the only legitimate roles of government.

The more virulent these folk get, the more enslaved they become to their own genetics. 

Personally, I wouldn't call that freedom.

Rethinking Work: The Impact of Occupational Mental Health on Productivity

  








Of course, conventional wisdom tells us this is all nonsense. Depression and anxiety are excuses, not sicknesses - people just need to get over themselves. In fact, working through anxiety and depression makes you strong on the other end.

When it comes to micromanagement, hey; tough managers are effective managers; employees need close oversight, otherwise they'd be the bosses, wouldn't they? Life is tough; if you can't rise to the challenge then you're probably in the wrong environment. I went through the same thing - look how I turned out. Some folk are just in denial about the way the world really is.

And yet, study after study indicates we are facing a rise in work-related mental health concerns. You might even call it a global business mental health crisis. Countries around the world are trying to find reactive ways to tackle this elephant in the room; Canada has just released it's own plan for a national mental healthcare strategy. The question largely being ignored is this; iis there a correlation between the rise in workplace mental illness and the increasing amounts of cognitive work people are being given to do under increasingly tight deadlines? Is it the people or the nature of work that's the problem?

I once had a baffling conversation with a woman in business who's partner, also in business, was facing heart health concerns. This fellow works ridiculous hours and has an enormous set of expectations put on him. He also tends to drink excessively, but that's as much part of his work culture as anything else. This woman, wickedly smart, incredibly strategic and prone to making informed decisions told me she'd done her homework and found there was no significant correlation between stress and heart health.  She was worried about her partner's well-being, yet determined the stress of his (and her) lifestyle was not a factor.  I had looked at the exact same data she had and come to a completely different conclusion; she was going to take her counsel over mine, though, because she had a track record of success to fall back on.  When you're used to being right, you're not likely to consider the advice of someone with less related experience - if anything, you'll seek the advice of consultants (but might not necessarily follow it). 
 
 
Since then, I have done even more research (largely listed in the E-Library column to the left of this blog), stretching beyond mental illness and including cognitive processes, how and why the body reacts to physical stimuli and even mental health in other animals.  I wonder what my friend would have said if I had told her that zebras don't get ulcers?  It's kind of a human thing.


This is the piece that gets left out of the conversation - life, in general, is meant to be stressful, but only in short burstsPredators spend most of their time resting, conserving energy for the physical toll hunting takes on them.  The same holds true with prey; stress is a short-term burst of hormones like cortisol and adrenaline that allows them to survive an encounter or not, after which they can either go back to relaxing or are dead.

The human world of work is increasingly moving away from that model.  In the Industrial Age, workers would clock in, do their shifts, take their breaks and then go home.  There was a clear divide between work and home life; work itself was more about one-off activities rather than complex, long-term commitments.  Writing a note or building a widget is one thing; seeking out new business opportunities, creating long-term strategies and maintaining networks is another beast completely.  Add to this gridlock, picking up the kids, making it to the grocery store, etc, etc; we have developed a work/life system of non-stop stress.  It's clearly not a sustainable model.

So, why are some people - say, 1% of the population - so willing to clinging to it?  To me, it seems to go back to biology.  In a survival-of-the-fittest model, we assume the environment isn't the problem; if you're tough, you'll manage and if not, you don't.  Until a tipping point gets reached where a sufficient proportion of the population falls below the cut, we can still keep telling ourselves that it's a success thing, not a culture thing.  Those at the top are comfortable in the model; like a lion resting after a full meal, they see no need to expend the necessary energy to change a system that is working for them just fine.  Sadly, it takes until the social unrest at the bottom is so severe that the people at the top will face enough disruption (cortisol) to look at alternatives, while still trying to preserve their relative positions.  Don Tapscott calls this the burning platform - personally, I like the metaphor of the boiling frog.

 
The fact is, the platform is burning and unrest is bubbling to the surface.  Worse, if you're in business, you're bottom line is being impacted.  There's only one way to get ahead of this problem; by rethinking work, revising the way we understand mental health and beginning to consciously consider the collective impact of our actions.

If you don't like it, too bad - you're just gonna have to get used to it.