Throughout most of the history of human civilization and society, mankind has had the understanding that we only have limited insight into cause and effect in the world. People have mostly seen this in a religious context, where God or gods of various dispositions controlled aspects of the world leaving mankind either to deal with minor things or accept that we only have limited control. With the rise of rationalism and the successes of science, mankind gradually came to believe that man could know everything worth knowing. Furthermore, the belief came to be that the world is ordered and can be controlled; goals can be set, plans made and with proper execution, be brought to fruition.
We see this in public life, where there often is a general expectancy that the government can resolve deep issues by just deciding, planning and allocating funds. In management and organization we see this belief in the pervasive fascination with neo-Taylorism as described elsewhere.
However, the cracks in this world view are now becoming evident; in fact, they have been for a long time. After two world wars in the 20th century and the uncertainties revealed by modern science, it has dawned on people that perhaps our knowledge is not as conclusive and just because we make and proclaim a plan, it does not necessarily follow that things will go that way. Add to this the disturbing fact that our cognitive capabilities are unreliable. Human decision making is often flawed because of cognitive biases. Wikipedia lists 183 biases that we often suffer from.
“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance”
Daniel Kahneman in “Thinking Fast and Slow”
Mark Twain put it in more blunt and plain words 150 years ago:
It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.
Elsewhere, we have written about the rapid changes, uncertainties and challenges that face organizations today and how the frequency of such events have increased significantly; they are now the norm for many organizations.
All in all we have to face complexity in decision making again
One person whom has contributed to an understanding of complexity more than anyone I know is Dave Snowden. Snowden is a Welsh scholar (never refer to him as being English) with one foot in theoretical Physics and another in Philosophy. Having graduated and realized that nobody employs philosophers (hasn’t happened since the time of the ancient Greeks), he ended up working for IBM where he started developing Cynefin; a decision making framework and a model for understanding context and choosing appropriate actions. Since then he has created the company Cognitive Edge and has continued the development of Cynefin. There exists a lot of material on Cynefin.
In 2007 an article was published in Harvard Business Review (HBR). It was a landmark in the application of Cynefin, as it gave a lot of practical advice for how leaders should behave in the different contexts; what Cynefin defines as domains. Snowden has since been a frequent speaker at Agile and Lean conferences and other events. In a way he has given us, not exhaustive, but comprehensive knowledge about how to make sense of things and why we need to work the Agile Lean way when we have a large dose of complexity in our portfolio of work. In the HBR article he says:
The Cynefin framework helps leaders determine the prevailing operative context so that they can make appropriate choices. Each domain requires different actions.
In a blogpost from 2014-12-18, he elaborates:
The whole point of sense-making is make sense of the world so we can act in it. It’s not about restricting everything to engineering diagrams. Coherence and consistency are key, but conformity? Consign that to the flames.
Cynefin basics
Cynefin is a multi-faceted concept which can be used to make sense of many different things, including politics and other life scenarios. For the current purpose we will focus on trying to understand how to work with items on a backlog, things we would like to have and values or goals we would like to achieve. These Items need solutions; we need to try things out to see what works and what doesn’t. We also need to prioritize; what are the best possible ways of spending the next periods of time? Cynefin can help clear our thinking here. Items can require very different ways of finding solutions.
Cynefin operates with five domains. These domains are not fixed but they emerge as an understanding in the group or team being dealt with. It is therefore highly contextual and as a result, oversimplifying it will kill it. A different group or team would have a different set of knowledge constraints and consequently, different domains.
The typical drawing of Cynefin is shown below with a couple of extra details. To the right of the drawing are the ordered domains, where we can see a cause and effect relationship. To the left are the unordered domains, where there is only fragmented knowledge or close to no knowledge about cause and effect. In the middle is the dreaded black hole of Disorder. Let us discuss the domains first:
The Domains
- The Obvious (previously known as Simple) is an ordered domain, where cause and effect exist. Any reasonable person can see it and see what needs to be done. There are rigid constraints, rules and regulations. “Best Practice” can exist here and is legitimate. New items are attacked through “Sense, Categorize and Respond”. We see what is coming, decide which category of the known ones it belongs to and act accordingly because we know already.
- The Complicated is also an ordered domain, but it is difficult to see what needs to be done. Analytical methods have to be applied or experts found, who have built up this special knowledge (like doctors being able to interpret x-ray pictures). As a result, a solution can be found. There is a manageable set of “Governing constraints” that the right expertise can discern. In this domain the appropriate term is “Good practice” depending on the expertise at hand. The line of attack is “Sense, Analyze and Respond”. We see what is coming then analyse or find expertise before we act.
- The Complex domain is something quite different. There are typically many actors and systems that interact and change each other; only fragmented knowledge seems to be available and there are multiple “Enabling Constraints”. Responses are deeply non linear; even small changes can have large impacts. Sometimes, coming through a situation, a pattern of cause and effect can be seen but upfront a clear picture is not available; knowledge has to be gathered en-route. In the Complex domain there is emerging practice – a combination of something known before, but also something brand new. The way to find a solution is “Probe, Sense and Respond”. The right way is to think in terms of multiple and parallel, experiments that will reveal what options will work best. We perform “safe-to-fail-experiments”, we do not produce fail-safe designs. These belong to the Complicated domain.
- The Chaos domain is a place where cause and effect cannot be seen. Nothing seems to relate to or indicate a solution; there is total absence of relevant constraints. It is necessary to move very quickly to see if a bold act can establish a constraint to stabilize matters and transform the situation into one of the other domains; typically the Complex domain so that sensible work can start. It has to be done very quickly because Chaos is a transient domain; if action is not taken to stabilize the chaos, the world will stabilize it in a way that will probably not be advantageous. “Act, Sense and Respond” is the modus operandi. Anything discovered will be completely novel.
- Disorder is the spot in the middle where the appropriate domain is not known.This domain consists of two sub-domains: the dark area in the center and the outer lighter area. The dark area is a sort of organizational black hole that sucks everything into it and radiates randomly. Snowden calls this “Inauthentic Disorder”. It covers the situation where people think they know where they are, but they really don’t. Solutions are therefore attempted the wrong way and people typically fall back on what they are accustomed to, such as the modus operandi of the domain they were in before. The lighter area is an place that can be entered into legitimately in order to try to shift to another domain; more on this to be discussed later. Inauthentic Disorder is a dangerous place to be at as people are lost and clueless but they don’t know it. A quote similar to the one from Mark Twain above is appropriate here albeit a bit more academic:
“The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge.”
Daniel J. Boorstin
The zone of complacency
There is another aspect of Cynefin which needs to be mentioned. It is not so much about the sense-making part of Cynefin, but more a part of what is typically called the Dynamics of Cynefin; the way to operate and move between the different domains. “The zone of complacency” is shown at the bottom of the drawing as a fold. It is the border between the Obvious and Chaos domains. Conceptually, it is like a cliff over which one can fall into Chaos.
The idea is that if an organization over-constrains situations by putting rules and regulations into place where they really are not legitimate, people start to believe that the rules and regulations are what matter the most. It can also manifest itself where either people are or an entire organization is so convinced of a theory that nobody further questions its validity – in which case it has become dogma.
People then stop listening for the weak signals of change and sometimes start thinking that past successes have made them immune to future failure. The organization is then said to be in “The zone of complacency” where everybody is very smug or even arrogant; typically inward focusing. Customers and suppliers are often the first to notice this condition.
In a crisis then, the organization can completely misread the signs, fail catastrophically and end up in Chaos. As Snowden puts it: “Recovery is very, very expensive”. Very rigid constraints are brittle and can suddenly break. Spectacular examples of catastrophic failures are available in abundance:
- Completely misreading the message sent by the introduction of the IPhone, Nokia was convinced they had “the” Smartphone with the consequence that they have now for all practical purposes, disappeared.
- Kodak dominated the analog film market, invented the digital camera, sat on a mountain of patents and yet did not make the transition to the age of digital photography. Kodak went the way of the Dodo and only emerged from bankruptcy after divesting all of its intellectual property.
- The “unsinkable” Titanic probably doesn’t need more explanation.
- Chairman Mao’s “Great Leap forward” in China from 1958 to 1962 took an estimated toll of 30-45 million lives and devastated the Chinese economy. It was a tragic and an absurd application of central planning without understanding of consequences.
- Perhaps less known but also tragic was Premier Khrushchev’s campaign to farm the “Virgin Lands” of Kazakhstan, starting in 1953. Initially a success, it failed after 10 years mostly because the monoculture of grain depleted the soil of necessary nutrients. The whole story is a vivid example of the impossibility of planning complex initiatives in Moscow without understanding the conditions in Kazakhstan or the basics of modern agriculture.
It will by now be obvious that the Zone of Complacency is to be avoided at all cost, but how can this be done? A couple of hints can be given here, but throughout this volume more will be added.
- If organizations foster a proper learning culture where people have psychological safety, there is a greater likelihood that dangers will be discovered and brought to attention.
- Retain some cynics in the organization. It is all too easy to fall into the trap of only having people who agree with the leadership but cynics are very precious; they don’t really trust anything a priori but will require a proper explanation and evidence. They also have somewhat of a disregard for consequences and will therefore speak up even when it would be more comfortable not to do so. They are real nuisances but necessary. In an Agile Lean organization there has to be a constant challenge to the status quo. There is of course a fine line between being a constructive cynic and just a grumpy, difficult curmudgeon.
Dynamics in Cynefin
The Dynamics in Cynefin are typically about what to do in different identifiable situations. Those really interested should read more about this from Snowden’s own pen in his blog posts and articles here… Interesting new concepts covering the borders between domains have been added lately. We will just cover the basic dynamics here:
- Oscillation between Complex and Complicated. This is the most important and frequent movement when working with a high content of complexity. Where there is a new challenge, parallel safe-to-fail experiments are carried out to determine what works. Knowledge is built up and movement is made to the complicated domain where solutions can be delivered with the newly acquired expertise. Quite often another challenge appears, and movement back to the Complex domain is made. In the Complex domain there is “exploration” (where knowledge is built up through experiments). In the Complicated domain there is “exploitation” (where knowledge which has been built up is utilized).
- Moving to Obvious. If we are convinced that something is really never going to change, we can move it down to Obvious, describe best practice and make checklists etc. but we have to be careful; it is a standard cognitive bias to believe that we are more in charge and know better than we really are and do. Stuff should only be moved to Obvious when we honestly can say: “If this ever is going to be different, we will start from scratch, we are not going to change this anymore.”
- Managed innovation. Sometimes we may want to trigger innovation in an organization. This can be accomplished by relaxing constraints; taking what Snowden calls a “shallow dive into Chaos”. Effectively, we let people loose in some areas; nudging them outside the comfort zone whilst maintaining a certain “sense of urgency” and perhaps a starvation of resources in addition to time. It is shown in the drawing above as a transition through the outskirts of Disorder into Chaos, the purpose being to discover completely novel practices or solutions leading back to the Complex domain in a managed way instead of getting lost in Chaos.
These basic ways of thinking and acting are relevant to most organizations.
Other takeaways from Cynefin
Cognitive insights
In the cognitive melting pot called Cynefin, some other insights worth remembering have emerged, here are some, quoted from Snowden with some editorial liberties:
- Knowledge can only be volunteered; it cannot be conscripted. Nobody can make someone share the knowledge they have. It is impossible to look behind the curtain of the human mind and measure if the individual indeed has shared his knowledge. Not even information given under torture can be verified to be exhaustive. It follows that when dealing with knowledge, work and innovation, which is what the nature of the Complex domain implies, then it doesn’t work to use force, fear and intimidation with people; they have to be treated more or less as – yes – volunteers.
- We only know what we know when we need to know it. Human knowledge is contextual, sometimes in the extreme. External triggers can sometimes open a floodgate of memories and knowledge. Sometimes people need to let challenges simmer for a while and new patterns emerge; a combination of something we apparently knew all along and something we just discovered. It follows that it is often necessary to create situations for people, where their knowledge is required and then suddenly it is there. One such example is the use of Planning Poker for estimation. Of course everyone is interested in the number, but the knowledge exchange that is a bi-product of the process is even more important.
- In the context of real need, few people will withhold their knowledge. Generally people are willing to share, unless they have been abused or taken for a ride before. In many traditional organizations that is sadly often the case, and it therefore takes quite an effort to build trust again before real knowledge sharing will occur. This also means, that it is no good just telling people to organize everything they know about a subject, there has to be a contextual need. This has implications for writing specifications about features and products, it cannot be done upfront, only in iterative dialog and context.
- Everything is fragmented. The human brain stores an astronomical number of pieces of information with fine granularity and it retrieves this information through a sort of pattern match, a fuzzy query. Sometimes the brain gives an answer as soon as there is a plausible and coherent answer based on snippets of previously acquired information stored; interestingly the more recent snippets are apparently weighted higher. It follows that in many cases it is a wasted effort to spend many resources on producing highly structured documents presenting a situation (specifications for example), because people rely and act on understanding through fragments anyway.
- Tolerated failure imprints learning better than success. People learn more from things that did not go the way they planned than from observing plans succeed. In a way this is an extension of the scientific method. When looking to substantiate a hypothesis, the scientist tries to come up with experiments that could disprove the hypothesis. Success is fine, but a failure typically adds much more specific knowledge. It is therefore important to develop a culture in an organization that allows experimentation and as a result of that, sometimes failure and then rapid learning.
- The way we know things is not the way we report we know things. There is solid evidence that people make decisions in an almost unknowable way, a fuzzy blend of heuristics, pattern matching and past experiences all happening in a split second. However we tend to present our decision making in a much more rational and structured way. We apparently like to think of ourselves as much more organized than reality warrants. This is also a standard cognitive bias, and we often have to use very diverse methods to extract a picture of how people really got to a certain decision or point of view.
- We always know more than we can say, and we will always say more than we can write down. It is a disturbing fact that the bandwidth gets narrower as we move from things we know inside, to what we can talk about and further to what can be written down. It follows, that in dealing with understanding of challenges and solutions, we have to try to hear many different people’s perspectives in verbal communication, not just in a written statement, to make sense of it all.
These statements, along with a host of other observations about our cognitive capabilities set the boundaries for how we choose to create organizations in the Complex domain. Among these boundaries are all the cognitive biases that we struggle with on a daily basis (see a full list here). Working in the Complex domain is a perpetual cognitive process of trying to make sense of things.
The use of narratives
When it comes to the practical application of Cynefin to actual make sense of things, it is usually done through narratives or storytelling. We let people tell stories and self- assess what these stories mean. Specifically, we do not let an expert evaluate, as they typically fail in the Complex domain.
There is a specific work-shop method for nudging a group or team to its own understanding of Cynefin, called “The four points method”, we will talk about that in a later chapter. It uses narratives or stories from their everyday lives and gradually adjusts the understanding until there are sufficiently clear lines between the domains.
Any civilization has stories that convey and transmit knowledge within the group or from group to group; in many societies this specifically includes transmitting knowledge from one generation to another. Monkeys groom each other to connect socially while human beings tell stories.
In Agile and Lean we use narratives all over the place. We try to build our vision up as a narrative, write specifications as narratives (often called user stories) and we use stories when trying to learn from past experience through retrospectives.
There is a much higher recall factor of information transmitted as narratives. I was told, that if you tell a group a story with a reasonably large amount of information in it, 24 hours later the group retains about 80% to 85% of the information transmitted. If the same information is presented as a structure like bullet points in multiple levels, the number is only 40% to 45%. The reference unfortunately escapes me at the moment, but there is a significant difference.
Application of Cynefin in ALL
The understanding of complexity is foundational to everything in Agile Lean Leadership. In fact we believe it is the most important trigger in being able to scale concepts like Scrum out in the whole organization. The fathers of Scrum (Jeff Sutherland and Ken Schwaber) did not specifically talk in terms of complexity, but there were hints in their choice of words. Basically they are ex-military people and said: “Scrum works, it is good for you! Do it!”. That is a good point, but later Dave Snowden came around and said: “Now let me tell you why it works, and why it is necessary”.
Here are a couple of important benefits of understanding complexity in Cynefin terms and applying its dynamics in an Agile Lean Organization. Throughout this volume we will keep referring back to the foundational material in Cynefin.
- The oscillation between Complex and Complicated is where Agile and Scrum lives, it is the border country where small iterations (Sprints) make sense. Iteration after iteration the whole Team makes an effort to refine the specifications and solutions of the work on the Backlog so that we set a goal for the iteration with a reasonable probability of meeting it. Effectively we are moving a small portion into the Complicated domain and then in the next iteration we are back into Complex and picking up some more. Not all uncertainty is gone in an iteration, but we have predictability within a probabilistic range.
- There is nothing wrong with traditional managing of projects and initiatives per se. It just has to match the reality in which we operate. Therefore, if most of what we have to do is in the area between Compicated and Obvious, it is legitimate to have an upfront plan with everything broken nicely into tasks and allocated to people with the right skills. All we are saying is: Use the right method for the right situation and; many initiatives in modern organizations have a high content of complexity.
- A typical situation to watch out for is someone from one domain being put in charge of an initiative mostly in another one; this person will typically apply the methodology from his original domain which is a recipe for failure. The typical example is to take a person who has worked successfully for a number of years in finance or accounts and then make him lead a development initiative of a new product. The first thing he will do when being confronted with the messiness of a development project is to demand more structure, more reporting, more checklists and more budget control. That kills innovation and slows everything down like molasses in January. Effectively, he is in Inauthentic Disorder.
- As mentioned above, The Zone of Complacency is to be avoided at all costs. Another of the cognitive biases is that most people prefer the status quo and clear rules. It is much easier than having to think carefully about things. Apparently our brain is also lazy, it really wants to fall back on following rules and regulations, because it requires less energy. If therefore given the opportunity, the brain dozes off and we are going autopilot on rules and regulations. Cynefin helps us identify early symptoms of this so we can intervene and avoid falling off the cliff..
- Finally, Cynefin can help in understanding each of the items on the Backlog. The team is asked to assess which of the Cynefin domains a certain item is in. That will give clues on how to choose an appropriate solution or line of attack. If for example the Team declares a certain item Complex, we know that we have to expect experimentation.
There are other coherent definitions of complexity than Cynefin, but this one has a practical applicability when developing organizations. If properly understood, it induces a healthy dose of humility into the leadership; resulting in a realization that they also only have fragmented knowledge and need the rest of the people in the organization to make proper sense of things.
Read also the follow-up article about the new development of “Liminal Cynefin” here…
Original illustration at top courtesy of the Atlas of Economic Complexity