INTRODUCTION |
Analysis is a craft that management cannot easily do without. If plain, raw data collection – which in the quote below is summarized in the word intelligence – would suffice, why would we need analysis (Davis, 2008)?
“When the facts speak for themselves, intelligence has done its job and there is no need for analysis.”
Usually, however, it is not that simple. Data have a nasty habit of not speaking for themselves. Data can often be ambiguous. Different data may be mutually contradicting. Data may be missing. Most of all, data in many instances may not offer the answer the strategic analysis requests (Johnston, 2005a):
“Since the facts do not speak for themselves but need to be interpreted, it is inevitable that the individual propensities of an intelligence officer will enter into the process of evaluation.”
The conclusion seems obvious. Data generally need interpretation. By implication, when management needs the answers to strategic or tactical questions regarding the business environment, they will have to build or in-source an analytical capability. Even when the choice is to in-source analysis, at least one individual in the in-house organization will need to know how to brief a third party. Too often the best procurement officer in a particular line of business is a former sales person from that very same line of business. That rule of thumb applies equally to analysis. Management has essentially two options when organizing business environment analysis: either hire a strategic analyst and let them do the work. Or they can hire at least one (former) analyst and let them handle the procurement. In this chapter, I will briefly touch on the discipline of strategic analysis itself.
DEFINITIONS OF ANALYSIS
The first question is: “What is analysis?” Definitions of the noun ‘analysis’ vary. The Free Dictionary provides this as one of its definitions (TheFreeDictionary, 2013):
“The study of such constituent parts and their interrelationships in making up a whole.”
Merriam-Webster (Merriam-Webster, 2013) provides two definitions. The meaning in their second definition comes closer to what I believe to be strategic analysis:
• A careful study of something to learn about its parts, what they do, and how they are related to each other.
• An explanation of the nature and meaning of something.
In sources on (military) intelligence the two definitions below were found (Bruce, 2008a), (Jones, 2007):
“Synthesized raw information collected from multiple sources, interpreting the meaning of such info in the context of the policymakers’ needs.”
“(analysis is:) a matter of somehow keeping one’s head above water in a tidal wave of documents, whose factual content must be ‘processed.’”
The second Merriam-Webster definition, as well as Bruce’ definition, probably best cover what analysis actually is in the context of strategic analysis of a company’s business environment. The latter definition by Jones will be recognized by many mature analysts. Making sense of data overload is definitely a part of analysis. This book aims to assist analysts in doing just that: making sense of data, even when the data keep coming. The following metaphor may help to elucidate the essence of analysis in synthesizing a strategic analysis function deliverable.
GLASS PRODUCTION AS A METAPHOR FOR TURNING DATA INTO ACTIONABLE INTELLIGENCE
Common glass is produced by mixing sand (consisting mainly of silicon dioxide), soda ash and lime. Consider the raw materials required to produce glass as data. The raw materials may need to be purified prior to glass production: in strategic analysis, this means the data quality (i.e., reliability) has to be verified and thus assured. Like raw materials in glass production, data that do not meet the predefined minimum quality standards have to be rejected. The different data need to be weighed in terms of relative importance. That is tantamount to the mixing step in glass production. To make glass, one has to follow the recipe. The right amounts of the right quality raw materials have to be mixed to get the right high-quality finished product. Mixing sand, lime and soda ash as powders in the right ratio, at ambient temperature, no matter how intensively the mixture is being stirred, doesn’t necessarily lead to the production of glass. Unless other critical factors are taken into account, even when mixed in the right ratio it is still only a mix of powders.
The next step is therefore processing the data. In glass production, this happens in an oven. Depending on the required glass quality, process parameters need to be chosen. This has implications for the choice of oven temperature, the type of oven to be used and the length of time the mix of raw materials is left in the oven. In strategic analysis work, the processing happens in the mind of the analyst. Like ovens, some analysts are more suited than others to particular processes. The analyst may choose to be supported by computers and other tools, but the mind matters most. If everything is processed correctly, the output that leaves the oven is glass… or in strategic analysis work, a well-prepared analysis deliverable.
The analysis deliverable needs to be shaped to match with the specific decision-maker’s need. Management will not appreciate getting a deliverable that resembles a malformed lump of glass. The deliverable, may meet all the quality specifications but it may not be useful yet. The deliverable, like the glass, will need to be tailored in size, shaped and it will need to be polished and faceted. Finally, the deliverable – like the glass vase that should be ready for Mother’s Day, not three days late – will need to be delivered to management at the right time to ensure it is useful for decision-making support.
The analysis step in the analysis cycle is thus meant to process input data, turning it into meaningful output deliverables. The inputs of the process are referred to as data. The process itself is called analysis. The output is generally called intelligence.
ANATOMY OF |
THE WHERE, WHO, WHEN, WHY AND HOW OF ANALYSIS
Five of Kipling’s six honest service men are again employed in this chapter to further flesh-out and explain the concept of analysis – the question-men in this chapter are where, who, when, why and how:
• Where? |
Analysis is typically a desk research activity. In today’s connected world it doesn’t really matter where the desk is, as long as it is in a brightly lit and quiet room. 1 The room should be equipped with the right set of tools including (online) newspapers and other database subscriptions and high-speed internet access. The economic laws of increasing and diminishing returns applies to strategic analysis. An analyst who operates in an intellectual vacuum will generally be less effective than an analyst who operates in an organizational environment with peers of at least the same intellectual level. 2 These peers will almost automatically act as a sounding board for the analyst. 3 This will allow the strategic analysis department to produce better quality output. |
• Who? |
An analyst usually relies on the human brain rather than on artificial intelligence, although applying the latter may serve as a helpful tool for the analyst. 4 |
• When? |
The majority of the collection work in a strategic analysis project needs to be finished before meaningful analysis can start. One common analytical bias is premature closing. An analyst or a decision-maker (or both) may prematurely close his mind to new data. Once a person has seen a meaningful percentage of the facts and based on those facts, has made up his mind, she may not like to see more facts. Needless to say, this phenomenon poses severe risks to the quality of the policy or strategy that, based on the incomplete data set, is subsequently adopted and pursued. In some projects, iterative loops will occur. Some analysis outcomes will generate new leads for collection, leading to new analytic perspectives etc., Acknowledging this reality, analysts are still advised to finish collecting as many of the facts as possible prior to concluding what it all means. |
• Why? |
Analysis is the critical step in processing data, information and possibly knowledge into intelligence. Similarly, glass is formed by mixing the right raw materials in the right quantity at the right temperature in the right oven. In the absence of both the oven and sufficient temperature, no glass will be formed. |
• How? |
The how question of analysis relates to: • The input data and the related individual input data quality assurance – simply put: getting the right individual data inputs into the analysis. • The processing of data and information that individually have passed the data quality check. • The formulating of multiple hypotheses, the most plausible of which is chosen in the analysis deliverable. • The potential use of tools to turn quality controlled data sets into intelligence. • The quality control check of the ultimate analysis deliverable – ultimate objective is of course to make the output absolutely perfect. |
Diagram 9.1 schematically presents the steps of a common strategic analysis processes.
In this chapter, section 9.3 will describe data and their corresponding metadata. In section 9.4 generic attributes of analysis will be briefly discussed.
DATA AND |
Individual data points come in two dimensions: the actual measurement or data point and the contextual information accompanying the data. The latter are usually referred to as ‘metadata’. In the glass production metaphor, data are equivalent to the quantity of sand or soda ash or lime. Metadata corresponds to the date of the shipment, the truck in which the raw material has been delivered, the supplier of the raw material and the silo in which the raw material was stored prior to its use. All metadata taken together, make the shipment of the individual raw materials uniquely traceable.
For strategic analysis work, the following – a somewhat unsavory wartime government intelligence example – will help illustrate the importance of connecting data and metadata. In spite of its slightly dubious nature, the example is used because it nicely illustrates the underlying topic.
During the Second World War, human intelligence source Aiken Sneath informed the British domestic security agency MI5 of a gentleman called Ivor Montagu, a film maker. He stated, without further evidence, that Montagu was “an active fifth columnist” – a communist agent (Macintyre, 2010a).
The measurement or data point in this example is: “Ivor Montagu is an active fifth columnist.” The metadata here include (list not exhaustive):
• The source (i.e., the informer), of which Macintyre amusingly remarks that his name was so implausible that it had to be real.
• The timing of the source delivering his message.
• The form in which the source delivers his message (telephone call, formal written document, etc.).
• The recipient of the message at MI5.
• The state of mind of the informer at the time of delivering the data point (agitated, cool, upset, etc.).
• The relation of the source to Ivor Montagu.
Another government-related example below describes how deceiving data points were not sufficiently disputed because metadata were insufficiently valued. Prior to the invasion of Iraq in 2003, an exiled Iraqi source called ‘Curveball’ indirectly provided the US government intentionally false – or at least heavily distorted – intelligence on Iraq’s program to develop and ultimately deploy biological warfare-based weapons of mass destruction. The objective of this source was, through these false reports, to persuade the US government to take action against Saddam Hussein. The data points provided by this source were not corroborated by UN weapons inspection teams’ findings. The US government, however, believing the source was reliable, decided to act upon the deception anyway. And, of course, we all know the consequences.
The above is adequately summarized in a statement that strategic analysts may read as a warning regarding data correctness being influenced by metadata (Johnson, 2009a):
“Who said it is often more important than what was being said.”
This holds true for open source data and especially for human source date. The interim conclusion on data quality assurance is that data points and their metadata information should be validated in totality.
A final word for this section: about two decades after the human source informed MI5 on Ivor Montagu, signal intelligence sources confirmed that Montagu was or had indeed been working for the Soviet military intelligence service GRU (West, 1999). He operated under the code name ‘Nobility.’ 5 The point is that, in military intelligence, at least, the absence of evidence which negatively affected the credibility of Mr Sneath’s data point is not the same as evidence of absence.
What is the lesson from this section? Most of all that metadata forms an indispensable context for a single data point. The analyst must know and record the metadata to assess the correctness of the actual data point. This is the case in the above examples just as much as in handling any data point that an analyst receives as input for her work.
The implication for organizing strategic analysis projects is obvious. Namely, in designing and running any analysis project enough time should be made available to double-check sources and to corroborate findings. In journalism, one source is as good as no source at all. Strategic analysts, as good journalists do, need to organize not only data collection but also data quality validation before formulating conclusions.
ATTRIBUTES OF |
Business environment analysis is, and will remain, a craft in which experience more than learning is determinative. Strategic analysis by its nature has to cope with fundamentally unpredictable (human) behaviour which will never be governed by physical laws that are in most other instances universally applicable. Effective xenocentric strategic analysis predictions should reduce uncertainty in business decision-making. However, no matter how sophisticated the analysis is it will never be able to fully remove uncertainty.
Strategic analysis will thus never be considered a science, as it will never meet the fundamental requirement of science: reliably predictable reproducibility. Still, scientific methods may serve strategic analysis well. The use of scientific methods in intelligence analysis has been strongly advocated (Bruce, 2008c). Bruce proposes that intelligence analysis should use:
• Hypotheses
• Objectivity
• Transparency
• Replicability
• Peer review
• Provisional results
In the remainder of this section I will briefly touch upon each of these attributes. They are all fundamental elements in generally accepted scientific methodology. Apparently in 2008 they were not commonly applied in the US intelligence community, otherwise Bruce’s reference would have been superfluous. As I do not believe that Bruce is the king of stating the obvious, the attributes deserve to be discussed in more detail.
Thus, the question is whether scientific methods, and by implication the attributes of accepted scientific methodology are also relevant in business strategy consulting and in strategic analysis as data-provider-to-business-strategy design. I believe they are and in support of that I put forward two suppositions. The first is rooted in my personal experience. In my work, I have benefitted tremendously from my scientific methodology toolkit acquired during my PhD research. The second is that I see business strategy consulting firms, as well as investment banks and private equity funds, recruiting ‘quants’ (those who specialize in the application of mathematical and statistical methods) with a solid scientific background. Their recruits often hold PhDs in physics, mathematics or engineering. To such staff, scientific methodology is fully embedded in their thinking. I concur with strongly advocating the use of scientific methods, even when the ultimate goal of xenocentric strategic analysis is to think ‘like them’ and thus to predict what they’re thinking. Thus, I believe in organizing a strategic analysis function in a corporate environment in such a way that the scientific methodology attributes as we will discuss below truly matter – and doing so for the better.
HYPOTHESES
Hypothesis-based strategic plans are becoming the rule rather than the exception in many firms. A strategic analysis function should thus provide strategy designers and other decision-makers with hypotheses.
OBJECTIVITY
Objective measurement methods are crucial. In physics, every freshman gets to know Schrödinger’s cat. This metaphorical feline creature is named after a thought experiment proposed by one of the iconic physicists of the first half of the Twentieth Century, Erwin Schrödinger (1887-1961). Without getting bogged down in the subtleties of quantum mechanics, the point is that a measurement method may not be allowed to interfere with the outcome of the object or phenomenon that is measured. In Schrödinger’s experiment the poor cat is locked up for an hour in a room with a radioactive nucleus that may or may not decay during that hour. Upon decaying, the nucleus will release a poison gas that kills the cat. As the timing of radioactive decay of a single nucleus is not predictable, as soon as the cat is locked up in the room it is fundamentally impossible – because the cat cannot be physically seen – to objectively ascertain whether the cat is alive or dead. Unless, of course, the door to the room is opened… but that would represent undue interference.
This example may sound like a bizarre experiment of some mad scientist. If that were the case, Schrödinger’s cat wouldn’t feature in this text. Unfortunately, this hypothetical cat is a familiar presence in strategic analysis. Answers to questions or even people’s responses to probing may be highly dependent on the situation in which the answers are collected. The context of the measurement, to use scientific jargon, may in human source-based collection strongly affect the outcome of the experiment. Objectivity is the aim. Maintaining objectivity requires the collector or analyst to spot Schrödinger’s cat in the methodology and scare it away before flawed methodology affects the measurement’s accuracy.
TRANSPARENCY AND REPRODUCIBILITY OR REPLICABILITY
In science, all methodologies need to be documented in detail to allow for reproducibility. In strategic analysis, protection of human sources may hinder this generally preferable approach. Deliverables may, upon request of the decision-makers, have to substantiate all sources behind an analysis. Generally, there is no need to do so proactively. No reference to sources is entirely necessary, unless needed for the decision-makers to understand the background. After all, sources may be too precious to share unless absolutely necessary.
PEER REVIEW
In science, peers are invited by scientific journal boards to review manuscripts that are submitted for publication. In doing so, a methodology check is executed by experts prior to accepting a manuscript for publication. Peers are entitled to send the manuscript back to the author with questions that need to be answered prior to the journal accepting the paper. Peers may also reject the article altogether when they find it to not meet the required minimum scientific standards, and thus to be ‘beyond repair’.
In a strategic analysis function, the process of peer review is a good practice, with neutral yet constructive colleagues operating in peer roles.
PROVISIONAL RESULTS
In science, “if ugly facts challenge beautiful theory, facts win” (Bruce, 2008b). The beauty of science is that new facts, provided that they are measured objectively and correctly, may (and should!) supersede old hypotheses. This would be commendable in strategic analysis as well but is not so easily implemented. Once the strategic analysis function has developed an output, decisions are based upon the output. If later analysis shows the outputs to have been wrong, then the decisions made based on the deliverables – think large-scale market investments – may not be reversible. So, in strategic analysis the science metaphor is great, but it is not always applicable.
INTEGRITY
Complementary to the above thinking, the core attribute of analysis, as I see it is captured well in the following quote (Kerr, 2008a):
“Integrity is the single most important attribute of solid analysis.”
An analyst may lose a lot, but never – if they are really good at what they do – the perception of their integrity with the decision-makers they serve. McLaughlin mentions four additional attributes of (intelligence) analysis that equally apply to strategic analysis in a business context (McLaughlin, 2008a). The first has already been covered above, focusing on measurement methodologies and Schrödinger’s cat. The others speak for themselves:
• Objectivity.
• Civility.
• Balance.
• Thoroughness.
The attributes that management may demand from analysis can perhaps best be summarized as follows (Denrell, 2005):
“No managers should accept a theory about business unless they can be confident that the theory’s advocates are working of an unbiased data set.”
What matters is the relevance of both data correctness and data set completeness.
I will take this opportunity to repeat the core messages of this section. The key attributes of strategic analysis should be objectivity and integrity. It is critical for a strategic analysis function and its entire staff to be strictly non-partisan in office politics. Any strategic analysis function’s organizational design should facilitate operational objectivity and integrity. I had this pitfall in mind earlier, when I introduced Curveball. A flaw in the objectivity of a source (and perhaps a bit of political manipulation) may lead to undesirable outcomes that are to be avoided.
COLLECTOR |
The relation between the collector and the analyst, assuming these are different persons, is a topic covered in literature on military intelligence (Clark, 2007a). One question in this field is whether the collector should be briefed on the context of the collection request provided by the analyst. To avoid collector bias, it has been postulated that collectors should only be briefed on what to collect but not why (Jähne, 2009). The less context they have, the more they bring. However, this is not a universal view (Clark, 2007a) (Jones, 2007):
“The key to success with any collection strategy is a close and enduring relationship (of the analyst) with the collectors. Simply writing collection requirements and ‘throwing them over the wall’ doesn’t work. If collectors have access to and understand (…) the problem breakdown, they can respond much more effectively. This usually requires (the analyst) developing and maintaining personal contacts with collectors.”
“Intelligence information is typically very ambiguous, with several plausible interpretations. Understanding the context of information, therefore, is a fundamental tool of analysis. It is one reason for housing both collection and analysis capabilities in the CIA. Intelligence collectors are most familiar with the context of information gathered – its reliability, timeliness, relationship to other information and so on.”
In most strategic analysis departments, collectors and analysts work closely together, in fact if the roles have even been separated in the first place. This issue is less relevant in a business setting than in the military context. This does play a role when the collector is an individual in the firm with a lower information security clearance than the analyst.
COLLECTION AND CONFIDENTIALITY
An example of a collection and confidentiality dilemma is when in contrast to the analyst, the collector is not supposed to be knowledgeable about senior management plans. This could be the case when an analysis for an acquisition is being planned. In such cases, the size of the team in the know is minimized to avoid leakage, with all of the legal and other risks that leaks may entail.
In such a case, the collector must execute his role without knowing the full context of the information requested. Apart from potential suspicions raised in the collector’s mind, this shouldn’t and generally doesn’t generate problems. 6
Diagram 9.2 gives a schematic, simplified overview along the dimensions of geo-location, timing and personality for the different roles and profiles of collectors and analysts. An ‘off-site geo-location’ is defined as a location that has no link with the location of the activities or presence of the competitor (or the customer or the supplier, etc.). The collector or analyst need not physically be somewhere (on-site) to execute their work. In diagram 9.2 , I use the military intelligence abbreviations for different collection methodologies:
OSINT |
Open source intelligence. |
HUMINT | Human intelligence (as from human sources). |
IMINT | Imagery intelligence. |
MASINT | Measurement and Signature intelligence (back-engineering of products). |
Acknowledging the risk of over-generalizing, good analysts are different characters than good collectors. This is especially true for collectors of human intelligence on the one hand and quantitative analysts on the other. In fact this is not a problem at all. When both disciplines respect each other’s capabilities, cooperation can be and usually is seamless.