Our information age is also an era of unique risk. Dangerous events – e.g., large-scale terrorist attacks, wars, revolutions, financial crises, climate catastrophes – occur before warning or preparation can take place. Add to that the growing complexity of the business environment, rapid technological advances, and dramatic changes in industries, business models and the workforce, and the result is a certifiable maelstrom of uncertainty.
Under these circumstances, the current strategic environment is too complex and dynamic for a mere handful of persons to conduct informed risk assessment and decision-making – specifically that which takes into account a very large number of variables and creates a relevant, effective situation assessment.
A parallel (and related) revolution is taking place in how information is created and knowledge is developed. One could call this the “revolution of the many” – and it is relevant here in two senses.
The first sense refers to masses of people being used for analytic purposes – i.e., crowdsourcing. This represents the analytical utility inherent in a large group of people, informed by the assumption that the wisdom of the crowd is greater than the wisdom of any one individual.
The latter sense refers to many pieces of information and many insights. Big data – the use of vast amounts of information from a range of sources, in different formats and of varying qualities to generate insights – is already a common practice. However, the next big thing after the mastery of big data is big knowledge: the aggregation of insights generated by a constellation of analysts.
In recent years, crowdsourcing has become enormously popular. That a large group can reach more accurate conclusions than a small group (or any expert individuals) has repeatedly been proven. For instance, relying on the crowd to reach a statistical conclusion is an excellent idea when the question is “How many peas does this container hold?” or “Which hotel is best?”
Admittedly, crowdsourcing has its limits – and there are issues for which it is ineffective. In many cases where profundity is required, the impact of crowdsourcing is expressed only in the aggregate value of community members and their mutual analytical interactions.
It is thus crucial to rely on “expert-sourcing” – i.e., a community of experts or a selective crowd. In this case, the power of a professional crowd lies in its ability to develop deep knowledge. This power stems from the unique professional and personal perspectives of the members of that community and from their professional interactions.
Despite these developments, the central concepts in terms of strategic planning – whether in the government or the private sector – are based off of the “old world” prior to the information revolution. Many organizations and firms are still based off of the idea of the one as opposed to the idea of the many. They thereby elevate ownership of an individual’s knowledge – usually an expert in a particular field.
This philosophy also stresses the importance of a single, high-quality bit of information whose elicitation may enable the “cracking” of a strategic mystery – i.e., understanding the environment in which an organization or company operates. Organizationally, this means placing the responsibility for analyzing, planning and executing strategic plans on individuals or small groups of senior managers or consultants.
Expert-sourcing can overcome the inherent difficulties of these various strategic processes. Organizations and companies often find it difficult to adapt to rapid change. Cognitively speaking, the complexity of our world requires professionalization and the ability to delve into detail. Under such circumstances, it is hard for individuals to combine the need for expertise in sub-topics with the need to present a holistic, multidisciplinary situation assessment. Moreover, individuals (just as groups) are susceptible to biases which impede the ability to think systematically in a way that matches reality – and consequently to make appropriate decisions.
The use of selective or focused crowdsourcing which relies on a community of experts helps to overcome this problem, allowing organizations and companies to utilize the one and that of the mass. The establishment, management and use of expert-sourcing utilizes the unique expertise of every community member while leveraging their cooperation to create value from their collective interactions. Such a community can consist of experts in a wide range of fields, some of whom deal with the core facets of an issue and some who deal with more general ecosystems of relevance. Such a community must include many members in order to increase the aggregate value of their interactions – both to generate a wealth of perspectives and to make sure that, at any given moment, it is possible to approach a large enough crowd to create insights in the timeframe relevant to the decision-makers.
More companies are starting to use expert-sourcing – usually online – to support decision-making processes. Some establish their own communities of experts and appoint community managers who continuously recruit members from inside and outside the organization. These managers maintain the community and put it to work when necessary. Others turn to companies operating communities of their own to carry out specific projects with speed – and without compromising depth or quality. It is important to note that this process is not meant to replace decision-makers and the circles supporting them. Expert-sourcing is simply an effective tool in the analytical toolbox of managers facing strategic challenges.
Establishing expert-sourcing mechanisms is only the first step. Once formed, the challenge becomes the most efficient utilization of this crowd. Here comes into play the idea of “big knowledge” – another expression of the idea of “many” or “masses”. This time it is a multitude of insights.
Big knowledge sometimes starts with big data. There’s no doubt that big data tools sift through huge quantities of information that might not otherwise be relevant or interesting. But big data is merely one of many starting points for disparate groups or even masses of analysts to generate the insights which are later turned into big knowledge.
What separates big knowledge from big data is not just making sense of an influx of information, but a flood of insights coming from many different analytic sources. Organizations utilizing big knowledge are thus faced with the following questions: Are we fully utilizing the insights generated, or are some of them getting lost on the way? And can we even generate true knowledge out of this aggregation of insights?
As if these questions are not complicated enough, there is another set of variables to take into consideration: the identity of those creating the insights and the context of their generation. Unlike “ordinary” data (which deals with objective facts), insights are interpretations of reality and therefore are subjective in nature. It is thus important to contextualize the aggregation of preliminary insights so as to build new layers of analysis thereupon. Indeed, ignorance of the basic conditions in which insights are initially generated may replicate biases, mental pathologies or just simple mistakes.
Understanding the context of insight generation is especially important when dealing with analysts examining similar or interconnected issues but who are all dispersed by location or field of study. Organizations facing this challenge include intelligence agencies, large consulting firms, research firms and risk assessment departments of large multinational corporations. Such entities need to consistently generate new and impactful insights which transcend the mundane. They must also constantly examine the holistic relevance of knowledge generated for a specific purpose. Doing this manually (via human analytic capacity) is simply too labor-intensive – not to mention limited by cognitive limitations.
So how does one effectively digest analytic products produced in multiple contexts? How does one identify blind spots, contradictions or patterns that may or may not emerge? In what structured way can one “connect the dots” between multitudes of analytic products to produce new sets of insights? And how can one assess whether the context in which an analysis was conducted (e.g., time, location, culture, professional background) influenced the assessment?
This is where big knowledge comes into play – an analytic toolkit that enhances cognitive capabilities rather than making them redundant. Even if artificial intelligence and big data become productive, there will be an extended (and perhaps indefinite) period where human-in-the-loop analytics are needed to pick up where software falls short. The most interesting future use cases come in the illumination of unknown or hidden connections which become a kind of “force multiplier” for the analytic community. Such a toolkit allows analytic communities to map knowledge, interconnect insights, uncover new relationships and ideas, and identify relevant experts for the task at hand.
Those organizations wishing to be on the forefront of strategy formulation must move beyond mere data and bring forth the next step in knowledge management. Only the use of handpicked expert crowds and big knowledge can truly reveal “unknown unknowns” – i.e., real-time understanding of previously obscured connections and relationships.
This is essential in an era of chronic instability and complexity. We cannot be satisfied with an understanding of the surface (i.e., merely what “big data” can bring”). Rather, we must dive deeper into the multidimensional – which only skilled use of the “many” can reveal.
Have you read?
How Keeping up with the Latest News Could Impact Your Business Performance
Hiring a great CEO: drivers of success and common mistakes to avoid
India’s Richest Tech Titans: 8 Wealthiest Indian Tech Billionaires, 2016
7 Ways to Improve the Employee Relocation Experience
Revealed: the recipe for career satisfaction