Development of intelligence platforms.

Towards the end of the 1990s, the Internet became a breeding ground for intelligence software, firstly because it offered countless open sources of information, and secondly, because of the birth of intranets offering a different kind of strategic intelligence. It could be used directly within departments and businesses, in addition to documentary resources, which until then had been the sole means of accessing quality information. In some ways, the internet made strategic intelligence accessible to all by putting information directly into the hands of end users. Intelligence platforms were developed at that moment and were used to produce reports and alerts. Soon, the aim became introducing production lines of filtered and specialist document data flows that used open data to provide information. This involved monitoring websites, transferring target content and issuing alerts and/or regular reports. This system is still widely used in many organisations.

Text Mining (automatic analysis of free text) has been a major focus for potential improvement, with two aims. Firstly, to categorise documents in order to send them to “the right people at the right time”, and secondly, to understand the meaning of documents in order to identify and highlight weak signals.

These initiatives have obtained mixed results despite constant developments in text mining. Advances have helped channel and accelerate the selective distribution of document flows under certain conditions, but we are not yet able to easily identify weak signals, threats or opportunities.

Does this mean that intelligence platforms have been reduced to “sorting documents”? Not at all. However, we do need to consider the resources and timing required to bring together technical capacity with real business requirements and practices. In other words, what information that is strictly necessary and truly useful to daily operations should technology be capable of sharing with intelligence analysts in the future?

Identifying a trend or highlighting a weak signal potentially heralding a threat or opportunity is not only based on a publisher’s ability to develop an inference engine or to “embed analytics”. There is no magic wand for analysing competitors, and less so the environment. Nor can the transition from a document-focused approach to a data-focused approach be achieved by installing interactive dashboards or mapping jumbled information that is complicated to implement and maintain on a daily basis once the demonstration stage, or at best a prototype, has been completed. We need to step back to identify the limitations and weaknesses of the automatic mapping of corpora, whose purpose is to identify weak signals. Let’s avoid repeating the same mistakes. Fine understanding of the environment and the identification of factors of change are not based on Data Science and Big Data alone.

We believe that next-generation strategic intelligence solutions must include a cutting-edge intelligence platform based on several pillars:

  • The aggregation of diverse and big data flows, from both internal and external sources;
  • Compliance with company security restrictions for source data visibility and access;
  • Consideration of the time-sensitive nature of data;
  • The possibility of simply interacting with (querying) incoming and archived data flows;
  • The presentation of data using closed, business-focused visual systems;
  • And last but not least, user support, beyond the simple yet necessary hotlines and training, via customised professional services.

Solutions publishers are confronted with a challenge that is not merely “software-based” in the sense of application development or their ability to create tools for the ideal intelligence platform. We believe that the challenge is two-fold.

On the one hand, it lies primarily in the perception and cultural practice of intelligence activities. Unfortunately, industrial-scale intelligence has become one of business’ “poor relations”. The budgets set aside for implementing and creating effective operating conditions for intelligence are well below the sums for external communication, for example. Another aspect blocking its progress is a question of turnover and social mobility within companies. How many “business intelligence managers” and analysts with real decision-making power and the resources for continuous improvement remain in this managerial position where they can promote innovation and progress for more than two or three years?

Software products are generally becoming more standardised with varying degrees of modularity and there is a sort of “arms race”, which often involves simply adding new functionalities, resulting in a more complex offer which is harder to use.

Users do not seem to expect a quick-fix tool with capacities independent of the organisation in which it works, whatever the intelligence culture and practices and regardless of the staff in charge. Instead, they need a solution that helps them to effectively understand the complexity and increasing turbulence of the environment. And above all, any solution needs to comply with the restrictions and practices of the companies in which it is used.

How can we achieve this vision? The time and conditions needed for innovation cannot be found in some sacrosanct customer relationship or by creating a kind of user club, but instead by bringing together technological potential (know-how and experience in software skills within a value chain) and the suspected or actual business requirements and constraints, in an experimental yet organised manner.

iScope has been modernising its offer and its KeyWatch platform in this direction since its creation. Some recent examples include two major investments in 2015, with an exclusive and important partnership with data visualisation specialist Atelier Iceberg (software and methodology) and the launch of KeyLab last summer.

Recent Posts