Daily Speculations The Web Site of Victor Niederhoffer and Laurel Kenner




Write to us at:(address is not clickable)

The Importance of Network Analysis Systems to Security Issues and Speculation

We increasingly need capabilities of a network analysis inference system in high level intelligence applications for when large data sets must be analyzed by a speculator and insufficient meaningful patterns emerge for predictions. It is altogether clear that a few commercial solutions exist to perform just that analysis, especially when an enterprise level search engine quickly returns meaningful results, however most available search modes depend on Boolean logic. This is a query data repository where all content can be so defined, organized, categorized and analyzed, and is something that goes beyond the Meta information on the current trend or the verbal expression of ideographic phenomenon. What is defined below is an attempt to "capture" the meaningful data sets when they are not pre-known, and more importantly, the shaping patterns are not actually discerned even when data is available to the speculator.

In this day and age, the primary threat to the security of our communities is neither an army nor a single person. The pivotal threat comes from hostile networks. I have borrowed this and other conceptualizations below from commercial and non commercial products that are being used in an unfolding challenge that is one of a methodological nature as well. "Hostile Network" is the generic term for all the organizations that carry out terrorism, trafficking of drugs, money laundering, organized fraud, etc. The past decades have witnessed acts by lone wolves, but they pale in comparison to the damage caused by hostile networks. This concept may have its use in speculation as we also deal with wolves that can operate on their own or in concert.

The objective of national security agencies is to expose hostile networks, thwart their plans and capture their leaders and members. When that is not possible agencies strive to issue alerts about impending hostile operations, prevent the planned activity or at least minimize the effectiveness of such actions. Speculation runs into similar constraints.

Intelligence agencies concluded long ago that communication data and other event data records, often without direct relationship to a subject under investigation, is a crucial source of information contributing to intelligence analysis. This can be discerned when access is applied generously to huge data. As a result, significant resources have been directed into making it possible to collect and store tens of billions (tobs) of data records in structured and unstructured formats. Structured data is stored as data records and unstructured data is stored as transcripts of calls, field reports and other similar documents. Conventional software has not been successful in using this information to produce significant added value to intelligence analysis of terrorist and criminal activities. The new methodology and technology to do so are going to be a must for advanced speculation as well.

The intelligence analysis of terrorist and criminal activities is very different than data mining and statistical analysis used in business, economic, or sociological analysis. Terrorists are organized in cells and networks of people. They go to great lengths to compartmentalize and distance themselves from each other using multiple layers of separation. Intelligence agencies need specialized methodology and technology to expose and analyze these networks. Speculators may face the same or conceptually cognate challenges. For instance, information is available that a suspect corporation was observed communicating over the course of an hour on a particular date. The analyst will want to answer questions such as:

  1. Is the suspect corporation linked to a particular [hostile to x] network?
  2. Who else is involved in the network?
  3. How is the network structured?
  4. How does it operate?
  5. What are the network trends?
  6. As well as other relevant questions about the network

In order to understand these networks it is necessary to uncover what is included in the network and how the individual elements are linked to each other. Multiple sources of link support information (LSI) can be used to accomplish this objective including communication records, advertising, various subscribers or bloggers' messages, email messages or forwarded messages, Internet activity, discussion transcripts, certain event, (media, informal, mixed) incidents, financial transactions, relocation records, vehicle fleet developments, etc. For each pair of entity types which are linked in the network and for each source of LSI, different criteria can be used to infer the links. For instance, it is possible to infer a link between two corporations or key people based on available records if both spoke to a common third party even if they did not come into contact directly. In the same way, it is possible to infer a link between two groups who are owners of different bank accounts or export contracts which both carried out transactions to a common third account. These are only indicative examples.

Analysts typically want to explore multiple suspects and vary filters and link criteria used in a process of trial and error. This can succeed only if they can work interactively with the data. On the one hand, the system must expose a network hidden in millions or billions of records and on the other hand allow the analyst to view individual records relevant to the analysis. In order for the analysis process to be practical the results of a single trial must be completed within a few minutes. When the challenge at hand relates to a ticking bomb scenario, results must be achieved even faster. Standard commercial technologies do not provide these capabilities or performance.

The ideal network analysis inference system is fully capable of providing answers to these kinds of questions. It can be enabled by software available already and use this type of link analysis, with billions of data records from various sources, to expose and analyze hostile networks. Intelligence methodology can now evolve to a new level as a result of innovative technology being applied to changing threats, either economic or other types of encroachment on our normative or stressed systems, and make them more predictable.

This type of analysis provides, or rather, defines a new paradigm to fulfil needs that were known for as long as speculation was systematic and data based. The crucial advantages of the system must include, depending on your specific needs and ambitions:

  1. Network centric rather than case or target centric
  2. Sophisticated link analysis mechanisms designed specifically to expose and analyze hostile networks
  3. Data fusion of varied types and formats of data
  4. Fast response time even when using tens of billions of records
  5. Designed for teamwork within speculators groups, across working groups and between speculators working independently
  6. Data compression to minimize data storage requirements
  7. Flexibility and scalability of analysis methodology, quantities of data and hardware infrastructure facilitating long term evolution
  8. Integration of structured and unstructured data in contextual data management
  9. Easy to use by non-technical personnel

Software application and technology to assist investigators and analysis personnel may turn raw data into actionable intelligence by:

  1. Exposing and analyzing networks
  2. Enhancing the ability to make inferences
  3. Standardizing methodologies and communication across organizational lines
  4. Storing, sharing and reusing any type of information
  5. Improving the investigation work flow
  6. Supporting timely decision making
  7. Facilitating the information dissemination cycle
  8. Supporting fast changing needs
  9. Using manpower efficiently

Once an event or an even set has been detected the system can send an alert using any of the standard means of communication:

The system should be able to manage a distribution list. The list consists of individuals and definitions of groups of individuals in order to allow quick and flexible definitions for dissemination of alerts. Each type of event can be defined to have an alert delivered to individuals or groups of individuals. In addition, the system allows distribution to different people and groups based on the level of severity of the event.

Alert based systems often have a problem of producing too many alerts. There are several features in the desired system that should help control the "alert overflow":

  1. Option to set a time limit such that the detection of an event runs for a set period of time and must be renewed by the user
  2. Adjust the minimum level of severity for each event to a level where the number of alerts is acceptable
  3. Rank the importance of each type of event. The user can then define the maximum number of alerts and the system will send alerts starting with the most important in descending order until it reaches the set maximum

Perhaps all of the above is a bit too general but the challenge is now more in converting methodologies for use by speculators, rather than inventing them, thus a reflection on why and 'for what context' is taking place. The good part about the type of methodologies required for conversion for speculators and trading is that we can find them in certain seemingly irrelevant circles of analytical activity, although seemingly may be a misleading adjective.


Prof. Joseph D. Ben-Dak is an expert in international security, responses to terror, technology and global politics. He has served in numerous high-level international posts at the United Nations and other organizations. He holds a doctorate in Organizational Sociology and Research Methods from the University of Michigan, Ann Arbor.