Team:UCAS-China/Estimation ISS

Estimation for Individuality in Stable States

During the previous modeling, we find three stable states of the system. Do these stable states possess the same characteristics? Are these stable states the result of mutualism or they are induced by dominance? These questions dig into the deep nature of the stable states we obtained. The information theory of individuality (ITI) established by David Krakauer provides a general method to work on such problems.

Part 1. Theory

Considering an object evolving with time \(t\), its individuality should be described as the information flow form its past to future. In most cases, the object is not isolated, it is immersed in its local environment. Call the combination of the object \(O_{t}\) and its environment \(E_{t}\) as a system \(S_{t}\), so the system evolves with time. We all know that all things with a structure contain information. The information of the system is partially inherited from previous system. Such information flow together with environmental noise determine the evolution of the system. Back to the object, the information flow from the previous system is a composition of information flow from previous object and previous environment. The individuality of the object can be described as how much more information an object gets from previous object than from previous environment.

Figure 1. Time Evolution of the System

The information flow between two signals \(S,R\) is captured by mutual information \( I(S ; R)=H(S)+H(R)-H(S, R)\), where \(H(X)\) is the entropy of the signal \(X\). To further calculate the individuality, we need a technique called decomposition of mutual information. If a signal is a combination of two signals, we have \( I\left(S_{1}, S_{2} ; R\right)=I\left(R ; S_{1}\right)+I\left(R ; S_{2} \mid S_{1}\right) \), where \( I\left(R ; S_{2} \mid S_{1}\right):=H\left(S_{2} \mid S_{2}\right)-H\left(S_{2} \mid R, S_{1}\right) \) means when signal \(S_{1}\) is already know, the extra information \(R\) obtained when newly given the signal \(S_{2}\). Here we need to notice that the combination of two signals usually contain more information than the sum of two signals’ information.

Figure 2. Information decomposite

Back to the object-environment cases, we have

$$ \begin{align} I(O_(t+1);S_t )&=I(O_(t+1);O_t,E_t )\\ &=I(O_(t+1);O_t )+I(O_(t+1);E_t |O_t )\\ &=I(O_(t+1);E_t )+I(O_(t+1);O_t |E_t ) \end{align} $$

And we define

(a) \(A^*= I(O_(t+1);O_t )\), the information flow from object itself, an index of object’s autonomy. If this is the leading composition in total mutual information, it means the object is mainly controlled by itself, we say the object is of high Organismal Individuality.

(b) \(A=I(O_(t+1);O_t |E_t )\), the information flow from object itself when given the environment signal, an index of object’s autonomy under environmental influence. Unlike \(A^{*}\), the information in \( A\) usually cannot directly read from \( O_{t} \), it has to be interpreted together with \( E_{t} \). If it is the leading composition, the object is living in an environment that is closely related to it (a colonial), therefore object has high level of Colonial Individuality.

(c) \(nC=I(O_(t+1);E_t |O_t )\), the information flow from environment, an index of the influence of environment to the object. If it is low, it means all environmental factors are predictable for the object. If it is the leading composition, the object is of high Environmental Determinism.

(d) \(NTIC=A^*-A\), Environmental Coding, an index of how object’s organismal individuality different from its colonial individuality. The higher the index is, information about the environment are more likely to be encoded in the object innately, therefore the object is dominated by its nature. On the opposite, if the index is negative, information needs to be encoded through ongoing interaction and nurture dominates the object.

(e) \(I(O_(t+1);E_t )\), this composition does not contain information from objects. If it is the primary composition, the object has low autonomy, therefore cannot be called an individual.

Part 2. Applying ITI to our Project

To calculate the information, we should first modify the master equation into stochastic form. Detail of transformation is trivial and would not be listed here.

The simulation is then run and the stochastic signal around stable states is obtained. An appropriate time interval must be first chosen. The nature of dynamic equations implies that as long as time intervals are small enough, the change on variables’ value would be relatively small. This means when time interval is small, all objects would present a strong organismal individuality, making the result useless. A good index of choosing the time interval is the autocorrelation function—when it is small, we can conclude that the interval is long enough.

Figure 3.a. Autocorrelation Marix for half a day's iterval
Figure 3.b. Autocorrelation Marix for one day' interval

The signal was then separated into bins. Bin division should consider both the characteristics and the total number of the data. The numbers of bins would directly influence the value of entropy. However, when it comes to mutual information, bin numbers are unimportant as their contribution was neutralized in the calculation.

To reduce calculating time, we choose bacteria living in the mucus to represent the respective bacteria signal. It is a rational choice since most of the bacteria lives in the mucus. Moreover, the correlation matrix also showed significant co-relation between bacteria in the mucus and the respective adherent bacteria. Naturally, we take the immune response to represent human signal. The rest of variables are taken as unknow intermediate product.

The calculating results with their analysis are listed below.

$$O$$ $$A^*$$ $$A$$ $$NTIC$$
H 0.9874 0.6957 0.2917
I 0.6714 0.1572 0.5142

Table 1. ITI analysis for i-state

$$O$$ $$A^*$$ $$A$$ $$NTIC$$
H 0.6742 0.4440 0.2302
I 0.8268 0.9944 -0.1676

Table 2. ITI analysis for p-state

$$O$$ $$A^*$$ $$A$$ $$NTIC$$
L 0.6968 0.9595 -0.2627
H 0.6343 0.9499 -0.3155
I 0.3095 0.8468 -0.5372

Table 3. ITI analysis for h-state

\(i-state\) Hp and human are both of high organismal individuality and are independent with each other. Their environmental coding is significantly positive. This means they adapts the situation well but as they rely on innate coding, thus in the face of pulse, i-state cannot keep its balance.

\(p-state\) Hp and humans are both of high organismal individuality. However, their environmental coding differs. Humans have a negative environmental coding while Hp have a positive one, which implies that Hp is dominating the p-state while humans have to manage to survive p-state.

\(h-state\) All of three components are of high colonial individuality. Their environmental coding is negative and significantly nonzero, which makes h-state capable to overcome extreme fluctuation. The environmental determinism for humans is also high, suggesting that humans obtain more influences from the bacteria.