During the last decades, the idea of the edge of chaos, i.e. the space between order and chaos -- a zone usually described with the mathematics of impending avalanches and crystallizing liquids -- has been a fruitful mathematical concept to find and interpret new rules for natural and social systems.
At the edge of chaos we found the dynamics of criticality, where one system transforms rapidly into another. Scientists have studied such behavior in physical systems for decades; some have theorized that it might be found in living systems too, perhaps underlying some of biology’s fundamental and largely unexplained phenomena: how a few interacting genes shape an organism’s development, how networked neurons give rise to complex cognitive functions or how a social system self-organizes.
In 1999, the Danish physicist Per Bak proclaimed to a group of neuroscientists that perhaps the brain was less complicated than they thought. Perhaps, he said, the brain worked on the same fundamental principles as a simple sand pile, in which avalanches of various sizes help keep the entire system stable overall — a process he dubbed “self-organized criticality.” Self-organized criticality (SOC) is a property of dynamical systems that have a critical point as an attractor. Their macroscopic behaviour thus displays the spatial and/or temporal scale-invariance characteristic of the critical point of a phase transition, but without the need to tune control parameters to a precise value, because the system, effectively, tunes itself as it evolves towards criticality.
In this workshop, we present some examples of how the concept of criticality can be used to understand collective behaviour (regardless of wether the units of the system are neurons in a brain or individuals in a society). For doing so we will analyze how simple toy models display or not self-organized criticality in different conditions.
The Abelian sandpile model, also known as the Bak-Tang-Wiesenfeld model, was the first discovered example of a dynamical system displaying self-organized criticality.
The model is a cellular automaton. In its original formulation, each cell on a finite grid has an associated value that corresponds to the slope of the pile. This slope builds up as "grains of sand" (or "chips") are randomly placed onto the pile, until the slope exceeds a specific threshold value at which time that site collapses transferring sand into the adjacent sites, increasing their slope. There is a successive random placement of sand grains on the grid; each such placement of sand at a particular site may have no effect, or it may cause a cascading reaction that will affect many sites.
The original interest behind the model stemmed from the fact that in simulations on lattices, it is attracted to its critical state, at which point the correlation length of the system and the correlation time of the system go to infinity, without any fine tuning of a system parameter. This contrasts with earlier examples of critical phenomena, such as the phase transitions between solid and liquid, or liquid and gas, where the critical point can only be reached by precise tuning (e.g., of temperature). Hence, in the sandpile model we can say that the criticality is self-organized.
Once the sandpile model reaches its critical state there is no correlation between the system's response to a perturbation and the details of a perturbation. Generally this means that dropping another grain of sand onto the pile may cause nothing to happen, or it may cause the entire pile to collapse in a massive slide. The model also displays $1/f$ noise, a feature common to many complex systems in nature.
The example below displays a standard Abelian sandpile. You can see the distribution of the grains along time, how they expand from where they are dropped. Modify the parameteres in the code to explore some variations.
In [4]:
%%HTML
<iframe src="https://www.openprocessing.org/sketch/434784/embed/" width="450" height="510"></iframe>
The Bak–Sneppen model is a simple model of co-evolution between interacting species. It was developed to show how self-organized criticality may explain key features of the fossil record, such as the distribution of sizes of extinction events and the phenomenon of punctuated equilibrium.
The model dynamics repeatedly eliminates the least adapted species and mutates it and its neighbors to recreate the interaction between species. We consider N species, which are associated with a fitness factor f(i). They are indexed by integers i around a ring. The algorithm consists in choosing the least fit species, and then replacing it and its two closest neighbors (previous and next species) by new species, with a new random fitness. After a long run there will be a minimum required fitness, below which species don't survive. These "long-run" events are referred to as avalanches, and the model proceeds through these avalanches until it reaches a state of relative stability where all species' fitness are above a certain threshold.
As well, the Bak-Sneppen model has been used to represent scientific progress. Accounts of scientific change roughly fall into two categories, one and two-process change. One-process views hold that scientific change is a single process of movement toward an independent point of reference, for example Popper's cycle of conjecture and refutation. One of the most influential criticisms of this view is by Thomas Kuhn, who held that standards for science are not independent, but coevolve with scientific activity itself. Science then does not evolve toward a goal but away from goals previously set, with no fixed point left to compare its progress against. Without such a stable ‘‘Archimedean platform,’’ there is nothing against which science can meaningfully be said to cumulate. Moreover, changes of standard caused by scientific change can cause cascades of further scientific changes.
This example displays the Bak-Sneppel model. Every column represents one specie. As the species become older, their color change from light green to dark blue. When a specie has the least fitness, it is eliminated along its neighbours, so new species are created with their age restarted to zero and changing their colors to light green.
We can modify the original parameters to analyse their effect on the extinction avalanches. Go to OpenProcessing code and, in the "BS" tab, modify:
In [4]:
%%HTML
<iframe src="https://www.openprocessing.org/sketch/434390/embed/" width="500" height="400"></iframe>
A popular tool to detect patterns typical of critical phenomena is fluctuation analysis, which measures the fractal dimension of a signal. A fractal dimension is a ratio providing a statistical index of complexity comparing how detail in a pattern (strictly speaking, a fractal pattern) changes with the scale at which it is measured. It has also been characterized as a measure of the space-filling capacity of a pattern that tells how a fractal scales differently from the space it is embedded in; a fractal dimension does not have to be an integer.
For instance, if we measure the lenght of the England coast, we will find different values according the unit chosen to do the measurement.
We may apply the fractal ideas to temporal series. For instance, the box counting method would consist in covering with boxes the line of the signal when the value of the signal is displayed along time.
There are several algorithms that measure the fractal index of time signals. We are going to use the DFA (Detrended Fluctuation Analysis) because of its robustness. The DFA provides us the hurst coefficient $H$ as the slope of the function describing the relation of the fluctuation size with the temporal scale. For self-similar time series, $H$ is directly related to fractal dimension, D, where $1 < D < 2$, such that $D = 2 - H$. As well, $H$ is related to the power spectrum decays exponent $\beta$ (from $P(f)\sim f^{{-\beta }}$) by the equivalence $\beta =2 H -1$.
On time series, there are three chracteristic fractal values: white, pink and brown noise.
Now, we will analyse the results of the sandpile model to check if they are fractal. To do so, we will obtain the fractal index of the avalanche size.
To analyse the results, we will code the sandpile algorithm. Every time a new grain is dropped at the center, this action may produce nothing, or may produce a displacement of several other grains. Sometimes, this displacement spreads all over the sandpile in a big avalanche that moves from the center to the border and backwards. The size of the avalanche for each new grain is the number of grains displaced by that grain.
An avalanche size is the number of grains displaced on every time step. To save computation time, one time step means dropping the maximum number of grains allowed for a pile, as dropping less grains produces no avalanches. If the grains are drop near a wall or a corner, avalanches are contained by the boundary of the system (in our case extra grains are spilled, but the effect is equivalent).
Now, we will can run the script the function with different parameters to analize the results. We will plot the resulting board, the size of avalanches in every timestep, and the fluctuations of those avalanches computed using the DFA function. Fluctuations let us know how many avalanches there has been with a certain size range.
In [7]:
%matplotlib inline
from matplotlib import pyplot as plt
import numpy as np
from pyeeg import dfa
def plot_dfa(x,step,Nfit):
ix= np.arange(np.log2(len(x)),2,-step)
n=np.round(2**ix)
n,F= dfa(x,L=n)
Ncut=np.where(np.abs(n-Nfit)==np.min(np.abs(n-Nfit)))[0][0]
P= np.polyfit(np.log(n[0:Ncut]),np.log(F[0:Ncut]), 1)
print
plt.loglog(n,F)
plt.loglog(n[0:Ncut], np.power(n[0:Ncut], P[0])*np.exp(P[1]),'r')
plt.xlabel('n')
plt.ylabel('Fluctuations')
plt.title('Beta = '+str(2*P[0]-1))
# Width of the board (height is equal to width)
BOARD_SIZE = 100
# Maximum number of stackable sand grains that will cause an avalanche
MAX_GRAINS = 4
# Probablity of grains falling down to adjacent piles during an avalanche
FALL_PROBABILITY = 1
# Total number of grains dropped on the board
NUM_GRAINS = 10000
# Calculate the simulation
L=35 # Size of the Board. Odd number maintains simmetry
T=100000 # Number of iterations
avalancheSize=np.zeros(T) # Series of avalanche
# size of the pile-space (each pixel is a pile)
SIZE = (L, L)
# maximum number of stackable sand grains that will cause a tople
# (should be dividable by 4!)
MAXH = 4
# sand field
field = np.zeros((SIZE[0],SIZE[1]))
# run until a stable state is reached
for t in range(T):
# add sand if main pile is not saturated
#MPC = (int((L-1)/2),int((L-1)/2)) # Center of pile
#MPC = (1,int((L-1)/2)) # Wall
MPC = (1,1) # Corner
MPC=(np.random.randint(L),np.random.randint(L)) # Random
#MPC=(np.random.randint(5),np.random.randint(5)) # Random around corner
#Add grains if the system is not saturated
if field[MPC]<MAXH:
field[MPC] += MAXH
# find the highest pile
toohigh = field >= MAXH
# decrease piles
field[toohigh] -= MAXH
# increase piles
field[1:,:][toohigh[:-1,:]] += 1
field[:-1,:][toohigh[1:,:]] += 1
field[:,1:][toohigh[:,:-1]] += 1
field[:,:-1][toohigh[:,1:]] += 1
# reset the overspill
field[0,:] = 0
field[-1:,:] = 0
field[:,0] = 0
field[:,-1] = 0
# record avalanche size
avalancheSize[t]+=np.sum(toohigh)
# Figures
fig = plt.figure(figsize=(15,4))
# Sandpile boad
plt.subplot(1,3,1)
plt.axis('off')
plt.imshow(field)
plt.colorbar()
#select signal from the system series in stationary mode
x=avalancheSize[int(T*3/4):]
# Chronogram of avalanches
ax = plt.subplot(1,3,2)
plt.xlabel('Time')
plt.ylabel('Avalanche size')
plt.plot(x[-1000:])
cut=2
step=0.1
ax = plt.subplot(1,3,3)
plot_dfa(x,step,100)
plt.show()
Now, we will analyse the results of the Bak-Sneppen model to check if they are fractal. We will analyze the mean fitness of the individuals in the system as the globar variable driving behaviour. At first, this number will grow along time because the least fitness specie is eliminated each generation. As time goes by, an increasingly random growth is expected because old species will have an almost constant high value of fitness while the new species are introduced with random fitness. However, interaction between species is going to provoke chains extintions lowering global fitness. The equilibrium between the two process drives the system towards self organized criticality.
To do so, we will obtain the fractal index of the avalanche size.
In [12]:
%matplotlib inline
from matplotlib import pyplot as plt
import numpy as np
from pyeeg import dfa
def plot_dfa(x,step,Nfit):
ix= np.arange(np.log2(len(x)),2,-step)
n=np.round(2**ix)
n,F= dfa(x,L=n)
Ncut=np.where(np.abs(n-Nfit)==np.min(np.abs(n-Nfit)))[0][0]
P= np.polyfit(np.log(n[0:Ncut]),np.log(F[0:Ncut]), 1)
print
plt.loglog(n,F)
plt.loglog(n[0:Ncut], np.power(n[0:Ncut], P[0])*np.exp(P[1]),'r')
plt.xlabel('n')
plt.ylabel('Fluctuations')
plt.title('Beta = '+str(2*P[0]-1))
#Number of species (system size)
N=64
# set probability of propagating extintion
p=0.5
#Array of species
s=np.random.rand(N)
#Simulation time
T=10000
#mean fitness series
fit=np.zeros(T)
def update(s,p):
ind=np.argmin(s)
s[ind]=np.random.rand(1) # remove individual
for d in [-1,1]: # for each neighbour
if np.random.rand(1)<p: # if probability < p
s[(ind+d)%N]=np.random.rand(1) # remove neighbour
return s
for t in range(T):
s=update(s,p)
fit[t]=np.mean(s)
# Figures
fig = plt.figure(figsize=(15,4))
#select signal from the system series in stationary mode
x=fit[int(T*3/4):]
# Chronogram of avalanches
ax = plt.subplot(1,2,1)
plt.xlabel('Time')
plt.ylabel('Mean fitness')
plt.plot(x[-1000:])
cut=2
step=0.1
ax = plt.subplot(1,2,2)
plot_dfa(x,step,400)
plt.show()