Modelling foraminiferal test accumulation in Python by Andrew Berkeley

The Steady State Assumption

What if we return to the same spot time and time again and collect sediment samples which always show the same quantities of foraminifera: the same absolute concentrations, the same relative species abundances. We might interpret this as meaning that nothing is happening - no test production, no loss, no burial. Alternatively, it may be that the processes that are operating happen to be in some sort of equilibrium: the number of tests introduced between sampling times is equal to the number removed. In this case, the processes are said to be in dynamic equilibrium and the system as a whole is at a steady state.

For scenarios which are in steady state, the General Foraminiferal Equation is simplified a little since, if there is no change with time ($\frac {\partial C}{\partial t} = 0$), it becomes,

$$0 = D(x)\frac {\partial ^2C}{\partial x^2} - w\frac {\partial C}{\partial x} + a(x)R(x) - \lambda(x) C$$

What this equation now states is that, despite several processes still being described (mixing, burial, test production and loss) the net change to the concentration of dead tests when all are considered is equal zero. This must mean that the various processes are in some sort of balance - a steady state. Assuming a steady-state is a useful device because it allows us to analyse and explain data from a single point in time without needing to explicitly resolve changes through time.

A simple example with surface assemblages

Reference other notebook...

Okay, so dead test concentrations start off at zero and increase through time. But significantly, the addition of tests slows down and eventually stops, and thereafter the test concentration remains stable. Test accumulation stabilizes at a concentration of about 1000 tests per unit sediment after about 100 time steps. It is not as though test production and taphonomic decay aren't continuing in subsequent timesteps, it is just that they have reached a balance. So it seems there is a kind of inherent stabilizing tendency to this simple model: given sufficient time it will reach a particular state and stay there. When we invoke an assumption or an interpretation of a steady-state, we are usually implying that the system has reached some similar sort of inherent equlibrium phase.

And if we're only interested in the steady-state rather than the complete time evolution, then we can take a short cut. Remember that if $C$ is stable, not changing, then mathematically we can say that $\frac {d C}{d t} = 0$ - the rate of change of $C$ is zero. A visual illustration of this is the flattening of the line in the plot after about $t = 100$. So the steady-state phase is characterised by - no, defined by - $\frac {d C}{d t} = 0$. This means we can use equation (1) in a different way, rather than formerly solving it. Let's set the left-hand side to zero,

$$0 = aR - \lambda C^* $$

where we now adopt $C^*$ as the steady-state concentration. This can be easily solved for $C^*$,

$$C^* = a \frac {R}{\lambda} \hspace{1cm} (3) $$

So now we have an explicit statement of the steady state concentration without having to solve the entire history of the assemblage. Let's solve this using Python...


In [2]:
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline

In [3]:
a = 50
R = 1.0
lmbda = 0.05 # lambda is a reserved word in Python

In [4]:
# We've already defined the independent variables
C_star = a * R / lmbda

print(C_star)


1000.0

So equation (3) predicts a steady-state concentration of 1000 tests per unit of sediment. And what was the stable value generated from equation (2)? Let's get the last value from our C array.

Hmmm. Well they're pretty close. So equation (3) is getting to the same steady-state answer as equation (2), and with much less effort! If we look a bit more closely at equation (2) we can see why. Equation (2) looks just like equaton (3) but with an extra term in square brackets. The term in square brackets contains an exponential decay function which converges to zero as time increases and therefore the term in square brackets itself converges to 1. So equation (2) converges towards equation (3, i.e. $a \frac {R}{\lambda}$) with increasing time.

We can conclude that, for a single sediment layer (e.g. the surface layer) with no mixing or sedimentation and constant rates of test production and taphonomic decay, we should expect the dead test concentration ($C$) to reach a stable size and to exceed the size of the standing crop ($a$) by a factor of $\frac {R}{\lambda}$, i.e. the ratio of the reproduction rate and the taphonomic decay rate.

But why would this system be inherently stable? The answer is down to the way test production and taphonomic loss have been defined. Test production occurs at a constant rate ($aR$). By contrast, the rate of taphonomic loss changes depending on the number of tests already accumulated. More specifically, taphonomic loss is directly proportional to test concentration (defined as $\lambda C$) and therefore the rate of test loss increases as the test concentration increases. This provides the inherent stabilizing feature: when test concentrations grow large, the rate of further growth is checked by larger decay rates. At some point a dead test concentration will be reached at which the rate of loss happens to equal the rate of addition. And at this point, no more additional tests are added and the concentration stays the same.

In general, models with such decay terms might be expected to similarly curtail accumulation. Whatever the size of the decay constant ($\lambda$), there will always be a concentration at which loss is the same as input (assuming a simple input) and therefore no more accumulation occurs. The decay constant simply determines how long it will take to get to that point. This limiting feedback process is one reason why assumptions of steady-state may be perfectly appropriate.

An example with cored data

The steady-state assumption may be particularly useful when looking at cored data which represents test accumulation over long timescales (e.g. decades, centuries). It is obviously difficult to collect repeated cores over such timescales. We can imagine that as sediment accumulates over long time periods and the zones of foraminiferal test production and taphonomic shift up with it, that the profile of accumulate dead tests does not change each time we resample: maybe it describes a "typical" pattern of taphonomic loss or infaunal species enrichment that persists.

$$0 = - w\frac {\partial C}{\partial x} + aR$$

which we can rearrange to,

$$\frac {d C}{d x} = \frac {aR}{w}$$

This now explicitly describes test concentration as a function of depth only, implying a steady-state scenario.

If we assume no taphonomic loss then we get

$$C^* = \frac {aRx}{w}$$

In [8]:
x_max = 10.0     # total depth into sediment to model
a = 25.0 # standing crop size
R = 2.0  # reproduction rate
w = 0.5  # sedimentation rate

x =np.arange(x_max+1)
C_star = a*R*x/w

In [9]:
fig = plt.figure()

p = fig.add_subplot(111, xlim=(0, x_max), ylim=(np.min(C_star)*1.1, np.max(C_star)*1.1))
plt.plot(C_star[:], lw=3)
plt.grid()
plt.xlabel('depth, x')
plt.ylabel('dead test concentration, C')
plt.show()



In [ ]:


In [ ]:

Why?

Assuming a steady-state may be necessary if we only have data from one point in time - we cannot resolve changes through time with such and so assuming that the data is representative of all times (because the system is in equilibrium) allows us to unpick the individual processes.

In many cases it will be a reasonable assumption although this remains a judgement call and may depend on the scale of analysis. Foraminiferal populations and dead assemblages may fluctuate on timescales of weeks or months. But to the extent that these fluctuations average out over longer times scales it may be appropriate to consider them as noise around a more stable, average signal. If an environment can be said to be represented by a "typical" suface assemblage (e.g. for palaeoenvironment purposes) or a "typical" profile of dead test concentrations with depth (e.g. "well preserved", or "infaunally enriched") then they can be equally considered to


In [ ]: