# Differential equations in Python

There is often no analytical solution to systems with nonlinear, interacting dynamics. We can, however, examine the dynamics using numerical methods. Consider the predator-prey system of equations, where there are fish ($x$) and fishing boats ($y$):

We use the built-in SciPy function odeint to solve the system of ordinary differential equations, which relies on lsoda from the FORTRAN library odepack. First, we define a callable function to compute the time derivatives for a given state, indexed by the time period. We also load libraries that we’ll use later to animate the results.

Then, we define the state-space and intital conditions, so that we can solve the system of linear equations. The result is animated below. (The code for some of the graphical bells and whistles is omitted for the sake of exposition.)

The red, dashed lines indicate the nullclines, derived from the first-order conditions of the equation system. These lines delineate the phase space of the top graph; and the lines intersect at the equilibrium levels of fish and boats.

It is easy to break this result by messing with the solver parameters or the size of the time steps (relative to the total time), demonstrating the fragility of the result for real-world applications. If, for example, we increase the step size from 0.1 to 5, we lose most of the dynamics that characterize the system. The same goes for fiddling with the iteration parameters of the ODE solver.

Suppose we wanted to figure out the behavior of this system near the equilibrium before going through the numerical estimation. First, we linearize the system near the equilibrium, yielding the Jacobian. Let $f(x, y) = dx/dt$ and $g(x, y) = dy/dt$, then the nonlinear system can be approximated by the following linear system near the equilibrim $(\bar{x}, \bar{y})$:

Noting that $f(\bar{x}, \bar{y}) = g(\bar{x}, \bar{y}) = 0$ by definition of the equilibrium, the nonlinear system is approximated by the linear system defined by the Jacobian, evaluated at the equilibrium:

Then the eigenvalues $\lambda$ are given by:

The real part is negative and there is an imaginary component, such that the system will oscillate around the equilibrium tending inward. This behavior is reflected in the animation. I guess we didn’t really have to go through all this work; but whatever, it’s useful for other problems.

## Schaefer model

Bjorndal and Conrad (1987) modelled open-access exploitation of North Sea herring between 1963 - 1977. Their model is similar to the one above, except slightly more complicated. Let fish stock ($x$) and fishing effort ($y$) be modelled by the following system:

where $k$ is a catchability constant, $g$ is the intrinsic growth rate of the fish stock, $K$ is the carrying capacity, $p$ is the fish price, and $c$ is the marginal cost of one unit of effort. Then, through the same process as above, we find that the equilibrium point (at the intersection of the nullclines) is:

Using the constants in Bjorndal and Conrad (1987) we model the system similarly:

The Jacobian for this system evaluated at the equilibrium is:

I don’t want to solve this by hand, so I plug it into Python.

The eigenvalues are $\lambda = -0.0031 \pm 41.0182i$. Once again, the behavior seen in the numerical approximation is confirmed by math. The system oscillates and tends inward.

Shit can get weird. The numerical approximations, while very good in Python, can misrepresent the system of equations, given certain parameters. Specifically, the system is solved through an iterative process of calculating the linear change at each interval, approximating the continuous system. Choosing certain step sizes and tolerances will send Python or Matlab into a tailspin. Although, the checks and balances within odeint are really quite good, such that it’s way easier to break the numerical approximation if you try to write it explicitly in a for-loop.