A nerve cell is an example of a system whose excitation codes and transmits dynamic information. When a stimulus is above a given threshold, the neuron responds by generating an action potential, so that over time it fires irregular sequences of all-or-nothing spike responses. The ionic cell mechanisms that generate the spike are understood, but there are many mechanisms for coding the spike train, and our knowledge of these mechanisms ranges from well established to speculative. In Physical Review Letters, Chacron et al.1 show, using a simple action-potential model, that correlations between sequential interspike intervals can shape the noise spectrum — and that this shaping can increase the transmission of information, because it reduces the noise spectrum at low frequencies.

Information theory is essentially linear2: a reaction is directly proportional to an action. So it might be expected that the constraints imposed by interval correlations would reduce the transmission of information. However, spike generation is a strongly nonlinear, excitable process with a threshold, and such systems can behave counterintuitively. A good example is stochastic resonance3, in which additive noise can increase, rather than decrease, the efficiency of information transmission.

A spike train can be represented by the sequence of intervals between spikes; this is characterized by the interval statistics (in the time domain by probability distributions and correlations, and in the frequency domain by spectral densities). Chacron et al.1 consider two schemes of spike generation. The first produces a ‘renewal process’ that has no memory of the excitation because the system resets itself each time a spike is generated. Here, there is no correlation between successive spike intervals. The probability distributions for higher-order intervals (say, between one spike and the third spike following) and the ‘autocorrelation’ (the probability of a spike occurring after some other spike, irrespective of how many intervening spikes there were) can be calculated directly from the interspike-interval probability density.

The second scheme generates a non-renewal spike train, with correlations between adjacent intervals. Spike trains of sensory neurons4 with a constant stimulus often show dependencies between neighbouring intervals; such dependencies are produced either by fluctuations in an oscillatory spike-generator or by activity-dependent changes in threshold. Long intervals followed by short ones (and vice versa) result in negative interval correlations.

The essence of spike generation is that inputs too weak to trigger a response are summed, giving a potential; when the potential reaches a threshold, a spike is generated. Chacron et al.1 use a ‘perfect integrator model’5 for both spike-generation schemes, with a random threshold drawn from a uniform distribution. When the threshold is reached the potential is reset, either by an amount dependent on the magnitude of the threshold just reached (to produce serial correlation), or by a random amount (to generate a spike train with the same interspike-interval density but no correlation).

The advantage of using this simple model is that the spectra, coherence and information transmission rates (mutual information rates) can all be calculated, as well as estimated through computer simulation. Chacron et al. demonstrate that a negative serial correlation between spikes reduces the spike-train spectrum and coherence at very low frequencies, although at middle-range frequencies the coherence is increased. Serial correlation also enhances the ability of the model to transmit information by reducing low-frequency noise compared with the renewal process. Such effects have been seen in more realistic neural models6, and can be quantified in real neural spike trains; but the complexity of both of these situations had meant that the mechanism for improved information transfer was unclear.

Whether or not this increase in the information transmission rate is exploited in neural systems is an open question, as biological evolution produces systems that work well enough and are robust, without necessarily being optimally efficient. The nervous system responds to a spike train in real time and does not process it as an indefinite sequence. But the effect of correlation on the transmission rate in a single spike train might transfer to correlations between multiple spike trains: variability between different neurons could be correlated, through common inputs and feedback7, and coupling within a population could lead to a similar reduction in variability by noise shaping8.