spiketools.variability

Variability metrics for spike trains.

Unless noted otherwise, functions operate on canonical spiketimes arrays with times in row 0 and trial or unit indices in row 1.

Examples

Shared example setup used throughout the documentation:

import numpy as np

from spiketools import gamma_spikes
from spiketools.variability import cv2, cv_two, ff

np.random.seed(0)
rates = np.array([6.0] * 10 + [5.6, 6.3, 5.9, 6.5, 5.8, 6.1, 5.7, 6.4, 6.0, 5.5], dtype=float)
orders = np.array([0.2] * 10 + [1.0, 2.0, 2.0, 3.0, 1.0, 2.0, 3.0, 2.0, 1.0, 3.0], dtype=float)
spiketimes = gamma_spikes(rates=rates, order=orders, tlim=[0.0, 5000.0], dt=1.0)

Use one trial from the shared dataset for interval variability. Here trial_id = 13 has gamma order = 3, so it should be more regular than a Poisson-like trial and we therefore expect cv2 < 1.

trial_id = 13
trial = spiketimes[:, spiketimes[1] == trial_id].copy()
trial[1] = 0

print(f"Trial {trial_id} uses gamma order {orders[trial_id]}; expect cv2 < 1.")
print(round(float(cv2(trial)), 3))
print(round(float(cv_two(trial, min_vals=2)), 3))

Example output:

Trial 13 uses gamma order 3.0; expect cv2 < 1.
0.292
0.536

For the Fano factor, use only trials 0-9. These have the same rate but a bursty gamma order = 0.2, so we expect ff(...) > 1:

ff_trials = spiketimes[:, spiketimes[1] < 10].copy()

print("Trials 0-9 use rate 6 spikes/s and gamma order 0.2; expect ff > 1.")
print(round(float(ff(ff_trials, tlim=[0.0, 5000.0])), 3))

Example output:

Trials 0-9 use rate 6 spikes/s and gamma order 0.2; expect ff > 1.
2.891
  1"""Variability metrics for spike trains.
  2
  3Unless noted otherwise, functions operate on canonical `spiketimes` arrays with
  4times in row `0` and trial or unit indices in row `1`.
  5
  6Examples
  7--------
  8Shared example setup used throughout the documentation:
  9
 10```python
 11import numpy as np
 12
 13from spiketools import gamma_spikes
 14from spiketools.variability import cv2, cv_two, ff
 15
 16np.random.seed(0)
 17rates = np.array([6.0] * 10 + [5.6, 6.3, 5.9, 6.5, 5.8, 6.1, 5.7, 6.4, 6.0, 5.5], dtype=float)
 18orders = np.array([0.2] * 10 + [1.0, 2.0, 2.0, 3.0, 1.0, 2.0, 3.0, 2.0, 1.0, 3.0], dtype=float)
 19spiketimes = gamma_spikes(rates=rates, order=orders, tlim=[0.0, 5000.0], dt=1.0)
 20```
 21
 22Use one trial from the shared dataset for interval variability. Here `trial_id = 13`
 23has gamma `order = 3`, so it should be more regular than a Poisson-like trial and
 24we therefore expect `cv2 < 1`.
 25
 26```python
 27trial_id = 13
 28trial = spiketimes[:, spiketimes[1] == trial_id].copy()
 29trial[1] = 0
 30
 31print(f"Trial {trial_id} uses gamma order {orders[trial_id]}; expect cv2 < 1.")
 32print(round(float(cv2(trial)), 3))
 33print(round(float(cv_two(trial, min_vals=2)), 3))
 34```
 35
 36Example output:
 37
 38```text
 39Trial 13 uses gamma order 3.0; expect cv2 < 1.
 400.292
 410.536
 42```
 43
 44For the Fano factor, use only trials `0-9`. These have the same rate but a
 45bursty gamma `order = 0.2`, so we expect `ff(...) > 1`:
 46
 47```python
 48ff_trials = spiketimes[:, spiketimes[1] < 10].copy()
 49
 50print("Trials 0-9 use rate 6 spikes/s and gamma order 0.2; expect ff > 1.")
 51print(round(float(ff(ff_trials, tlim=[0.0, 5000.0])), 3))
 52```
 53
 54Example output:
 55
 56```text
 57Trials 0-9 use rate 6 spikes/s and gamma order 0.2; expect ff > 1.
 582.891
 59```
 60"""
 61
 62from __future__ import annotations
 63
 64from bisect import bisect_right
 65
 66import numpy as np
 67import pylab
 68
 69try:
 70    from Cspiketools import time_resolved_cv_two as _time_resolved_cv_two  # type: ignore
 71except ModuleNotFoundError:  # pragma: no cover - optional acceleration
 72    def _time_resolved_cv_two(*args, **kwargs):
 73        raise ModuleNotFoundError(
 74            "The optional C extension 'Cspiketools' is not available. "
 75            "Compile it via 'python src/setupCspiketools.py build_ext --inplace'."
 76        )
 77from .conversion import (
 78    cut_spiketimes,
 79    get_time_limits,
 80    spiketimes_to_binary,
 81    spiketimes_to_list,
 82)
 83from .rate import gaussian_kernel, kernel_rate, rate_integral, sliding_counts
 84from .transforms import time_warp
 85
 86__all__ = [
 87    "cv2",
 88    "cv_two",
 89    "ff",
 90    "lv",
 91    "kernel_fano",
 92    "time_resolved_cv2",
 93    "time_resolved_cv_two",
 94    "time_warped_cv2",
 95]
 96
 97
 98def ff(spiketimes, mintrials=None, tlim=None):
 99    r"""Compute the Fano factor across trials or units.
100
101    Parameters
102    ----------
103    spiketimes:
104        Canonical spike representation.
105    mintrials:
106        Optional minimum number of rows required for a valid estimate.
107    tlim:
108        Optional `[tmin, tmax]` counting window in ms.
109
110    Definition
111    ----------
112    If `N` is the spike count in the analysis window, the Fano factor is
113
114    $$
115    \mathrm{FF} = \frac{\mathrm{Var}[N]}{\mathrm{E}[N]}
116    $$
117
118    Examples
119    --------
120    >>> ff(np.array([[0.0, 1.0, 0.0, 1.0], [0.0, 0.0, 1.0, 1.0]]), tlim=[0.0, 2.0])
121    0.0
122    """
123    if tlim is None:
124        tlim = get_time_limits(spiketimes)
125    dt = tlim[1] - tlim[0]
126    binary, _ = spiketimes_to_binary(spiketimes, tlim=tlim, dt=dt)
127    counts = binary.sum(axis=1)
128    if (counts == 0).all():
129        return pylab.nan
130    if mintrials is not None and len(counts) < mintrials:
131        return pylab.nan
132    return counts.var() / counts.mean()
133
134
135def kernel_fano(spiketimes, window, dt=1.0, tlim=None, components=False):
136    """Estimate a sliding-window Fano factor from binned spike counts.
137
138    Parameters
139    ----------
140    spiketimes:
141        Canonical spike representation.
142    window:
143        Counting window in ms.
144    dt:
145        Bin width in ms.
146    tlim:
147        Optional analysis interval in ms.
148    components:
149        If `True`, return variance and mean separately instead of their ratio.
150
151    Examples
152    --------
153    >>> fano, time = kernel_fano(
154    ...     np.array([[0.0, 2.0, 0.0, 2.0], [0.0, 0.0, 1.0, 1.0]]),
155    ...     window=2.0,
156    ...     dt=1.0,
157    ...     tlim=[0.0, 4.0],
158    ... )
159    >>> fano.shape, time.shape
160    ((3,), (3,))
161    """
162    if tlim is None:
163        tlim = get_time_limits(spiketimes)
164    if tlim[1] - tlim[0] == window:
165        binary, time = spiketimes_to_binary(spiketimes, tlim=tlim)
166        counts = binary.sum(axis=1)
167        return pylab.array([counts.var() / counts.mean()]), pylab.array(time.mean())
168
169    counts, time = sliding_counts(spiketimes, window, dt, tlim)
170
171    vars_ = counts.var(axis=0)
172    means = counts.mean(axis=0)
173    fano = pylab.zeros_like(vars_) * pylab.nan
174    if components:
175        return vars_, means, time
176    fano[means > 0] = vars_[means > 0] / means[means > 0]
177    return fano, time
178
179
180def cv2(
181    spiketimes,
182    pool=True,
183    return_all=False,
184    bessel_correction=False,
185    minvals=0,
186):
187    r"""Compute the coefficient-of-variation statistic from inter-spike intervals.
188
189    Parameters
190    ----------
191    spiketimes:
192        Canonical spike representation.
193    pool:
194        If `True`, pool intervals across trials or units before computing the
195        statistic. Otherwise compute per row and average.
196    return_all:
197        If `True`, return per-row values instead of their mean.
198    bessel_correction:
199        Use `ddof=1` in the interval variance.
200    minvals:
201        Minimum number of finite intervals required for a per-row value.
202
203    Definition
204    ----------
205    Despite the historical function name, this returns the squared coefficient
206    of variation of the inter-spike intervals $I_n$:
207
208    $$
209    \mathrm{CV}^2 = \frac{\mathrm{Var}[I_n]}{\mathrm{E}[I_n]^2}
210    $$
211
212    Examples
213    --------
214    >>> round(float(cv2(np.array([[0.0, 2.0, 5.0], [0.0, 0.0, 0.0]]))), 3)
215    0.04
216    """
217    if spiketimes.shape[1] < 3:
218        if return_all:
219            return pylab.array([pylab.nan])
220        return pylab.nan
221    ddof = 1 if bessel_correction else 0
222    spikelist = spiketimes_to_list(spiketimes)
223    maxlen = max(len(sl) for sl in spikelist)
224    if maxlen < 3:
225        return pylab.nan
226    spikearray = pylab.zeros((len(spikelist), maxlen)) * pylab.nan
227    spike_counts = []
228    for i, sl in enumerate(spikelist):
229        spikearray[i, : len(sl)] = sl
230        spike_counts.append(len(sl))
231    spike_counts = pylab.array(spike_counts)
232    spikearray = spikearray[spike_counts > 2]
233    intervals = pylab.diff(spikearray, axis=1)
234
235    if pool:
236        var = pylab.nanvar(intervals, ddof=ddof)
237        mean_squared = pylab.nanmean(intervals) ** 2
238        return var / mean_squared
239
240    intervals[pylab.isfinite(intervals).sum(axis=1) < minvals, :] = pylab.nan
241    var = pylab.nanvar(intervals, axis=1, ddof=ddof)
242    mean_squared = pylab.nanmean(intervals, axis=1) ** 2
243    if return_all:
244        return var / mean_squared
245    return pylab.nanmean(var / mean_squared)
246
247
248def _consecutive_intervals(spiketimes):
249    order = pylab.argsort(spiketimes[0])
250    sorted_spiketimes = spiketimes[:, order]
251    order = pylab.argsort(sorted_spiketimes[1], kind="mergesort")
252    sorted_spiketimes = sorted_spiketimes[:, order]
253    isis = pylab.diff(sorted_spiketimes[0])
254    trial = pylab.diff(sorted_spiketimes[1])
255    isis[trial != 0] = pylab.nan
256    inds = pylab.array([range(i, i + 2) for i in range(len(isis) - 1)])
257    try:
258        consecutive_isis = isis[inds]
259    except Exception:
260        consecutive_isis = pylab.zeros((1, 2)) * pylab.nan
261    consecutive_isis = consecutive_isis[pylab.isfinite(consecutive_isis.sum(axis=1))]
262    return consecutive_isis
263
264
265def cv_two(spiketimes, min_vals=20):
266    r"""Compute the local Cv2 measure from consecutive inter-spike intervals.
267
268    Definition
269    ----------
270    For consecutive inter-spike intervals $I_n$ and $I_{n+1}$, the local Cv2
271    statistic is
272
273    $$
274    \mathrm{Cv2} =
275    \left\langle
276    \frac{2\,|I_{n+1} - I_n|}{I_{n+1} + I_n}
277    \right\rangle_n
278    $$
279
280    where the average is taken across all finite neighboring interval pairs.
281
282    Examples
283    --------
284    >>> round(float(cv_two(np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]), min_vals=2)), 3)
285    0.343
286    """
287    consecutive_isis = _consecutive_intervals(spiketimes)
288    ms = (
289        2
290        * pylab.absolute(consecutive_isis[:, 0] - consecutive_isis[:, 1])
291        / (consecutive_isis[:, 0] + consecutive_isis[:, 1])
292    )
293    if len(ms) >= min_vals:
294        return pylab.mean(ms)
295    return pylab.nan
296
297
298def lv(spiketimes, min_vals=20):
299    r"""Compute the Shinomoto local variation metric.
300
301    Definition
302    ----------
303    Using the same consecutive inter-spike interval notation as `cv_two(...)`,
304    the local variation is
305
306    $$
307    \mathrm{LV} =
308    \left\langle
309    \frac{3\,(I_{n+1} - I_n)^2}{(I_{n+1} + I_n)^2}
310    \right\rangle_n
311    $$
312
313    Unlike $\mathrm{CV}^2$, this statistic is local in time and is therefore
314    less sensitive to slow rate drifts.
315
316    Examples
317    --------
318    >>> round(float(lv(np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]), min_vals=2)), 3)
319    0.091
320    """
321    consecutive_isis = _consecutive_intervals(spiketimes)
322    ms = (
323        3
324        * (consecutive_isis[:, 0] - consecutive_isis[:, 1]) ** 2
325        / (consecutive_isis[:, 0] + consecutive_isis[:, 1]) ** 2
326    )
327    if len(ms) >= min_vals:
328        return pylab.mean(ms)
329    return pylab.nan
330
331
332def time_resolved_cv2(
333    spiketimes,
334    window=None,
335    ot=5.0,
336    tlim=None,
337    pool=False,
338    tstep=1.0,
339    bessel_correction=False,
340    minvals=0,
341    return_all=False,
342):
343    r"""Estimate Cv2 in sliding windows over time.
344
345    Parameters
346    ----------
347    spiketimes:
348        Canonical spike representation.
349    window:
350        Window width in ms. If `None`, it is inferred from the mean ISI.
351    ot:
352        Multiplicative factor used when inferring `window`.
353    tlim:
354        Optional analysis interval in ms.
355    pool:
356        Whether to pool intervals across trials or units.
357    tstep:
358        Window step size in ms.
359    bessel_correction:
360        Use `ddof=1` in the interval variance.
361    minvals:
362        Minimum number of intervals per row.
363    return_all:
364        Return per-row values for each window instead of a single mean.
365
366    Definition
367    ----------
368    For a window of width $W$ starting at $t$, this function computes
369
370    $$
371    \mathrm{CV}^2(t) =
372    \mathrm{CV}^2\left(\{I_n : \text{both spikes of } I_n \in [t, t + W)\}\right)
373    $$
374
375    and reports it at the window center. If `window=None`, the code first
376    estimates a characteristic window from the mean inter-spike interval and
377    the factor `ot`.
378
379    Examples
380    --------
381    >>> values, time = time_resolved_cv2(
382    ...     np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]),
383    ...     window=5.0,
384    ...     tlim=[0.0, 10.0],
385    ...     tstep=2.0,
386    ... )
387    >>> values.shape, time.shape
388    ((3,), (3,))
389    """
390    if tlim is None:
391        tlim = get_time_limits(spiketimes)
392
393    if window is None:
394        spikelist = spiketimes_to_list(spiketimes)
395        isis = [pylab.diff(spikes) for spikes in spikelist]
396        meanisi = [isi.mean() for isi in isis if pylab.isnan(isi.mean()) == False]
397        if meanisi:
398            window = sum(meanisi) / float(len(meanisi)) * ot
399        else:
400            window = tlim[1] - tlim[0]
401
402    time = []
403    cvs = []
404    tmin = tlim[0]
405    tmax = tlim[0] + window
406    order = pylab.argsort(spiketimes[0])
407    ordered_spiketimes = spiketimes[:, order]
408    while tmax < tlim[1]:
409        start_ind = bisect_right(ordered_spiketimes[0], tmin)
410        end_ind = bisect_right(ordered_spiketimes[0], tmax)
411        window_spikes = ordered_spiketimes[:, start_ind:end_ind]
412        cvs.append(
413            cv2(
414                window_spikes,
415                pool,
416                bessel_correction=bessel_correction,
417                minvals=minvals,
418                return_all=return_all,
419            )
420        )
421        time.append(0.5 * (tmin + tmax))
422        tmin += tstep
423        tmax += tstep
424    if return_all:
425        if len(cvs) > 0:
426            maxlen = max(len(cv) for cv in cvs)
427            cvs = [cv.tolist() + [pylab.nan] * (maxlen - len(cv)) for cv in cvs]
428        else:
429            cvs = [[]]
430    return pylab.array(cvs), pylab.array(time)
431
432
433def time_warped_cv2(
434    spiketimes,
435    window=None,
436    ot=5.0,
437    tstep=1.0,
438    tlim=None,
439    rate=None,
440    kernel_width=50.0,
441    nstd=3.0,
442    pool=True,
443    dt=1.0,
444    inspection_plots=False,
445    kernel_type="gaussian",
446    bessel_correction=False,
447    interpolate=False,
448    minvals=0,
449    return_all=False,
450):
451    """Estimate Cv2 after compensating for slow rate fluctuations by time warping.
452
453    Parameters
454    ----------
455    spiketimes:
456        Canonical spike representation.
457    window:
458        Window width in warped time. If `None`, infer from the mean ISI.
459    ot:
460        Multiplicative factor used for automatic window selection.
461    tstep:
462        Window step size in ms.
463    tlim:
464        Optional `[tmin, tmax]` analysis interval.
465    rate:
466        Optional precomputed `(rate, time)` tuple.
467    kernel_width:
468        Gaussian kernel width in ms when `rate` is not provided.
469    pool:
470        Whether to pool intervals across rows in the Cv2 estimate.
471    dt:
472        Rate-estimation sampling interval in ms.
473    inspection_plots:
474        Plot the forward and backward time-warp mappings for debugging.
475    interpolate:
476        If `True`, interpolate the output back onto a regular time grid.
477    minvals:
478        Minimum number of intervals per row.
479    return_all:
480        Return per-row values for each window instead of a single mean.
481
482    Examples
483    --------
484    >>> values, time = time_warped_cv2(
485    ...     np.array([[0.0, 3.0, 7.0, 12.0], [0.0, 0.0, 0.0, 0.0]]),
486    ...     window=3.0,
487    ...     tlim=[0.0, 13.0],
488    ...     pool=True,
489    ...     dt=1.0,
490    ... )
491    >>> time.ndim
492    1
493    """
494    del nstd  # Unused but kept for API compatibility.
495    del kernel_type  # Unused but kept for API compatibility.
496
497    if tlim is None:
498        tlim = get_time_limits(spiketimes)
499
500    if rate is None:
501        kernel = gaussian_kernel(kernel_width, dt)
502        rate, trate = kernel_rate(spiketimes, kernel, tlim=tlim, dt=dt)
503    else:
504        trate = rate[1]
505        rate = rate[0]
506
507    rate = rate.flatten()
508    rate[rate == 0] = 1e-4
509    notnaninds = pylab.isnan(rate) == False
510    rate = rate[notnaninds]
511    trate = trate[notnaninds]
512
513    tdash = rate_integral(rate, dt)
514    tdash -= tdash.min()
515    tdash /= tdash.max()
516    tdash *= trate.max() - trate.min()
517    tdash += trate.min()
518
519    transformed_spikes = spiketimes.copy().astype(float)
520    transformed_spikes[0, :] = time_warp(transformed_spikes[0, :], trate, tdash)
521
522    if inspection_plots:
523        pylab.figure()
524        pylab.subplot(1, 2, 1)
525        stepsize = max(1, int(spiketimes.shape[1] / 100))
526        for i in list(range(0, spiketimes.shape[1], stepsize)) + [-1]:
527            if pylab.isnan(transformed_spikes[0, i]) or pylab.isnan(spiketimes[0, i]):
528                continue
529            idx_real = int(np.argmin(np.abs(trate - spiketimes[0, i])))
530            idx_warp = int(np.argmin(np.abs(tdash - transformed_spikes[0, i])))
531            pylab.plot(
532                [spiketimes[0, i]] * 2,
533                [0, tdash[idx_real]],
534                "--k",
535                linewidth=0.5,
536            )
537            pylab.plot(
538                [0, trate[idx_warp]],
539                [transformed_spikes[0, i]] * 2,
540                "--k",
541                linewidth=0.5,
542            )
543        pylab.plot(trate, tdash, linewidth=2.0)
544        pylab.xlabel("real time")
545        pylab.ylabel("transformed time")
546        pylab.title("transformation of spiketimes")
547
548    cv2_vals, tcv2 = time_resolved_cv2(
549        transformed_spikes,
550        window,
551        ot,
552        None,
553        pool,
554        tstep,
555        bessel_correction=bessel_correction,
556        minvals=minvals,
557        return_all=return_all,
558    )
559    if inspection_plots:
560        pylab.subplot(1, 2, 2)
561        stepsize = max(1, int(len(tcv2) / 50))
562        tcv22 = time_warp(tcv2, tdash, trate)
563        for i in list(range(0, len(tcv2), stepsize)) + [-1]:
564            idx_warp = int(np.argmin(np.abs(tdash - tcv2[i])))
565            idx_real = int(np.argmin(np.abs(trate - tcv22[i])))
566            pylab.plot(
567                [tcv2[i]] * 2,
568                [0, trate[idx_warp]],
569                "--k",
570                linewidth=0.5,
571            )
572            pylab.plot(
573                [0, tdash[idx_real]],
574                [tcv22[i]] * 2,
575                "--k",
576                linewidth=0.5,
577            )
578        pylab.plot(tdash, trate, linewidth=2)
579        pylab.title("re-transformation of cv2")
580        pylab.xlabel("transformed time")
581        pylab.ylabel("real time")
582
583    tcv2 = time_warp(tcv2, tdash, trate)
584
585    if interpolate:
586        time = pylab.arange(tlim[0], tlim[1], dt)
587        if len(cv2_vals) == 0 or (return_all and cv2_vals.shape[1] == 0):
588            cv2_vals = pylab.zeros_like(time) * pylab.nan
589        else:
590            if return_all:
591                cv2_vals = pylab.array(
592                    [
593                        pylab.interp(time, tcv2, cv2_vals[:, i], left=pylab.nan, right=pylab.nan)
594                        for i in range(cv2_vals.shape[1])
595                    ]
596                )
597            else:
598                cv2_vals = pylab.interp(time, tcv2, cv2_vals, left=pylab.nan, right=pylab.nan)
599        return cv2_vals, time
600    return cv2_vals, tcv2
601
602
603def time_resolved_cv_two(spiketimes, window=400, tlim=None, min_vals=10, tstep=1):
604    r"""Compute rolling Cv2 using the optional C extension.
605
606    Definition
607    ----------
608    This is the compiled analogue of applying `cv_two(...)` in successive
609    windows. For each window $[t, t + W)$ it collects all finite neighboring
610    interval pairs $(I_n, I_{n+1})$ whose spikes lie inside the window and
611    evaluates
612
613    $$
614    \mathrm{Cv2}(t) =
615    \left\langle
616    \frac{2\,|I_{n+1} - I_n|}{I_{n+1} + I_n}
617    \right\rangle_n
618    $$
619
620    if at least `min_vals` pairs are available. The window is then shifted by
621    `tstep`.
622
623    Notes
624    -----
625    This function requires the optional `Cspiketools` extension module.
626
627    Expected output
628    ---------------
629    Returns `(values, time)` with the same window semantics as
630    `time_resolved_cv2(...)` when the compiled extension is available.
631    """
632    return _time_resolved_cv_two(spiketimes, window, tlim, min_vals, tstep)
def cv2( spiketimes, pool=True, return_all=False, bessel_correction=False, minvals=0):
181def cv2(
182    spiketimes,
183    pool=True,
184    return_all=False,
185    bessel_correction=False,
186    minvals=0,
187):
188    r"""Compute the coefficient-of-variation statistic from inter-spike intervals.
189
190    Parameters
191    ----------
192    spiketimes:
193        Canonical spike representation.
194    pool:
195        If `True`, pool intervals across trials or units before computing the
196        statistic. Otherwise compute per row and average.
197    return_all:
198        If `True`, return per-row values instead of their mean.
199    bessel_correction:
200        Use `ddof=1` in the interval variance.
201    minvals:
202        Minimum number of finite intervals required for a per-row value.
203
204    Definition
205    ----------
206    Despite the historical function name, this returns the squared coefficient
207    of variation of the inter-spike intervals $I_n$:
208
209    $$
210    \mathrm{CV}^2 = \frac{\mathrm{Var}[I_n]}{\mathrm{E}[I_n]^2}
211    $$
212
213    Examples
214    --------
215    >>> round(float(cv2(np.array([[0.0, 2.0, 5.0], [0.0, 0.0, 0.0]]))), 3)
216    0.04
217    """
218    if spiketimes.shape[1] < 3:
219        if return_all:
220            return pylab.array([pylab.nan])
221        return pylab.nan
222    ddof = 1 if bessel_correction else 0
223    spikelist = spiketimes_to_list(spiketimes)
224    maxlen = max(len(sl) for sl in spikelist)
225    if maxlen < 3:
226        return pylab.nan
227    spikearray = pylab.zeros((len(spikelist), maxlen)) * pylab.nan
228    spike_counts = []
229    for i, sl in enumerate(spikelist):
230        spikearray[i, : len(sl)] = sl
231        spike_counts.append(len(sl))
232    spike_counts = pylab.array(spike_counts)
233    spikearray = spikearray[spike_counts > 2]
234    intervals = pylab.diff(spikearray, axis=1)
235
236    if pool:
237        var = pylab.nanvar(intervals, ddof=ddof)
238        mean_squared = pylab.nanmean(intervals) ** 2
239        return var / mean_squared
240
241    intervals[pylab.isfinite(intervals).sum(axis=1) < minvals, :] = pylab.nan
242    var = pylab.nanvar(intervals, axis=1, ddof=ddof)
243    mean_squared = pylab.nanmean(intervals, axis=1) ** 2
244    if return_all:
245        return var / mean_squared
246    return pylab.nanmean(var / mean_squared)

Compute the coefficient-of-variation statistic from inter-spike intervals.

Parameters
  • spiketimes:: Canonical spike representation.
  • pool:: If True, pool intervals across trials or units before computing the statistic. Otherwise compute per row and average.
  • return_all:: If True, return per-row values instead of their mean.
  • bessel_correction:: Use ddof=1 in the interval variance.
  • minvals:: Minimum number of finite intervals required for a per-row value.
Definition

Despite the historical function name, this returns the squared coefficient of variation of the inter-spike intervals $I_n$:

$$ \mathrm{CV}^2 = \frac{\mathrm{Var}[I_n]}{\mathrm{E}[I_n]^2} $$

Examples
>>> round(float(cv2(np.array([[0.0, 2.0, 5.0], [0.0, 0.0, 0.0]]))), 3)
0.04
def cv_two(spiketimes, min_vals=20):
266def cv_two(spiketimes, min_vals=20):
267    r"""Compute the local Cv2 measure from consecutive inter-spike intervals.
268
269    Definition
270    ----------
271    For consecutive inter-spike intervals $I_n$ and $I_{n+1}$, the local Cv2
272    statistic is
273
274    $$
275    \mathrm{Cv2} =
276    \left\langle
277    \frac{2\,|I_{n+1} - I_n|}{I_{n+1} + I_n}
278    \right\rangle_n
279    $$
280
281    where the average is taken across all finite neighboring interval pairs.
282
283    Examples
284    --------
285    >>> round(float(cv_two(np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]), min_vals=2)), 3)
286    0.343
287    """
288    consecutive_isis = _consecutive_intervals(spiketimes)
289    ms = (
290        2
291        * pylab.absolute(consecutive_isis[:, 0] - consecutive_isis[:, 1])
292        / (consecutive_isis[:, 0] + consecutive_isis[:, 1])
293    )
294    if len(ms) >= min_vals:
295        return pylab.mean(ms)
296    return pylab.nan

Compute the local Cv2 measure from consecutive inter-spike intervals.

Definition

For consecutive inter-spike intervals $I_n$ and $I_{n+1}$, the local Cv2 statistic is

$$ \mathrm{Cv2} = \left\langle \frac{2\,|I_{n+1} - I_n|}{I_{n+1} + I_n} \right\rangle_n $$

where the average is taken across all finite neighboring interval pairs.

Examples
>>> round(float(cv_two(np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]), min_vals=2)), 3)
0.343
def ff(spiketimes, mintrials=None, tlim=None):
 99def ff(spiketimes, mintrials=None, tlim=None):
100    r"""Compute the Fano factor across trials or units.
101
102    Parameters
103    ----------
104    spiketimes:
105        Canonical spike representation.
106    mintrials:
107        Optional minimum number of rows required for a valid estimate.
108    tlim:
109        Optional `[tmin, tmax]` counting window in ms.
110
111    Definition
112    ----------
113    If `N` is the spike count in the analysis window, the Fano factor is
114
115    $$
116    \mathrm{FF} = \frac{\mathrm{Var}[N]}{\mathrm{E}[N]}
117    $$
118
119    Examples
120    --------
121    >>> ff(np.array([[0.0, 1.0, 0.0, 1.0], [0.0, 0.0, 1.0, 1.0]]), tlim=[0.0, 2.0])
122    0.0
123    """
124    if tlim is None:
125        tlim = get_time_limits(spiketimes)
126    dt = tlim[1] - tlim[0]
127    binary, _ = spiketimes_to_binary(spiketimes, tlim=tlim, dt=dt)
128    counts = binary.sum(axis=1)
129    if (counts == 0).all():
130        return pylab.nan
131    if mintrials is not None and len(counts) < mintrials:
132        return pylab.nan
133    return counts.var() / counts.mean()

Compute the Fano factor across trials or units.

Parameters
  • spiketimes:: Canonical spike representation.
  • mintrials:: Optional minimum number of rows required for a valid estimate.
  • tlim:: Optional [tmin, tmax] counting window in ms.
Definition

If N is the spike count in the analysis window, the Fano factor is

$$ \mathrm{FF} = \frac{\mathrm{Var}[N]}{\mathrm{E}[N]} $$

Examples
>>> ff(np.array([[0.0, 1.0, 0.0, 1.0], [0.0, 0.0, 1.0, 1.0]]), tlim=[0.0, 2.0])
0.0
def lv(spiketimes, min_vals=20):
299def lv(spiketimes, min_vals=20):
300    r"""Compute the Shinomoto local variation metric.
301
302    Definition
303    ----------
304    Using the same consecutive inter-spike interval notation as `cv_two(...)`,
305    the local variation is
306
307    $$
308    \mathrm{LV} =
309    \left\langle
310    \frac{3\,(I_{n+1} - I_n)^2}{(I_{n+1} + I_n)^2}
311    \right\rangle_n
312    $$
313
314    Unlike $\mathrm{CV}^2$, this statistic is local in time and is therefore
315    less sensitive to slow rate drifts.
316
317    Examples
318    --------
319    >>> round(float(lv(np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]), min_vals=2)), 3)
320    0.091
321    """
322    consecutive_isis = _consecutive_intervals(spiketimes)
323    ms = (
324        3
325        * (consecutive_isis[:, 0] - consecutive_isis[:, 1]) ** 2
326        / (consecutive_isis[:, 0] + consecutive_isis[:, 1]) ** 2
327    )
328    if len(ms) >= min_vals:
329        return pylab.mean(ms)
330    return pylab.nan

Compute the Shinomoto local variation metric.

Definition

Using the same consecutive inter-spike interval notation as cv_two(...), the local variation is

$$ \mathrm{LV} = \left\langle \frac{3\,(I_{n+1} - I_n)^2}{(I_{n+1} + I_n)^2} \right\rangle_n $$

Unlike $\mathrm{CV}^2$, this statistic is local in time and is therefore less sensitive to slow rate drifts.

Examples
>>> round(float(lv(np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]), min_vals=2)), 3)
0.091
def kernel_fano(spiketimes, window, dt=1.0, tlim=None, components=False):
136def kernel_fano(spiketimes, window, dt=1.0, tlim=None, components=False):
137    """Estimate a sliding-window Fano factor from binned spike counts.
138
139    Parameters
140    ----------
141    spiketimes:
142        Canonical spike representation.
143    window:
144        Counting window in ms.
145    dt:
146        Bin width in ms.
147    tlim:
148        Optional analysis interval in ms.
149    components:
150        If `True`, return variance and mean separately instead of their ratio.
151
152    Examples
153    --------
154    >>> fano, time = kernel_fano(
155    ...     np.array([[0.0, 2.0, 0.0, 2.0], [0.0, 0.0, 1.0, 1.0]]),
156    ...     window=2.0,
157    ...     dt=1.0,
158    ...     tlim=[0.0, 4.0],
159    ... )
160    >>> fano.shape, time.shape
161    ((3,), (3,))
162    """
163    if tlim is None:
164        tlim = get_time_limits(spiketimes)
165    if tlim[1] - tlim[0] == window:
166        binary, time = spiketimes_to_binary(spiketimes, tlim=tlim)
167        counts = binary.sum(axis=1)
168        return pylab.array([counts.var() / counts.mean()]), pylab.array(time.mean())
169
170    counts, time = sliding_counts(spiketimes, window, dt, tlim)
171
172    vars_ = counts.var(axis=0)
173    means = counts.mean(axis=0)
174    fano = pylab.zeros_like(vars_) * pylab.nan
175    if components:
176        return vars_, means, time
177    fano[means > 0] = vars_[means > 0] / means[means > 0]
178    return fano, time

Estimate a sliding-window Fano factor from binned spike counts.

Parameters
  • spiketimes:: Canonical spike representation.
  • window:: Counting window in ms.
  • dt:: Bin width in ms.
  • tlim:: Optional analysis interval in ms.
  • components:: If True, return variance and mean separately instead of their ratio.
Examples
>>> fano, time = kernel_fano(
...     np.array([[0.0, 2.0, 0.0, 2.0], [0.0, 0.0, 1.0, 1.0]]),
...     window=2.0,
...     dt=1.0,
...     tlim=[0.0, 4.0],
... )
>>> fano.shape, time.shape
((3,), (3,))
def time_resolved_cv2( spiketimes, window=None, ot=5.0, tlim=None, pool=False, tstep=1.0, bessel_correction=False, minvals=0, return_all=False):
333def time_resolved_cv2(
334    spiketimes,
335    window=None,
336    ot=5.0,
337    tlim=None,
338    pool=False,
339    tstep=1.0,
340    bessel_correction=False,
341    minvals=0,
342    return_all=False,
343):
344    r"""Estimate Cv2 in sliding windows over time.
345
346    Parameters
347    ----------
348    spiketimes:
349        Canonical spike representation.
350    window:
351        Window width in ms. If `None`, it is inferred from the mean ISI.
352    ot:
353        Multiplicative factor used when inferring `window`.
354    tlim:
355        Optional analysis interval in ms.
356    pool:
357        Whether to pool intervals across trials or units.
358    tstep:
359        Window step size in ms.
360    bessel_correction:
361        Use `ddof=1` in the interval variance.
362    minvals:
363        Minimum number of intervals per row.
364    return_all:
365        Return per-row values for each window instead of a single mean.
366
367    Definition
368    ----------
369    For a window of width $W$ starting at $t$, this function computes
370
371    $$
372    \mathrm{CV}^2(t) =
373    \mathrm{CV}^2\left(\{I_n : \text{both spikes of } I_n \in [t, t + W)\}\right)
374    $$
375
376    and reports it at the window center. If `window=None`, the code first
377    estimates a characteristic window from the mean inter-spike interval and
378    the factor `ot`.
379
380    Examples
381    --------
382    >>> values, time = time_resolved_cv2(
383    ...     np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]),
384    ...     window=5.0,
385    ...     tlim=[0.0, 10.0],
386    ...     tstep=2.0,
387    ... )
388    >>> values.shape, time.shape
389    ((3,), (3,))
390    """
391    if tlim is None:
392        tlim = get_time_limits(spiketimes)
393
394    if window is None:
395        spikelist = spiketimes_to_list(spiketimes)
396        isis = [pylab.diff(spikes) for spikes in spikelist]
397        meanisi = [isi.mean() for isi in isis if pylab.isnan(isi.mean()) == False]
398        if meanisi:
399            window = sum(meanisi) / float(len(meanisi)) * ot
400        else:
401            window = tlim[1] - tlim[0]
402
403    time = []
404    cvs = []
405    tmin = tlim[0]
406    tmax = tlim[0] + window
407    order = pylab.argsort(spiketimes[0])
408    ordered_spiketimes = spiketimes[:, order]
409    while tmax < tlim[1]:
410        start_ind = bisect_right(ordered_spiketimes[0], tmin)
411        end_ind = bisect_right(ordered_spiketimes[0], tmax)
412        window_spikes = ordered_spiketimes[:, start_ind:end_ind]
413        cvs.append(
414            cv2(
415                window_spikes,
416                pool,
417                bessel_correction=bessel_correction,
418                minvals=minvals,
419                return_all=return_all,
420            )
421        )
422        time.append(0.5 * (tmin + tmax))
423        tmin += tstep
424        tmax += tstep
425    if return_all:
426        if len(cvs) > 0:
427            maxlen = max(len(cv) for cv in cvs)
428            cvs = [cv.tolist() + [pylab.nan] * (maxlen - len(cv)) for cv in cvs]
429        else:
430            cvs = [[]]
431    return pylab.array(cvs), pylab.array(time)

Estimate Cv2 in sliding windows over time.

Parameters
  • spiketimes:: Canonical spike representation.
  • window:: Window width in ms. If None, it is inferred from the mean ISI.
  • ot:: Multiplicative factor used when inferring window.
  • tlim:: Optional analysis interval in ms.
  • pool:: Whether to pool intervals across trials or units.
  • tstep:: Window step size in ms.
  • bessel_correction:: Use ddof=1 in the interval variance.
  • minvals:: Minimum number of intervals per row.
  • return_all:: Return per-row values for each window instead of a single mean.
Definition

For a window of width $W$ starting at $t$, this function computes

$$ \mathrm{CV}^2(t) = \mathrm{CV}^2\left({I_n : \text{both spikes of } I_n \in [t, t + W)}\right) $$

and reports it at the window center. If window=None, the code first estimates a characteristic window from the mean inter-spike interval and the factor ot.

Examples
>>> values, time = time_resolved_cv2(
...     np.array([[0.0, 2.0, 5.0, 9.0], [0.0, 0.0, 0.0, 0.0]]),
...     window=5.0,
...     tlim=[0.0, 10.0],
...     tstep=2.0,
... )
>>> values.shape, time.shape
((3,), (3,))
def time_resolved_cv_two(spiketimes, window=400, tlim=None, min_vals=10, tstep=1):
604def time_resolved_cv_two(spiketimes, window=400, tlim=None, min_vals=10, tstep=1):
605    r"""Compute rolling Cv2 using the optional C extension.
606
607    Definition
608    ----------
609    This is the compiled analogue of applying `cv_two(...)` in successive
610    windows. For each window $[t, t + W)$ it collects all finite neighboring
611    interval pairs $(I_n, I_{n+1})$ whose spikes lie inside the window and
612    evaluates
613
614    $$
615    \mathrm{Cv2}(t) =
616    \left\langle
617    \frac{2\,|I_{n+1} - I_n|}{I_{n+1} + I_n}
618    \right\rangle_n
619    $$
620
621    if at least `min_vals` pairs are available. The window is then shifted by
622    `tstep`.
623
624    Notes
625    -----
626    This function requires the optional `Cspiketools` extension module.
627
628    Expected output
629    ---------------
630    Returns `(values, time)` with the same window semantics as
631    `time_resolved_cv2(...)` when the compiled extension is available.
632    """
633    return _time_resolved_cv_two(spiketimes, window, tlim, min_vals, tstep)

Compute rolling Cv2 using the optional C extension.

Definition

This is the compiled analogue of applying cv_two(...) in successive windows. For each window $[t, t + W)$ it collects all finite neighboring interval pairs $(I_n, I_{n+1})$ whose spikes lie inside the window and evaluates

$$ \mathrm{Cv2}(t) = \left\langle \frac{2\,|I_{n+1} - I_n|}{I_{n+1} + I_n} \right\rangle_n $$

if at least min_vals pairs are available. The window is then shifted by tstep.

Notes

This function requires the optional Cspiketools extension module.

Expected output

Returns (values, time) with the same window semantics as time_resolved_cv2(...) when the compiled extension is available.

def time_warped_cv2( spiketimes, window=None, ot=5.0, tstep=1.0, tlim=None, rate=None, kernel_width=50.0, nstd=3.0, pool=True, dt=1.0, inspection_plots=False, kernel_type='gaussian', bessel_correction=False, interpolate=False, minvals=0, return_all=False):
434def time_warped_cv2(
435    spiketimes,
436    window=None,
437    ot=5.0,
438    tstep=1.0,
439    tlim=None,
440    rate=None,
441    kernel_width=50.0,
442    nstd=3.0,
443    pool=True,
444    dt=1.0,
445    inspection_plots=False,
446    kernel_type="gaussian",
447    bessel_correction=False,
448    interpolate=False,
449    minvals=0,
450    return_all=False,
451):
452    """Estimate Cv2 after compensating for slow rate fluctuations by time warping.
453
454    Parameters
455    ----------
456    spiketimes:
457        Canonical spike representation.
458    window:
459        Window width in warped time. If `None`, infer from the mean ISI.
460    ot:
461        Multiplicative factor used for automatic window selection.
462    tstep:
463        Window step size in ms.
464    tlim:
465        Optional `[tmin, tmax]` analysis interval.
466    rate:
467        Optional precomputed `(rate, time)` tuple.
468    kernel_width:
469        Gaussian kernel width in ms when `rate` is not provided.
470    pool:
471        Whether to pool intervals across rows in the Cv2 estimate.
472    dt:
473        Rate-estimation sampling interval in ms.
474    inspection_plots:
475        Plot the forward and backward time-warp mappings for debugging.
476    interpolate:
477        If `True`, interpolate the output back onto a regular time grid.
478    minvals:
479        Minimum number of intervals per row.
480    return_all:
481        Return per-row values for each window instead of a single mean.
482
483    Examples
484    --------
485    >>> values, time = time_warped_cv2(
486    ...     np.array([[0.0, 3.0, 7.0, 12.0], [0.0, 0.0, 0.0, 0.0]]),
487    ...     window=3.0,
488    ...     tlim=[0.0, 13.0],
489    ...     pool=True,
490    ...     dt=1.0,
491    ... )
492    >>> time.ndim
493    1
494    """
495    del nstd  # Unused but kept for API compatibility.
496    del kernel_type  # Unused but kept for API compatibility.
497
498    if tlim is None:
499        tlim = get_time_limits(spiketimes)
500
501    if rate is None:
502        kernel = gaussian_kernel(kernel_width, dt)
503        rate, trate = kernel_rate(spiketimes, kernel, tlim=tlim, dt=dt)
504    else:
505        trate = rate[1]
506        rate = rate[0]
507
508    rate = rate.flatten()
509    rate[rate == 0] = 1e-4
510    notnaninds = pylab.isnan(rate) == False
511    rate = rate[notnaninds]
512    trate = trate[notnaninds]
513
514    tdash = rate_integral(rate, dt)
515    tdash -= tdash.min()
516    tdash /= tdash.max()
517    tdash *= trate.max() - trate.min()
518    tdash += trate.min()
519
520    transformed_spikes = spiketimes.copy().astype(float)
521    transformed_spikes[0, :] = time_warp(transformed_spikes[0, :], trate, tdash)
522
523    if inspection_plots:
524        pylab.figure()
525        pylab.subplot(1, 2, 1)
526        stepsize = max(1, int(spiketimes.shape[1] / 100))
527        for i in list(range(0, spiketimes.shape[1], stepsize)) + [-1]:
528            if pylab.isnan(transformed_spikes[0, i]) or pylab.isnan(spiketimes[0, i]):
529                continue
530            idx_real = int(np.argmin(np.abs(trate - spiketimes[0, i])))
531            idx_warp = int(np.argmin(np.abs(tdash - transformed_spikes[0, i])))
532            pylab.plot(
533                [spiketimes[0, i]] * 2,
534                [0, tdash[idx_real]],
535                "--k",
536                linewidth=0.5,
537            )
538            pylab.plot(
539                [0, trate[idx_warp]],
540                [transformed_spikes[0, i]] * 2,
541                "--k",
542                linewidth=0.5,
543            )
544        pylab.plot(trate, tdash, linewidth=2.0)
545        pylab.xlabel("real time")
546        pylab.ylabel("transformed time")
547        pylab.title("transformation of spiketimes")
548
549    cv2_vals, tcv2 = time_resolved_cv2(
550        transformed_spikes,
551        window,
552        ot,
553        None,
554        pool,
555        tstep,
556        bessel_correction=bessel_correction,
557        minvals=minvals,
558        return_all=return_all,
559    )
560    if inspection_plots:
561        pylab.subplot(1, 2, 2)
562        stepsize = max(1, int(len(tcv2) / 50))
563        tcv22 = time_warp(tcv2, tdash, trate)
564        for i in list(range(0, len(tcv2), stepsize)) + [-1]:
565            idx_warp = int(np.argmin(np.abs(tdash - tcv2[i])))
566            idx_real = int(np.argmin(np.abs(trate - tcv22[i])))
567            pylab.plot(
568                [tcv2[i]] * 2,
569                [0, trate[idx_warp]],
570                "--k",
571                linewidth=0.5,
572            )
573            pylab.plot(
574                [0, tdash[idx_real]],
575                [tcv22[i]] * 2,
576                "--k",
577                linewidth=0.5,
578            )
579        pylab.plot(tdash, trate, linewidth=2)
580        pylab.title("re-transformation of cv2")
581        pylab.xlabel("transformed time")
582        pylab.ylabel("real time")
583
584    tcv2 = time_warp(tcv2, tdash, trate)
585
586    if interpolate:
587        time = pylab.arange(tlim[0], tlim[1], dt)
588        if len(cv2_vals) == 0 or (return_all and cv2_vals.shape[1] == 0):
589            cv2_vals = pylab.zeros_like(time) * pylab.nan
590        else:
591            if return_all:
592                cv2_vals = pylab.array(
593                    [
594                        pylab.interp(time, tcv2, cv2_vals[:, i], left=pylab.nan, right=pylab.nan)
595                        for i in range(cv2_vals.shape[1])
596                    ]
597                )
598            else:
599                cv2_vals = pylab.interp(time, tcv2, cv2_vals, left=pylab.nan, right=pylab.nan)
600        return cv2_vals, time
601    return cv2_vals, tcv2

Estimate Cv2 after compensating for slow rate fluctuations by time warping.

Parameters
  • spiketimes:: Canonical spike representation.
  • window:: Window width in warped time. If None, infer from the mean ISI.
  • ot:: Multiplicative factor used for automatic window selection.
  • tstep:: Window step size in ms.
  • tlim:: Optional [tmin, tmax] analysis interval.
  • rate:: Optional precomputed (rate, time) tuple.
  • kernel_width:: Gaussian kernel width in ms when rate is not provided.
  • pool:: Whether to pool intervals across rows in the Cv2 estimate.
  • dt:: Rate-estimation sampling interval in ms.
  • inspection_plots:: Plot the forward and backward time-warp mappings for debugging.
  • interpolate:: If True, interpolate the output back onto a regular time grid.
  • minvals:: Minimum number of intervals per row.
  • return_all:: Return per-row values for each window instead of a single mean.
Examples
>>> values, time = time_warped_cv2(
...     np.array([[0.0, 3.0, 7.0, 12.0], [0.0, 0.0, 0.0, 0.0]]),
...     window=3.0,
...     tlim=[0.0, 13.0],
...     pool=True,
...     dt=1.0,
... )
>>> time.ndim
1