Synaptipy API Reference
This document provides reference information for developers using Synaptipy as a library. Autodoc-generated class and function signatures are included below each section alongside usage examples.
Table of Contents
Core Components
Data Model
Core Domain Data Models for Synaptipy.
Defines the central classes representing electrophysiology concepts like Recording sessions and individual data Channels.
- class Synaptipy.core.data_model.UndoStack(max_depth=20)[source]
Bases:
objectLightweight state-history stack for non-destructive editing (Command pattern).
Stores deep-copy snapshots of channel data before destructive operations so that a single
Channel.undo()call instantly restores the previous state. Memory depth is capped at max_depth entries (oldest entries are evicted).Usage:
channel.push_undo("apply lowpass 300 Hz") channel.data_trials = filtered_trials # ... channel.undo() # restores data_trials to state before the filterInitialise the undo stack.
- Parameters:
max_depth – Maximum number of undo levels retained (default 20).
- push(label, state)[source]
Save a named state snapshot.
- Parameters:
label – Human-readable description of the pending change (e.g.
"notch 50 Hz").state – Arbitrary serialisable dict representing the channel state to restore.
- pop()[source]
Remove and return the most recently saved state.
- Returns:
(label, state)tuple, orNoneif the stack is empty.
- property depth
Number of undo levels currently stored.
- class Synaptipy.core.data_model.Channel(id, name, units, sampling_rate, data_trials, loader=None)[source]
Bases:
objectRepresents a single channel of recorded data, potentially across multiple trials or segments.
Initializes a Channel object.
- Parameters:
id – A unique identifier for the channel (e.g., ‘0’, ‘1’, ‘Vm’).
name – A descriptive name for the channel (e.g., ‘Voltage’, ‘IN_0’).
units – The physical units of the data (e.g., ‘mV’, ‘pA’, ‘V’).
sampling_rate – The sampling frequency in Hz.
data_trials – A list where each element is a 1D NumPy array representing the data for one trial/segment.
loader – Optional callable/object with load_trial(index) method for lazy loading.
- property num_trials
Returns the number of trials/segments available for this channel.
- property num_samples
Returns the number of samples in the first trial.
WARNING: This property only checks the first trial. If trials have variable lengths, this value may be misleading. Use get_consistent_samples() for strict validation.
Returns 0 if no trials are present.
- get_consistent_samples()[source]
Returns the number of samples per trial, ensuring all trials have the same length. Raises ValueError if trials have different lengths. Returns 0 if no trials.
- get_data(trial_index)[source]
Returns the raw data for a specific trial. For lazy loading, this method will load the data from disk if not already loaded.
- get_primary_data_label()[source]
Determines a suitable label (‘Voltage’, ‘Current’, ‘Signal’) based on units.
- get_finite_data_bounds()[source]
Returns the min and max values across all trials, ensuring they are finite. Returns None if no finite data is found.
- push_undo(label='')[source]
Save the current
data_trialsstate so thatundo()can restore it.Call this before any destructive operation (filter, event deletion, …).
- Parameters:
label – Short human-readable description of the upcoming change (e.g.
"lowpass 300 Hz"). Stored for UI display only.
- undo()[source]
Restore
data_trialsto the last saved state.- Returns:
Trueif a state was restored,Falseif the stack was empty.
- property can_undo
Truewhen at least one undo level is available.
- class Synaptipy.core.data_model.Recording(source_file)[source]
Bases:
objectRepresents data and metadata loaded from a single recording file. Contains multiple Channel objects.
Initializes a Recording object.
- Parameters:
source_file – The Path object pointing to the original data file.
- close()[source]
Release any underlying file handles held by the source handle.
Must be called when a recording is removed from the workspace to prevent Neo IO readers from keeping the source file locked.
- property num_channels
Returns the number of channels in this recording.
- property channel_names
Returns a list of the names of all channels.
- property max_trials
Returns the maximum number of trials found across all channels in this recording. Returns 0 if there are no channels or no trials.
- class Synaptipy.core.data_model.Experiment[source]
Bases:
objectOptional container representing a collection of Recordings, potentially from a single experimental session or related set. (Currently basic).
Usage Example
from Synaptipy.core.data_model import Recording, Channel
from pathlib import Path
recording = Recording(source_file=Path("/path/to/file.abf"))
sampling_rate = recording.sampling_rate
duration = recording.duration
channels = recording.channels
channel = recording.get_channel_by_name("Vm")
File Readers
Adapter for reading various electrophysiology file formats using the neo library and translating them into the application’s core domain model. IO class selection uses a predefined dictionary mapping extensions to IO names.
The read_recording method implements a robust “Header-First” approach: 1. Reads file header first to discover ALL channels with their metadata 2. Creates a definitive channel map before processing any signal data 3. Aggregates data from segments to the correct channels using stable IDs 4. Ensures all channels (including custom-labeled ones) are correctly identified
This approach eliminates assumptions about data structure and ensures that what’s in the file header is what gets loaded, making the software truly versatile for WCP, ABF, and other supported file formats.
Also provides a method to generate a Qt file dialog filter based on its supported IOs.
- class Synaptipy.infrastructure.file_readers.neo_adapter.NeoAdapter[source]
Bases:
objectReads ephys files using neo, translating data to the Core Domain Model. Uses a fixed dictionary (IODict) for IO selection and implements a robust “Header-First” approach for channel identification.
The Header-First approach ensures: - All channels are discovered from file header before data processing - Custom channel labels are preserved and correctly mapped - No assumptions are made about data structure - Versatile support for WCP, ABF, and other formats
- get_supported_file_filter()[source]
Generates a file filter string for QFileDialog based on the IODict.
Usage Example
from Synaptipy.infrastructure.file_readers.neo_adapter import NeoAdapter
adapter = NeoAdapter()
recording = adapter.read_recording("/path/to/data.abf")
files = adapter.list_compatible_files("/path/to/directory")
Analysis Modules
Synaptipy organises its 17 built-in analysis routines into five core modules. Each module corresponds to a tab in the GUI Analyser and is also available as a composable unit in the batch processing pipeline.
Module 1 - Passive Membrane Properties
Baseline (RMP), Input Resistance, Membrane Time Constant (Tau), Sag Ratio, I-V Curve, and Capacitance.
Core Protocol Module 1: Passive Membrane Properties.
Consolidates: Resting Membrane Potential (RMP), Input Resistance (Rin), Membrane Time Constant (Tau), Sag Ratio, and Capacitance analysis.
All registry wrapper functions return the standard namespaced schema:
{
"module_used": "passive_properties",
"metrics": { ... flat result keys ... }
}
- Synaptipy.core.analysis.passive_properties.apply_ljp_correction(voltage, ljp_mv)[source]
Return a copy of voltage with the Liquid Junction Potential subtracted.
$V_{true} = V_{recorded} - LJP$
When
ljp_mvis 0.0 the original array is returned unchanged to avoid unnecessary allocations in the common case.- Parameters:
voltage – 1-D voltage array in mV.
ljp_mv – Liquid Junction Potential in mV (positive value shifts voltage downward; negative shifts upward).
- Returns:
Corrected voltage array (same dtype as input).
- Synaptipy.core.analysis.passive_properties.calculate_rmp(data, time, baseline_window)[source]
Calculate the Resting Membrane Potential (RMP) from a defined baseline window.
Algorithm
Extract the voltage samples whose timestamps fall within
baseline_window.RMP =
numpy.mean()of those samples (mV).Drift estimate: a uniform moving-average kernel of ~50 ms is applied to suppress high-frequency noise, then
numpy.polyfit()(degree 1) fits a line to the smoothed segment. The slope is the drift rate (mV s⁻¹). The kernel width is capped at one-third of the baseline length to avoid boundary artefacts on short protocols.
- param data:
1-D voltage array (mV).
- type data:
np.ndarray- param time:
1-D time array aligned with data (s).
- type time:
np.ndarray- param baseline_window:
(start_time, end_time)defining the pre-stimulus baseline period (s). Must satisfystart_time < end_time.- type baseline_window:
- returns:
Attributes populated on success:
value(float) – mean baseline voltage (mV).std_dev(float) – standard deviation of baseline (mV).drift(float or None) – linear drift rate of the smoothed baseline (mV s⁻¹);Nonewhen estimation fails.duration(float) – window duration (s).unit(str) – always'mV'.is_valid(bool) –Falsewhen the window contains no samples or the data array is malformed.
- rtype:
RmpResult
- Synaptipy.core.analysis.passive_properties.calculate_baseline_stats(time, voltage, start_time, end_time)[source]
Return (mean, std_dev) for a baseline window or None.
- Synaptipy.core.analysis.passive_properties.find_stable_baseline(data, sample_rate, window_duration_s=0.5, step_duration_s=0.1)[source]
Find the most stable (lowest variance) baseline segment by sliding window.
- Synaptipy.core.analysis.passive_properties.calculate_rin(voltage_trace, time_vector, current_amplitude, baseline_window, response_window, parameters=None, rs_artifact_blanking_ms=0.5)[source]
Calculate Input Resistance (Rin) from a current-clamp voltage deflection.
Algorithm
Uses Ohm’s Law applied to the membrane voltage deflection:
\[R_{in} = \frac{|\Delta V|}{|\Delta I|}\]where \(\Delta I\) is current_amplitude converted from pA to nA (\(\Delta I_{nA} = |I_{pA}| / 1000\)).
Three Rin estimates are returned:
Mean Rin (
value): \(\Delta V_{mean} / \Delta I_{nA}\), where \(\Delta V_{mean}\) is the mean of the full (Rs-blanked) response window relative to the baseline mean.Peak Rin (
rin_peak_mohm): \(\Delta V_{peak} / \Delta I_{nA}\), where \(\Delta V_{peak}\) is the point of maximum absolute voltage deflection. Most sensitive to Ih sag.Steady-state Rin (
rin_steady_state_mohm): \(\Delta V_{ss} / \Delta I_{nA}\), where \(\Delta V_{ss}\) is the mean of the last 20 % of the (blanked) response window, reflecting the true membrane resistance after Ih has settled.
The first
rs_artifact_blanking_msms of the response window are excluded from all three estimates to prevent the series-resistance jump from contaminating the measurement.- param voltage_trace:
1-D voltage array (mV).
- type voltage_trace:
np.ndarray- param time_vector:
1-D time array aligned with voltage_trace (s).
- type time_vector:
np.ndarray- param current_amplitude:
Amplitude of the injected current step (pA). Sign is preserved; a negative value indicates a hyperpolarising step.
- type current_amplitude:
- param baseline_window:
(start, end)of the pre-stimulus baseline period (s).- type baseline_window:
- param response_window:
(start, end)of the current-step response period (s).- type response_window:
- param parameters:
Arbitrary parameter dict stored verbatim in the returned
RinResultfor provenance tracking.- type parameters:
dict, optional- param rs_artifact_blanking_ms:
Duration (ms) to skip at the onset of the response window, excluding the fast series-resistance voltage jump from all Rin estimates (default 0.5 ms).
- type rs_artifact_blanking_ms:
float, optional- returns:
Attributes populated on success:
value(float) – mean Rin (MΩ).rin_peak_mohm(float) – peak Rin from maximum deflection (MΩ).rin_steady_state_mohm(float) – steady-state Rin from last 20 % of response window (MΩ).conductance(float) – membrane conductance, 1/Rin (µS).voltage_deflection(float) – mean ΔV (mV).current_injection(float) – current step amplitude (pA).baseline_voltage(float) – mean baseline voltage (mV).steady_state_voltage(float) – mean steady-state voltage (mV).is_valid(bool) –Falsewhen either window contains no samples or current_amplitude is zero.
- rtype:
RinResult
- Synaptipy.core.analysis.passive_properties.calculate_vc_transient_parameters(current_trace, time_vector, step_onset_time, voltage_step_mv, baseline_window_s=0.005, transient_window_ms=20.0)[source]
Fit the capacitive transient from a VC step to extract Rs and Cm.
In voltage-clamp, a command voltage step elicits a capacitive transient whose peak height and time-integral allow decomposition of series resistance (Rs) and whole-cell capacitance (Cm):
Rs = delta_V / I_peak Cm = Q_transient / delta_V where Q is charge under the transient tau_c = Rs * Cm (capacitive charging time constant)
A mono-exponential is also fit to the transient decay to yield
tau_cand a refinedCm_fitestimate.- Parameters:
current_trace (
np.ndarray) – 1-D current array (pA).time_vector (
np.ndarray) – 1-D time array aligned with current_trace (seconds).step_onset_time (
float) – Time of the voltage-clamp step onset (seconds).voltage_step_mv (
float) – Amplitude of the voltage step (mV). Sign is preserved.baseline_window_s (
float) – Duration of the pre-step baseline used to compute holding current (s).transient_window_ms (
float) – Duration of the post-step window searched for the transient peak (ms).
- Returns:
Keys:
rs_mohm,cm_pf,tau_c_ms,cm_fit_pf,transient_peak_pa,transient_charge_pa_s. All values arefloat(np.nan)when the fit fails.- Return type:
- Synaptipy.core.analysis.passive_properties.calculate_cc_series_resistance_fast(voltage_trace, time_vector, step_onset_time, current_step_pa, artifact_window_ms=0.1, tau_ms=None, rin_mohm=None)[source]
Estimate CC series resistance from the fast voltage artifact and derive Cm.
In current clamp, the fast resistive drop at step onset reflects the series (pipette + access) resistance before the membrane charges:
Rs = delta_V_fast / I_step (first ``artifact_window_ms`` after onset)
When tau_ms and rin_mohm are supplied, CC membrane capacitance is derived as:
Cm = tau_CC / R_in
- Parameters:
voltage_trace (
np.ndarray) – 1-D voltage array (mV).time_vector (
np.ndarray) – 1-D time array (s).step_onset_time (
float) – Time of the current step onset (seconds).current_step_pa (
float) – Amplitude of the injected current step (pA). Must be non-zero.artifact_window_ms (
float) – Duration of the initial fast artifact window used to measure the instantaneous voltage drop (ms, default 0.5).tau_ms (
float, optional) – Membrane time constant from exponential fit (ms). Required to derivecm_derived_pf.rin_mohm (
float, optional) – Input resistance (MOhm). Required to derivecm_derived_pf.
- Returns:
Keys:
rs_cc_mohm,cm_derived_pf. Values arefloat(np.nan)when the calculation fails.- Return type:
- Synaptipy.core.analysis.passive_properties.calculate_conductance(current_trace, time_vector, voltage_step, baseline_window, response_window, parameters=None)[source]
Calculate Conductance (G = delta_I / delta_V) from a voltage-clamp current trace.
- Synaptipy.core.analysis.passive_properties.calculate_iv_curve(sweeps, time_vectors, current_steps, baseline_window, response_window)[source]
Calculate the I-V relationship across multiple sweeps.
- Synaptipy.core.analysis.passive_properties.calculate_tau(voltage_trace, time_vector, stim_start_time, fit_duration, model='mono', tau_bounds=None, artifact_blanking_ms=0.5)[source]
Calculate the membrane time constant (tau) by fitting an exponential model.
Algorithm
Extract the fit window from
stim_start_time + artifact_blanking_ms / 1000tostim_start_time + fit_duration.Sag detection (Ih suppression): if the voltage peak (minimum for hyperpolarising steps, maximum for depolarising steps) falls in the second half of the window, the window is truncated at that peak to prevent Ih-sag rebound from contaminating the exponential fit.
Initial-guess estimation:
numpy.polyfit()(degree 1) is applied tolog(|V - V_ss|)vs. time to obtain a robust tau seed, avoiding dependence on the (potentially noisy) first sample.Curve fitting via
scipy.optimize.curve_fit()using the Trust-Region Reflective (TRF) algorithm (maxfev=5000) with amplitude bounds derived from the data range (±2× peak-to-peak):Mono-exponential (
model='mono'):\[V(t) = V_{ss} + (V_0 - V_{ss})\,e^{-t/\tau}\]Bi-exponential (
model='bi'):\[V(t) = V_{ss} + A_{fast}\,e^{-t/\tau_{fast}} + A_{slow}\,e^{-t/\tau_{slow}}\]
Quality gate (mono only): fits with R² < 0.80 are rejected and
tau_msis set toNaNto prevent physiologically implausible values from propagating silently.
- param voltage_trace:
1-D voltage array (mV).
- type voltage_trace:
np.ndarray- param time_vector:
1-D time array aligned with voltage_trace (s).
- type time_vector:
np.ndarray- param stim_start_time:
Onset time of the current step (s).
- type stim_start_time:
- param fit_duration:
Duration of the fit window measured from stim_start_time (s).
- type fit_duration:
- param model:
Exponential model to fit.
'mono'(default) is appropriate for most standard passive-properties protocols.'bi'separates fast and slow time constants for multi-compartment cells.- type model:
{'mono', 'bi'}, optional- param tau_bounds:
(tau_min, tau_max)in seconds, passed directly as bounds toscipy.optimize.curve_fit(). Defaults to(1e-4, 1.0)(0.1 ms – 1 s), covering the physiological range of cortical and hippocampal neurons.- type tau_bounds:
- param artifact_blanking_ms:
Duration to skip at the start of the fit window to exclude the fast capacitive artefact and the series-resistance voltage drop (ms, default 0.5 ms).
- type artifact_blanking_ms:
float, optional- returns:
For
model='mono':{'tau_ms': float, '_fit_time': list, '_fit_values': list, 'r_squared': float}.tau_msisfloat('nan')when R² < 0.80 or the fit diverges.For
model='bi':{'tau_fast_ms': float, 'tau_slow_ms': float, 'amplitude_fast': float, 'amplitude_slow': float, 'V_ss': float, '_fit_time': list, '_fit_values': list}.Returns
Nonewhen fewer than 3 samples are available in the fit window after blanking.- rtype:
- Synaptipy.core.analysis.passive_properties.calculate_sag_ratio(voltage_trace, time_vector, baseline_window, response_peak_window, response_steady_state_window, peak_smoothing_ms=5.0, rebound_window_ms=100.0)[source]
Calculate Sag Potential Ratio from a hyperpolarising current step.
Returns dict with keys sag_ratio, sag_percentage, v_peak, v_ss, v_baseline, rebound_depolarization.
- Synaptipy.core.analysis.passive_properties.calculate_capacitance_cc(tau_ms, rin_mohm, rs_mohm=None)[source]
Calculate Cell Capacitance (Cm) from Current-Clamp data.
When rs_mohm is provided, the series resistance is subtracted from the input resistance before computing capacitance, giving the corrected formula:
Cm = tau / (Rin - Rs)
Without rs_mohm (or when
rs_mohmisNone), the simpler approximationCm = tau / Rinis used and a warning is logged to remind the caller that the result may be over-estimated.- Parameters:
tau_ms – Membrane time constant (ms).
rin_mohm – Input resistance (MOhm).
rs_mohm – Series (access) resistance (MOhm), optional.
- Returns:
Membrane capacitance in pF, or
Nonewhen inputs are invalid.
- Synaptipy.core.analysis.passive_properties.calculate_capacitance_vc(current_trace, time_vector, baseline_window, transient_window, voltage_step_amplitude_mv)[source]
Calculate Cm and Rs from a voltage-clamp capacitive transient.
Method: 1. Rs =
delta_V / I_peak(Ohm’s law at the instant of the step). 2. Fit a mono-exponential to the transient decay -> tau_transient. 3. Cm = tau_transient / Rs.Falls back to the charge-integral (AUC) method for Cm when the exponential fit fails.
- Parameters:
current_trace – 1-D current array (pA).
time_vector – Corresponding time array (s).
baseline_window – (start, end) in seconds for pre-step baseline.
transient_window – (start, end) in seconds covering the capacitive transient.
voltage_step_amplitude_mv – Voltage command step (mV). Sign is ignored.
- Returns:
Dict with keys
capacitance_pf(pF) andseries_resistance_mohm(MOhm), or None on failure.
- Synaptipy.core.analysis.passive_properties.run_rmp_analysis_wrapper(data_list, time_list, sampling_rate, **kwargs)[source]
Wrapper for RMP analysis. Accepts single or multi-trial data. Returns namespaced schema.
- Synaptipy.core.analysis.passive_properties.run_sag_ratio_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for Sag Ratio analysis. Returns namespaced schema.
- Synaptipy.core.analysis.passive_properties.run_rin_analysis_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for Input Resistance analysis. Returns namespaced schema.
- Synaptipy.core.analysis.passive_properties.run_tau_analysis_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for Tau analysis. Returns namespaced schema.
- Synaptipy.core.analysis.passive_properties.run_iv_curve_wrapper(data_list, time_list, sampling_rate, **kwargs)[source]
Wrapper for multi-trial I-V Curve analysis. Returns namespaced schema.
Module 2 - Single Spike Analysis
Spike Detection and Phase Plane (dV/dt vs V) analysis.
Core Protocol Module 2: Single Spike Analysis.
Consolidates: Spike Detection, AP Characterisation (threshold, amplitude, half-width, rise/decay times, AHP) and Phase Plane (dV/dt vs V) analysis.
All registry wrapper functions return:
{
"module_used": "single_spike",
"metrics": { ... flat result keys ... }
}
- Synaptipy.core.analysis.single_spike.detect_spikes_threshold(data, time, threshold, refractory_samples, peak_search_window_samples=None, parameters=None, dvdt_threshold=20.0)[source]
Detect action potentials using a two-stage dV/dt-threshold crossing algorithm.
Algorithm
First-derivative computation:
numpy.gradient()is applied to data with sample spacingdt = time[1] - time[0](s), yielding dV/dt in mV s⁻¹.dV/dt crossing detection: candidate spike onsets are identified as upward crossings of
dvdt_threshold * 1000(mV s⁻¹). Each crossing is the sample where dV/dt transitions from strictly below to at-or-above the threshold.Refractory period enforcement: candidate crossings separated by fewer than refractory_samples are suppressed, retaining only the first crossing in each refractory interval (greedy forward scan).
Peak localisation: for each accepted onset, the voltage maximum within the next peak_search_window_samples is found. The candidate is accepted as a spike only if
data[peak_idx] >= threshold(mV).
- param data:
1-D voltage array (mV).
- type data:
np.ndarray- param time:
1-D time array aligned with data (s).
- type time:
np.ndarray- param threshold:
Minimum voltage a candidate peak must reach to be accepted as a spike (mV). Guards against sub-threshold dV/dt transients.
- type threshold:
- param refractory_samples:
Minimum number of samples between successive accepted spike onsets. Convert from time:
int(refractory_period_s * sampling_rate_hz).- type refractory_samples:
- param peak_search_window_samples:
Number of samples to search forward from each onset crossing for the voltage peak. Defaults to refractory_samples when
None.- type peak_search_window_samples:
int, optional- param parameters:
Arbitrary parameter dict stored verbatim in the returned
SpikeTrainResultfor provenance.- type parameters:
dict, optional- param dvdt_threshold:
dV/dt threshold for onset detection (V s⁻¹, default 20.0). Converted internally to mV s⁻¹ by multiplication with 1000.
- type dvdt_threshold:
float, optional- returns:
Attributes populated on success:
value(int) – total spike count.spike_times(np.ndarray) – peak times (s).spike_indices(np.ndarray) – peak sample indices.mean_frequency(float) – mean instantaneous firing rate(n_spikes - 1) / (t_last - t_first)(Hz); 0.0 for ≤ 1 spike.is_valid(bool) –Falsewhen input arrays are malformed.
- rtype:
SpikeTrainResult
- Synaptipy.core.analysis.single_spike.calculate_spike_features(data, time, spike_indices, dvdt_threshold=20.0, ahp_window_sec=0.05, onset_lookback=0.01, fahp_window_ms=(1.0, 5.0), mahp_window_ms=(10.0, 50.0))[source]
Calculate detailed features for each detected spike (vectorised NumPy).
Returns list of dicts per spike: ap_threshold, amplitude, half_width, rise_time_10_90, decay_time_90_10, fahp_depth, mahp_depth, ahp_duration_half, adp_amplitude, max_dvdt, min_dvdt.
AP threshold is detected via the peak of d2V/dt2 in the pre-spike lookback window (maximum curvature method). Falls back to the first dV/dt crossing above
dvdt_thresholdwhen d2V/dt2 gives a boundary result.- Parameters:
data – 1-D voltage array (mV).
time – Corresponding time array (s).
spike_indices – Array of sample indices for each spike peak.
dvdt_threshold – Fallback dV/dt threshold for AP onset (V/s).
ahp_window_sec – Duration of AHP/ADP search window (s).
onset_lookback – Lookback window before each spike peak (s).
fahp_window_ms – (start, end) of fast-AHP window after peak (ms).
mahp_window_ms – (start, end) of medium-AHP window after peak (ms).
- Synaptipy.core.analysis.single_spike.calculate_isi(spike_times)[source]
Return inter-spike intervals from spike_times array.
- Synaptipy.core.analysis.single_spike.analyze_multi_sweep_spikes(data_trials, time_vector, threshold, refractory_samples, dvdt_threshold=20.0)[source]
Detect spikes across multiple sweeps.
- Synaptipy.core.analysis.single_spike.calculate_dvdt(voltage, sampling_rate, sigma_ms=0.1)[source]
Calculate dV/dt (V/s) with optional Savitzky-Golay smoothing.
Computes the raw derivative first, then applies a Savitzky-Golay filter (polynomial order 3) directly to the derivative array. This preserves the true max dV/dt better than pre-smoothing the voltage with a Gaussian, which attenuates the sharp upstroke of action potentials.
- Parameters:
voltage – 1D voltage array (mV).
sampling_rate – Sampling rate (Hz).
sigma_ms – Smoothing window (ms). The SG window length is derived as
max(5, int(sigma_ms / 1000 * sampling_rate)), rounded up to the next odd integer. Set to 0 for no smoothing.
- Returns:
1D array of dV/dt in V/s.
- Synaptipy.core.analysis.single_spike.get_phase_plane_trajectory(voltage, sampling_rate, sigma_ms=0.1)[source]
Return (voltage, dvdt) phase-plane trajectory.
- Synaptipy.core.analysis.single_spike.detect_threshold_kink(voltage, sampling_rate, dvdt_threshold=20.0, kink_slope=10.0, search_window_ms=5.0, peak_indices=None)[source]
Detect AP threshold using the dV/dt kink method.
Returns array of threshold indices.
- Synaptipy.core.analysis.single_spike.run_spike_detection_wrapper(data, time, sampling_rate, threshold=-20.0, refractory_period=0.002, peak_search_window=0.005, dvdt_threshold=20.0, ahp_window=0.05, onset_lookback=0.01, **kwargs)[source]
Wrapper for spike detection. Returns namespaced schema.
- Synaptipy.core.analysis.single_spike.phase_plane_analysis_wrapper(voltage, time, sampling_rate, sigma_ms=0.1, dvdt_threshold=20.0, **kwargs)[source]
Wrapper for Phase Plane analysis. Returns namespaced schema.
- Synaptipy.core.analysis.single_spike.phase_plane_analysis(voltage, time, sampling_rate, sigma_ms=0.1, dvdt_threshold=20.0, **kwargs)
Wrapper for Phase Plane analysis. Returns namespaced schema.
Module 3 - Firing Dynamics
Excitability (F-I Curve), Burst Analysis, and Spike Train Dynamics.
Core Protocol Module 3: Firing Dynamics.
- Consolidates: Excitability (F-I curve), Burst Analysis, and Spike Train
Dynamics into one self-contained module.
All registry wrapper functions return:
{
"module_used": "firing_dynamics",
"metrics": { ... flat result keys ... }
}
- Synaptipy.core.analysis.firing_dynamics.calculate_fi_curve(sweeps, time_vectors, current_steps=None, threshold=-20.0, refractory_ms=2.0)[source]
Calculate F-I Curve properties from a set of sweeps.
- Parameters:
sweeps – List of voltage traces (1D arrays).
time_vectors – List of corresponding time vectors.
current_steps – List of current amplitudes for each sweep. If None, inferred.
threshold – Spike detection threshold (mV).
refractory_ms – Refractory period (ms).
- Returns:
Dictionary with rheobase_pa, fi_slope, max_freq, spike_counts, frequencies, adaptation_ratios, current_steps.
- Synaptipy.core.analysis.firing_dynamics.run_excitability_analysis_wrapper(data_list, time_list, sampling_rate, **kwargs)[source]
Wrapper for Excitability Analysis (F-I Curve).
- Synaptipy.core.analysis.firing_dynamics.calculate_bursts_logic(spike_times, max_isi_start=0.01, max_isi_end=0.2, min_spikes=2, dynamic_burst=False, burst_isi_fraction=0.3, parameters=None)[source]
Detect bursts in a spike train.
- Parameters:
spike_times – 1D array of spike times (seconds).
max_isi_start – Max ISI to start a burst (s). Ignored when dynamic_burst=True.
max_isi_end – Max ISI to continue a burst (s). Ignored when dynamic_burst=True.
min_spikes – Minimum spikes per burst.
dynamic_burst – When True, compute the mean ISI of the whole train and define the burst boundary as
burst_isi_fraction * mean_isi. This abandons hardcoded thresholds in favour of the train’s own temporal structure.burst_isi_fraction – Fraction of mean ISI used as burst boundary when
dynamic_burst=True(default 0.3, i.e. 30%).
- Returns:
BurstResult object.
- Synaptipy.core.analysis.firing_dynamics.analyze_spikes_and_bursts(data, time, sampling_rate, threshold, max_isi_start, max_isi_end, refractory_ms=2.0, dynamic_burst=False, burst_isi_fraction=0.3, parameters=None)[source]
Detect spikes then detect bursts.
- Synaptipy.core.analysis.firing_dynamics.run_burst_analysis_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for Burst Analysis.
- class Synaptipy.core.analysis.firing_dynamics.TrainDynamicsResult(value, unit, is_valid=True, error_message=None, quality_flags=<factory>, metadata=<factory>, spike_count=0, mean_isi_s=None, cv=None, cv2=None, lv=None, adaptation_index=None, isis=None, parameters=<factory>)[source]
Bases:
AnalysisResultResult object for spike train dynamics analysis.
- spike_count = 0
- mean_isi_s = None
- cv = None
- cv2 = None
- lv = None
- adaptation_index = None
- isis = None
- parameters
- Synaptipy.core.analysis.firing_dynamics.calculate_train_dynamics(spike_times)[source]
Compute native spike train statistical metrics.
- Parameters:
spike_times – 1D NumPy array of spike times in seconds.
- Returns:
TrainDynamicsResult.
Module 4 - Synaptic Events
Threshold-based detection, Template Match detection, and Baseline-Peak detection.
Core Protocol Module 4: Synaptic Events.
Consolidates all synaptic event detection methods (adaptive threshold, template matching, baseline-peak-kinetics) from event_detection.py into one self-contained module.
All registry wrapper functions return:
{
"module_used": "synaptic_events",
"metrics": { ... flat result keys ... }
}
Exports detect_minis_threshold as a backward-compatibility alias.
- Synaptipy.core.analysis.synaptic_events.find_quiescent_baseline_rms(data, sample_rate, window_ms=20.0)[source]
Identify the quietest (minimum-variance) segment in a trace via a sliding window and return its RMS as the noise floor.
Unlike a fixed pre-trace window (e.g.
trace[0:50]), this approach is robust to recordings with spontaneous activity at the start: the search considers the entire trace, selecting the 20 ms chunk with the smallest variance regardless of its position.- Parameters:
data – 1D signal array (mV or pA).
sample_rate – Sampling rate (Hz).
window_ms – Duration of the sliding window (ms, default 20).
- Returns:
Tuple of (rms_noise_floor, (start_idx, end_idx)) where the indices define the quiescent window used for the RMS calculation.
- Synaptipy.core.analysis.synaptic_events.calculate_event_charge_dynamic(data, event_index, sample_rate, local_baseline, polarity='negative', max_duration_ms=100.0)[source]
Integrate event charge (area under curve) with a dynamic boundary.
The integration ends at whichever comes first:
The signal returns to
local_baseline(event complete).A large derivative transient indicates the onset of a subsequent summating event (onset detected as
dV/dt> 3x the noise in the early derivative).
- Parameters:
data – 1D signal array.
event_index – Sample index of the event peak.
sample_rate – Sampling rate (Hz).
local_baseline – Local baseline voltage/current level.
polarity –
"negative"or"positive".max_duration_ms – Hard cap on integration window (ms).
- Returns:
Signed charge (area under curve relative to baseline, in units of data * seconds, e.g. mV·s or pA·s).
- Synaptipy.core.analysis.synaptic_events.fit_biexponential_decay(data, event_index, sample_rate, local_baseline, polarity='negative', fit_window_ms=80.0)[source]
Fit mono- and bi-exponential decays to a synaptic event.
Tries a bi-exponential fit first. Falls back to mono-exponential when the bi-exp fit does not converge or yields non-physical parameters (negative amplitudes or time constants).
Bi-exponential model (relative to baseline):
f(t) = A_fast * exp(-t / tau_fast) + A_slow * exp(-t / tau_slow)
with A_fast + A_slow normalised by the event peak amplitude at t=0.
- Parameters:
data – 1-D signal array.
event_index – Sample index of the event peak.
sample_rate – Sampling rate (Hz).
local_baseline – Local baseline level (same units as data).
polarity –
"negative"or"positive".fit_window_ms – Maximum duration of the decay segment to fit (ms).
- Returns:
tau_mono_ms– mono-exp time constant (ms)tau_fast_ms– fast component time constant (ms);Noneif bi-exp did not convergetau_slow_ms– slow component time constant (ms);Noneif bi-exp did not convergebi_exp_converged– True when bi-exp fit was accepteddecay_fit_error–Nonewhen fit succeeded; error message string otherwise
- Return type:
Dict with keys
- Synaptipy.core.analysis.synaptic_events.compute_local_pre_event_baseline(data, event_indices, sample_rate, pre_event_window_ms=2.0, polarity='negative')[source]
Compute a local pre-event baseline voltage for each detected event.
For summ ating synaptic events that ride on the decay of a previous event, the global resting potential is a poor amplitude reference. This function searches the pre_event_window_ms immediately preceding each event peak and returns the local “foot” voltage:
Negative polarity: the maximum (most depolarised) voltage in the search window - i.e. the point before the hyperpolarising/inward current event begins to deflect the trace.
Positive polarity: the minimum (most hyperpolarised) voltage.
- Parameters:
data – 1D voltage/current array.
event_indices – Integer indices of detected event peaks.
sample_rate – Sampling rate in Hz.
pre_event_window_ms – Duration (ms) of the search window before each peak (default 2.0 ms, valid range 1-3 ms recommended).
polarity – “negative” or “positive”.
- Returns:
1D float array of local baseline values, one per event.
- Synaptipy.core.analysis.synaptic_events.calculate_paired_pulse_ratio(data, time, stimulus_onsets_s, sample_rate, response_window_ms=20.0, polarity='negative')[source]
Calculate the Paired-Pulse Ratio (PPR) with residual decay correction.
A naive amplitude ratio (P2/P1) is contaminated by the unresolved decay tail of the first response. This function:
Fits a mono-exponential decay to the P1 tail.
Evaluates the fitted tail at the P2 onset to obtain the residual baseline offset.
Measures P2 amplitude relative to the corrected (residual-subtracted) baseline instead of the global baseline.
PPR = A2_corrected / A1
- Parameters:
data (
np.ndarray) – 1-D signal array (pA or mV).time (
np.ndarray) – 1-D time array (seconds), same length as data.stimulus_onsets_s (
np.ndarray) – 1-D array of stimulus onset times (seconds). At least two are required.sample_rate (
float) – Sampling rate (Hz).response_window_ms (
float) – Duration (ms) of the post-stimulus window searched for the peak.polarity (
str) –"negative"for inward currents/hyperpolarisations;"positive"for depolarisations.
- Returns:
Keys:
ppr– corrected P2/P1 amplitude ratioppr_naive– uncorrected P2/P1 ratio (no residual subtraction)amplitude_p1– P1 amplitude relative to pre-stimulus baselineamplitude_p2_corrected– P2 amplitude after residual subtractionamplitude_p2_naive– P2 amplitude relative to global baselineresidual_at_p2– predicted P1 tail amplitude at P2 onsettau_p1_ms– time constant of the P1 decay fit (ms)interpulse_interval_ms– interval between S1 and S2 (ms)error– error message string when calculation fails
- Return type:
- Synaptipy.core.analysis.synaptic_events.detect_events_threshold(data, time, threshold, polarity='negative', refractory_period=0.002, rolling_baseline_window_ms=100.0, artifact_mask=None, use_quiescent_noise_floor=True, quiescent_window_ms=20.0)[source]
Detect events using topological prominence to handle shifting baselines.
By default uses a quiescent-noise-floor estimate: the RMS of the minimum-variance 20 ms chunk in the trace is used to set a dynamic noise threshold, preventing false positives even when spontaneous activity dominates the beginning of the recording.
- Synaptipy.core.analysis.synaptic_events.detect_minis_threshold(data, time, threshold, polarity='negative', refractory_period=0.002, rolling_baseline_window_ms=100.0, artifact_mask=None, use_quiescent_noise_floor=True, quiescent_window_ms=20.0)
Detect events using topological prominence to handle shifting baselines.
By default uses a quiescent-noise-floor estimate: the RMS of the minimum-variance 20 ms chunk in the trace is used to set a dynamic noise threshold, preventing false positives even when spontaneous activity dominates the beginning of the recording.
- Synaptipy.core.analysis.synaptic_events.run_event_detection_threshold_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for adaptive threshold event detection.
- Synaptipy.core.analysis.synaptic_events.detect_events_template(data, sampling_rate, threshold_std, tau_rise, tau_decay, polarity='negative', rolling_baseline_window_ms=100.0, artifact_mask=None, time=None, min_event_distance_ms=0.0)[source]
Detect events using a multi-kernel matched-filter bank.
Three kernels are built using the specified tau_rise and tau_decay × 1, 2, 3 to tolerate dendritic filtering that prolongs event decay (Cable theory predicts a ~2-3× slowdown for distal inputs). A combined z-score trace (pointwise maximum across the three filtered traces) is used for peak detection, improving sensitivity to both somatic and dendritic events.
- Synaptipy.core.analysis.synaptic_events.run_event_detection_template_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for template-matching event detection.
- Synaptipy.core.analysis.synaptic_events.detect_events_baseline_peak_kinetics(data, sample_rate, direction='negative', baseline_window_s=0.5, baseline_step_s=0.1, threshold_sd_factor=3.0, filter_freq_hz=None, min_event_separation_ms=5.0, auto_baseline=True, rolling_baseline_window_ms=0.0)[source]
Detect events via stable-baseline estimation then prominence-based peak finding.
Module 5 - Evoked Responses
Evoked Sync (TTL-gated stimulus correlation, latency, response probability, and jitter), Paired-Pulse Ratio with residual decay subtraction, and Stimulus Train short-term plasticity (STP) analysis.
Core Protocol Module 5: Evoked Responses.
Consolidates optogenetic stimulus synchronization (TTL-gated latency, probability, jitter analysis) from optogenetics.py.
All registry wrapper functions return:
{
"module_used": "evoked_responses",
"metrics": { ... flat result keys ... }
}
- class Synaptipy.core.analysis.evoked_responses.OptoSyncResult(value, unit, is_valid=True, error_message=None, quality_flags=<factory>, metadata=<factory>, optical_latency_ms=None, response_probability=None, spike_jitter_ms=None, stimulus_count=0, success_count=0, failure_count=0, stimulus_onsets=None, stimulus_offsets=None, responding_spikes=<factory>, parameters=<factory>)[source]
Bases:
AnalysisResultResult object for optogenetic synchronization analysis.
- optical_latency_ms = None
- response_probability = None
- spike_jitter_ms = None
- stimulus_count = 0
- success_count = 0
- failure_count = 0
- stimulus_onsets = None
- stimulus_offsets = None
- responding_spikes
- parameters
- Synaptipy.core.analysis.evoked_responses.extract_ttl_epochs(ttl_data, time, threshold=2.5, auto_threshold=True)[source]
Extract rising and falling edges of a digital TTL signal.
- Returns:
Tuple of (onsets, offsets) arrays in seconds.
- Synaptipy.core.analysis.evoked_responses.calculate_optogenetic_sync(ttl_data, action_potential_times, time, ttl_threshold=2.5, response_window_ms=20.0)[source]
Correlate TTL stimuli with action potential times.
- Parameters:
ttl_data – Digital signal data trace.
action_potential_times – Pre-calculated spike/event times (seconds).
time – Timestamps of the trace.
ttl_threshold – Voltage threshold for TTL edge detection.
response_window_ms – Search window for APs after stimulus onset (ms).
- Returns:
OptoSyncResult.
- Synaptipy.core.analysis.evoked_responses.calculate_paired_pulse_ratio(data, time, stim1_onset_s, stim2_onset_s, response_window_ms=20.0, baseline_window_ms=5.0, fit_decay_from_ms=5.0, fit_decay_window_ms=30.0, polarity='negative', artifact_blanking_ms=1.0)[source]
Calculate Paired-Pulse Ratio with residual decay subtraction.
Without subtracting the residual exponential decay of the first event under the second stimulus window, the measured amplitude of the second response is artificially inflated (facilitation) or deflated (depression), yielding biologically invalid PPR values.
Algorithm:
Measure amplitude of response 1 (R1) relative to its local pre-stimulus baseline.
Fit a mono-exponential decay to the tail of R1 (from
fit_decay_from_mstofit_decay_window_msafter stim1_onset).Extrapolate the decay curve to estimate the residual baseline level at stim2_onset.
Measure amplitude of response 2 (R2_raw) relative to its own pre-stimulus sample.
Subtract the residual decay value from R2_raw to obtain R2_corrected.
Return
paired_pulse_ratio = R2_corrected / R1.
- Parameters:
data – 1-D voltage/current array (mV or pA).
time – 1-D time array (s).
stim1_onset_s – Time of first stimulus onset (s).
stim2_onset_s – Time of second stimulus onset (s).
response_window_ms – Duration after each stimulus to search for peak (ms).
baseline_window_ms – Pre-stimulus baseline window (ms) to compute local baseline for each response.
fit_decay_from_ms – Offset from stim1_onset to start fitting decay (ms). Should be after the initial transient.
fit_decay_window_ms – Window duration for decay fit (ms).
polarity –
"negative"(inward/downward events, e.g. EPSCs) or"positive".artifact_blanking_ms – Duration (ms) after each stimulus onset to ignore when searching for the peak response (default 1.0). Prevents the stimulus shock-wave artefact from being identified as the biological response peak.
- Returns:
r1_amplitude– amplitude of first response (baseline-subtracted)r2_amplitude_raw– raw amplitude of second responser2_amplitude_corrected– R2 after subtracting residual decayresidual_at_stim2– estimated residual baseline at stim2_onsetpaired_pulse_ratio– R2_corrected / R1decay_tau_ms– time constant of first event decay (ms)ppr_error– None on success; error string on failure
- Return type:
Dict with keys
- Synaptipy.core.analysis.evoked_responses.run_opto_sync_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for optogenetic synchronization analysis.
Correlates TTL/optical stimulus pulses with detected events.
- Synaptipy.core.analysis.evoked_responses.run_ppr_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for Paired-Pulse Ratio analysis with optional TTL-based onset detection.
- Synaptipy.core.analysis.evoked_responses.calculate_stimulus_train_stp(data, time, stim_onsets, polarity='negative', response_window_ms=20.0, baseline_window_ms=5.0, artifact_blanking_ms=1.0)[source]
Compute short-term plasticity (STP) amplitudes for a stimulus train.
For each stimulus onset the function measures a baseline immediately preceding the stimulus, then finds the peak response in a post-stimulus window (after artifact blanking). Amplitudes are normalised to R1 to yield the STP profile.
- Parameters:
data – 1-D voltage or current trace.
time – 1-D time vector (seconds, same length as data).
stim_onsets – Stimulus onset times in seconds, ordered chronologically.
polarity –
"negative"for inward/hyperpolarising events,"positive"for outward/depolarising events.response_window_ms – Duration of the post-stimulus peak-search window in milliseconds.
baseline_window_ms – Duration of the pre-stimulus baseline window in milliseconds.
artifact_blanking_ms – Data within this interval after each onset are excluded from peak detection.
- Returns:
Dictionary with keys
amplitudes,amplitudes_norm,pulse_numbers,stim_onsetsand descriptive metric keys.
- Synaptipy.core.analysis.evoked_responses.run_stimulus_train_stp_wrapper(data, time, sampling_rate, **kwargs)[source]
Wrapper for Stimulus Train STP analysis.
Stimulus times are either detected from an optional TTL/trigger channel or generated from a user-supplied frequency and start time. For each pulse the wrapper measures a baseline-subtracted peak amplitude and normalises the series to R1 to produce the STP profile.
Analysis Registry
The central decorator-based registry that maps named analysis functions to the GUI and batch engine.
Analysis Registry for dynamic function registration and lookup.
This module provides a registry pattern that allows analysis functions to register themselves via decorators, enabling flexible pipeline configuration.
- class Synaptipy.core.analysis.registry.AnalysisRegistry[source]
Bases:
objectRegistry for analysis functions.
Functions can be registered using the @AnalysisRegistry.register decorator, and then retrieved by name for use in batch processing pipelines.
- classmethod register(name, type='analysis', **kwargs)[source]
Decorator to register an analysis or preprocessing function.
- Parameters:
name – Unique identifier for the function (e.g., “spike_detection”)
type – The type of function (“analysis” or “preprocessing”)
**kwargs – Additional metadata to store with the function (e.g., ui_params)
- Returns:
Decorator function
Example:
@AnalysisRegistry.register("spike_detection", ui_params=[...]) def run_spike_detection(data, time, sampling_rate, **kwargs): # ... analysis logic ... return results_dict
- classmethod register_processor(name, **kwargs)[source]
Decorator to register a preprocessing function. Alias for
register(name, type="preprocessing", **kwargs).
- classmethod get_function(name)[source]
Retrieve a registered analysis function by name.
- Parameters:
name – The registered name of the function
- Returns:
The registered function, or None if not found
- classmethod get_metadata(name)[source]
Retrieve metadata for a registered analysis function.
- Parameters:
name – The registered name of the function
- Returns:
Dictionary of metadata, or empty dict if not found
- classmethod list_registered()[source]
Get a list of all registered analysis function names.
- Returns:
List of registered function names
- classmethod list_by_type(type_str)[source]
Get registered function names filtered by type.
- Parameters:
type_str – The type to filter by (e.g., “analysis”, “preprocessing”)
- Returns:
List of function names matching the given type
- classmethod list_analysis()[source]
Get all registered analysis function names (excludes preprocessing).
- classmethod mark_core_snapshot()[source]
Record the current registry keys as the immutable core set.
Call this once after importing the built-in analysis package but before loading any external plugins.
unregister_plugins()uses this snapshot to know which entries must never be removed.
- classmethod unregister_plugins()[source]
Remove all analyses that are NOT part of the core package.
Safe to call multiple times. Only affects entries added after the last
mark_core_snapshot()call (i.e. plugin-contributed entries).
Usage Example - Registry
import Synaptipy.core.analysis # triggers all @register decorators
from Synaptipy.core.analysis.registry import AnalysisRegistry
names = AnalysisRegistry.list_registered()
meta = AnalysisRegistry.get_metadata("sag_ratio_analysis")
func = AnalysisRegistry.get_function("sag_ratio_analysis")
result = func(data, time, sampling_rate, baseline_start=0.0, baseline_end=0.1)
Batch Processing
Batch Analysis Engine for Synaptipy. Handles processing multiple files and aggregating results using a flexible registry-based pipeline.
The engine uses a registry-based architecture where analysis functions register themselves via decorators, and the pipeline configuration defines what analyses to run on which data scopes.
Output Design Principles
Every row is fully traceable to its source (file, channel, trial, analysis).
Metadata columns appear first; analysis results in the middle; internal/debug last.
Scalar results live in their own columns; array values are summarised for tabular compatibility (Excel, Origin, R, MATLAB) and the raw arrays are kept under private
_-prefixed keys that are stripped during CSV export.Channel physical units are always recorded so downstream scripts can auto-label axes.
Recording-level metadata (protocol, duration, session time) is propagated when available.
Author: Anzal K Shahul <anzal.ks@gmail.com>
- class Synaptipy.core.analysis.batch_engine.BatchAnalysisEngine(neo_adapter=None, max_workers=1)[source]
Bases:
objectEngine for running analysis across multiple files/recordings using a flexible pipeline.
The engine uses a registry-based architecture where analysis functions register themselves via decorators, and the pipeline configuration defines what analyses to run on which data scopes.
Example:
engine = BatchAnalysisEngine() files = [Path("file1.abf"), Path("file2.abf")] pipeline = [ { 'analysis': 'spike_detection', 'scope': 'all_trials', 'params': {'threshold': -15.0, 'refractory_ms': 2.0} }, { 'analysis': 'rmp_analysis', 'scope': 'average', 'params': {'baseline_start': 0.0, 'baseline_end': 0.1} } ] results_df = engine.run_batch(files, pipeline)Initialize the batch analysis engine.
- Parameters:
neo_adapter – Optional NeoAdapter instance. If None, creates a new one.
max_workers – Number of parallel worker processes for
run_batch(). 1 (default) means fully sequential execution. Values > 1 enableProcessPoolExecutorparallelism. Pass-1to use all available CPU cores.
- update_performance_settings(settings)[source]
Dynamically update performance limits without restarting.
Reads
max_cpu_coresfrom settings and updatesmax_workersimmediately so the nextrun_batch()call picks up the new value. This is the subscriber side of the pub/subpreferences_changedsignal.- Parameters:
settings – Dict that may contain
"max_cpu_cores"(int) and/or"max_ram_allocation_gb"(float, logged but not enforced here).
- static list_available_analyses()[source]
Get a list of all registered analysis function names.
- Returns:
List of available analysis names.
- static get_analysis_info(name)[source]
Get information about a registered analysis function.
- Parameters:
name – The registered name of the analysis function.
- Returns:
Dictionary with function info (docstring, etc.) or None if not found.
- run_batch(files, pipeline_config, progress_callback=None, channel_filter=None, rs_tolerance=0.2)[source]
Run analysis on a list of files/recordings using a flexible pipeline configuration.
When
max_workers> 1 and files contains at least two items, the file-level loop is distributed across worker processes viaProcessPoolExecutor. The GUI thread is never blocked in either mode — callers should wrap this in aBatchWorkerQThread.- Parameters:
files – List of file paths OR Recording objects to process.
pipeline_config – List of task dictionaries.
progress_callback – Optional callback (current, total, status_msg).
channel_filter – Optional list of channel names/IDs to process.
rs_tolerance – Maximum fractional increase in series resistance compared to the first valid Rs measurement before a sweep is flagged with
rs_qc_warning. Default 0.20 (20 %). Set tofloat('inf')to disable the check.
- Returns:
pandas DataFrame containing aggregated results with metadata.
Usage Example
from Synaptipy.core.analysis.batch_engine import BatchAnalysisEngine
from pathlib import Path
engine = BatchAnalysisEngine(max_workers=4)
pipeline = [
{
'analysis': 'spike_detection',
'scope': 'all_trials',
'params': {'threshold': -20.0, 'refractory_ms': 2.0}
},
]
files = [Path("file1.abf"), Path("file2.abf")]
results_df = engine.run_batch(files, pipeline)
Epoch Manager
EpochManager - Hardware TTL and manual experimental epoch management.
Experimental recordings often contain distinct phases (Baseline, Stimulation,
Washout). EpochManager either auto-detects these boundaries from a
TTL/Digital-Input channel or lets the researcher define them manually.
Once epochs are defined, per-epoch data slices can be extracted from any
Channel for downstream analysis
(e.g. tracking plasticity changes across Stim vs. Baseline).
- class Synaptipy.core.analysis.epoch_manager.Epoch(name, start_time, end_time, epoch_type='manual', metadata=<factory>)[source]
Bases:
objectA named time window within a recording.
- name
Human-readable label (e.g.
"Baseline","Stim","Washout").- Type:
- start_time
Epoch start in seconds (relative to recording onset).
- Type:
- end_time
Epoch end in seconds.
- Type:
- epoch_type
Either
"ttl"(auto-detected from hardware) or"manual".- Type:
- metadata
Optional arbitrary key/value annotations.
- Type:
Dict[str, Any]
- name
- start_time
- end_time
- epoch_type = 'manual'
- metadata
- property duration
Epoch duration in seconds.
- contains(t)[source]
Return
Trueif time t falls within[start_time, end_time].
- class Synaptipy.core.analysis.epoch_manager.EpochManager[source]
Bases:
objectManage experimental epoch boundaries for a recording.
Epochs are ordered by
Epoch.start_time. Overlapping epochs are allowed so that the same window can be labelled with multiple semantic tags.Typical workflow:
em = EpochManager() # Option A: auto-detect from a TTL channel em.from_ttl(ttl_data, time_vector, pre_stim_s=1.0, post_stim_s=1.0) # Option B: manual definition em.add_manual_epoch("Baseline", 0.0, 60.0) em.add_manual_epoch("Stim", 60.0, 120.0) em.add_manual_epoch("Washout", 120.0, 300.0) # Slice channel data by epoch slices = em.get_epoch_slices(channel, trial_index=0)- DEFAULT_EPOCH_NAMES = ('Baseline', 'Stim', 'Washout')
- property epochs
Sorted list of all defined epochs.
- property epoch_names
Names of all defined epochs in time order.
- add_manual_epoch(name, start_time, end_time, **metadata)[source]
Add a manually defined epoch.
- Parameters:
name – Label for the epoch.
start_time – Start time in seconds.
end_time – End time in seconds.
**metadata – Optional key/value annotations stored in
Epoch.metadata.
- Returns:
The newly created
Epoch.- Raises:
ValueError – If end_time <= start_time.
- from_ttl(ttl_data, time, ttl_threshold=2.5, pre_stim_s=1.0, post_stim_s=1.0, min_inter_epoch_s=0.5, stim_name='Stim', baseline_name='Baseline', washout_name='Washout')[source]
Auto-generate epochs from a TTL / Digital-Input channel.
Detects TTL pulse boundaries using
extract_ttl_epochs(), then creates:A Baseline epoch from
time[0]tofirst_onset - pre_stim_s.A Stim epoch spanning the detected TTL activity (
first_onset - pre_stim_stolast_offset + post_stim_s).A Washout epoch from
last_offset + post_stim_stotime[-1], if enough time remains.
- Returns:
List of the newly created
Epochobjects. The manager’sepochslist is also updated in place.
- get_epoch(name)[source]
Return the first epoch whose name matches name (case-insensitive).
- epochs_at_time(t)[source]
Return all epochs that contain time t.
- get_epoch_slices(channel, trial_index=0)[source]
Extract (data, time) slices for every epoch from a channel trial.
- Parameters:
channel – A
Channelinstance.trial_index – Which trial to slice (default 0).
- Returns:
Dict mapping epoch name to
(data_slice, time_slice)ndarrays. Epochs with no overlapping samples map to empty arrays.
- remove_epoch(name)[source]
Remove the first epoch matching name.
- Returns:
Trueif an epoch was removed,Falseif not found.
- clear()[source]
Remove all epochs.
Signal Processing
Signal processing utilities for Synaptipy. Includes filtering and trace quality checks.
- Synaptipy.core.signal_processor.validate_sampling_rate(fs)[source]
Validate sampling rate and warn if suspiciously low.
- Parameters:
fs – Sampling rate in Hz.
- Returns:
True if valid (positive), False otherwise.
- Synaptipy.core.signal_processor.check_trace_quality(data, sampling_rate)[source]
Assess the quality of a recording trace.
Checks for: - Signal-to-Noise Ratio (SNR) estimation - Baseline Drift - 50/60Hz Line Noise contamination
- Parameters:
data – 1D numpy array of the signal (e.g., voltage in mV or current in pA).
sampling_rate – Sampling rate in Hz.
- Returns:
Dictionary containing quality metrics and flags.
- Synaptipy.core.signal_processor.bandpass_filter(data, lowcut, highcut, fs, order=5)[source]
Apply a Butterworth bandpass filter to the data. Uses Second Order Sections (SOS) for numerical stability.
- Parameters:
data – Input signal array
lowcut – Low cutoff frequency in Hz
highcut – High cutoff frequency in Hz
fs – Sampling frequency in Hz
order – Filter order (1-10, default 5)
- Returns:
Filtered data, or original data if filtering fails
- Synaptipy.core.signal_processor.lowpass_filter(data, cutoff, fs, order=5)[source]
Apply a Butterworth lowpass filter. Uses Second Order Sections (SOS) for numerical stability.
- Parameters:
data – Input signal array
cutoff – Cutoff frequency in Hz
fs – Sampling frequency in Hz
order – Filter order (1-10, default 5)
- Returns:
Filtered data, or original data if filtering fails
- Synaptipy.core.signal_processor.highpass_filter(data, cutoff, fs, order=5)[source]
Apply a Butterworth highpass filter. Uses Second Order Sections (SOS) for numerical stability.
- Parameters:
data – Input signal array
cutoff – Cutoff frequency in Hz
fs – Sampling frequency in Hz
order – Filter order (1-10, default 5)
- Returns:
Filtered data, or original data if filtering fails
- Synaptipy.core.signal_processor.notch_filter(data, freq, Q, fs)[source]
Apply a notch filter to remove a specific frequency. Uses SOS format via zpk2sos for numerical stability.
- Parameters:
data – Input signal array
freq – Notch frequency in Hz
Q – Quality factor (higher = narrower notch)
fs – Sampling frequency in Hz
- Returns:
Filtered data, or original data if filtering fails
- Synaptipy.core.signal_processor.comb_filter(data, freq, Q, fs)[source]
Apply an IIR comb filter to remove a fundamental frequency and its harmonics. Useful for line noise (e.g., 50Hz or 60Hz).
- Parameters:
data – Input signal array
freq – Fundamental frequency to remove in Hz (e.g., 50 or 60)
Q – Quality factor (higher = narrower notches)
fs – Sampling frequency in Hz
- Returns:
Filtered data, or original data if filtering fails
- Synaptipy.core.signal_processor.subtract_baseline_mode(data, decimals=None)[source]
Subtract baseline using the mode of the distribution of values.
- Parameters:
data – Input signal array
decimals – Number of decimal places to round to for mode calculation. If None, it tries to infer a reasonable precision or defaults to 1.
- Returns:
Data with baseline subtracted (aligned to 0)
- Synaptipy.core.signal_processor.subtract_baseline_mean(data)[source]
Subtract the mean of the entire signal.
- Synaptipy.core.signal_processor.subtract_baseline_median(data)[source]
Subtract the median of the entire signal.
- Synaptipy.core.signal_processor.subtract_baseline_linear(data)[source]
Subtract a linear trend (detrend) from the signal. Useful for removing drift.
- Synaptipy.core.signal_processor.subtract_baseline_region(data, t, start_t, end_t)[source]
Subtract the mean value calculated from a specific time window.
- Parameters:
data – Signal array
t – Time vector (must be same length as data)
start_t – Start time of baseline window
end_t – End time of baseline window
- Synaptipy.core.signal_processor.blank_artifact(data, time_vector, onset_time, duration_ms, method='hold')[source]
Suppress a stimulus artifact by replacing a time window.
Three interpolation modes are available:
"hold"— replace the artifact window with the last pre-artifact sample value (sample-and-hold)."zero"— set the artifact window to zero."linear"— linearly interpolate between the pre- and post-artifact boundary values.
- Parameters:
data – 1-D signal array.
time_vector – 1-D time array (same length as data), in seconds.
onset_time – Start of the artifact window, in seconds.
duration_ms – Duration of the artifact window, in milliseconds.
method – Interpolation mode —
"hold","zero", or"linear". Default"hold".
- Returns:
Copy of data with the artifact window replaced.
- Raises:
ValueError – If method is not one of the recognised modes.
- Synaptipy.core.signal_processor.find_artifact_windows(data, fs, slope_threshold, padding_ms=2.0)[source]
Identify time windows containing high-slope artifacts.
Algorithm: 1. Calculate absolute gradient of the data. 2. Threshold gradient to find high-slope points. 3. Dilate the boolean mask by padding_ms to capture the artifact tail/ringing.
- Parameters:
data – Signal array.
fs – Sampling rate in Hz.
slope_threshold – Threshold for the absolute gradient (e.g. pA/sample or mV/sample).
padding_ms – Time to expand the mask around detected peaks (in milliseconds).
- Returns:
Boolean mask of the same shape as data, where True indicates an artifact.
- Synaptipy.core.signal_processor.compute_psd(data, sampling_rate, nperseg=None, window='hann')[source]
Compute Power Spectral Density (PSD) using Welch’s method.
- Parameters:
data – 1D signal array.
sampling_rate – Sampling rate in Hz.
nperseg – FFT segment length. Defaults to
min(len(data), 4096).window – Window function name passed to
scipy.signal.welch()(default"hann").
- Returns:
Tuple
(frequencies, psd)where frequencies is in Hz and psd is in (data_units)^2/Hz. Both arrays are 1-D float64. On failure or missing scipy an empty-array pair is returned.
- Synaptipy.core.signal_processor.multi_harmonic_notch(data, fundamental_hz, fs, max_harmonics=None, Q=30.0)[source]
Strip a fundamental frequency and its harmonics using cascaded notch filters.
Applies a discrete notch at fundamental_hz, 2 * fundamental_hz, 3 * fundamental_hz, …, up to the Nyquist limit (or max_harmonics, whichever comes first).
Prefer
comb_filter()(IIR comb viascipy.signal.iircomb()) when the scipy version supports it. This function falls back to cascadednotch_filter()calls, which is always available.- Parameters:
data – Input signal array.
fundamental_hz – Fundamental line-noise frequency to remove (e.g. 50 or 60).
fs – Sampling rate in Hz.
max_harmonics – Maximum number of harmonics to remove including the fundamental.
Nonemeans remove all harmonics below Nyquist.Q – Quality factor for each notch (higher = narrower). Default 30.
- Returns:
Filtered signal, or original data if filtering is impossible.
Processing Pipeline
Signal Processing Pipeline.
Formalizes the order of operations for signal processing (e.g., Baseline -> Filter). Ensures that both visualization and analysis use the exact same processing sequence.
- class Synaptipy.core.processing_pipeline.SignalProcessingPipeline[source]
Bases:
objectManages an ordered list of signal processing steps.
- Synaptipy.core.processing_pipeline.apply_trace_corrections(data, time, fs, *, ljp_mv=0.0, pn_traces=None, pn_scale=1.0, pre_event_window_s=None, artifact_interp_steps=None, filter_steps=None)[source]
Apply the immutable five-step trace correction in a guaranteed order.
Regardless of the order the user toggles settings in the GUI, this function must be used as the single entry point for all backend corrections so that the execution order is always:
- Step A - LJP Voltage Offset
V_true = V_recorded - ljp_mv- Step B - P/N Leak Subtraction
If pn_traces is supplied, compute the per-sample mean across the sub-threshold repetitions, scale by pn_scale, and subtract from the corrected trace. This removes capacitive transients and steady-state leak currents without affecting the signal of interest.
- Step C - Scalar Noise-Floor Zeroing
Subtract the median of the user-specified pre-event window
pre_event_window_s = (t_start, t_end). Because the LJP and P/N corrections have already been applied, this median reflects only the residual noise floor, not a physiological offset.- Step D - Pre-filter Artifact Interpolation
Linearly interpolate across each stimulus artifact defined in artifact_interp_steps. Running this after A-C and before filtering prevents Gibbs ringing: the DSP filter operates on an already-flat waveform without sharp transient edges.
- Step E - DSP Filtering
Apply any filters listed in filter_steps (same dict schema as
SignalProcessingPipeline:{'type': 'filter', 'method': 'lowpass', 'cutoff': 1000, 'order': 5}). Running filters after A-D prevents edge artefacts from the transient subtraction from being smeared across the waveform.
- Parameters:
data – Raw (uncorrected) signal array.
time – Time vector aligned with data (seconds).
fs – Sampling rate in Hz.
ljp_mv – Liquid Junction Potential in mV. Step A only runs when
ljp_mv != 0.0.pn_traces – 2-D array of shape
(n_sweeps, n_samples)containing the sub-threshold P/N sweeps. Step B is skipped when pn_traces isNone.pn_scale – Scalar factor applied to the averaged P/N template before subtraction (default 1.0).
pre_event_window_s –
(t_start, t_end)tuple in seconds. Step C is skipped when this isNone.artifact_interp_steps – List of artifact dicts with keys
onset_time(s) andduration_ms(ms). Each defines a stimulus artifact to linearly interpolate. Step D is skipped whenNone.filter_steps – List of filter dicts consumed by
SignalProcessingPipeline.process(). Step E is skipped when the list is empty orNone.
- Returns:
Corrected signal array (always a copy — the input is never mutated).
Plugin System
Plugin Manager for Synaptipy.
Scans two plugin directories and dynamically loads external Python scripts. Any script using the @AnalysisRegistry.register decorator will automatically populate the UI and Batch Engine.
Search order:
Built-in examples:
<project_root>/examples/plugins/- shipped with the package so features work out-of-the-box.User plugins:
~/.synaptipy/plugins/- personal or third-party additions.
When the same stem name appears in both directories the user’s copy takes precedence and a warning is logged.
This file is part of Synaptipy, licensed under the GNU Affero General Public License v3.0. See the LICENSE file in the root of the repository for full license details.
- class Synaptipy.application.plugin_manager.PluginManager[source]
Bases:
objectManages the discovery, loading, and registration of third-party plugins.
- classmethod get_plugin_files()[source]
Returns a deduplicated list of plugin
.pyfiles from bothexamples/plugins/and~/.synaptipy/plugins/.The user directory takes precedence: if a file with the same stem exists in both locations, the examples copy is skipped and a warning is emitted so the author knows their local version is active.
- classmethod load_plugins()[source]
Dynamically imports all plugins discovered by
get_plugin_files().Plugins from
examples/plugins/are loaded first, then user plugins. A bad plugin (ImportError,SyntaxError, or any other exception) is skipped gracefully so it does not crash the main application.Loading is skipped entirely when the
enable_pluginsQSettings key isFalse(set via Preferences -> Extensions).
- classmethod reload_plugins()[source]
Hot-reload plugins without restarting the application.
Purges all plugin-contributed analyses from
AnalysisRegistry, then re-loads plugins if theenable_pluginssetting isTrue. Call this after the user toggles the “Enable Custom Plugins” preference, then rebuild the Analyser UI to reflect the change.
Usage Example
from Synaptipy.application.plugin_manager import PluginManager
PluginManager.load_plugins()
PluginManager.reload_plugins()
Exporters
NWB Exporter
Exporter for saving Recording data to the NWB:N 2.0 format. Utilizes metadata extracted by NeoAdapter and stored in data_model objects.
- class Synaptipy.infrastructure.exporters.nwb_exporter.NWBHDF5IO(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.NWBFile(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.CurrentClampSeries(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.CurrentClampStimulusSeries(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.IntracellularElectrode(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.PatchClampSeries(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.VoltageClampSeries(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.VoltageClampStimulusSeries(*args, **kwargs)[source]
Bases:
objectSentinel: pynwb not installed.
- class Synaptipy.infrastructure.exporters.nwb_exporter.NWBExporter[source]
Bases:
objectHandles exporting Recording domain objects to NWB files.
- export(recording, output_path, session_metadata, analysis_results=None)[source]
Exports the given Recording object to an NWB file.
- Parameters:
recording – The Recording object containing data and metadata.
output_path – The Path object where the .nwb file will be saved.
session_metadata – A dictionary containing user-provided or default metadata required for NWBFile creation. Expected keys: ‘session_description’, ‘identifier’, ‘session_start_time’, plus optional ‘experimenter’, ‘lab’, ‘institution’, ‘session_id’, and Subject/Device/Electrode info.
analysis_results – Optional dict (or list of dicts) produced by
BatchAnalysisEngine.run_batch. When provided, discrete-event arrays stored under the_raw_arrayskey are written into an NWBProcessingModuleasDynamicTableobjects. No analysis computation is re-run.
- Raises:
ExportError – If any error occurs during the NWB file creation or writing.
Usage Example
from Synaptipy.infrastructure.exporters.nwb_exporter import NWBExporter
exporter = NWBExporter()
metadata = {
'session_description': 'Recording session',
'experimenter': 'Researcher Name',
'lab': 'Lab Name',
'institution': 'Institution',
'experiment_description': 'Experiment details',
'session_id': 'session123'
}
exporter.export(recording, "/path/to/output.nwb", metadata)
Licensing
Synaptipy is released under the GNU Affero General Public License Version 3 (AGPL-3.0). See the LICENSE file for full terms.