Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open folders with user altered filenames in SpikeGLX #1608

Open
wants to merge 9 commits into
base: master
Choose a base branch
from
80 changes: 53 additions & 27 deletions neo/rawio/spikeglxrawio.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,14 +82,11 @@ class SpikeGLXRawIO(BaseRawWithBufferApiIO):

Notes
-----
* Contrary to other implementations this IO reads the entire folder and subfolders and:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed this because it is not clear what other implementations this is referring to, probably the ones before but then I don't think it should be here.

deals with several segments based on the `_gt0`, `_gt1`, `_gt2`, etc postfixes
deals with all signals "imec0", "imec1" for neuropixel probes and also
external signal like"nidq". This is the "device"
* For imec device both "ap" and "lf" are extracted so one device have several "streams"
* There are several versions depending the neuropixel probe generation (`1.x`/`2.x`/`3.x`)
* Here, we assume that the `meta` file has the same structure across all generations.
* This IO is developed based on neuropixel generation 2.0, single shank recordings.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a note of the first version but we have gradually added support. This supports the shanked version just fine and also NHP and other varieties.

* This IO reads the entire folder and subfolders locating the `.bin` and `.meta` files
* Handles gates and triggers as segments (based on the `_gt0`, `_gt1`, `_t0` , `_t1` in filenames)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added specific mention to gates and triggers as that is more in line with SpikeGLX documentation.

* Handles all signals coming from different acquisition cards ("imec0", "imec1", etc) in a typical
PXIe chassis setup and also external signal like"nidq".
h-mayorquin marked this conversation as resolved.
Show resolved Hide resolved
* For imec devices both "ap" and "lf" are extracted so even a one device setup will have several "streams"

Examples
--------
Expand Down Expand Up @@ -125,7 +122,6 @@ def _parse_header(self):
stream_names = sorted(list(srates.keys()), key=lambda e: srates[e])[::-1]
nb_segment = np.unique([info["seg_index"] for info in self.signals_info_list]).size

# self._memmaps = {}
self.signals_info_dict = {}
# one unique block
self._buffer_descriptions = {0: {}}
Expand Down Expand Up @@ -166,7 +162,6 @@ def _parse_header(self):

stream_id = stream_name

stream_index = stream_names.index(info["stream_name"])
signal_streams.append((stream_name, stream_id, buffer_id))

# add channels to global list
Expand Down Expand Up @@ -250,7 +245,6 @@ def _parse_header(self):
# insert some annotation at some place
self._generate_minimal_annotations()
self._generate_minimal_annotations()
block_ann = self.raw_annotations["blocks"][0]

for seg_index in range(nb_segment):
seg_ann = self.raw_annotations["blocks"][0]["segments"][seg_index]
Expand Down Expand Up @@ -354,22 +348,44 @@ def scan_files(dirname):
if len(info_list) == 0:
raise FileNotFoundError(f"No appropriate combination of .meta and .bin files were detected in {dirname}")

# the segment index will depend on both 'gate_num' and 'trigger_num'
# so we order by 'gate_num' then 'trigger_num'
# None is before any int
def make_key(info):
k0 = info["gate_num"]
if k0 is None:
k0 = -1
k1 = info["trigger_num"]
if k1 is None:
k1 = -1
return (k0, k1)

order_key = list({make_key(info) for info in info_list})
order_key = sorted(order_key)
# This sets non-integers values before integers
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this section is the core of the PR. It uses the same mechanism that we have for gates and triggers before but also for probe index.:

  • It creates a set of unique combinations of (gate, trigger) and (probe, port,dock).
  • That combination is used to create the index (segment index and probe index)
  • The parameters of info are used to assign the segment and probe index with a dict.

The normalize lambda unifies this behavior. Maybe there is a better naming but I could not come with anything, it just follows the convention that Sam had before.

normalize = lambda x: x if isinstance(x, int) else -1

# Segment index is determined by the gate_num and trigger_num in that order
def get_segment_tuple(info):
# Create a key from the normalized gate_num and trigger_num
gate_num = normalize(info.get("gate_num"))
trigger_num = normalize(info.get("trigger_num"))
return (gate_num, trigger_num)

unique_segment_tuples = {get_segment_tuple(info) for info in info_list}
sorted_keys = sorted(unique_segment_tuples)

# Map each unique key to a corresponding index
segment_tuple_to_segment_index = {key: idx for idx, key in enumerate(sorted_keys)}

for info in info_list:
info["seg_index"] = segment_tuple_to_segment_index[get_segment_tuple(info)]


# Probe index calculation
# This ensures that all nidq entries come before any other keys, which corresponds to index 0.
def get_probe_tuple(info):
slot = normalize(info.get("probe_slot"))
port = normalize(info.get("probe_port"))
dock = normalize(info.get("probe_dock"))
return (slot, port, dock)

unique_probe_tuples = {get_probe_tuple(info) for info in info_list}
sorted_probe_keys = sorted(unique_probe_tuples)
probe_tuple_to_probe_index = {key: idx for idx, key in enumerate(sorted_probe_keys)}

for info in info_list:
info["seg_index"] = order_key.index(make_key(info))
if info.get("device") == "nidq":
info["device_index"] = 0 # TODO: Handle multi nidq case
else:
info["device_index"] = probe_tuple_to_probe_index[get_probe_tuple(info)]


return info_list

Expand Down Expand Up @@ -488,7 +504,10 @@ def extract_stream_info(meta_file, meta):
else:
# NIDQ case
has_sync_trace = False
fname = Path(meta_file).stem

bin_file_path = meta["fileName"]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the part that uses the meta["fileName"] (the original one) instead of the file_path in the system (which the user might have changed).

fname = Path(bin_file_path).stem

run_name, gate_num, trigger_num, device, stream_kind = parse_spikeglx_fname(fname)

if "imec" in fname.split(".")[-2]:
Expand Down Expand Up @@ -550,6 +569,10 @@ def extract_stream_info(meta_file, meta):
gain_factor = float(meta["niAiRangeMax"]) / 32768
channel_gains = per_channel_gain * gain_factor

probe_slot = meta.get("imDatPrb_slot", None)
probe_port = meta.get("imDatPrb_port", None)
probe_dock = meta.get("imDatPrb_dock", None)

info = {}
info["fname"] = fname
info["meta"] = meta
Expand All @@ -569,6 +592,9 @@ def extract_stream_info(meta_file, meta):
info["channel_gains"] = channel_gains
info["channel_offsets"] = np.zeros(info["num_chan"])
info["has_sync_trace"] = has_sync_trace
info["probe_slot"] = int(probe_slot) if probe_slot else None
info["probe_port"] = int(probe_port) if probe_port else None
info["probe_dock"] = int(probe_dock) if probe_dock else None

if "nidq" in device:
info["digital_channels"] = []
Expand Down
Loading