Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Neuropyxels with Open Ephys data #96

Open
javierzsm opened this issue Oct 8, 2021 · 42 comments
Open

Neuropyxels with Open Ephys data #96

javierzsm opened this issue Oct 8, 2021 · 42 comments

Comments

@javierzsm
Copy link

Hi,
This is super cool although I can't use it because I'm not using SpikeGL. Is there any chance to have it working with data acquired using Open Ephys?
Best,
Javier

@m-beau
Copy link
Owner

m-beau commented Oct 8, 2021

Hi Javier,

I could actually use your help! Can you send me an example dataset? Are you recording with neuropixels 1.0, the commercial version?

Maxime

@javierzsm
Copy link
Author

Sure, you can find a little piece of data here https://owncloud.icm-institute.org/index.php/s/b8i5s4lVV8Dub6C
I put the entire directory with all the sub-folders and files created by Open Ephys during an acquisition session.
It's been acquired with a single neuropixels 1.0 probe using the Open Ephys Neuropixels-PXI plugin.

@m-beau
Copy link
Owner

m-beau commented Oct 10, 2021

Cheers, and which spike-sorter did you use?

@javierzsm
Copy link
Author

Kilosort 3, I added a folder with the spike sorting outputs.
Let me know if that works.

@javierzsm
Copy link
Author

Hi!
I saw the README was updated and it should also work with Open Ephys recorded data. Yet, the script doesn't seem to find the binary file.
How should the data be organized for it? KS outputs + .dat file in the same folder?
Thanks!

@m-beau
Copy link
Owner

m-beau commented Oct 19, 2021

Hi! I am currently working on it, I edited the readme file a but prematurely - bear with me!

@m-beau
Copy link
Owner

m-beau commented Oct 23, 2021

Hi Javier, almost there - I am trying to work out how to infer the bits to uVolts conversion factor from openephys metadata.

In spikeGLX, the amplification factor can customized and is typically 500. From there, I can compute the conversion factor which is about 2.34: glx_convfactor = Vrange/2**bits_encoding/ampFactor

in the oebin file, I do not find any amplification factor but I find another number per channel: "bit_volts", which must be what I am looking for. Its value is 0.19499999284744262695 on all of your channels.

I assume that this number must be 1/glx_convfactor, but it would mean that you have an amplification factor of ~228.5 which is rather odd. Can you let me know what was your amplification factor?

MAxime

@m-beau
Copy link
Owner

m-beau commented Oct 24, 2021

FYI, I figured it out - the compatibility layer is now written, and updated on github and pip.

The only missing bit is how to format the sync channel in events/Neuropixels-PXI-100.0, for that I need an example of recording with an actual sync signal

@javierzsm
Copy link
Author

javierzsm commented Oct 26, 2021 via email

@javierzsm
Copy link
Author

Hello Maxime,
Here is a sample recording of about 180 s during which I gave a 30s TTL train @1Hz.
I also included the KS3 sorting results. Let me know if it's what you needed.
https://owncloud.icm-institute.org/index.php/s/NsvvdgKWsFDrkW5
Cheers,
j

@m-beau
Copy link
Owner

m-beau commented Nov 10, 2021

Hi Javier,

I am not prioritizing this at the moment - you can now use npyx with OpenEphys, simply not load the sync channel through npyx yet. It isn't critical with openephys though as you can simply load the sync signal from 'events/Neuropix-PXI-100.0/TTL_1/timestamps.npy using numpy! Let me know how npyx works for you, don't hesitate to ask if you encounter further issues!

@m-beau
Copy link
Owner

m-beau commented Dec 17, 2021

Hi Javier,

Did it work out for you? Everything should work smoothly on an openEphys dataset, apart from extracting the sync channel which is very easy to do manually (it is pre-extracted in OpenEphys, at 'events/Neuropix-PXI-100.0/TTL_1/timestamps.npy' I reckon)

@m-beau
Copy link
Owner

m-beau commented Jul 18, 2022

Confirmed that it worked orally - closing this issue

@m-beau m-beau closed this as completed Jul 18, 2022
@chris-angeloni
Copy link
Contributor

FWIW, there was a recent OpenEphys update that has made most of this code no longer work for data recorded with the new GUI (mostly due to folder name changes). I'm working on updating it!

@m-beau
Copy link
Owner

m-beau commented Oct 26, 2022

Damn! Thanks, pull requests from open ephys users are welcome! I am not acquainted with the system :)

@davoudian
Copy link

just dropping in to say I believe @chris-angeloni did fix this in his fork of the repo. I tried to open a PR but don't have permission. could be worth merging so more openephys folks can make use of this excellent repo.

@m-beau
Copy link
Owner

m-beau commented Aug 3, 2023

Hey, thanks for letting me know - let me figure out how to edit the PR permissions (@brandonStell managed to open PRs in the past, I am surprised that it didn't work for you)

@m-beau m-beau reopened this Aug 3, 2023
@m-beau
Copy link
Owner

m-beau commented Aug 3, 2023

It may be because @chris-angeloni 's fork is pretty far behind the current master branch (202 commits behind!)?

In any case @davoudian , if you were willing to make a fresh fork, edit the relevant bits of code to and then try to open a PR, it'd be dope.

Alternatively, I am happy to work on it myself - I just need someone to send me an openEphys dataset so that I can do proper unit testing.

@chris-angeloni
Copy link
Contributor

Ah, yes, sorry I did not do a pull request sooner!

I am currently gearing up to do a bunch of neuropixels recordings, at which point I will need to revisit the analysis pipeline. At that time I will try to take the opportunity to clone the new branch and reimplement the fixes I previously added.

@davoudian
Copy link

thanks @m-beau ! I may have some time early next week to try and open a PR.

in the meantime i've also emailed a short ~15min recording using oephys to the email listed on your github.

@m-beau
Copy link
Owner

m-beau commented Aug 14, 2023

Hey guys!

I got your recording - I am working on this now.

the npyx.inout.metadata function loads openephys' metadata (inside the .oebin file) properly.

Am I right that the dataset you sent me isn't spike-sorted though? I cannot find any params.py or spike_times.npy file in your directory structure.

Maxime

@davoudian
Copy link

yeah I just did a quick recording while setting up / de-noising a new rig. I can spike sort it via spikeinterface/KS if that would help though.

let me know

@m-beau
Copy link
Owner

m-beau commented Aug 14, 2023

yep it'd be useful! just send me the output of the sorter, to make sure that other npyx functions work well.

Other question: in a dataset that someone sent me in the past, the AP and LFP folder were called Neuropix-PXI-100.0 and 100.1 instead: can you confirm that the openephys-generated directories are always called ''blahblah-AP' and 'blahblah-LFP' these days for the AP and LFP channels?

@m-beau
Copy link
Owner

m-beau commented Aug 14, 2023

I have just pushed npyx version 3.0.0 - the functions npyx.inout.metadata(dp) and npyx.inout.get_npix_sync work(dp) on open ephys datasets, when dp = "path/to/data/dir" where 'dir' is the folder which contains the .oebin file, ./continuous and ./events folders (see readme file of npyx for details).

@m-beau
Copy link
Owner

m-beau commented Aug 14, 2023

If you guys could try it out on a bunch of recordings for me, to make sure that it doesn't crash, it would be great. Happy to implement any fixes asap this week (download from pip or install from the master branch, making sure that version 3.0.0 is printed in your notebook/terminal after import)!

@davoudian
Copy link

davoudian commented Aug 14, 2023

@m-beau fantastic!

  • i'll give it a shot today/tomorrow on some recordings and analysis tools I've built atop npyx
  • re: naming - yes in my experiences it is something like \continuous\Neuropix-PXI-100.ProbeA-AP or continuous\Neuropix-PXI-100.ProbeA-LFP.
  • just in case, I shared some spike sorted data with you via email

@davoudian
Copy link

@m-beau two things I noticed by running some bare bones analysis quickly:

  • support for loading openephys binary files was dropped from def get_binary_file_path at some point ?
  • you use df.append a bit in loading/processing, but that has been depreciated . not sure where else it is used but can either switch to concating lists or require pandas<2.0 in setup.py

i'll try and keep track of these and any other bugs I find and hopefully open a PR to squash them when I have some more time. thanks again for sharing all your hard work, glad to not be in matlab

@m-beau
Copy link
Owner

m-beau commented Aug 30, 2023

I just merged @chris-angeloni's pull request - it should do the trick. I am opening a new issue #318 regarding the pandas .append deprecation - I will not close this issue for another month to make sure we didn't miss any OE incompatibility!

@TsungChihTsai
Copy link

I use open ephys recording, kilosort3, and create phy. However, I still have problem to use Neuropyxels with Open Ephys data

#WARNING only probe version Acquisition Board not handled with openEphys

I attempt to use quickstart.ipynb, but it (# Threshold crosses of the sync channel acquired with the SMA port on the acquisition board; onsets, offsets = get_npix_sync(dp); onsets) shows


AssertionError Traceback (most recent call last)
Cell In[13], line 2
1 # Threshold crosses of the sync channel acquired with the SMA port on the acquisition board
----> 2 onsets, offsets = get_npix_sync(dp)
3 onsets

File ~\AppData\Local\miniconda3\envs\si_env\lib\site-packages\npyx\inout.py:460, in get_npix_sync(dp, output_binary, filt_key, unit, verbose, again, sample_span)
457 assert unit in ['seconds', 'samples']
459 sync_dp = dp/'sync_chan'
--> 460 meta = read_metadata(dp)
461 srate = meta[filt_key]['sampling_rate'] if unit=='seconds' else 1
463 # initialize variables

File ~\AppData\Local\miniconda3\envs\si_env\lib\site-packages\npyx\inout.py:48, in read_metadata(dp)
46 meta[probe] = metadata(dpx)
47 else:
---> 48 meta = metadata(dp)
50 return meta

File ~\AppData\Local\miniconda3\envs\si_env\lib\site-packages\npyx\inout.py:135, in metadata(dp)
133 # find probe version
134 oe_probe_version = meta_oe["continuous"][0]["source_processor_name"]
--> 135 assert oe_probe_version in probe_versions['oe'].keys(),
136 f'WARNING only probe version {oe_probe_version} not handled with openEphys - post an issue at www.github.com/m-beau/NeuroPyxels'
137 meta['probe_version']=probe_versions['oe'][oe_probe_version]
138 meta['probe_version_int'] = probe_versions['int'][meta['probe_version']]

AssertionError: WARNING only probe version Acquisition Board not handled with openEphys - post an issue at www.github.com/m-beau/NeuroPyxels

I think it means I should modified about the probe info, but I don't know where could I change?

I also follow the Directory structure with "myrecording.oebin, params.py, spike_times.npy, spike_clusters.npy, cluster_groups.tsv # if manually curated with phy", and I use "Preprocess binary data": from npyx.inout import preprocess_binary_file # star import is sufficient, but I like explicit imports!

can perform bandpass filtering (butterworth 3 nodes) and median subtraction (aka common average referenceing, CAR)

in the future: ADC realignment (like CatGT), whitening, spatial filtering (experimental).

filtered_fname = preprocess_binary_file(dp, filt_key='ap', median_subtract=True, f_low=None, f_high=300, order=3, verbose=True)

and the results showed that

"Preprocessing not_found...
- median subtraction (aka common average referencing CAR),
- filtering in time (between 0 and 300 Hz) forward:True, backward:False.

AssertionError Traceback (most recent call last)
Cell In[23], line 5
1 from npyx.inout import preprocess_binary_file # star import is sufficient, but I like explicit imports!
3 # can perform bandpass filtering (butterworth 3 nodes) and median subtraction (aka common average referenceing, CAR)
4 # in the future: ADC realignment (like CatGT), whitening, spatial filtering (experimental).
----> 5 filtered_fname = preprocess_binary_file(dp, filt_key='ap', median_subtract=True, f_low=None, f_high=300, order=3, verbose=True)

File ~\AppData\Local\miniconda3\envs\si_env\lib\site-packages\npyx\inout.py:840, in preprocess_binary_file(dp, filt_key, fname, target_dp, move_orig_data, ADC_realign, median_subtract, f_low, f_high, order, filter_forward, filter_backward, spatial_filt, whiten, whiten_range, again_Wrot, verbose, again_if_preprocessed_filename, delete_original_data, data_deletion_double_check)
838 # fetch metadata
839 fk = {'ap':'highpass', 'lf':'lowpass'}[filt_key]
--> 840 meta = read_metadata(dp)
841 fs = meta[fk]['sampling_rate']
842 binary_byte_size = meta[fk]['binary_byte_size']

File ~\AppData\Local\miniconda3\envs\si_env\lib\site-packages\npyx\inout.py:48, in read_metadata(dp)
46 meta[probe] = metadata(dpx)
47 else:
---> 48 meta = metadata(dp)
50 return meta

File ~\AppData\Local\miniconda3\envs\si_env\lib\site-packages\npyx\inout.py:116, in metadata(dp)
114 glx_found = np.any(glx_ap_files) or np.any(glx_lf_files)
115 oe_found = np.any(oe_files)
--> 116 assert glx_found or oe_found,
117 f'WARNING no .ap/lf.meta (spikeGLX) or .oebin (OpenEphys) file found at {dp}.'
118 assert not (glx_found and oe_found),
119 'WARNING dataset seems to contain both an open ephys and spikeGLX metafile - fix this!'
120 assert len(glx_ap_files)==1 or len(glx_lf_files)==1 or len(oe_files)==1,
121 'WARNING more than 1 .ap.meta or 1 .oebin files found!'

AssertionError: WARNING no .ap/lf.meta (spikeGLX) or .oebin (OpenEphys) file found at ..

@m-beau
Copy link
Owner

m-beau commented Feb 9, 2024

Hey there,

Try to install npyx from the master repo: pip install git+https://github.com/m-beau/NeuroPyxels.git

@TsungChihTsai
Copy link

I did it, and it showed as below:

Collecting git+https://github.com/m-beau/NeuroPyxels.git
Cloning https://github.com/m-beau/NeuroPyxels.git to c:\users\ttsai\appdata\local\temp\pip-req-build-5i5y17np
Running command git clone --filter=blob:none --quiet https://github.com/m-beau/NeuroPyxels.git 'C:\Users\ttsai\AppData\Local\Temp\pip-req-build-5i5y17np'
Resolved https://github.com/m-beau/NeuroPyxels.git to commit 517bc12
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [8 lines of output]
Traceback (most recent call last):
File "", line 2, in
File "", line 34, in
File "C:\Users\ttsai\AppData\Local\Temp\pip-req-build-5i5y17np\setup.py", line 23, in
readme = readme_file.read()
File "C:\Users\ttsai\AppData\Local\miniconda3\envs\si_env\lib\encodings\cp1252.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
UnicodeDecodeError: 'charmap' codec can't decode byte 0x9d in position 1034: character maps to
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

@TsungChihTsai
Copy link

I thought my issue is that I use the same Directory structure as your README. But, I still cannot use test_npyx or onsets, offsets = get_npix_sync(dp) to acquire my dataset. Could you please share the dataset of quickstart.ipynb?
"dp = "/media/maxime/AnalysisSSD/test_dataset" # dp stands for datapath" ?

@m-beau
Copy link
Owner

m-beau commented Feb 9, 2024

Try to create a new conda environment, it sounds like you have versioning issues.

Make sure that the python version of your conda environment is python 10 or lower! And try to install npyx from a local repository.

conda create -n my_environment python=3.10
conda activate my_environment
cd 'any folder', e.g. downloads
git clone https://github.com/m-beau/NeuroPyxels
cd NeuroPyxels
pip install .

@TsungChihTsai
Copy link

Thanks for your guidance and assistance!

I installed npyx from a local repository:
conda create -n Neuropyxels python=3.10
conda activate Neuropyxels
git clone https://github.com/m-beau/NeuroPyxels
cd NeuroPyxels
pip install .
cd..
pip install jupyter
jupyter notebook

Then, I open the "quickstart.ipynb",

when I ran "# Threshold crosses of the sync channel acquired with the SMA port on the acquisition board
onsets, offsets = get_npix_sync(dp)
onsets"

It showed as following:

UnboundLocalError Traceback (most recent call last)
Cell In[9], line 4
1 probe_index=0
3 # Threshold crosses of the sync channel acquired with the SMA port on the acquisition board
----> 4 onsets, offsets = get_npix_sync(dp)
5 onsets

File ~\NeuroPyxels\npyx\inout.py:479, in get_npix_sync(dp, output_binary, filt_key, unit, verbose, again, sample_span)
476 assert unit in ['seconds', 'samples']
478 sync_dp = dp / 'sync_chan'
--> 479 meta = read_metadata(dp)
480 srate = meta[filt_key]['sampling_rate'] if unit=='seconds' else 1
482 # initialize variables

File ~\NeuroPyxels\npyx\inout.py:48, in read_metadata(dp)
46 meta[probe] = metadata(dpx)
47 else:
---> 48 meta = metadata(dp)
50 return meta

File ~\NeuroPyxels\npyx\inout.py:138, in metadata(dp)
136 probe_index = i
137 break
--> 138 oe_probe_version = meta_oe["continuous"][probe_index]["source_processor_name"]
139 assert oe_probe_version in probe_versions['oe'].keys(),
140 f'WARNING only probe version {oe_probe_version} not handled with openEphys - post an issue at www.github.com/m-beau/NeuroPyxels'
141 meta['probe_version']=probe_versions['oe'][oe_probe_version]

UnboundLocalError: local variable 'probe_index' referenced before assignment

I'd like to know how to define "probe_index" before this.

I used the probe as below:
manufacturer = 'cambridgeneurotech'
probe_name = 'ASSY-196-P-1'
channels=16

@m-beau
Copy link
Owner

m-beau commented Feb 11, 2024

Great to see you managed to install npyx!

The openephys version hasn't been tested super well by myself as I do not use it, but I think other users on this thread already solved this - @chris-angeloni @javierzsm @davoudian, if you managed to pull the last version of npyx, copy-paste the section of your code which solves compatibility issues with open ephys and then commit or pull-request (from a fork), it would be much appreciated!

@TsungChihTsai
Copy link

image

I also saw "cupy could not be imported - some functions dealing with the binary file (filtering, whitening...) will not work."

@m-beau
Copy link
Owner

m-beau commented Feb 11, 2024

Yep this is normal, check out the installation instructions in the readme file.

@yiyangcrick
Copy link

Hi all - I am trying npyx on my data (i.e., openephys + KS2.5 + phy curated). I got an error when loading metadata (read_metadata) and spikes (ten(dp, u)). I followed the directory structure specified in the readme file. The error was as the below:

`TypeError Traceback (most recent call last)
Cell In[25], line 1
----> 1 meta = read_metadata(dp)

File c:\Users\yangy1\AppData\Local\anaconda3\envs\npyx\lib\site-packages\npyx\inout.py:48, in read_metadata(dp)
46 meta[probe] = metadata(dpx)
47 else:
---> 48 meta = metadata(dp)
50 return meta

File c:\Users\yangy1\AppData\Local\anaconda3\envs\npyx\lib\site-packages\npyx\inout.py:163, in metadata(dp)
161 meta[filt_key]={}
162 filt_key_i=filt_index[filt_key]
--> 163 meta[filt_key]['sampling_rate']=float(meta_oe["continuous"][filt_key_i]['sample_rate'])
164 meta[filt_key]['n_channels_binaryfile']=int(meta_oe["continuous"][filt_key_i]['num_channels'])
165 if params_f.exists():

TypeError: list indices must be integers or slices, not list`

Here are the directories/files in my dp:
cluster_group.tsv npyxMemory spike_times.npy
continuous params.py structure.oebin
events spike_clusters.npy sync_messages.txt

Does anyone have any clue about what could be wrong?

@m-beau
Copy link
Owner

m-beau commented May 10, 2024

Hi!

NeuroPyxels hasn't been super well tested with open ephys datasets - would you mind sending me your dataset as a Dropbox link so that I can try running it on my end and understand what's going on?

Cheers

@chris-angeloni
Copy link
Contributor

chris-angeloni commented May 10, 2024

Do you have the most recent version of neuropyxels? I had to add some code before where your error is occurring to prevent this:

# index for highpass and lowpass
filt_index = {'highpass': [], 'lowpass': []}
for i,processor in enumerate(meta_oe['continuous']):
    if 'AP' in processor['folder_name']:
        filt_index['highpass'] = i
    if 'LFP' in processor['folder_name']:
        filt_index['lowpass'] = i

And the line numbers in your error don't quite line up with the current version of the code?

@yiyangcrick
Copy link

yiyangcrick commented May 14, 2024

@chris-angeloni I used npyx 4.0.5 before. The error persists as I updated to the latest 4.0.6.

Do you have the most recent version of neuropyxels? I had to add some code before where your error is occurring to prevent this:

# index for highpass and lowpass
filt_index = {'highpass': [], 'lowpass': []}
for i,processor in enumerate(meta_oe['continuous']):
    if 'AP' in processor['folder_name']:
        filt_index['highpass'] = i
    if 'LFP' in processor['folder_name']:
        filt_index['lowpass'] = i

And the line numbers in your error don't quite line up with the current version of the code?

@yiyangcrick
Copy link

yiyangcrick commented May 14, 2024

@m-beau Thanks for the reply and I am happy to share my data. The only problem is that the current recording is quite large in size like over 120 gb. I can send you a shorter one though it hasn't been curated using phy but has all the KS2.5 output. I think this should be enough to replicate the error? https://www.dropbox.com/scl/fo/0qi3zr16l3xr7zzob8oib/ACRRrywnHDf7aq1gdWz1_Vk?rlkey=jnvebqzty2lndwc3i744txlki&st=5q1f7qoq&dl=0

Hi!

NeuroPyxels hasn't been super well tested with open ephys datasets - would you mind sending me your dataset as a Dropbox link so that I can try running it on my end and understand what's going on?

Cheers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants