-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replace load_extractor
with load_function
from loading.py
#3613
base: main
Are you sure you want to change the base?
Conversation
load_extractor
and use_times
in get_duration
/`get_totalload_extractor
and use_times
in get_duration
/get_total_duration
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again mostly cosmetic things on my part.
src/spikeinterface/core/base.py
Outdated
@@ -761,62 +761,71 @@ def load(file_path: Union[str, Path], base_folder: Optional[Union[Path, str, boo | |||
* save (...) a folder which contain data + json (or pickle) + metadata. | |||
|
|||
""" | |||
print(file_path, is_path_remote(file_path)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should these be prints or warning? in general?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed
src/spikeinterface/core/base.py
Outdated
with open(file_path, "rb") as f: | ||
d = pickle.load(f) | ||
else: | ||
raise ValueError(f"Impossible to load {file_path}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we provide any other info here to help the user. Impossible to load xx. Do we know the most common reason that we could say "try this instead" or is this impossible to know the problem?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I unified all error messages now :)
src/spikeinterface/core/base.py
Outdated
f = folder / f"cached.{dump_ext}" | ||
extractor = read_zarr(folder) | ||
else: | ||
# the is spikeinterface<=0.94.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shall we fix this comment while we are adjusting things :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done!
src/spikeinterface/core/base.py
Outdated
raise ValueError(f"This folder is not a cached folder {file_path}") | ||
extractor = BaseExtractor.load(file, base_folder=folder) | ||
else: | ||
error_msg = ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is a great error message as it says the problem then gives the solution :)
src/spikeinterface/core/base.py
Outdated
return extractor | ||
extractor = read_zarr(file_path) | ||
else: | ||
raise NotImplementedError("Only zarr format is supported for remote files") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to say. To convert your spikeinterface object to zarr do a xx with format=zarr? That might be getting to specific if the plan is to extend this to more formats in the future.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @zm711
I don't understand this comment. I think it will be easy to extend the comment if and when we support more remote storage formats. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mean something like
NotImpletmentedError("We only currently support zarr format for remote files. Your file is format xx. You can convert this file to zarr by doing extractor.save(format=zarr) with necessariy arguments.")
but if the plan is to support a bunch of other formats then it wouldn't make sense to overemphasize zarr in the error message because the message won't stay true once we support more format. If our goal is only to use zarr for remote connection then it might be nice to make a message that states the problem and gives the solution like above. does that make sense?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so this branch is only for remote storage, so we would also need to add how to upload to a cloud bucket? :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we triggered the remote branch then that would mean they were successful in getting it to the cloud right? because we check that it is remote. But this final check is that they uploaded the wrong file type right? Or am I misunderstanding. I think explaining the cloud would be overkill for this message it would be better to just say (see our cloud docs--and then make some docs on si in the cloud :) )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahhh I see your point! Ok let me give it a try :)
Co-authored-by: Zach McKenzie <[email protected]>
for more information, see https://pre-commit.ci
I understand the need but I do not like the design. |
load_extractor
and use_times
in get_duration
/get_total_duration
load_extractor
with load_function
from loading.py
Created a new
loading.py
file and movedload
function there (keptload_extractor
for back compatibility)The new function will be a magic load of all SI objects: Recording/Sorting/SortingAnalyzer/(WaveformExtractor)