Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'IncrementalPCA' object has no attribute 'components_'. Did you mean: 'n_components'? #3219

Closed
MarinManuel opened this issue Jul 17, 2024 · 6 comments · Fixed by #3224
Labels
postprocessing Related to postprocessing module

Comments

@MarinManuel
Copy link

I started getting this error recently when computing extension to my analyzer. What is curious to me is that this happens when processing a sorting from kilosort2.5 and kilosort4, but not kilosort3

my code looks like this

analyzer = si.create_sorting_analyzer(
        sorting=sorting,
        recording=rec,
        sparse=True,
        format="zarr",
        folder=analyzer_destination
    )
analyzer.compute("random_spikes", method="uniform", max_spikes_per_unit=500)
analyzer.compute("waveforms", ms_before=1.5, ms_after=2.0)
analyzer.compute("templates", operators=["average", "median", "std"])
analyzer.compute("noise_levels")
analyzer.compute("correlograms")
analyzer.compute("unit_locations")
analyzer.compute("spike_amplitudes")
analyzer.compute("template_similarity")
analyzer.compute("principal_components", n_jobs=1)  # using n_jobs=-1 makes this function take much longer
analyzer.compute("quality_metrics")

si.export_to_phy(analyzer, output_folder=phy_destination, remove_if_exists=False)

and I get the error

(...)
Projecting waveforms:  79%|███████▉  | 908/1146 [00:17<00:04, 52.80it/s]
Projecting waveforms:  80%|███████▉  | 913/1146 [00:17<00:04, 51.61it/s]
Traceback (most recent call last):
  File "/work/pi_mmanuel_uri_edu/Analysis/run_slurm.py", line 78, in <module>
    analyzer.compute("principal_components", n_jobs=1)  # using n_jobs=-1 makes this function take much longer
  File "/work/pi_mmanuel_uri_edu/src/spikeinterface/src/spikeinterface/core/sortinganalyzer.py", line 915, in compute
    return self.compute_one_extension(extension_name=input, save=save, verbose=verbose, **kwargs)
  File "/work/pi_mmanuel_uri_edu/src/spikeinterface/src/spikeinterface/core/sortinganalyzer.py", line 996, in compute_one_extension
    extension_instance.run(save=save, verbose=verbose, **job_kwargs)
  File "/work/pi_mmanuel_uri_edu/src/spikeinterface/src/spikeinterface/core/sortinganalyzer.py", line 1673, in run
    self._run(**kwargs)
  File "/work/pi_mmanuel_uri_edu/src/spikeinterface/src/spikeinterface/postprocessing/principal_component.py", line 291, in _run
    pca_projection = self._transform_waveforms(some_spikes, some_waveforms, pca_model, progress_bar)
  File "/work/pi_mmanuel_uri_edu/src/spikeinterface/src/spikeinterface/postprocessing/principal_component.py", line 494, in _transform_waveforms
    proj = pca_model.transform(wfs[:, :, wf_ind])
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/utils/_set_output.py", line 313, in wrapped
    data_to_wrap = f(self, X, *args, **kwargs)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/decomposition/_incremental_pca.py", line 414, in transform
    return super().transform(X)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/utils/_set_output.py", line 313, in wrapped
    data_to_wrap = f(self, X, *args, **kwargs)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/decomposition/_base.py", line 139, in transform
    xp, _ = get_namespace(X, self.components_, self.explained_variance_)
AttributeError: 'IncrementalPCA' object has no attribute 'components_'. Did you mean: 'n_components'?
@alejoe91
Copy link
Member

Hi @MarinManuel

What version are you using? It should be fixed in main by this "simple" that we merged a few days ago:
#3178

@alejoe91 alejoe91 added the postprocessing Related to postprocessing module label Jul 17, 2024
@MarinManuel
Copy link
Author

Yes, upgrading to 0.101rc1 seems to have fixed it

@MarinManuel
Copy link
Author

actually, I spoke too soon

si.__version__
'0.101.0rc1'

si.export_to_phy(analyzer, output_folder=output_folder, remove_if_exists=True)
write_binary_recording: 100%
 8134/8134 [08:56<00:00, 18.52it/s]
extract PCs:  67%
 5422/8134 [15:22<04:08, 10.89it/s]

---------------------------------------------------------------------------
_RemoteTraceback                          Traceback (most recent call last)
_RemoteTraceback: 
"""
Traceback (most recent call last):
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/process.py", line 246, in _process_worker
    r = call_item.fn(*call_item.args, **call_item.kwargs)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/process.py", line 205, in _process_chunk
    return [fn(*args) for args in chunk]
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/process.py", line 205, in <listcomp>
    return [fn(*args) for args in chunk]
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/spikeinterface/core/job_tools.py", line 463, in function_wrapper
    return _func(segment_index, start_frame, end_frame, _worker_ctx)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/spikeinterface/postprocessing/principal_component.py", line 640, in _all_pc_extractor_chunk
    all_pcs[i, :, c] = pca_model[chan_ind].transform(w)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/utils/_set_output.py", line 313, in wrapped
    data_to_wrap = f(self, X, *args, **kwargs)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/decomposition/_incremental_pca.py", line 414, in transform
    return super().transform(X)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/utils/_set_output.py", line 313, in wrapped
    data_to_wrap = f(self, X, *args, **kwargs)
  File "/work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/sklearn/decomposition/_base.py", line 139, in transform
    xp, _ = get_namespace(X, self.components_, self.explained_variance_)
AttributeError: 'IncrementalPCA' object has no attribute 'components_'. Did you mean: 'n_components'?
"""

The above exception was the direct cause of the following exception:

AttributeError                            Traceback (most recent call last)
Cell In[33], line 1
----> 1 si.export_to_phy(analyzer, output_folder='/scratch/workspace/mmanuel_uri_edu-2024-07-11/2024-07-11_probeA_4-11_kilosort2_5_phy', remove_if_exists=True)

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/spikeinterface/exporters/to_phy.py:230, in export_to_phy(sorting_analyzer, output_folder, compute_pc_features, compute_amplitudes, sparsity, copy_binary, remove_if_exists, template_mode, add_quality_metrics, add_template_metrics, additional_properties, dtype, verbose, use_relative_path, **job_kwargs)
    226     sorting_analyzer.compute("principal_components", n_components=5, mode="by_channel_local", **job_kwargs)
    228 pca_extension = sorting_analyzer.get_extension("principal_components")
--> 230 pca_extension.run_for_all_spikes(output_folder / "pc_features.npy", **job_kwargs)
    232 max_num_channels_pc = max(len(chan_inds) for chan_inds in used_sparsity.unit_id_to_channel_indices.values())
    233 pc_feature_ind = -np.ones((len(unit_ids), max_num_channels_pc), dtype="int64")

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/spikeinterface/postprocessing/principal_component.py:411, in ComputePrincipalComponents.run_for_all_spikes(self, file_path, verbose, **job_kwargs)
    399 init_args = (
    400     recording,
    401     sorting.to_multiprocessing(job_kwargs["n_jobs"]),
   (...)
    406     pca_model,
    407 )
    408 processor = ChunkRecordingExecutor(
    409     recording, func, init_func, init_args, job_name="extract PCs", verbose=verbose, **job_kwargs
    410 )
--> 411 processor.run()

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/spikeinterface/core/job_tools.py:425, in ChunkRecordingExecutor.run(self)
    422 if self.progress_bar:
    423     results = tqdm(results, desc=self.job_name, total=len(all_chunks))
--> 425 for res in results:
    426     if self.handle_returns:
    427         returns.append(res)

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/tqdm/notebook.py:250, in tqdm_notebook.__iter__(self)
    248 try:
    249     it = super().__iter__()
--> 250     for obj in it:
    251         # return super(tqdm...) will not catch exception
    252         yield obj
    253 # NB: except ... [ as ...] breaks IPython async KeyboardInterrupt

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/site-packages/tqdm/std.py:1181, in tqdm.__iter__(self)
   1178 time = self._time
   1180 try:
-> 1181     for obj in iterable:
   1182         yield obj
   1183         # Update and possibly print the progressbar.
   1184         # Note: does not call self.update(1) for speed optimisation.

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/process.py:575, in _chain_from_iterable_of_lists(iterable)
    569 def _chain_from_iterable_of_lists(iterable):
    570     """
    571     Specialized implementation of itertools.chain.from_iterable.
    572     Each item in *iterable* should be a list.  This function is
    573     careful not to keep references to yielded objects.
    574     """
--> 575     for element in iterable:
    576         element.reverse()
    577         while element:

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/_base.py:621, in Executor.map.<locals>.result_iterator()
    618 while fs:
    619     # Careful not to keep a reference to the popped future
    620     if timeout is None:
--> 621         yield _result_or_cancel(fs.pop())
    622     else:
    623         yield _result_or_cancel(fs.pop(), end_time - time.monotonic())

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/_base.py:319, in _result_or_cancel(***failed resolving arguments***)
    317 try:
    318     try:
--> 319         return fut.result(timeout)
    320     finally:
    321         fut.cancel()

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/_base.py:451, in Future.result(self, timeout)
    449     raise CancelledError()
    450 elif self._state == FINISHED:
--> 451     return self.__get_result()
    453 self._condition.wait(timeout)
    455 if self._state in [CANCELLED, CANCELLED_AND_NOTIFIED]:

File /work/pi_mmanuel_uri_edu/envs/NPX/lib/python3.10/concurrent/futures/_base.py:403, in Future.__get_result(self)
    401 if self._exception:
    402     try:
--> 403         raise self._exception
    404     finally:
    405         # Break a reference cycle with the exception in self._exception
    406         self = None

AttributeError: 'IncrementalPCA' object has no attribute 'components_'

@alejoe91
Copy link
Member

@MarinManuel can you try from this PR? I protected the transform in the run_for_all_spikes as well: #3224

@MarinManuel
Copy link
Author

Hi Alessio,
I get the same error, but maybe I didn't install the right version?
I did pip install git+https://github.com/SpikeInterface/spikeinterface.git@refs/pull/3224/merge to install #3224 ?
the version installed now is

pip freeze | grep spike
spikeinterface @ git+https://github.com/SpikeInterface/spikeinterface.git@c929e514533cb8f4e19dd5f77ade8a4b5f6309f8

@alejoe91
Copy link
Member

@MarinManuel we just merged to main. Can you try to pull from there? I'm 99% sure that was the issue, but let's see

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
postprocessing Related to postprocessing module
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants