-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't let decimate mess with times and skim tests #3519
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Couple comments. I think Joe will definitely be able to comment better on the time component.
@@ -93,22 +90,26 @@ class DecimateRecordingSegment(BaseRecordingSegment): | |||
def __init__( | |||
self, | |||
parent_recording_segment, | |||
resample_rate, | |||
decimated_sampling_frequency, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would be a breaking change no? Should we deprecate?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, but it's soo deep in the API that I'm 100% sure it wouldn't affect anyone's workflow.
If it were at the DecimateRecording
level, than we should have worried about back-compatibility because of saved objects/JSON files. But since it's the segment which is instantiated on the fly we don't have to worry about it (and I think the naming is much more in line with the overll API)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Absolutely agree with the naming. And makes sense if it's deep. I wasn't sure if this was more on the private or public side, but makes sense that the Segment level is basically private.
new_t_start = None | ||
if parent_recording_segment.time_vector is not None: | ||
time_vector = parent_recording_segment.time_vector[decimation_offset::decimation_factor] | ||
decimated_sampling_frequency = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's this None for? I think Joe has thought about this more so It's tricky for me to think about time vector vs t_start and when we want a frequency of None vs a value.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
currently time_vector
and sampling_frequency
/t_start
representation of time in the segment are still mutually exclusive, so we need to set the sampling_freq to None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay cool. Maybe we can have a chat about this at some point. I don't want to take up developer meeting time for this necessarily, but since I don't use the time api I don't know it well enough. :)
RTD has been so flaky. It has consistently failed on one of my Neo PRs for no reason.... |
Yeah we can look at that later..let me know if it looks good! |
decimated_rec = DecimateRecording(rec, decimation_factor, decimation_offset=decimation_offset) | ||
|
||
for segment_index in range(rec.get_num_segments()): | ||
assert np.allclose( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we explicitly set the tolerance that we tolerate? We often get flakiness due to floating point rounding. does get_times
return floats?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good point. I think here all equal will work too. I'll push an update
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zm711 use np.testing/assert_array_almost_equal(..., decimal=10)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You think 10 decimal places? It looks like assert_array_almost_equal also deals with nan's which is nice. I vaguely remembering Heberto having to slowly relax one of these style of tests because it keep failing. 10 decimals seems super exact. Why not the default? (which is 7 based on most recent docs).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In my tests the absolute error was e-15 :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This works for me :)
Deep bug in decimate, that deletes existing time vectors for segments!
This PR modifies the behavior so that
time_vectors
are also decimated, which is very intuitive.Also skimmed a but the tests (previously 2100 tests with parameterized, now "only" 25)