Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use common beam instead of beam averaging #656

Open
wants to merge 63 commits into
base: master
Choose a base branch
from

Conversation

e-koch
Copy link
Contributor

@e-koch e-koch commented Aug 15, 2020

Remove beam averaging and force operations to use the common beam. This depends on improvements to the common beam algorithm in radio-beam (radio-astro-tools/radio-beam#81). Addresses #615.

  • Remove current usage of beam averaging

  • [ ] Generalize identification of bad beams to difference in areas instead of deviation from median. Bad beams are outliers. The current implementation should be fine in most cases.

  • Toggle to enforce common beam operations vs. allow small deviations.

  • Cache common beam somewhere if goodbeam_mask doesn't change Added a VRSC.common_beam property that can be recomputed with VRSC.set_common_beam (e.g., if you change goodbeam_mask, alter the common beam kwarg inputs, etc)

  • Add beam area checking for DaskVaryingResSpectralCube

  • Use VRSC.goodbeams_mask by default for which beams to mask in VRSC.set_common_beam. The prior default is mask=compute which triggers a full read in of the cube mask.

  • Cache a version of cube.mask.any(axis=(1,2)). See point directly above. SpectralCube.unmasked_channels

  • expand docs on new changes + how beams are handled in general

  • Added VRSC.common_beam that is set by enabling compute_commonbeam=True on read (SpectralCube.read). This will compute the common beam for the cube at its state on read in.

Key changes

  • the common beam estimator requires scipy to be installed (needs scipy.spatial.ConvexHull). Changed scipy to be a required package (we could split this to keep it optional, to only be used when there's a VRSC)
  • Removed VRSC.average_beams for VRSC._compute_common_beam, which returns a common beam radio_beam.Beam. On read in the property VRSC.common_beam can be set by enabling compute_commonbeam=True in SpectralCube.read. VRSC.average_beams raises a DeprecationWarning and passes the original args to VRSC.set_common_beam
  • Change default masking behaviour for the common beam to only use VRSC.goodbeams_mask for VRSC.set_common_beam (mask='goodbeams'). Previous default was mask='compute' which combines goodbeams_mask and the full cube mask, triggering a full read in of the data.
  • Added a bool flag VRSC.strict_beam_match which enforces exact beam matching (VRSC.beam_threshold = 0.0). This will raise errors for any spectral operation with a message to convolve to the same beam first.
  • Added SpectralCube.unmasked_channels which caches cube.mask.include().any(axis=(1,2)). The latter requires the whole mask be read in/computed and is quite slow for big cubes. cube.unmasked_channels is for checking which channels are fully masked.

@coveralls
Copy link

coveralls commented Aug 15, 2020

Coverage Status

Coverage increased (+0.5%) to 86.522% when pulling 522690c on e-koch:remove_beamaverage into 1497c07 on radio-astro-tools:master.

@e-koch
Copy link
Contributor Author

e-koch commented Aug 21, 2020

@keflavich -- Computing the mask to remove fully-masked channels is a bit of a problem: https://github.com/radio-astro-tools/spectral-cube/pull/656/files#diff-8f799a9c2fa6fd38a2278d484cd416efR595. It triggers a full read in of the data (into memory for non-dask) and is super slow.

I'm wondering if the default should be to avoid loading the whole cube mask and just to use the goodbeams_mask instead

@e-koch
Copy link
Contributor Author

e-koch commented Aug 26, 2020

@astrofrog - Have you seen this InvocationError before? https://travis-ci.org/github/radio-astro-tools/spectral-cube/jobs/721172051 I can't reproduce it locally.

@astrofrog
Copy link
Member

I’ll try and take a look shortly!

@e-koch
Copy link
Contributor Author

e-koch commented Aug 26, 2020

@astrofrog Fixed it (somehow)! I think a file wasn't being closed when the test failed.

@e-koch e-koch marked this pull request as ready for review August 28, 2020 00:00
@e-koch e-koch changed the title WIP: Use common beam instead of beam averaging Use common beam instead of beam averaging Aug 28, 2020
@e-koch
Copy link
Contributor Author

e-koch commented Aug 28, 2020

Tests on travis are giving an InvocationError again when one test using a dask cube fails and I think the file doesn't get closed.

Here's the failure I get locally:

___________________________________________________________________ test_unmasked_channels[True-data_vda] ___________________________________________________________________

filename = PosixPath('/tmp/pytest-of-eric/pytest-60/test_unmasked_channels_True_da0/vda.fits'), use_dask = True

    @pytest.mark.parametrize('filename', ['data_vda', 'data_vda_beams'],
                             indirect=['filename'])
    def test_unmasked_channels(filename, use_dask):
    
        cube, data = cube_and_raw(filename, use_dask=use_dask)
    
        # Add a mask to the cube
        mask = np.ones(cube.shape, dtype=bool)
        mask[0] = False
        cube = cube.with_mask(mask)
    
>       np.testing.assert_equal(cube.unmasked_channels, mask.any(axis=(1, 2)))

spectral_cube/tests/test_spectral_cube.py:2570: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
spectral_cube/utils.py:18: in wrapper
    self._cache[(func, args)] = func(self, *args)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = DaskSpectralCube with shape=(4, 3, 2) and unit=K and chunk size (4, 3, 2):
 n_x:      2  type_x: RA---SIN  unit_x: deg...094 deg:   29.935209 deg
 n_s:      4  type_s: VOPT      unit_s: km / s  range:     -321.215 km / s:    -317.350 km / s

    @property
    @cached
    def unmasked_channels(self):
        if isinstance(self._data, da.Array):
>           return self._compute(da.any(self._mask_include, axis=(1, 2)))
E           AttributeError: 'DaskSpectralCube' object has no attribute '_mask_include'

spectral_cube/base_class.py:353: AttributeError
========================================================================== short test summary info ==========================================================================
FAILED spectral_cube/tests/test_spectral_cube.py::test_unmasked_channels[True-data_vda] - AttributeError: 'DaskSpectralCube' object has no attribute '_mask_include'

The non-varying resolution cube isn't getting the mask set at an earlier stage. But the test with a VRSC is fine.

@e-koch
Copy link
Contributor Author

e-koch commented Aug 28, 2020

@astrofrog @keflavich -- ready for review

Copy link
Contributor

@keflavich keflavich left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some minor changes.

I just want to make sure: the default behavior is not to convolve to a common beam, is it? I want to be sure we're not triggering massive convolution operations without warning the user, since that tends to crash computers.

docs/beam_handling.rst Outdated Show resolved Hide resolved
docs/beam_handling.rst Outdated Show resolved Hide resolved
docs/beam_handling.rst Outdated Show resolved Hide resolved
docs/beam_handling.rst Outdated Show resolved Hide resolved
docs/beam_handling.rst Outdated Show resolved Hide resolved
@e-koch
Copy link
Contributor Author

e-koch commented Sep 22, 2020

No, no auto trigger of a whole cube convolution. An error is raised if the beams deviate too much from the common beam, similar to what we already have with the median/avg beam.

Copy link
Member

@astrofrog astrofrog left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good overall! A question below about API, in particular I think it would be cleaner to return a common beam when it is computed than store it in a stateful way on the cube.

This will also definitely need an entry in the changelog! 😆

docs/beam_handling.rst Outdated Show resolved Hide resolved
docs/beam_handling.rst Outdated Show resolved Hide resolved
docs/beam_handling.rst Outdated Show resolved Hide resolved
in the cube. To do this, we can use
:meth:`~spectral_cube.VaryingResolutionSpectralCube.set_common_beam`::

cube.set_common_beam()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason why we need to make a stateful change on the cube and set common_beam? We could instead do something like:

common_beam = cube.find_common_beam()
cube.convolve_to(common_beam)

? Having it stateful means it could get out of sync if e.g. some channels are masked and so on. In general we might want to avoid stateful changes to cubes if we can?

With find_common_beam above one could even find a common beam with different settings and compare them without having to make a stateful change each time.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only issue is that the common beam operation can be slow for cubes with many (>1e3) channels. And baked into the checks for deviations is the need to check for variation from the common beam, where we had the average beam before. That's our only check (currently) for whether to ignore small beam changes for operations along the spectral axis.

Allowing a stateful change could also work if we force the beam masking to not be stateful (which I think it can be now)

spectral_cube/base_class.py Outdated Show resolved Hide resolved
spectral_cube/base_class.py Outdated Show resolved Hide resolved
e-koch and others added 22 commits November 11, 2020 11:12
Co-authored-by: Adam Ginsburg <[email protected]>
Co-authored-by: Adam Ginsburg <[email protected]>
Co-authored-by: Adam Ginsburg <[email protected]>
Co-authored-by: Thomas Robitaille <[email protected]>
…be deconvolved; add that option in our tests here for failure cases
@e-koch
Copy link
Contributor Author

e-koch commented Nov 19, 2020

  • Tests for loading varying-res cubes with common beam computed (compute_commonbeam=True)
  • Docs for loading with compute_commonbeam=True
  • Test for with_bad_beams_masked (replacing state change in identify_bad_beams)
  • Update docs for with_bad_beams_masked
  • Test for new convolve_to_commonbeam
  • Add docs to convolution operations for convolve_to_commonbeam

@keflavich
Copy link
Contributor

Discussion today: @e-koch will try splitting up this PR to avoid state-change related issues.

Maybe need to remove median(beams) operation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants