Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make cutouts out of TICA cubes #60

Merged
merged 66 commits into from
Nov 1, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
66 commits
Select commit Hold shift + click to select a range
db9e269
initializing TicaCutoutFactory
jaymedina Apr 25, 2022
f444a92
_parse_info_table modified
jaymedina Apr 25, 2022
cc19b3d
cutout limits are applied
jaymedina Apr 25, 2022
a550525
adding tpf build etc
jaymedina Apr 26, 2022
63de19b
tpf built
jaymedina Apr 28, 2022
8bd3a3b
writing tpf to file
jaymedina May 4, 2022
14164d1
adding FFI_TYPE primary header kw denoting tica or tess cutout
jaymedina May 11, 2022
5c2f4f9
.
jaymedina May 11, 2022
e99582e
adding ffi_type primary header kwd to both tica and tess
jaymedina May 18, 2022
1a95cb2
comment update
jaymedina Jun 2, 2022
d015b12
updating kwds
jaymedina Jun 3, 2022
71b0fd5
adding img kwds in init
jaymedina Jun 3, 2022
8bf77fc
updating primary header kwds. NOTE:
jaymedina Jun 3, 2022
7910322
removing print statements
jaymedina Jun 21, 2022
06439c8
updating TICA primary header to match SPOC primary header
jaymedina Aug 2, 2022
e52d93c
bug fix. BJDREFI needed to be called before TELAPSE gets calculated
jaymedina Aug 2, 2022
f2d24cf
updating img kwds
jaymedina Aug 4, 2022
031037a
updating unit tests to iterating thru spoc and tica
jaymedina Aug 4, 2022
3a2c66b
cleaning
jaymedina Aug 4, 2022
309c43f
clean
jaymedina Aug 4, 2022
7aac689
.
jaymedina Aug 4, 2022
408a8bc
.
jaymedina Aug 4, 2022
73fa88b
fixing key error bug when calculating TIME column
jaymedina Sep 5, 2022
0fae7b4
fixing array type for CADENCEN0
jaymedina Sep 5, 2022
6a59345
fixing typo in flux col under _build_tpf
jaymedina Sep 6, 2022
01f0d42
adding FIRST_FFI and LAST_FFI prim. hdr kwds to denote the FFIs used …
jaymedina Oct 17, 2022
3a3f041
updating TIMESYS keyword for TICA
jaymedina Oct 18, 2022
66d96f1
adding TICA functionality into CutoutFactory
jaymedina Oct 18, 2022
458a35c
removing TicaCutoutFactory
jaymedina Oct 18, 2022
a68208c
fixing header keyword comments; removing unnecessary primary header k…
jaymedina Oct 19, 2022
050bc64
fixing TIMEREF and TASSIGN kwd vals for tica
jaymedina Oct 19, 2022
fea4919
removing time, exptime, and filter kwds from tica ext 1 header
jaymedina Oct 19, 2022
f3b37e1
populating EXT 1 bin table CADENCEN0 column for TICA with CADENCE kwd…
jaymedina Oct 19, 2022
b3f95f0
general cleanup; adding verbose comments;
jaymedina Oct 19, 2022
c2351e5
header cleanup for spoc and tica
jaymedina Oct 19, 2022
a13e371
removing ticacutoutfactory import from tests
jaymedina Oct 19, 2022
1287967
updating all calls to cube_cut
jaymedina Oct 19, 2022
908bd16
chaning TIMESYS back to TDB for tica
jaymedina Oct 20, 2022
1109fe0
replacing NAs with None for consistency
jaymedina Oct 21, 2022
6c64746
changing pixel unit back to e- for tica
jaymedina Oct 21, 2022
39fb556
chaning SIMDATA to False for tica
jaymedina Oct 21, 2022
675fb4e
general cleanup; adding product kwd desc. to all methods calling it;
jaymedina Oct 24, 2022
85e3f63
using or instead of |
jaymedina Oct 25, 2022
8704ee4
typo
jaymedina Oct 25, 2022
e8a8ca6
typo
jaymedina Oct 25, 2022
7ed74cd
adding a check for TICA to make sure we have WCS info in header
jaymedina Oct 25, 2022
3b071be
reverting indentation changes and spacing changes
jaymedina Oct 25, 2022
cd64048
removing conditional in wcs check
jaymedina Oct 25, 2022
0074d0c
removing calls to TicaCutoutFactory
jaymedina Oct 25, 2022
09ad86c
taking cutout_maker out of conditional
jaymedina Oct 25, 2022
78564b2
adding header keyword regression tests
jaymedina Oct 26, 2022
a3957a4
importing time
jaymedina Oct 26, 2022
270ed6f
initial commit to fix tests
jaymedina Oct 26, 2022
25242a0
fixed nan test with BARYCORR kwd; fixed DATE-OBS clalculation check f…
jaymedina Oct 27, 2022
c9f8031
code style
jaymedina Oct 27, 2022
8eb05c8
test fix
jaymedina Oct 27, 2022
ed9d420
fixed test_cutouts
jaymedina Oct 27, 2022
bc4fa08
fixed make_cube test
jaymedina Oct 27, 2022
1431a64
pep8
jaymedina Oct 27, 2022
cdc5520
replacing None strings with None valus
jaymedina Oct 28, 2022
50407ca
noting that product default is SPOC for cube_cut function
jaymedina Oct 28, 2022
62c53d8
removing extra line
jaymedina Oct 28, 2022
30b343f
removing call to first_ffi, last_ffi
jaymedina Oct 28, 2022
7b9f163
fixed typo on ASTATE comment
jaymedina Oct 28, 2022
1ae9a7d
renaming check1 ffi_type argument to product
jaymedina Oct 28, 2022
1275009
removing product arg from check1
jaymedina Oct 28, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
177 changes: 139 additions & 38 deletions astrocut/cube_cut.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,13 @@
from time import time
from itertools import product

import numpy as np
import astropy.units as u
import numpy as np

from astropy.io import fits
from astropy.coordinates import SkyCoord
from astropy import wcs
from astropy.coordinates import SkyCoord
from astropy.io import fits
from astropy.time import Time

from . import __version__
from .exceptions import InputWarning, InvalidQueryError
Expand All @@ -30,14 +31,17 @@ class CutoutFactory():
"""
Class for creating image cutouts.

This class emcompasses all of the cutout functionality.
In the current version this means creating cutout target pixel files from TESS full frame image cubes.
This class encompasses all of the cutout functionality.
In the current version this means creating cutout target pixel files from both
SPOC (Science Processing Operations Center) and TICA (Tess Image CAlibration)
full frame image cubes.

Future versions will include more generalized cutout functionality.
"""

def __init__(self):
"""
Initiazation function.
Initialization function.
"""

self.cube_wcs = None # WCS information from the image cube
Expand All @@ -48,7 +52,8 @@ def __init__(self):
self.cutout_lims = np.zeros((2, 2), dtype=int) # Cutout pixel limits, [[ymin,ymax],[xmin,xmax]]
self.center_coord = None # Central skycoord

# Extra keywords from the FFI image headers (TESS specific)
# Extra keywords from the FFI image headers in SPOC.
# These are applied to both SPOC and TICA cutouts for consistency.
self.img_kwds = {"BACKAPP": [None, "background is subtracted"],
jaymedina marked this conversation as resolved.
Show resolved Hide resolved
"CDPP0_5": [None, "RMS CDPP on 0.5-hr time scales"],
"CDPP1_0": [None, "RMS CDPP on 1.0-hr time scales"],
Expand Down Expand Up @@ -84,27 +89,34 @@ def __init__(self):
"VIGNAPP": [None, "vignetting or collimator correction applied"]}


def _parse_table_info(self, table_data, verbose=False):
def _parse_table_info(self, product, table_data, verbose=False):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we do use this function in tesscut. perhaps we shouldn't be, since it's "internal". or perhaps it should not be an internal function. nevertheless, it's fine as you have it, but it might have been nice to have had product be a keyword argument that defaulted to SPOC instead of being a positional arg so that downstream code wouldn't have to change. I believe you are also taking care of the downstream code, at least in tesscut. Are there any other packages that might use this function? Astroquery?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'm fine with not changing this now. just pointing it out that we need to change it in tesscut (also in the s3_support branch, when we switch over to using this)

Copy link
Collaborator Author

@jaymedina jaymedina Oct 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm in favor of changing this here to prevent any future bugs elsewhere. I'll work on this next.

EDIT: After second thought, it might be better to leave product as a required positional argument. We would still have to remember to change product to the Enum call wherever this function is on TESSCut, because we can't have SPOC as the default for both SPOC and TICA. And a "missing positional keyword argument" error is more straightforward to fix than whatever error would come out of attempting to parse SPOC info in a TICA table (it would probably be a missing keyword error, but still).

And no _parse_table_info doesn't seem to be explicitly used in astroquery

"""
Takes the header and one entry from the cube table of image header data,
builds a WCS object that encalpsulates the given WCS information,
Takes the header and the middle entry from the cube table (EXT 2) of image header data,
builds a WCS object that encapsulates the given WCS information,
and collects into a dictionary the other keywords we care about.

The WCS is stored in ``self.cube_wcs``, and the extra keywords in ``self.img_kwds``

Parameters
----------
product : str
The product type to make the cutouts from.
Can either be 'SPOC' or 'TICA'.
table_data : `~astropy.io.fits.fitsrec.FITS_rec`
The cube image header data table.
falkben marked this conversation as resolved.
Show resolved Hide resolved
"""

data_ind = len(table_data)//2 # using the middle file for table info
# Populating `table_row` with the primary header keywords
# of the middle FFI
data_ind = len(table_data)//2
table_row = None

# Making sure we have a row with wcs info
while table_row is None:
table_row = table_data[data_ind]
if table_row["WCSAXES"] != 2:

# Making sure we have a row with wcs info.
wcsaxes_keyword = 'WCSAXES' if product == 'SPOC' else 'WCAX3'
if table_row[wcsaxes_keyword] != 2:
table_row = None
data_ind += 1
if data_ind == len(table_data):
Expand All @@ -129,7 +141,7 @@ def _parse_table_info(self, table_data, verbose=False):
# Filling the img_kwds dictionary while we are here
for kwd in self.img_kwds:
self.img_kwds[kwd][0] = wcs_header.get(kwd)
# Adding the info about which FFI we got the
# Adding the info about which FFI we got the WCS info from
self.img_kwds["WCS_FFI"] = [table_data[data_ind]["FFI_FILE"],
"FFI used for cutout WCS"]

Expand Down Expand Up @@ -160,7 +172,7 @@ def _get_cutout_limits(self, cutout_size):
lims = np.zeros((2, 2), dtype=int)

for axis, size in enumerate(cutout_size):

if not isinstance(size, u.Quantity): # assume pixels
dim = size / 2
elif size.unit == u.pixel: # also pixels
Expand Down Expand Up @@ -322,7 +334,6 @@ def _get_cutout_wcs_dict(self):

cutout_wcs_dict = dict()


## Cutout array keywords ##

cutout_wcs_dict["WCAX{}"] = [wcs_header['WCSAXES'], "number of WCS axes"]
Expand Down Expand Up @@ -440,7 +451,7 @@ def _get_cutout(self, transposed_cube, verbose=True):
return img_cutout, uncert_cutout, aperture


def _update_primary_header(self, primary_header):
def _update_primary_header(self, product, primary_header):
jaymedina marked this conversation as resolved.
Show resolved Hide resolved
"""
Updates the primary header for the cutout target pixel file by filling in
the object ra and dec with the central cutout coordinates and filling in
Expand All @@ -450,27 +461,102 @@ def _update_primary_header(self, primary_header):

Parameters
----------
product : str
The product type to make the cutouts from.
Can either be 'SPOC' or 'TICA'.
primary_header : `~astropy.io.fits.Header`
The primary header from the cube file that will be modified in place for the cutout.
"""

# Adding cutout specific headers
primary_header['CREATOR'] = ('astrocut', 'software used to produce this file')
primary_header['PROCVER'] = (__version__, 'software version')
primary_header['FFI_TYPE'] = (product, 'the FFI type used to make the cutouts')
# TODO : The name of FIRST_FFI (and LAST_FFI) is too long to be a header kwd value.
# Find a way to include these in the headers without breaking astropy (maybe abbreviate?).
# primary_header['FIRST_FFI'] = (self.first_ffi, 'the FFI used for the primary header
# keyword values, except TSTOP')
# primary_header['LAST_FFI'] = (self.last_ffi, 'the FFI used for the TSTOP keyword value')

primary_header['RA_OBJ'] = (self.center_coord.ra.deg, '[deg] right ascension')
primary_header['DEC_OBJ'] = (self.center_coord.dec.deg, '[deg] declination')

primary_header['TIMEREF'] = ('SOLARSYSTEM', 'barycentric correction applied to times')
primary_header['TASSIGN'] = ('SPACECRAFT', 'where time is assigned')
timeref = 'SOLARSYSTEM' if product == 'SPOC' else None
tassign = 'SPACECRAFT' if product == 'SPOC' else None
primary_header['TIMEREF'] = (timeref, 'barycentric correction applied to times')
primary_header['TASSIGN'] = (tassign, 'where time is assigned')
primary_header['TIMESYS'] = ('TDB', 'time system is Barycentric Dynamical Time (TDB)')

primary_header['BJDREFI'] = (2457000, 'integer part of BTJD reference date')
primary_header['BJDREFF'] = (0.00000000, 'fraction of the day in BTJD reference date')
primary_header['TIMEUNIT'] = ('d', 'time unit for TIME, TSTART and TSTOP')

delete_kwds_wildcards = ['SC_*', 'RMS*', 'A_*', 'AP_*', 'B_*', 'BP*', 'GAIN*', 'TESS_*', 'CD*',
'CT*', 'CRPIX*', 'CRVAL*', 'MJD*']
delete_kwds = ['COMMENT', 'FILTER', 'TIME', 'EXPTIME', 'ACS_MODE', 'DEC_TARG', 'FLXWIN', 'RA_TARG',
'CCDNUM', 'CAMNUM', 'WCSGDF', 'UNITS', 'CADENCE', 'SCIPIXS', 'INT_TIME', 'PIX_CAT',
'REQUANT', 'DIFF_HUF', 'PRIM_HUF', 'QUAL_BIT', 'SPM', 'STARTTJD', 'ENDTJD', 'CRM',
'TJD_ZERO', 'CRM_N', 'ORBIT_ID', 'MIDTJD']

if product == 'TICA':

# Adding some missing kwds not in TICA (but in Ames-produced SPOC ffis)
jaymedina marked this conversation as resolved.
Show resolved Hide resolved
primary_header['EXTVER'] = ('1', 'extension version number (not format version)')
primary_header['SIMDATA'] = (False, 'file is based on simulated data')
primary_header['NEXTEND'] = ('2', 'number of standard extensions')
primary_header['TSTART'] = (primary_header['STARTTJD'], 'observation start time in TJD of first FFI')
primary_header['TSTOP'] = (primary_header['ENDTJD'], 'observation stop time in TJD of last FFI')
primary_header['CAMERA'] = (primary_header['CAMNUM'], 'Camera number')
primary_header['CCD'] = (primary_header['CCDNUM'], 'CCD chip number')
primary_header['ASTATE'] = (None, 'archive state F indicates single orbit processing')
primary_header['CRMITEN'] = (primary_header['CRM'], 'spacecraft cosmic ray mitigation enabled')
primary_header['CRBLKSZ'] = (None, '[exposures] s/c cosmic ray mitigation block siz')
primary_header['FFIINDEX'] = (primary_header['CADENCE'], 'number of FFI cadence interval of first FFI')
primary_header['DATA_REL'] = (None, 'data release version number')

date_obs = Time(primary_header['TSTART']+primary_header['BJDREFI'], format='jd').iso
date_end = Time(primary_header['TSTOP']+primary_header['BJDREFI'], format='jd').iso
primary_header['DATE-OBS'] = (date_obs, 'TSTART as UTC calendar date of first FFI')
primary_header['DATE-END'] = (date_end, 'TSTOP as UTC calendar date of last FFI')
jaymedina marked this conversation as resolved.
Show resolved Hide resolved

primary_header['FILEVER'] = (None, 'file format version')
primary_header['RADESYS'] = (None, 'reference frame of celestial coordinates')
primary_header['SCCONFIG'] = (None, 'spacecraft configuration ID')
primary_header['TIMVERSN'] = (None, 'OGIP memo number for file format')

# Bulk removal with wildcards. Most of these should only live in EXT 1 header.
for kwd in delete_kwds_wildcards:
try:
del primary_header[kwd]
except KeyError:
continue

# Removal of specific kwds not relevant for cutouts.
# Most likely these describe a single FFI, and not
# the whole cube, which is misleading because we are
# working with entire stacks of FFIs. Other keywords
# are analogs to ones that have already been added
# to the primary header in the lines above.
for kwd in delete_kwds:
try:
del primary_header[kwd]
except KeyError:
continue

telapse = primary_header.get("TSTOP", 0) - primary_header.get("TSTART", 0)
primary_header['TELAPSE '] = (telapse, '[d] TSTOP - TSTART')


# Updating card comment to be more explicit
primary_header['DATE'] = (primary_header['DATE'], 'FFI cube creation date')

# Specifying that some of these headers keyword values are inherited from the first FFI
if product == 'SPOC':
primary_header['TSTART'] = (primary_header['TSTART'], 'observation start time in TJD of first FFI')
primary_header['TSTOP'] = (primary_header['TSTOP'], 'observation stop time in TJD of last FFI')
primary_header['DATE-OBS'] = (primary_header['DATE-OBS'], 'TSTART as UTC calendar date of first FFI')
primary_header['DATE-END'] = (primary_header['DATE-END'], 'TSTOP as UTC calendar date of last FFI')
primary_header['FFIINDEX'] = (primary_header['FFIINDEX'], 'number of FFI cadence interval of first FFI')

# These are all the things in the TESS pipeline tpfs about the object that we can't fill
primary_header['OBJECT'] = ("", 'string version of target id ')
primary_header['TCID'] = (0, 'unique tess target identifier')
Expand Down Expand Up @@ -538,6 +624,10 @@ def _add_img_kwds(self, table_header):
"""

for key in self.img_kwds:
# We'll skip these TICA-specific image keywords that are single-FFI specific
# or just not helpful
if (key == 'TIME') or (key == 'EXPTIME') or (key == 'FILTER'):
continue
table_header[key] = tuple(self.img_kwds[key])


Expand All @@ -564,12 +654,15 @@ def _apply_header_inherit(self, hdu_list):
hdu.header[kwd] = (primary_header[kwd], primary_header.comments[kwd])


def _build_tpf(self, cube_fits, img_cube, uncert_cube, cutout_wcs_dict, aperture, verbose=True):
def _build_tpf(self, product, cube_fits, img_cube, uncert_cube, cutout_wcs_dict, aperture):
"""
Building the cutout target pixel file (TPF) and formatting it to match TESS pipeline TPFs.

Paramters
---------
product : str
The product type to make the cutouts from.
Can either be 'SPOC' or 'TICA'.
cube_fits : `~astropy.io.fits.hdu.hdulist.HDUList`
The cube hdu list.
img_cube : `numpy.array`
Expand All @@ -594,41 +687,48 @@ def _build_tpf(self, cube_fits, img_cube, uncert_cube, cutout_wcs_dict, aperture
# The primary hdu is just the main header, which is the same
# as the one on the cube file
primary_hdu = cube_fits[0]
self._update_primary_header(primary_hdu.header)
self._update_primary_header(product, primary_hdu.header)

cols = list()

# Adding the cutouts
tform = str(img_cube[0].size) + "E"
dims = str(img_cube[0].shape[::-1])
empty_arr = np.zeros(img_cube.shape)

# Adding the Time relates columns
start = 'TSTART' if product == 'SPOC' else 'STARTTJD'
stop = 'TSTOP' if product == 'SPOC' else 'ENDTJD'
cols.append(fits.Column(name='TIME', format='D', unit='BJD - 2457000, days', disp='D14.7',
array=(cube_fits[2].columns['TSTART'].array + cube_fits[2].columns['TSTOP'].array)/2))
array=(cube_fits[2].columns[start].array + cube_fits[2].columns[stop].array)/2))

cols.append(fits.Column(name='TIMECORR', format='E', unit='d', disp='E14.7',
array=cube_fits[2].columns['BARYCORR'].array))
if product == 'SPOC':
cols.append(fits.Column(name='TIMECORR', format='E', unit='d', disp='E14.7',
array=cube_fits[2].columns['BARYCORR'].array))

# Adding CADENCENO as zeros b/c we don't have this info
cols.append(fits.Column(name='CADENCENO', format='J', disp='I10', array=empty_arr[:, 0, 0]))
# Adding CADENCENO as zeros for SPOC b/c we don't have this info
cadence_array = empty_arr[:, 0, 0] if product == 'SPOC' else cube_fits[2].columns['CADENCE'].array
cols.append(fits.Column(name='CADENCENO', format='J', disp='I10', array=cadence_array))

# Adding counts (-1 b/c we don't have data)
cols.append(fits.Column(name='RAW_CNTS', format=tform.replace('E', 'J'), unit='count', dim=dims, disp='I8',
array=empty_arr-1, null=-1))

# Adding flux and flux_err (data we actually have!)
cols.append(fits.Column(name='FLUX', format=tform, dim=dims, unit='e-/s', disp='E14.7', array=img_cube))
cols.append(fits.Column(name='FLUX_ERR', format=tform, dim=dims, unit='e-/s', disp='E14.7', array=uncert_cube))
pixel_unit = 'e-/s' if product == 'SPOC' else 'e-'
cols.append(fits.Column(name='FLUX', format=tform, dim=dims, unit=pixel_unit, disp='E14.7', array=img_cube))
cols.append(fits.Column(name='FLUX_ERR', format=tform, dim=dims, unit=pixel_unit, disp='E14.7',
array=uncert_cube))

# Adding the background info (zeros b.c we don't have this info)
cols.append(fits.Column(name='FLUX_BKG', format=tform, dim=dims, unit='e-/s', disp='E14.7', array=empty_arr))
cols.append(fits.Column(name='FLUX_BKG', format=tform, dim=dims, unit=pixel_unit, disp='E14.7',
array=empty_arr))
cols.append(fits.Column(name='FLUX_BKG_ERR', format=tform, dim=dims,
unit='e-/s', disp='E14.7', array=empty_arr))
unit=pixel_unit, disp='E14.7', array=empty_arr))

# Adding the quality flags
data_quality = 'DQUALITY' if product == 'SPOC' else 'QUAL_BIT'
cols.append(fits.Column(name='QUALITY', format='J', disp='B16.16',
array=cube_fits[2].columns['DQUALITY'].array))
array=cube_fits[2].columns[data_quality].array))

# Adding the position correction info (zeros b.c we don't have this info)
cols.append(fits.Column(name='POS_CORR1', format='E', unit='pixel', disp='E14.7', array=empty_arr[:, 0, 0]))
Expand Down Expand Up @@ -672,9 +772,8 @@ def _build_tpf(self, cube_fits, img_cube, uncert_cube, cutout_wcs_dict, aperture
return cutout_hdu_list



def cube_cut(self, cube_file, coordinates, cutout_size,
target_pixel_file=None, output_path=".", verbose=False):
product='SPOC', target_pixel_file=None, output_path=".", verbose=False):
"""
Takes a cube file (as created by `~astrocut.CubeFactory`), and makes a cutout target pixel
file of the given size around the given coordinates. The target pixel file is formatted like
Expand All @@ -697,6 +796,9 @@ def cube_cut(self, cube_file, coordinates, cutout_size,
order. Scalar numbers in ``cutout_size`` are assumed to be in
units of pixels. `~astropy.units.Quantity` objects must be in pixel or
angular units.
product : str
The product type to make the cutouts from.
Can either be 'SPOC' or 'TICA' (default is 'SPOC').
target_pixel_file : str
Optional. The name for the output target pixel file.
If no name is supplied, the file will be named:
Expand All @@ -721,7 +823,7 @@ def cube_cut(self, cube_file, coordinates, cutout_size,
with fits.open(cube_file, mode='denywrite', memmap=True) as cube:

# Get the info we need from the data table
self._parse_table_info(cube[2].data, verbose)
self._parse_table_info(product, cube[2].data, verbose)

if isinstance(coordinates, SkyCoord):
self.center_coord = coordinates
Expand Down Expand Up @@ -766,7 +868,7 @@ def cube_cut(self, cube_file, coordinates, cutout_size,
cutout_wcs_dict = self._get_cutout_wcs_dict()

# Build the TPF
tpf_object = self._build_tpf(cube, img_cutout, uncert_cutout, cutout_wcs_dict, aperture)
tpf_object = self._build_tpf(product, cube, img_cutout, uncert_cutout, cutout_wcs_dict, aperture)

if verbose:
write_time = time()
Expand Down Expand Up @@ -799,4 +901,3 @@ def cube_cut(self, cube_file, coordinates, cutout_size,
print("Total time: {:.2} sec".format(time()-start_time))

return target_pixel_file

Loading